Compare commits

..

36 commits

Author SHA1 Message Date
Simon Willison
785f2ad0bd Apply database-level allow blocks to view-query action, refs #2510
When a database has an "allow" block in the configuration, it should
apply to all queries in that database, not just tables and the database
itself. This fix ensures that queries respect database-level access
controls.

This fixes the test_padlocks_on_database_page test which expects
plugin-defined queries (from_async_hook, from_hook) to show padlock
indicators when the database has restricted access.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 13:44:24 -07:00
Simon Willison
16b2729847 Fix #2509: Settings-based deny rules now override root user privileges
The root user's permission_resources_sql hook was returning early with a
blanket "allow all" rule, preventing settings-based deny rules from being
considered. This caused /-/allowed and /-/rules endpoints to incorrectly
show resources that were denied via settings.

Changed permission_resources_sql to append root permissions to the rules
list instead of returning early, allowing config-based deny rules to be
evaluated. The SQL cascading logic correctly applies: deny rules at the
same depth beat allow rules, so database-level denies override root's
global-level allow.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 11:10:58 -07:00
Simon Willison
d1ea067fde Migrate homepage to use bulk allowed_resources() and fix NULL handling in SQL JOINs
- Updated IndexView in datasette/views/index.py to fetch all allowed databases and tables
  in bulk upfront using allowed_resources() instead of calling check_visibility() for each
  database, table, and view individually
- Fixed SQL bug in build_allowed_resources_sql() where USING (parent, child) clauses failed
  for database resources because NULL = NULL evaluates to NULL in SQL, not TRUE
- Changed all INNER JOINs to use explicit ON conditions with NULL-safe comparisons:
  ON b.parent = x.parent AND (b.child = x.child OR (b.child IS NULL AND x.child IS NULL))

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 10:03:40 -07:00
Simon Willison
3adddad6aa Add parent filter and include_is_private to allowed_resources()
Major improvements to the allowed_resources() API:

1. **parent filter**: Filter results to specific database in SQL, not Python
   - Avoids loading thousands of tables into Python memory
   - Filtering happens efficiently in SQLite

2. **include_is_private flag**: Detect private resources in single SQL query
   - Compares actor permissions vs anonymous permissions in SQL
   - LEFT JOIN between actor_allowed and anon_allowed CTEs
   - Returns is_private column: 1 if anonymous blocked, 0 otherwise
   - No individual check_visibility() calls needed

3. **Resource.private property**: Safe access with clear error messages
   - Raises AttributeError if accessed without include_is_private=True
   - Prevents accidental misuse of the property

4. **Database view optimization**: Use new API to eliminate redundant checks
   - Single bulk query replaces N individual permission checks
   - Private flag computed in SQL, not via check_visibility() calls
   - Views filtered from allowed_dict instead of checking db.view_names()

All permission filtering now happens in SQLite where it belongs, with
minimal data transferred to Python.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 09:30:37 -07:00
Simon Willison
1134b22a27 Migrate /database view to use bulk allowed_resources()
Replace one-by-one permission checks with bulk allowed_resources() call:
- DatabaseView and QueryView now fetch all allowed tables once
- Filter views and tables using pre-fetched allowed_table_set
- Update TableResource.resources_sql() to include views from catalog_views

This improves performance by reducing permission checks from O(n) to O(1) per
table/view, where n is the number of tables in the database.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 00:28:16 -07:00
Simon Willison
11c039d35e Error on startup if invalid setting types 2025-10-24 00:14:28 -07:00
Simon Willison
3e6e8ee047 Simplify types in datasette/permissions.py 2025-10-23 23:54:04 -07:00
Simon Willison
e4f5b5c30f Fix schema mismatch in empty result query
When no permission rules exist, the query was returning 2 columns (parent, child)
but the function contract specifies 3 columns (parent, child, reason). This could
cause schema mismatches in consuming code.

Added 'NULL AS reason' to match the documented 3-column schema.

Added regression test that verifies the schema has 3 columns even when no
permission rules are returned. The test fails without the fix (showing only
2 columns) and passes with it.

Thanks to @asg017 for catching this
2025-10-23 21:48:12 -07:00
Simon Willison
bd5e969c8b Address PR #2515 review comments
- Add URL to sqlite-permissions-poc in module docstring
- Replace Optional with | None for modern Python syntax
- Add Datasette type annotations
- Add SQL comment explaining cascading permission logic
- Refactor duplicated plugin result processing into helper function
2025-10-23 16:08:56 -07:00
Simon Willison
e71c083700 Ran blacken-docs 2025-10-23 15:53:49 -07:00
Simon Willison
e5316215aa Ran cog 2025-10-23 15:50:26 -07:00
Simon Willison
092ada7b7d Removed unneccessary isinstance(candidate, PermissionSQL) 2025-10-23 15:48:48 -07:00
Simon Willison
5919de0384 Remove unused methods from Resource base class 2025-10-23 15:48:31 -07:00
Simon Willison
4880102b5d Ran latest prettier 2025-10-23 15:36:16 -07:00
Simon Willison
4b50cc7bc1 Use allowed_resources_sql() with CTE for table filtering 2025-10-23 15:34:36 -07:00
Simon Willison
c4f0365130 Rewrite tables endpoint to use SQL LIKE instead of Python regex 2025-10-23 15:29:51 -07:00
Simon Willison
19a37303c7 Fix /-/tables endpoint: add .json support and correct response format 2025-10-23 15:28:37 -07:00
Simon Willison
8de5b9431c Fix test_tables_endpoint_config_database_allow by using unique database names 2025-10-23 15:26:14 -07:00
Simon Willison
275c06fbe4 Add register_actions hook to test plugin and improve test 2025-10-23 15:24:10 -07:00
Simon Willison
d4dd08933e Fix test_navigation_menu_links by enabling root_enabled for root actor 2025-10-23 15:20:16 -07:00
Simon Willison
28a69d19a2 permission_allowed_default_allow_sql 2025-10-23 15:17:29 -07:00
Simon Willison
8bb07f80b1 Applied Black 2025-10-23 15:08:34 -07:00
Simon Willison
4d93149c2b Fix permission endpoint tests by resolving method signature conflicts
- Renamed internal allowed_resources_sql() to _build_permission_rules_sql()
  to avoid conflict with public method
- Made public allowed_resources_sql() keyword-only to prevent argument order bugs
- Fixed PermissionRulesView to use _build_permission_rules_sql() which returns
  full permission rules (with allow/deny) instead of filtered resources
- Fixed _build_permission_rules_sql() to pass actor dict to build_rules_union()
- Added actor_id extraction in AllowedResourcesView
- Added root_enabled=True to test fixture to grant permissions-debug to root user

All 51 tests in test_permission_endpoints.py now pass.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-23 14:53:07 -07:00
Simon Willison
475f817c5a Fixed some more tests 2025-10-23 14:43:51 -07:00
Simon Willison
2039e238d9 Fix permission_allowed_sql_bridge to not apply defaults, closes #2526
The bridge was incorrectly using the new allowed() method which applies
default allow rules. This caused actors without restrictions to get True
instead of USE_DEFAULT, breaking backward compatibility.

Fixed by:
- Removing the code that converted to resource objects and called allowed()
- Bridge now ONLY checks config-based rules via _config_permission_rules()
- Returns None when no config rules exist, allowing Permission.default to apply
- This maintains backward compatibility with the permission_allowed() API

All 177 permission tests now pass, including test_actor_restricted_permissions
and test_permissions_checked which were previously failing.
2025-10-23 14:34:48 -07:00
Simon Willison
e42b040055 Mark test_permissions_checked database download test as xfail, refs #2526
The test expects ensure_permissions() to check all three permissions
(view-database-download, view-database, view-instance) but the current
implementation short-circuits after the first successful check.

Created issue #2526 to track the investigation of the expected behavior.
2025-10-23 14:23:46 -07:00
Simon Willison
f4245dce66 Eliminate duplicate config checking by removing old permission_allowed hooks
- Removed permission_allowed_default() hook (checked config twice)
- Removed _resolve_config_view_permissions() and _resolve_config_permissions_blocks() helpers
- Added permission_allowed_sql_bridge() to bridge old permission_allowed() API to new SQL system
- Moved default_allow_sql setting check into permission_resources_sql()
- Made root-level allow blocks apply to all view-* actions (view-database, view-table, view-query)
- Added add_row_allow_block() helper for allow blocks that should deny when no match

This resolves the duplicate checking issue where config blocks were evaluated twice:
once in permission_allowed hooks and once in permission_resources_sql hooks.

Note: One test still failing (test_permissions_checked for database download) - needs investigation
2025-10-23 13:53:01 -07:00
Simon Willison
faef51ad05 Document datasette.allowed(), PermissionSQL class, and SQL parameters
- Added documentation for datasette.allowed() method with keyword-only arguments
- Added comprehensive PermissionSQL class documentation with examples
- Documented the three SQL parameters available: :actor, :actor_id, :action
- Included examples of using json_extract() to access actor fields
- Explained permission resolution rules (specificity, deny over allow, implicit deny)
- Fixed RST formatting warnings (escaped asterisk, fixed underline length)
2025-10-23 12:42:10 -07:00
Simon Willison
5fc58c8775 New --root mechanism with datasette.root_enabled, closes #2521 2025-10-23 12:40:50 -07:00
Simon Willison
1d37d30c2a Ensure :actor, :actor_id and :action are all available to permissions SQL, closes #2520
- Updated build_rules_union() to accept actor as dict and provide :actor (JSON) and :actor_id
- Updated resolve_permissions_from_catalog() and resolve_permissions_with_candidates() to accept actor dict
- :actor is now the full actor dict as JSON (use json_extract() to access fields)
- :actor_id is the actor's id field for simple comparisons
- :action continues to be available as before
- Updated all call sites and tests to use new parameter format
- Added test demonstrating all three parameters working together
2025-10-23 09:48:55 -07:00
Simon Willison
5ed57607e5 PluginSQL renamed to PermissionSQL, closes #2524 2025-10-23 09:34:19 -07:00
Simon Willison
cf887e0277 ds.allowed() is now keyword-argument only, closes #2519 2025-10-23 09:25:33 -07:00
Simon Willison
7dfd14bb07 Update allowed_resources_sql() and refactor allowed_resources() 2025-10-20 16:26:54 -07:00
Simon Willison
3663b9df2d Moved Resource defaults to datasette/resources.py 2025-10-20 16:23:14 -07:00
Simon Willison
9e5c64c3de Ran prettier 2025-10-20 16:03:22 -07:00
Simon Willison
7db754c284 Implement resource-based permission system with SQL-driven access control
This introduces a new hierarchical permission system that uses SQL queries
for efficient permission checking across resources. The system replaces the
older permission_allowed() pattern with a more flexible resource-based
approach.

Core changes:

- New Resource ABC and Action dataclass in datasette/permissions.py
  * Resources represent hierarchical entities (instance, database, table)
  * Each resource type implements resources_sql() to list all instances
  * Actions define operations on resources with cascading rules

- New plugin hook: register_actions(datasette)
  * Plugins register actions with their associated resource types
  * Replaces register_permissions() and register_resource_types()
  * See docs/plugin_hooks.rst for full documentation

- Three new Datasette methods for permission checks:
  * allowed_resources(action, actor) - returns list[Resource]
  * allowed_resources_with_reasons(action, actor) - for debugging
  * allowed(action, resource, actor) - checks single resource
  * All use SQL for filtering, never Python iteration

- New /-/tables endpoint (TablesView)
  * Returns JSON list of tables user can view
  * Supports ?q= parameter for regex filtering
  * Format: {"matches": [{"name": "db/table", "url": "/db/table"}]}
  * Respects all permission rules from configuration and plugins

- SQL-based permission evaluation (datasette/utils/actions_sql.py)
  * Cascading rules: child-level → parent-level → global-level
  * DENY beats ALLOW at same specificity
  * Uses CTEs for efficient SQL-only filtering
  * Combines permission_resources_sql() hook results

- Default actions in datasette/default_actions.py
  * InstanceResource, DatabaseResource, TableResource, QueryResource
  * Core actions: view-instance, view-database, view-table, etc.

- Fixed default_permissions.py to handle database-level allow blocks
  * Now creates parent-level rules for view-table action
  * Fixes: datasette ... -s databases.fixtures.allow.id root

Documentation:

- Comprehensive register_actions() hook documentation
- Detailed resources_sql() method explanation
- /-/tables endpoint documentation in docs/introspection.rst
- Deprecated register_permissions() with migration guide

Tests:

- tests/test_actions_sql.py: 7 tests for core permission API
- tests/test_tables_endpoint.py: 13 tests for /-/tables endpoint
- All 118 documentation tests pass
- Tests verify SQL does filtering (not Python)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-20 16:00:36 -07:00
118 changed files with 4182 additions and 8443 deletions

View file

@ -1,11 +1,10 @@
name: Deploy latest.datasette.io name: Deploy latest.datasette.io
on: on:
workflow_dispatch:
push: push:
branches: branches:
- main - main
# - 1.0-dev - 1.0-dev
permissions: permissions:
contents: read contents: read
@ -15,12 +14,19 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Check out datasette - name: Check out datasette
uses: actions/checkout@v5 uses: actions/checkout@v3
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v6 uses: actions/setup-python@v6
# Using Python 3.10 for gcloud compatibility:
with: with:
python-version: "3.13" python-version: "3.10"
cache: pip - uses: actions/cache@v4
name: Configure pip caching
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/setup.py') }}
restore-keys: |
${{ runner.os }}-pip-
- name: Install Python dependencies - name: Install Python dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip
@ -95,13 +101,12 @@ jobs:
# jq '.plugins |= . + {"datasette-ephemeral-tables": {"table_ttl": 900}}' \ # jq '.plugins |= . + {"datasette-ephemeral-tables": {"table_ttl": 900}}' \
# > metadata.json # > metadata.json
# cat metadata.json # cat metadata.json
- id: auth - name: Set up Cloud Run
name: Authenticate to Google Cloud uses: google-github-actions/setup-gcloud@v0
uses: google-github-actions/auth@v3
with: with:
credentials_json: ${{ secrets.GCP_SA_KEY }} version: '318.0.0'
- name: Set up Cloud SDK service_account_email: ${{ secrets.GCP_SA_EMAIL }}
uses: google-github-actions/setup-gcloud@v3 service_account_key: ${{ secrets.GCP_SA_KEY }}
- name: Deploy to Cloud Run - name: Deploy to Cloud Run
env: env:
LATEST_DATASETTE_SECRET: ${{ secrets.LATEST_DATASETTE_SECRET }} LATEST_DATASETTE_SECRET: ${{ secrets.LATEST_DATASETTE_SECRET }}

View file

@ -20,7 +20,7 @@ jobs:
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
cache: pip cache: pip
cache-dependency-path: pyproject.toml cache-dependency-path: setup.py
- name: Install dependencies - name: Install dependencies
run: | run: |
pip install -e '.[test]' pip install -e '.[test]'
@ -41,7 +41,7 @@ jobs:
with: with:
python-version: '3.13' python-version: '3.13'
cache: pip cache: pip
cache-dependency-path: pyproject.toml cache-dependency-path: setup.py
- name: Install dependencies - name: Install dependencies
run: | run: |
pip install setuptools wheel build pip install setuptools wheel build
@ -62,7 +62,7 @@ jobs:
with: with:
python-version: '3.10' python-version: '3.10'
cache: pip cache: pip
cache-dependency-path: pyproject.toml cache-dependency-path: setup.py
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install -e .[docs] python -m pip install -e .[docs]
@ -73,13 +73,12 @@ jobs:
DISABLE_SPHINX_INLINE_TABS=1 sphinx-build -b xml . _build DISABLE_SPHINX_INLINE_TABS=1 sphinx-build -b xml . _build
sphinx-to-sqlite ../docs.db _build sphinx-to-sqlite ../docs.db _build
cd .. cd ..
- id: auth - name: Set up Cloud Run
name: Authenticate to Google Cloud uses: google-github-actions/setup-gcloud@v0
uses: google-github-actions/auth@v2
with: with:
credentials_json: ${{ secrets.GCP_SA_KEY }} version: '318.0.0'
- name: Set up Cloud SDK service_account_email: ${{ secrets.GCP_SA_EMAIL }}
uses: google-github-actions/setup-gcloud@v3 service_account_key: ${{ secrets.GCP_SA_KEY }}
- name: Deploy stable-docs.datasette.io to Cloud Run - name: Deploy stable-docs.datasette.io to Cloud Run
run: |- run: |-
gcloud config set run/region us-central1 gcloud config set run/region us-central1

View file

@ -15,7 +15,7 @@ jobs:
with: with:
python-version: '3.11' python-version: '3.11'
cache: 'pip' cache: 'pip'
cache-dependency-path: '**/pyproject.toml' cache-dependency-path: '**/setup.py'
- name: Install dependencies - name: Install dependencies
run: | run: |
pip install -e '.[docs]' pip install -e '.[docs]'

View file

@ -1,76 +0,0 @@
name: Update Stable Docs
on:
release:
types: [published]
push:
branches:
- main
permissions:
contents: write
jobs:
update_stable_docs:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
fetch-depth: 0 # We need all commits to find docs/ changes
- name: Set up Git user
run: |
git config user.name "Automated"
git config user.email "actions@users.noreply.github.com"
- name: Create stable branch if it does not yet exist
run: |
if ! git ls-remote --heads origin stable | grep -qE '\bstable\b'; then
# Make sure we have all tags locally
git fetch --tags --quiet
# Latest tag that is just numbers and dots (optionally prefixed with 'v')
# e.g., 0.65.2 or v0.65.2 — excludes 1.0a20, 1.0-rc1, etc.
LATEST_RELEASE=$(
git tag -l --sort=-v:refname \
| grep -E '^v?[0-9]+(\.[0-9]+){1,3}$' \
| head -n1
)
git checkout -b stable
# If there are any stable releases, copy docs/ from the most recent
if [ -n "$LATEST_RELEASE" ]; then
rm -rf docs/
git checkout "$LATEST_RELEASE" -- docs/ || true
fi
git commit -m "Populate docs/ from $LATEST_RELEASE" || echo "No changes"
git push -u origin stable
fi
- name: Handle Release
if: github.event_name == 'release' && !github.event.release.prerelease
run: |
git fetch --all
git checkout stable
git reset --hard ${GITHUB_REF#refs/tags/}
git push origin stable --force
- name: Handle Commit to Main
if: contains(github.event.head_commit.message, '!stable-docs')
run: |
git fetch origin
git checkout -b stable origin/stable
# Get the list of modified files in docs/ from the current commit
FILES=$(git diff-tree --no-commit-id --name-only -r ${{ github.sha }} -- docs/)
# Check if the list of files is non-empty
if [[ -n "$FILES" ]]; then
# Checkout those files to the stable branch to over-write with their contents
for FILE in $FILES; do
git checkout ${{ github.sha }} -- $FILE
done
git add docs/
git commit -m "Doc changes from ${{ github.sha }}"
git push origin stable
else
echo "No changes to docs/ in this commit."
exit 0
fi

View file

@ -21,7 +21,7 @@ jobs:
with: with:
python-version: '3.12' python-version: '3.12'
cache: 'pip' cache: 'pip'
cache-dependency-path: '**/pyproject.toml' cache-dependency-path: '**/setup.py'
- name: Install Python dependencies - name: Install Python dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip

View file

@ -18,7 +18,7 @@ jobs:
with: with:
python-version: "3.10" python-version: "3.10"
cache: 'pip' cache: 'pip'
cache-dependency-path: '**/pyproject.toml' cache-dependency-path: '**/setup.py'
- name: Cache Playwright browsers - name: Cache Playwright browsers
uses: actions/cache@v4 uses: actions/cache@v4
with: with:

View file

@ -32,7 +32,7 @@ jobs:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
allow-prereleases: true allow-prereleases: true
cache: pip cache: pip
cache-dependency-path: pyproject.toml cache-dependency-path: setup.py
- name: Set up SQLite ${{ matrix.sqlite-version }} - name: Set up SQLite ${{ matrix.sqlite-version }}
uses: asg017/sqlite-versions@71ea0de37ae739c33e447af91ba71dda8fcf22e6 uses: asg017/sqlite-versions@71ea0de37ae739c33e447af91ba71dda8fcf22e6
with: with:

View file

@ -19,7 +19,7 @@ jobs:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
allow-prereleases: true allow-prereleases: true
cache: pip cache: pip
cache-dependency-path: pyproject.toml cache-dependency-path: setup.py
- name: Build extension for --load-extension test - name: Build extension for --load-extension test
run: |- run: |-
(cd tests && gcc ext.c -fPIC -shared -o ext.so) (cd tests && gcc ext.c -fPIC -shared -o ext.so)
@ -36,8 +36,6 @@ jobs:
- name: Install docs dependencies - name: Install docs dependencies
run: | run: |
pip install -e '.[docs]' pip install -e '.[docs]'
- name: Black
run: black --check .
- name: Check if cog needs to be run - name: Check if cog needs to be run
run: | run: |
cog --check docs/*.rst cog --check docs/*.rst

View file

@ -5,7 +5,6 @@ on:
permissions: permissions:
contents: read contents: read
models: read
jobs: jobs:
build: build:
@ -14,5 +13,3 @@ jobs:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- name: Setup tmate session - name: Setup tmate session
uses: mxschmitt/action-tmate@v3 uses: mxschmitt/action-tmate@v3
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

5
.gitignore vendored
View file

@ -5,9 +5,6 @@ scratchpad
.vscode .vscode
uv.lock
data.db
# We don't use Pipfile, so ignore them # We don't use Pipfile, so ignore them
Pipfile Pipfile
Pipfile.lock Pipfile.lock
@ -126,4 +123,4 @@ node_modules
# include it in source control. # include it in source control.
tests/*.dylib tests/*.dylib
tests/*.so tests/*.so
tests/*.dll tests/*.dll

View file

@ -5,52 +5,38 @@ export DATASETTE_SECRET := "not_a_secret"
# Setup project # Setup project
@init: @init:
uv sync --extra test --extra docs pipenv run pip install -e '.[test,docs]'
# Run pytest with supplied options # Run pytest with supplied options
@test *options: init @test *options:
uv run pytest -n auto {{options}} pipenv run pytest {{options}}
@codespell: @codespell:
uv run codespell README.md --ignore-words docs/codespell-ignore-words.txt pipenv run codespell README.md --ignore-words docs/codespell-ignore-words.txt
uv run codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt pipenv run codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt
uv run codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt pipenv run codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt
uv run codespell tests --ignore-words docs/codespell-ignore-words.txt pipenv run codespell tests --ignore-words docs/codespell-ignore-words.txt
# Run linters: black, flake8, mypy, cog # Run linters: black, flake8, mypy, cog
@lint: codespell @lint: codespell
uv run black . --check pipenv run black . --check
uv run flake8 pipenv run flake8
uv run --extra test cog --check README.md docs/*.rst pipenv run cog --check README.md docs/*.rst
# Rebuild docs with cog # Rebuild docs with cog
@cog: @cog:
uv run --extra test cog -r README.md docs/*.rst pipenv run cog -r README.md docs/*.rst
# Serve live docs on localhost:8000 # Serve live docs on localhost:8000
@docs: cog blacken-docs @docs: cog
uv run --extra docs make -C docs livehtml pipenv run blacken-docs -l 60 docs/*.rst
cd docs && pipenv run make livehtml
# Build docs as static HTML
@docs-build: cog blacken-docs
rm -rf docs/_build && cd docs && uv run make html
# Apply Black # Apply Black
@black: @black:
uv run black . pipenv run black .
# Apply blacken-docs @serve:
@blacken-docs: pipenv run sqlite-utils create-database data.db
uv run blacken-docs -l 60 docs/*.rst pipenv run sqlite-utils create-table data.db docs id integer title text --pk id --ignore
pipenv run python -m datasette data.db --root --reload
# Apply prettier
@prettier:
npm run fix
# Format code with both black and prettier
@format: black prettier blacken-docs
@serve *options:
uv run sqlite-utils create-database data.db
uv run sqlite-utils create-table data.db docs id integer title text --pk id --ignore
uv run python -m datasette data.db --root --reload {{options}}

File diff suppressed because it is too large Load diff

View file

@ -146,6 +146,7 @@ def inspect(files, inspect_file, sqlite_extensions):
This can then be passed to "datasette --inspect-file" to speed up count This can then be passed to "datasette --inspect-file" to speed up count
operations against immutable database files. operations against immutable database files.
""" """
app = Datasette([], immutables=files, sqlite_extensions=sqlite_extensions)
inspect_data = run_sync(lambda: inspect_(files, sqlite_extensions)) inspect_data = run_sync(lambda: inspect_(files, sqlite_extensions))
if inspect_file == "-": if inspect_file == "-":
sys.stdout.write(json.dumps(inspect_data, indent=2)) sys.stdout.write(json.dumps(inspect_data, indent=2))
@ -438,20 +439,10 @@ def uninstall(packages, yes):
help="Output URL that sets a cookie authenticating the root user", help="Output URL that sets a cookie authenticating the root user",
is_flag=True, is_flag=True,
) )
@click.option(
"--default-deny",
help="Deny all permissions by default",
is_flag=True,
)
@click.option( @click.option(
"--get", "--get",
help="Run an HTTP GET request against this path, print results and exit", help="Run an HTTP GET request against this path, print results and exit",
) )
@click.option(
"--headers",
is_flag=True,
help="Include HTTP headers in --get output",
)
@click.option( @click.option(
"--token", "--token",
help="API token to send with --get requests", help="API token to send with --get requests",
@ -519,9 +510,7 @@ def serve(
settings, settings,
secret, secret,
root, root,
default_deny,
get, get,
headers,
token, token,
actor, actor,
version_note, version_note,
@ -600,23 +589,15 @@ def serve(
crossdb=crossdb, crossdb=crossdb,
nolock=nolock, nolock=nolock,
internal=internal, internal=internal,
default_deny=default_deny,
) )
# Separate directories from files # if files is a single directory, use that as config_dir=
directories = [f for f in files if os.path.isdir(f)] if 1 == len(files) and os.path.isdir(files[0]):
file_paths = [f for f in files if not os.path.isdir(f)] kwargs["config_dir"] = pathlib.Path(files[0])
files = []
# Handle config_dir - only one directory allowed
if len(directories) > 1:
raise click.ClickException(
"Cannot pass multiple directories. Pass a single directory as config_dir."
)
elif len(directories) == 1:
kwargs["config_dir"] = pathlib.Path(directories[0])
# Verify list of files, create if needed (and --create) # Verify list of files, create if needed (and --create)
for file in file_paths: for file in files:
if not pathlib.Path(file).exists(): if not pathlib.Path(file).exists():
if create: if create:
sqlite3.connect(file).execute("vacuum") sqlite3.connect(file).execute("vacuum")
@ -627,32 +608,8 @@ def serve(
) )
) )
# Check for duplicate files by resolving all paths to their absolute forms # De-duplicate files so 'datasette db.db db.db' only attaches one /db
# Collect all database files that will be loaded (explicit files + config_dir files) files = list(dict.fromkeys(files))
all_db_files = []
# Add explicit files
for file in file_paths:
all_db_files.append((file, pathlib.Path(file).resolve()))
# Add config_dir databases if config_dir is set
if "config_dir" in kwargs:
config_dir = kwargs["config_dir"]
for ext in ("db", "sqlite", "sqlite3"):
for db_file in config_dir.glob(f"*.{ext}"):
all_db_files.append((str(db_file), db_file.resolve()))
# Check for duplicates
seen = {}
for original_path, resolved_path in all_db_files:
if resolved_path in seen:
raise click.ClickException(
f"Duplicate database file: '{original_path}' and '{seen[resolved_path]}' "
f"both refer to {resolved_path}"
)
seen[resolved_path] = original_path
files = file_paths
try: try:
ds = Datasette(files, **kwargs) ds = Datasette(files, **kwargs)
@ -671,33 +628,19 @@ def serve(
# Run async soundness checks - but only if we're not under pytest # Run async soundness checks - but only if we're not under pytest
run_sync(lambda: check_databases(ds)) run_sync(lambda: check_databases(ds))
if headers and not get:
raise click.ClickException("--headers can only be used with --get")
if token and not get: if token and not get:
raise click.ClickException("--token can only be used with --get") raise click.ClickException("--token can only be used with --get")
if get: if get:
client = TestClient(ds) client = TestClient(ds)
request_headers = {} headers = {}
if token: if token:
request_headers["Authorization"] = "Bearer {}".format(token) headers["Authorization"] = "Bearer {}".format(token)
cookies = {} cookies = {}
if actor: if actor:
cookies["ds_actor"] = client.actor_cookie(json.loads(actor)) cookies["ds_actor"] = client.actor_cookie(json.loads(actor))
response = client.get(get, headers=request_headers, cookies=cookies) response = client.get(get, headers=headers, cookies=cookies)
click.echo(response.text)
if headers:
# Output HTTP status code, headers, two newlines, then the response body
click.echo(f"HTTP/1.1 {response.status}")
for key, value in response.headers.items():
click.echo(f"{key}: {value}")
if response.text:
click.echo()
click.echo(response.text)
else:
click.echo(response.text)
exit_code = 0 if response.status == 200 else 1 exit_code = 0 if response.status == 200 else 1
sys.exit(exit_code) sys.exit(exit_code)
return return
@ -823,7 +766,7 @@ def create_token(
actions.extend([p[1] for p in databases]) actions.extend([p[1] for p in databases])
actions.extend([p[2] for p in resources]) actions.extend([p[2] for p in resources])
for action in actions: for action in actions:
if not ds.actions.get(action): if not ds.permissions.get(action):
click.secho( click.secho(
f" Unknown permission: {action} ", f" Unknown permission: {action} ",
fg="red", fg="red",

View file

@ -143,9 +143,7 @@ class Database:
return conn.executescript(sql) return conn.executescript(sql)
with trace("sql", database=self.name, sql=sql.strip(), executescript=True): with trace("sql", database=self.name, sql=sql.strip(), executescript=True):
results = await self.execute_write_fn( results = await self.execute_write_fn(_inner, block=block)
_inner, block=block, transaction=False
)
return results return results
async def execute_write_many(self, sql, params_seq, block=True): async def execute_write_many(self, sql, params_seq, block=True):
@ -410,12 +408,7 @@ class Database:
# But SQLite prior to 3.16.0 doesn't support pragma functions # But SQLite prior to 3.16.0 doesn't support pragma functions
results = await self.execute("PRAGMA database_list;") results = await self.execute("PRAGMA database_list;")
# {'seq': 0, 'name': 'main', 'file': ''} # {'seq': 0, 'name': 'main', 'file': ''}
return [ return [AttachedDatabase(*row) for row in results.rows if row["seq"] > 0]
AttachedDatabase(*row)
for row in results.rows
# Filter out the SQLite internal "temp" database, refs #2557
if row["seq"] > 0 and row["name"] != "temp"
]
async def table_exists(self, table): async def table_exists(self, table):
results = await self.execute( results = await self.execute(

View file

@ -1,6 +1,7 @@
from datasette import hookimpl from datasette import hookimpl
from datasette.permissions import Action from datasette.permissions import Action
from datasette.resources import ( from datasette.resources import (
InstanceResource,
DatabaseResource, DatabaseResource,
TableResource, TableResource,
QueryResource, QueryResource,
@ -11,91 +12,120 @@ from datasette.resources import (
def register_actions(): def register_actions():
"""Register the core Datasette actions.""" """Register the core Datasette actions."""
return ( return (
# Global actions (no resource_class) # View actions
Action( Action(
name="view-instance", name="view-instance",
abbr="vi", abbr="vi",
description="View Datasette instance", description="View Datasette instance",
takes_parent=False,
takes_child=False,
resource_class=InstanceResource,
), ),
Action(
name="permissions-debug",
abbr="pd",
description="Access permission debug tool",
),
Action(
name="debug-menu",
abbr="dm",
description="View debug menu items",
),
# Database-level actions (parent-level)
Action( Action(
name="view-database", name="view-database",
abbr="vd", abbr="vd",
description="View database", description="View database",
takes_parent=True,
takes_child=False,
resource_class=DatabaseResource, resource_class=DatabaseResource,
), ),
Action( Action(
name="view-database-download", name="view-database-download",
abbr="vdd", abbr="vdd",
description="Download database file", description="Download database file",
takes_parent=True,
takes_child=False,
resource_class=DatabaseResource, resource_class=DatabaseResource,
also_requires="view-database", ),
Action(
name="view-table",
abbr="vt",
description="View table",
takes_parent=True,
takes_child=True,
resource_class=TableResource,
),
Action(
name="view-query",
abbr="vq",
description="View named query results",
takes_parent=True,
takes_child=True,
resource_class=QueryResource,
), ),
Action( Action(
name="execute-sql", name="execute-sql",
abbr="es", abbr="es",
description="Execute read-only SQL queries", description="Execute read-only SQL queries",
resource_class=DatabaseResource, takes_parent=True,
also_requires="view-database", takes_child=False,
),
Action(
name="create-table",
abbr="ct",
description="Create tables",
resource_class=DatabaseResource, resource_class=DatabaseResource,
), ),
# Table-level actions (child-level) # Debug actions
Action( Action(
name="view-table", name="permissions-debug",
abbr="vt", abbr="pd",
description="View table", description="Access permission debug tool",
resource_class=TableResource, takes_parent=False,
takes_child=False,
resource_class=InstanceResource,
), ),
Action(
name="debug-menu",
abbr="dm",
description="View debug menu items",
takes_parent=False,
takes_child=False,
resource_class=InstanceResource,
),
# Write actions on tables
Action( Action(
name="insert-row", name="insert-row",
abbr="ir", abbr="ir",
description="Insert rows", description="Insert rows",
takes_parent=True,
takes_child=True,
resource_class=TableResource, resource_class=TableResource,
), ),
Action( Action(
name="delete-row", name="delete-row",
abbr="dr", abbr="dr",
description="Delete rows", description="Delete rows",
takes_parent=True,
takes_child=True,
resource_class=TableResource, resource_class=TableResource,
), ),
Action( Action(
name="update-row", name="update-row",
abbr="ur", abbr="ur",
description="Update rows", description="Update rows",
takes_parent=True,
takes_child=True,
resource_class=TableResource, resource_class=TableResource,
), ),
Action( Action(
name="alter-table", name="alter-table",
abbr="at", abbr="at",
description="Alter tables", description="Alter tables",
takes_parent=True,
takes_child=True,
resource_class=TableResource, resource_class=TableResource,
), ),
Action( Action(
name="drop-table", name="drop-table",
abbr="dt", abbr="dt",
description="Drop tables", description="Drop tables",
takes_parent=True,
takes_child=True,
resource_class=TableResource, resource_class=TableResource,
), ),
# Query-level actions (child-level) # Schema actions on databases
Action( Action(
name="view-query", name="create-table",
abbr="vq", abbr="ct",
description="View named query results", description="Create tables",
resource_class=QueryResource, takes_parent=True,
takes_child=False,
resource_class=DatabaseResource,
), ),
) )

View file

@ -4,7 +4,7 @@ from datasette import hookimpl
@hookimpl @hookimpl
def menu_links(datasette, actor): def menu_links(datasette, actor):
async def inner(): async def inner():
if not await datasette.allowed(action="debug-menu", actor=actor): if not await datasette.permission_allowed(actor, "debug-menu"):
return [] return []
return [ return [

View file

@ -0,0 +1,572 @@
from datasette import hookimpl, Permission
from datasette.permissions import PermissionSQL
from datasette.utils import actor_matches_allow
import itsdangerous
import time
@hookimpl
def register_permissions():
return (
Permission(
name="view-instance",
abbr="vi",
description="View Datasette instance",
takes_database=False,
takes_resource=False,
default=True,
),
Permission(
name="view-database",
abbr="vd",
description="View database",
takes_database=True,
takes_resource=False,
default=True,
implies_can_view=True,
),
Permission(
name="view-database-download",
abbr="vdd",
description="Download database file",
takes_database=True,
takes_resource=False,
default=True,
),
Permission(
name="view-table",
abbr="vt",
description="View table",
takes_database=True,
takes_resource=True,
default=True,
implies_can_view=True,
),
Permission(
name="view-query",
abbr="vq",
description="View named query results",
takes_database=True,
takes_resource=True,
default=True,
implies_can_view=True,
),
Permission(
name="execute-sql",
abbr="es",
description="Execute read-only SQL queries",
takes_database=True,
takes_resource=False,
default=True,
implies_can_view=True,
),
Permission(
name="permissions-debug",
abbr="pd",
description="Access permission debug tool",
takes_database=False,
takes_resource=False,
default=False,
),
Permission(
name="debug-menu",
abbr="dm",
description="View debug menu items",
takes_database=False,
takes_resource=False,
default=False,
),
Permission(
name="insert-row",
abbr="ir",
description="Insert rows",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="delete-row",
abbr="dr",
description="Delete rows",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="update-row",
abbr="ur",
description="Update rows",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="create-table",
abbr="ct",
description="Create tables",
takes_database=True,
takes_resource=False,
default=False,
),
Permission(
name="alter-table",
abbr="at",
description="Alter tables",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="drop-table",
abbr="dt",
description="Drop tables",
takes_database=True,
takes_resource=True,
default=False,
),
)
@hookimpl(tryfirst=True, specname="permission_allowed")
async def permission_allowed_sql_bridge(datasette, actor, action, resource):
"""
Bridge config-based permission rules to the old permission_allowed API.
This allows views using the old string/tuple resource API to benefit from
config blocks defined in datasette.yaml without using the new resource-based system.
Note: This does NOT apply default allow rules - those should come from the
Permission object's default value to maintain backward compatibility.
"""
# Only check config-based rules - don't apply defaults
config_rules = await _config_permission_rules(datasette, actor, action)
if not config_rules:
return None
# Evaluate config rules for this specific resource
for rule in config_rules:
if rule.params: # Has config-based rules
from datasette.utils.permissions import resolve_permissions_with_candidates
# Build candidate based on resource
if resource is None:
candidates = [(None, None)]
elif isinstance(resource, str):
candidates = [(resource, None)]
elif isinstance(resource, tuple):
candidates = [(resource[0], resource[1])]
else:
return None
db = datasette.get_internal_database()
results = await resolve_permissions_with_candidates(
db, actor, [rule], candidates, action, implicit_deny=False
)
if results:
# Use the first result's allow value
for result in results:
if result.get("allow") is not None:
return bool(result["allow"])
return None
@hookimpl(tryfirst=True, specname="permission_allowed")
def permission_allowed_default_allow_sql(datasette, actor, action, resource):
"""
Enforce the default_allow_sql setting for execute-sql permission.
When default_allow_sql is set to False, deny all execute-sql permissions.
This runs before other permission checks to ensure the setting is respected.
"""
if action == "execute-sql":
if not datasette.setting("default_allow_sql"):
return False
return None
@hookimpl(tryfirst=True, specname="permission_allowed")
def permission_allowed_root(datasette, actor, action, resource):
"""
Grant all permissions to root user when Datasette started with --root flag.
The --root flag is a localhost development tool. When used, it sets
datasette.root_enabled = True and creates an actor with id="root".
This hook grants that actor all permissions.
Other plugins can use the same pattern: check datasette.root_enabled
to decide whether to honor root users.
"""
if datasette.root_enabled and actor and actor.get("id") == "root":
return True
return None
@hookimpl
async def permission_resources_sql(datasette, actor, action):
rules: list[PermissionSQL] = []
# Root user with root_enabled gets all permissions at global level
# Config rules at more specific levels (database/table) can still override
if datasette.root_enabled and actor and actor.get("id") == "root":
# Add a single global-level allow rule (NULL, NULL) for root
# This allows root to access everything by default, but database-level
# and table-level deny rules in config can still block specific resources
sql = "SELECT NULL AS parent, NULL AS child, 1 AS allow, 'root user' AS reason"
rules.append(
PermissionSQL(
source="root_permissions",
sql=sql,
params={},
)
)
config_rules = await _config_permission_rules(datasette, actor, action)
rules.extend(config_rules)
# Check default_allow_sql setting for execute-sql action
if action == "execute-sql" and not datasette.setting("default_allow_sql"):
# Return a deny rule for all databases
sql = "SELECT NULL AS parent, NULL AS child, 0 AS allow, 'default_allow_sql is false' AS reason"
rules.append(
PermissionSQL(
source="default_allow_sql_setting",
sql=sql,
params={},
)
)
# Early return - don't add default allow rule
if not rules:
return None
if len(rules) == 1:
return rules[0]
return rules
default_allow_actions = {
"view-instance",
"view-database",
"view-table",
"execute-sql",
}
if action in default_allow_actions:
reason = f"default allow for {action}".replace("'", "''")
sql = (
"SELECT NULL AS parent, NULL AS child, 1 AS allow, " f"'{reason}' AS reason"
)
rules.append(
PermissionSQL(
source="default_permissions",
sql=sql,
params={},
)
)
if not rules:
return None
if len(rules) == 1:
return rules[0]
return rules
async def _config_permission_rules(datasette, actor, action) -> list[PermissionSQL]:
config = datasette.config or {}
if actor is None:
actor_dict: dict | None = None
elif isinstance(actor, dict):
actor_dict = actor
else:
actor_lookup = await datasette.actors_from_ids([actor])
actor_dict = actor_lookup.get(actor) or {"id": actor}
def evaluate(allow_block):
if allow_block is None:
return None
return actor_matches_allow(actor_dict, allow_block)
rows = []
def add_row(parent, child, result, scope):
if result is None:
return
rows.append(
(
parent,
child,
bool(result),
f"config {'allow' if result else 'deny'} {scope}",
)
)
def add_row_allow_block(parent, child, allow_block, scope):
"""For 'allow' blocks, always add a row if the block exists - deny if no match"""
if allow_block is None:
return
result = evaluate(allow_block)
# If result is None (no match) or False, treat as deny
rows.append(
(
parent,
child,
bool(result), # None becomes False, False stays False, True stays True
f"config {'allow' if result else 'deny'} {scope}",
)
)
root_perm = (config.get("permissions") or {}).get(action)
add_row(None, None, evaluate(root_perm), f"permissions for {action}")
for db_name, db_config in (config.get("databases") or {}).items():
db_perm = (db_config.get("permissions") or {}).get(action)
add_row(
db_name, None, evaluate(db_perm), f"permissions for {action} on {db_name}"
)
for table_name, table_config in (db_config.get("tables") or {}).items():
table_perm = (table_config.get("permissions") or {}).get(action)
add_row(
db_name,
table_name,
evaluate(table_perm),
f"permissions for {action} on {db_name}/{table_name}",
)
if action == "view-table":
table_allow = (table_config or {}).get("allow")
add_row_allow_block(
db_name,
table_name,
table_allow,
f"allow for {action} on {db_name}/{table_name}",
)
for query_name, query_config in (db_config.get("queries") or {}).items():
# query_config can be a string (just SQL) or a dict (with SQL and options)
if isinstance(query_config, dict):
query_perm = (query_config.get("permissions") or {}).get(action)
add_row(
db_name,
query_name,
evaluate(query_perm),
f"permissions for {action} on {db_name}/{query_name}",
)
if action == "view-query":
query_allow = query_config.get("allow")
add_row_allow_block(
db_name,
query_name,
query_allow,
f"allow for {action} on {db_name}/{query_name}",
)
if action == "view-database":
db_allow = db_config.get("allow")
add_row_allow_block(
db_name, None, db_allow, f"allow for {action} on {db_name}"
)
if action == "execute-sql":
db_allow_sql = db_config.get("allow_sql")
add_row_allow_block(db_name, None, db_allow_sql, f"allow_sql for {db_name}")
if action == "view-table":
# Database-level allow block affects all tables in that database
db_allow = db_config.get("allow")
add_row_allow_block(
db_name, None, db_allow, f"allow for {action} on {db_name}"
)
if action == "view-query":
# Database-level allow block affects all queries in that database
db_allow = db_config.get("allow")
add_row_allow_block(
db_name, None, db_allow, f"allow for {action} on {db_name}"
)
# Root-level allow block applies to all view-* actions
if action == "view-instance":
allow_block = config.get("allow")
add_row_allow_block(None, None, allow_block, "allow for view-instance")
if action == "view-database":
# Root-level allow block also applies to view-database
allow_block = config.get("allow")
add_row_allow_block(None, None, allow_block, "allow for view-database")
if action == "view-table":
# Root-level allow block also applies to view-table
allow_block = config.get("allow")
add_row_allow_block(None, None, allow_block, "allow for view-table")
if action == "view-query":
# Root-level allow block also applies to view-query
allow_block = config.get("allow")
add_row_allow_block(None, None, allow_block, "allow for view-query")
if action == "execute-sql":
allow_sql = config.get("allow_sql")
add_row_allow_block(None, None, allow_sql, "allow_sql")
if not rows:
return []
parts = []
params = {}
for idx, (parent, child, allow, reason) in enumerate(rows):
key = f"cfg_{idx}"
parts.append(
f"SELECT :{key}_parent AS parent, :{key}_child AS child, :{key}_allow AS allow, :{key}_reason AS reason"
)
params[f"{key}_parent"] = parent
params[f"{key}_child"] = child
params[f"{key}_allow"] = 1 if allow else 0
params[f"{key}_reason"] = reason
sql = "\nUNION ALL\n".join(parts)
return [PermissionSQL(source="config_permissions", sql=sql, params=params)]
def restrictions_allow_action(
datasette: "Datasette",
restrictions: dict,
action: str,
resource: str | tuple[str, str],
):
"Do these restrictions allow the requested action against the requested resource?"
if action == "view-instance":
# Special case for view-instance: it's allowed if the restrictions include any
# permissions that have the implies_can_view=True flag set
all_rules = restrictions.get("a") or []
for database_rules in (restrictions.get("d") or {}).values():
all_rules += database_rules
for database_resource_rules in (restrictions.get("r") or {}).values():
for resource_rules in database_resource_rules.values():
all_rules += resource_rules
permissions = [datasette.get_permission(action) for action in all_rules]
if any(p for p in permissions if p.implies_can_view):
return True
if action == "view-database":
# Special case for view-database: it's allowed if the restrictions include any
# permissions that have the implies_can_view=True flag set AND takes_database
all_rules = restrictions.get("a") or []
database_rules = list((restrictions.get("d") or {}).get(resource) or [])
all_rules += database_rules
resource_rules = ((restrictions.get("r") or {}).get(resource) or {}).values()
for resource_rules in (restrictions.get("r") or {}).values():
for table_rules in resource_rules.values():
all_rules += table_rules
permissions = [datasette.get_permission(action) for action in all_rules]
if any(p for p in permissions if p.implies_can_view and p.takes_database):
return True
# Does this action have an abbreviation?
to_check = {action}
permission = datasette.permissions.get(action)
if permission and permission.abbr:
to_check.add(permission.abbr)
# If restrictions is defined then we use those to further restrict the actor
# Crucially, we only use this to say NO (return False) - we never
# use it to return YES (True) because that might over-ride other
# restrictions placed on this actor
all_allowed = restrictions.get("a")
if all_allowed is not None:
assert isinstance(all_allowed, list)
if to_check.intersection(all_allowed):
return True
# How about for the current database?
if resource:
if isinstance(resource, str):
database_name = resource
else:
database_name = resource[0]
database_allowed = restrictions.get("d", {}).get(database_name)
if database_allowed is not None:
assert isinstance(database_allowed, list)
if to_check.intersection(database_allowed):
return True
# Or the current table? That's any time the resource is (database, table)
if resource is not None and not isinstance(resource, str) and len(resource) == 2:
database, table = resource
table_allowed = restrictions.get("r", {}).get(database, {}).get(table)
# TODO: What should this do for canned queries?
if table_allowed is not None:
assert isinstance(table_allowed, list)
if to_check.intersection(table_allowed):
return True
# This action is not specifically allowed, so reject it
return False
@hookimpl(specname="permission_allowed")
def permission_allowed_actor_restrictions(datasette, actor, action, resource):
if actor is None:
return None
if "_r" not in actor:
# No restrictions, so we have no opinion
return None
_r = actor.get("_r")
if restrictions_allow_action(datasette, _r, action, resource):
# Return None because we do not have an opinion here
return None
else:
# Block this permission check
return False
@hookimpl
def actor_from_request(datasette, request):
prefix = "dstok_"
if not datasette.setting("allow_signed_tokens"):
return None
max_signed_tokens_ttl = datasette.setting("max_signed_tokens_ttl")
authorization = request.headers.get("authorization")
if not authorization:
return None
if not authorization.startswith("Bearer "):
return None
token = authorization[len("Bearer ") :]
if not token.startswith(prefix):
return None
token = token[len(prefix) :]
try:
decoded = datasette.unsign(token, namespace="token")
except itsdangerous.BadSignature:
return None
if "t" not in decoded:
# Missing timestamp
return None
created = decoded["t"]
if not isinstance(created, int):
# Invalid timestamp
return None
duration = decoded.get("d")
if duration is not None and not isinstance(duration, int):
# Invalid duration
return None
if (duration is None and max_signed_tokens_ttl) or (
duration is not None
and max_signed_tokens_ttl
and duration > max_signed_tokens_ttl
):
duration = max_signed_tokens_ttl
if duration:
if time.time() - created > duration:
# Expired
return None
actor = {"id": decoded["a"], "token": "dstok"}
if "_r" in decoded:
actor["_r"] = decoded["_r"]
if duration:
actor["token_expires"] = created + duration
return actor
@hookimpl
def skip_csrf(scope):
# Skip CSRF check for requests with content-type: application/json
if scope["type"] == "http":
headers = scope.get("headers") or {}
if dict(headers).get(b"content-type") == b"application/json":
return True

View file

@ -1,59 +0,0 @@
"""
Default permission implementations for Datasette.
This module provides the built-in permission checking logic through implementations
of the permission_resources_sql hook. The hooks are organized by their purpose:
1. Actor Restrictions - Enforces _r allowlists embedded in actor tokens
2. Root User - Grants full access when --root flag is used
3. Config Rules - Applies permissions from datasette.yaml
4. Default Settings - Enforces default_allow_sql and default view permissions
IMPORTANT: These hooks return PermissionSQL objects that are combined using SQL
UNION/INTERSECT operations. The order of evaluation is:
- restriction_sql fields are INTERSECTed (all must match)
- Regular sql fields are UNIONed and evaluated with cascading priority
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
# Re-export all hooks and public utilities
from .restrictions import (
actor_restrictions_sql,
restrictions_allow_action,
ActorRestrictions,
)
from .root import root_user_permissions_sql
from .config import config_permissions_sql
from .defaults import (
default_allow_sql_check,
default_action_permissions_sql,
DEFAULT_ALLOW_ACTIONS,
)
from .tokens import actor_from_signed_api_token
@hookimpl
def skip_csrf(scope) -> Optional[bool]:
"""Skip CSRF check for JSON content-type requests."""
if scope["type"] == "http":
headers = scope.get("headers") or {}
if dict(headers).get(b"content-type") == b"application/json":
return True
return None
@hookimpl
def canned_queries(datasette: "Datasette", database: str, actor) -> dict:
"""Return canned queries defined in datasette.yaml configuration."""
queries = (
((datasette.config or {}).get("databases") or {}).get(database) or {}
).get("queries") or {}
return queries

View file

@ -1,442 +0,0 @@
"""
Config-based permission handling for Datasette.
Applies permission rules from datasette.yaml configuration.
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Any, List, Optional, Set, Tuple
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
from datasette.utils import actor_matches_allow
from .helpers import PermissionRowCollector, get_action_name_variants
class ConfigPermissionProcessor:
"""
Processes permission rules from datasette.yaml configuration.
Configuration structure:
permissions: # Root-level permissions block
view-instance:
id: admin
databases:
mydb:
permissions: # Database-level permissions
view-database:
id: admin
allow: # Database-level allow block (for view-*)
id: viewer
allow_sql: # execute-sql allow block
id: analyst
tables:
users:
permissions: # Table-level permissions
view-table:
id: admin
allow: # Table-level allow block
id: viewer
queries:
my_query:
permissions: # Query-level permissions
view-query:
id: admin
allow: # Query-level allow block
id: viewer
"""
def __init__(
self,
datasette: "Datasette",
actor: Optional[dict],
action: str,
):
self.datasette = datasette
self.actor = actor
self.action = action
self.config = datasette.config or {}
self.collector = PermissionRowCollector(prefix="cfg")
# Pre-compute action variants
self.action_checks = get_action_name_variants(datasette, action)
self.action_obj = datasette.actions.get(action)
# Parse restrictions if present
self.has_restrictions = actor and "_r" in actor if actor else False
self.restrictions = actor.get("_r", {}) if actor else {}
# Pre-compute restriction info for efficiency
self.restricted_databases: Set[str] = set()
self.restricted_tables: Set[Tuple[str, str]] = set()
if self.has_restrictions:
self.restricted_databases = {
db_name
for db_name, db_actions in (self.restrictions.get("d") or {}).items()
if self.action_checks.intersection(db_actions)
}
self.restricted_tables = {
(db_name, table_name)
for db_name, tables in (self.restrictions.get("r") or {}).items()
for table_name, table_actions in tables.items()
if self.action_checks.intersection(table_actions)
}
# Tables implicitly reference their parent databases
self.restricted_databases.update(db for db, _ in self.restricted_tables)
def evaluate_allow_block(self, allow_block: Any) -> Optional[bool]:
"""Evaluate an allow block against the current actor."""
if allow_block is None:
return None
return actor_matches_allow(self.actor, allow_block)
def is_in_restriction_allowlist(
self,
parent: Optional[str],
child: Optional[str],
) -> bool:
"""Check if resource is allowed by actor restrictions."""
if not self.has_restrictions:
return True # No restrictions, all resources allowed
# Check global allowlist
if self.action_checks.intersection(self.restrictions.get("a", [])):
return True
# Check database-level allowlist
if parent and self.action_checks.intersection(
self.restrictions.get("d", {}).get(parent, [])
):
return True
# Check table-level allowlist
if parent:
table_restrictions = (self.restrictions.get("r", {}) or {}).get(parent, {})
if child:
table_actions = table_restrictions.get(child, [])
if self.action_checks.intersection(table_actions):
return True
else:
# Parent query should proceed if any child in this database is allowlisted
for table_actions in table_restrictions.values():
if self.action_checks.intersection(table_actions):
return True
# Parent/child both None: include if any restrictions exist for this action
if parent is None and child is None:
if self.action_checks.intersection(self.restrictions.get("a", [])):
return True
if self.restricted_databases:
return True
if self.restricted_tables:
return True
return False
def add_permissions_rule(
self,
parent: Optional[str],
child: Optional[str],
permissions_block: Optional[dict],
scope_desc: str,
) -> None:
"""Add a rule from a permissions:{action} block."""
if permissions_block is None:
return
action_allow_block = permissions_block.get(self.action)
result = self.evaluate_allow_block(action_allow_block)
self.collector.add(
parent=parent,
child=child,
allow=result,
reason=f"config {'allow' if result else 'deny'} {scope_desc}",
if_not_none=True,
)
def add_allow_block_rule(
self,
parent: Optional[str],
child: Optional[str],
allow_block: Any,
scope_desc: str,
) -> None:
"""
Add rules from an allow:{} block.
For allow blocks, if the block exists but doesn't match the actor,
this is treated as a deny. We also handle the restriction-gate logic.
"""
if allow_block is None:
return
# Skip if resource is not in restriction allowlist
if not self.is_in_restriction_allowlist(parent, child):
return
result = self.evaluate_allow_block(allow_block)
bool_result = bool(result)
self.collector.add(
parent,
child,
bool_result,
f"config {'allow' if result else 'deny'} {scope_desc}",
)
# Handle restriction-gate: add explicit denies for restricted resources
self._add_restriction_gate_denies(parent, child, bool_result, scope_desc)
def _add_restriction_gate_denies(
self,
parent: Optional[str],
child: Optional[str],
is_allowed: bool,
scope_desc: str,
) -> None:
"""
When a config rule denies at a higher level, add explicit denies
for restricted resources to prevent child-level allows from
incorrectly granting access.
"""
if is_allowed or child is not None or not self.has_restrictions:
return
if not self.action_obj:
return
reason = f"config deny {scope_desc} (restriction gate)"
if parent is None:
# Root-level deny: add denies for all restricted resources
if self.action_obj.takes_parent:
for db_name in self.restricted_databases:
self.collector.add(db_name, None, False, reason)
if self.action_obj.takes_child:
for db_name, table_name in self.restricted_tables:
self.collector.add(db_name, table_name, False, reason)
else:
# Database-level deny: add denies for tables in that database
if self.action_obj.takes_child:
for db_name, table_name in self.restricted_tables:
if db_name == parent:
self.collector.add(db_name, table_name, False, reason)
def process(self) -> Optional[PermissionSQL]:
"""Process all config rules and return combined PermissionSQL."""
self._process_root_permissions()
self._process_databases()
self._process_root_allow_blocks()
return self.collector.to_permission_sql()
def _process_root_permissions(self) -> None:
"""Process root-level permissions block."""
root_perms = self.config.get("permissions") or {}
self.add_permissions_rule(
None,
None,
root_perms,
f"permissions for {self.action}",
)
def _process_databases(self) -> None:
"""Process database-level and nested configurations."""
databases = self.config.get("databases") or {}
for db_name, db_config in databases.items():
self._process_database(db_name, db_config or {})
def _process_database(self, db_name: str, db_config: dict) -> None:
"""Process a single database's configuration."""
# Database-level permissions block
db_perms = db_config.get("permissions") or {}
self.add_permissions_rule(
db_name,
None,
db_perms,
f"permissions for {self.action} on {db_name}",
)
# Process tables
for table_name, table_config in (db_config.get("tables") or {}).items():
self._process_table(db_name, table_name, table_config or {})
# Process queries
for query_name, query_config in (db_config.get("queries") or {}).items():
self._process_query(db_name, query_name, query_config)
# Database-level allow blocks
self._process_database_allow_blocks(db_name, db_config)
def _process_table(
self,
db_name: str,
table_name: str,
table_config: dict,
) -> None:
"""Process a single table's configuration."""
# Table-level permissions block
table_perms = table_config.get("permissions") or {}
self.add_permissions_rule(
db_name,
table_name,
table_perms,
f"permissions for {self.action} on {db_name}/{table_name}",
)
# Table-level allow block (for view-table)
if self.action == "view-table":
self.add_allow_block_rule(
db_name,
table_name,
table_config.get("allow"),
f"allow for {self.action} on {db_name}/{table_name}",
)
def _process_query(
self,
db_name: str,
query_name: str,
query_config: Any,
) -> None:
"""Process a single query's configuration."""
# Query config can be a string (just SQL) or dict
if not isinstance(query_config, dict):
return
# Query-level permissions block
query_perms = query_config.get("permissions") or {}
self.add_permissions_rule(
db_name,
query_name,
query_perms,
f"permissions for {self.action} on {db_name}/{query_name}",
)
# Query-level allow block (for view-query)
if self.action == "view-query":
self.add_allow_block_rule(
db_name,
query_name,
query_config.get("allow"),
f"allow for {self.action} on {db_name}/{query_name}",
)
def _process_database_allow_blocks(
self,
db_name: str,
db_config: dict,
) -> None:
"""Process database-level allow/allow_sql blocks."""
# view-database allow block
if self.action == "view-database":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow"),
f"allow for {self.action} on {db_name}",
)
# execute-sql allow_sql block
if self.action == "execute-sql":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow_sql"),
f"allow_sql for {db_name}",
)
# view-table uses database-level allow for inheritance
if self.action == "view-table":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow"),
f"allow for {self.action} on {db_name}",
)
# view-query uses database-level allow for inheritance
if self.action == "view-query":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow"),
f"allow for {self.action} on {db_name}",
)
def _process_root_allow_blocks(self) -> None:
"""Process root-level allow/allow_sql blocks."""
root_allow = self.config.get("allow")
if self.action == "view-instance":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-instance",
)
if self.action == "view-database":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-database",
)
if self.action == "view-table":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-table",
)
if self.action == "view-query":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-query",
)
if self.action == "execute-sql":
self.add_allow_block_rule(
None,
None,
self.config.get("allow_sql"),
"allow_sql",
)
@hookimpl(specname="permission_resources_sql")
async def config_permissions_sql(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[List[PermissionSQL]]:
"""
Apply permission rules from datasette.yaml configuration.
This processes:
- permissions: blocks at root, database, table, and query levels
- allow: blocks for view-* actions
- allow_sql: blocks for execute-sql action
"""
processor = ConfigPermissionProcessor(datasette, actor, action)
result = processor.process()
if result is None:
return []
return [result]

View file

@ -1,70 +0,0 @@
"""
Default permission settings for Datasette.
Provides default allow rules for standard view/execute actions.
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
# Actions that are allowed by default (unless --default-deny is used)
DEFAULT_ALLOW_ACTIONS = frozenset(
{
"view-instance",
"view-database",
"view-database-download",
"view-table",
"view-query",
"execute-sql",
}
)
@hookimpl(specname="permission_resources_sql")
async def default_allow_sql_check(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[PermissionSQL]:
"""
Enforce the default_allow_sql setting.
When default_allow_sql is false (the default), execute-sql is denied
unless explicitly allowed by config or other rules.
"""
if action == "execute-sql":
if not datasette.setting("default_allow_sql"):
return PermissionSQL.deny(reason="default_allow_sql is false")
return None
@hookimpl(specname="permission_resources_sql")
async def default_action_permissions_sql(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[PermissionSQL]:
"""
Provide default allow rules for standard view/execute actions.
These defaults are skipped when datasette is started with --default-deny.
The restriction_sql mechanism (from actor_restrictions_sql) will still
filter these results if the actor has restrictions.
"""
if datasette.default_deny:
return None
if action in DEFAULT_ALLOW_ACTIONS:
reason = f"default allow for {action}".replace("'", "''")
return PermissionSQL.allow(reason=reason)
return None

View file

@ -1,85 +0,0 @@
"""
Shared helper utilities for default permission implementations.
"""
from __future__ import annotations
from dataclasses import dataclass
from typing import TYPE_CHECKING, List, Optional, Set
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette.permissions import PermissionSQL
def get_action_name_variants(datasette: "Datasette", action: str) -> Set[str]:
"""
Get all name variants for an action (full name and abbreviation).
Example:
get_action_name_variants(ds, "view-table") -> {"view-table", "vt"}
"""
variants = {action}
action_obj = datasette.actions.get(action)
if action_obj and action_obj.abbr:
variants.add(action_obj.abbr)
return variants
def action_in_list(datasette: "Datasette", action: str, action_list: list) -> bool:
"""Check if an action (or its abbreviation) is in a list."""
return bool(get_action_name_variants(datasette, action).intersection(action_list))
@dataclass
class PermissionRow:
"""A single permission rule row."""
parent: Optional[str]
child: Optional[str]
allow: bool
reason: str
class PermissionRowCollector:
"""Collects permission rows and converts them to PermissionSQL."""
def __init__(self, prefix: str = "row"):
self.rows: List[PermissionRow] = []
self.prefix = prefix
def add(
self,
parent: Optional[str],
child: Optional[str],
allow: Optional[bool],
reason: str,
if_not_none: bool = False,
) -> None:
"""Add a permission row. If if_not_none=True, only add if allow is not None."""
if if_not_none and allow is None:
return
self.rows.append(PermissionRow(parent, child, allow, reason))
def to_permission_sql(self) -> Optional[PermissionSQL]:
"""Convert collected rows to a PermissionSQL object."""
if not self.rows:
return None
parts = []
params = {}
for idx, row in enumerate(self.rows):
key = f"{self.prefix}_{idx}"
parts.append(
f"SELECT :{key}_parent AS parent, :{key}_child AS child, "
f":{key}_allow AS allow, :{key}_reason AS reason"
)
params[f"{key}_parent"] = row.parent
params[f"{key}_child"] = row.child
params[f"{key}_allow"] = 1 if row.allow else 0
params[f"{key}_reason"] = row.reason
sql = "\nUNION ALL\n".join(parts)
return PermissionSQL(sql=sql, params=params)

View file

@ -1,195 +0,0 @@
"""
Actor restriction handling for Datasette permissions.
This module handles the _r (restrictions) key in actor dictionaries, which
contains allowlists of resources the actor can access.
"""
from __future__ import annotations
from dataclasses import dataclass
from typing import TYPE_CHECKING, List, Optional, Set, Tuple
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
from .helpers import action_in_list, get_action_name_variants
@dataclass
class ActorRestrictions:
"""Parsed actor restrictions from the _r key."""
global_actions: List[str] # _r.a - globally allowed actions
database_actions: dict # _r.d - {db_name: [actions]}
table_actions: dict # _r.r - {db_name: {table: [actions]}}
@classmethod
def from_actor(cls, actor: Optional[dict]) -> Optional["ActorRestrictions"]:
"""Parse restrictions from actor dict. Returns None if no restrictions."""
if not actor:
return None
assert isinstance(actor, dict), "actor must be a dictionary"
restrictions = actor.get("_r")
if restrictions is None:
return None
return cls(
global_actions=restrictions.get("a", []),
database_actions=restrictions.get("d", {}),
table_actions=restrictions.get("r", {}),
)
def is_action_globally_allowed(self, datasette: "Datasette", action: str) -> bool:
"""Check if action is in the global allowlist."""
return action_in_list(datasette, action, self.global_actions)
def get_allowed_databases(self, datasette: "Datasette", action: str) -> Set[str]:
"""Get database names where this action is allowed."""
allowed = set()
for db_name, db_actions in self.database_actions.items():
if action_in_list(datasette, action, db_actions):
allowed.add(db_name)
return allowed
def get_allowed_tables(
self, datasette: "Datasette", action: str
) -> Set[Tuple[str, str]]:
"""Get (database, table) pairs where this action is allowed."""
allowed = set()
for db_name, tables in self.table_actions.items():
for table_name, table_actions in tables.items():
if action_in_list(datasette, action, table_actions):
allowed.add((db_name, table_name))
return allowed
@hookimpl(specname="permission_resources_sql")
async def actor_restrictions_sql(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[List[PermissionSQL]]:
"""
Handle actor restriction-based permission rules.
When an actor has an "_r" key, it contains an allowlist of resources they
can access. This function returns restriction_sql that filters the final
results to only include resources in that allowlist.
The _r structure:
{
"a": ["vi", "pd"], # Global actions allowed
"d": {"mydb": ["vt", "es"]}, # Database-level actions
"r": {"mydb": {"users": ["vt"]}} # Table-level actions
}
"""
if not actor:
return None
restrictions = ActorRestrictions.from_actor(actor)
if restrictions is None:
# No restrictions - all resources allowed
return []
# If globally allowed, no filtering needed
if restrictions.is_action_globally_allowed(datasette, action):
return []
# Build restriction SQL
allowed_dbs = restrictions.get_allowed_databases(datasette, action)
allowed_tables = restrictions.get_allowed_tables(datasette, action)
# If nothing is allowed for this action, return empty-set restriction
if not allowed_dbs and not allowed_tables:
return [
PermissionSQL(
params={"deny": f"actor restrictions: {action} not in allowlist"},
restriction_sql="SELECT NULL AS parent, NULL AS child WHERE 0",
)
]
# Build UNION of allowed resources
selects = []
params = {}
counter = 0
# Database-level entries (parent, NULL) - allows all children
for db_name in allowed_dbs:
key = f"restr_{counter}"
counter += 1
selects.append(f"SELECT :{key}_parent AS parent, NULL AS child")
params[f"{key}_parent"] = db_name
# Table-level entries (parent, child)
for db_name, table_name in allowed_tables:
key = f"restr_{counter}"
counter += 1
selects.append(f"SELECT :{key}_parent AS parent, :{key}_child AS child")
params[f"{key}_parent"] = db_name
params[f"{key}_child"] = table_name
restriction_sql = "\nUNION ALL\n".join(selects)
return [PermissionSQL(params=params, restriction_sql=restriction_sql)]
def restrictions_allow_action(
datasette: "Datasette",
restrictions: dict,
action: str,
resource: Optional[str | Tuple[str, str]],
) -> bool:
"""
Check if restrictions allow the requested action on the requested resource.
This is a synchronous utility function for use by other code that needs
to quickly check restriction allowlists.
Args:
datasette: The Datasette instance
restrictions: The _r dict from an actor
action: The action name to check
resource: None for global, str for database, (db, table) tuple for table
Returns:
True if allowed, False if denied
"""
# Does this action have an abbreviation?
to_check = get_action_name_variants(datasette, action)
# Check global level (any resource)
all_allowed = restrictions.get("a")
if all_allowed is not None:
assert isinstance(all_allowed, list)
if to_check.intersection(all_allowed):
return True
# Check database level
if resource:
if isinstance(resource, str):
database_name = resource
else:
database_name = resource[0]
database_allowed = restrictions.get("d", {}).get(database_name)
if database_allowed is not None:
assert isinstance(database_allowed, list)
if to_check.intersection(database_allowed):
return True
# Check table/resource level
if resource is not None and not isinstance(resource, str) and len(resource) == 2:
database, table = resource
table_allowed = restrictions.get("r", {}).get(database, {}).get(table)
if table_allowed is not None:
assert isinstance(table_allowed, list)
if to_check.intersection(table_allowed):
return True
# This action is not explicitly allowed, so reject it
return False

View file

@ -1,29 +0,0 @@
"""
Root user permission handling for Datasette.
Grants full permissions to the root user when --root flag is used.
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
@hookimpl(specname="permission_resources_sql")
async def root_user_permissions_sql(
datasette: "Datasette",
actor: Optional[dict],
) -> Optional[PermissionSQL]:
"""
Grant root user full permissions when --root flag is used.
"""
if not datasette.root_enabled:
return None
if actor is not None and actor.get("id") == "root":
return PermissionSQL.allow(reason="root user")

View file

@ -1,95 +0,0 @@
"""
Token authentication for Datasette.
Handles signed API tokens (dstok_ prefix).
"""
from __future__ import annotations
import time
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
import itsdangerous
from datasette import hookimpl
@hookimpl(specname="actor_from_request")
def actor_from_signed_api_token(datasette: "Datasette", request) -> Optional[dict]:
"""
Authenticate requests using signed API tokens (dstok_ prefix).
Token structure (signed JSON):
{
"a": "actor_id", # Actor ID
"t": 1234567890, # Timestamp (Unix epoch)
"d": 3600, # Optional: Duration in seconds
"_r": {...} # Optional: Restrictions
}
"""
prefix = "dstok_"
# Check if tokens are enabled
if not datasette.setting("allow_signed_tokens"):
return None
max_signed_tokens_ttl = datasette.setting("max_signed_tokens_ttl")
# Get authorization header
authorization = request.headers.get("authorization")
if not authorization:
return None
if not authorization.startswith("Bearer "):
return None
token = authorization[len("Bearer ") :]
if not token.startswith(prefix):
return None
# Remove prefix and verify signature
token = token[len(prefix) :]
try:
decoded = datasette.unsign(token, namespace="token")
except itsdangerous.BadSignature:
return None
# Validate timestamp
if "t" not in decoded:
return None
created = decoded["t"]
if not isinstance(created, int):
return None
# Handle duration/expiry
duration = decoded.get("d")
if duration is not None and not isinstance(duration, int):
return None
# Apply max TTL if configured
if (duration is None and max_signed_tokens_ttl) or (
duration is not None
and max_signed_tokens_ttl
and duration > max_signed_tokens_ttl
):
duration = max_signed_tokens_ttl
# Check expiry
if duration:
if time.time() - created > duration:
return None
# Build actor dict
actor = {"id": decoded["a"], "token": "dstok"}
# Copy restrictions if present
if "_r" in decoded:
actor["_r"] = decoded["_r"]
# Add expiry timestamp if applicable
if duration:
actor["token_expires"] = created + duration
return actor

View file

@ -2,6 +2,7 @@ from abc import ABC, abstractproperty
from dataclasses import asdict, dataclass, field from dataclasses import asdict, dataclass, field
from datasette.hookspecs import hookimpl from datasette.hookspecs import hookimpl
from datetime import datetime, timezone from datetime import datetime, timezone
from typing import Optional
@dataclass @dataclass
@ -13,7 +14,7 @@ class Event(ABC):
created: datetime = field( created: datetime = field(
init=False, default_factory=lambda: datetime.now(timezone.utc) init=False, default_factory=lambda: datetime.now(timezone.utc)
) )
actor: dict | None actor: Optional[dict]
def properties(self): def properties(self):
properties = asdict(self) properties = asdict(self)
@ -62,7 +63,7 @@ class CreateTokenEvent(Event):
""" """
name = "create-token" name = "create-token"
expires_after: int | None expires_after: Optional[int]
restrict_all: list restrict_all: list
restrict_database: dict restrict_database: dict
restrict_resource: dict restrict_resource: dict

View file

@ -1,8 +1,8 @@
from datasette import hookimpl from datasette import hookimpl
from datasette.resources import DatabaseResource
from datasette.views.base import DatasetteError from datasette.views.base import DatasetteError
from datasette.utils.asgi import BadRequest from datasette.utils.asgi import BadRequest
import json import json
import numbers
from .utils import detect_json1, escape_sqlite, path_with_removed_args from .utils import detect_json1, escape_sqlite, path_with_removed_args
@ -13,10 +13,11 @@ def where_filters(request, database, datasette):
where_clauses = [] where_clauses = []
extra_wheres_for_ui = [] extra_wheres_for_ui = []
if "_where" in request.args: if "_where" in request.args:
if not await datasette.allowed( if not await datasette.permission_allowed(
action="execute-sql", request.actor,
resource=DatabaseResource(database=database), "execute-sql",
actor=request.actor, resource=database,
default=True,
): ):
raise DatasetteError("_where= is not allowed", status=403) raise DatasetteError("_where= is not allowed", status=403)
else: else:

View file

@ -69,6 +69,11 @@ def register_facet_classes():
"""Register Facet subclasses""" """Register Facet subclasses"""
@hookspec
def register_permissions(datasette):
"""Register permissions: returns a list of datasette.permission.Permission named tuples"""
@hookspec @hookspec
def register_actions(datasette): def register_actions(datasette):
"""Register actions: returns a list of datasette.permission.Action objects""" """Register actions: returns a list of datasette.permission.Action objects"""
@ -110,6 +115,11 @@ def filters_from_request(request, database, table, datasette):
) based on the request""" ) based on the request"""
@hookspec
def permission_allowed(datasette, actor, action, resource):
"""Check if actor is allowed to perform this action - return True, False or None"""
@hookspec @hookspec
def permission_resources_sql(datasette, actor, action): def permission_resources_sql(datasette, actor, action):
"""Return SQL query fragments for permission checks on resources. """Return SQL query fragments for permission checks on resources.

View file

@ -1,33 +1,6 @@
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from dataclasses import dataclass from dataclasses import dataclass
from typing import Any, NamedTuple from typing import Any, Dict, Optional, NamedTuple
import contextvars
# Context variable to track when permission checks should be skipped
_skip_permission_checks = contextvars.ContextVar(
"skip_permission_checks", default=False
)
class SkipPermissions:
"""Context manager to temporarily skip permission checks.
This is not a stable API and may change in future releases.
Usage:
with SkipPermissions():
# Permission checks are skipped within this block
response = await datasette.client.get("/protected")
"""
def __enter__(self):
self.token = _skip_permission_checks.set(True)
return self
def __exit__(self, exc_type, exc_val, exc_tb):
_skip_permission_checks.reset(self.token)
return False
class Resource(ABC): class Resource(ABC):
@ -41,13 +14,9 @@ class Resource(ABC):
# Class-level metadata (subclasses must define these) # Class-level metadata (subclasses must define these)
name: str = None # e.g., "table", "database", "model" name: str = None # e.g., "table", "database", "model"
parent_class: type["Resource"] | None = None # e.g., DatabaseResource for tables parent_name: Optional[str] = None # e.g., "database" for tables
# Instance-level optional extra attributes def __init__(self, parent: Optional[str] = None, child: Optional[str] = None):
reasons: list[str] | None = None
include_reasons: bool | None = None
def __init__(self, parent: str | None = None, child: str | None = None):
""" """
Create a resource instance. Create a resource instance.
@ -81,29 +50,6 @@ class Resource(ABC):
def private(self, value: bool): def private(self, value: bool):
self._private = value self._private = value
@classmethod
def __init_subclass__(cls):
"""
Validate resource hierarchy doesn't exceed 2 levels.
Raises:
ValueError: If this resource would create a 3-level hierarchy
"""
super().__init_subclass__()
if cls.parent_class is None:
return # Top of hierarchy, nothing to validate
# Check if our parent has a parent - that would create 3 levels
if cls.parent_class.parent_class is not None:
# We have a parent, and that parent has a parent
# This creates a 3-level hierarchy, which is not allowed
raise ValueError(
f"Resource {cls.__name__} creates a 3-level hierarchy: "
f"{cls.parent_class.parent_class.__name__} -> {cls.parent_class.__name__} -> {cls.__name__}. "
f"Maximum 2 levels allowed (parent -> child)."
)
@classmethod @classmethod
@abstractmethod @abstractmethod
def resources_sql(cls) -> str: def resources_sql(cls) -> str:
@ -122,39 +68,14 @@ class AllowedResource(NamedTuple):
reason: str reason: str
@dataclass(frozen=True, kw_only=True) @dataclass(frozen=True)
class Action: class Action:
name: str name: str
abbr: str | None
description: str | None description: str | None
abbr: str | None = None takes_parent: bool
resource_class: type[Resource] | None = None takes_child: bool
also_requires: str | None = None # Optional action name that must also be allowed resource_class: type[Resource]
@property
def takes_parent(self) -> bool:
"""
Whether this action requires a parent identifier when instantiating its resource.
Returns False for global-only actions (no resource_class).
Returns True for all actions with a resource_class (all resources require a parent identifier).
"""
return self.resource_class is not None
@property
def takes_child(self) -> bool:
"""
Whether this action requires a child identifier when instantiating its resource.
Returns False for global actions (no resource_class).
Returns False for parent-level resources (DatabaseResource - parent_class is None).
Returns True for child-level resources (TableResource, QueryResource - have a parent_class).
"""
if self.resource_class is None:
return False
return self.resource_class.parent_class is not None
_reason_id = 1
@dataclass @dataclass
@ -165,42 +86,19 @@ class PermissionSQL:
child TEXT NULL, child TEXT NULL,
allow INTEGER, -- 1 allow, 0 deny allow INTEGER, -- 1 allow, 0 deny
reason TEXT reason TEXT
For restriction-only plugins, sql can be None and only restriction_sql is provided.
""" """
sql: str | None = ( source: str # identifier used for auditing (e.g., plugin name)
None # SQL that SELECTs the 4 columns above (can be None for restriction-only) sql: str # SQL that SELECTs the 4 columns above
) params: Dict[str, Any] # bound params for the SQL (values only; no ':' prefix)
params: dict[str, Any] | None = (
None # bound params for the SQL (values only; no ':' prefix)
)
source: str | None = None # System will set this to the plugin name
restriction_sql: str | None = (
None # Optional SQL that returns (parent, child) for restriction filtering
)
@classmethod
def allow(cls, reason: str, _allow: bool = True) -> "PermissionSQL":
global _reason_id
i = _reason_id
_reason_id += 1
return cls(
sql=f"SELECT NULL AS parent, NULL AS child, {1 if _allow else 0} AS allow, :reason_{i} AS reason",
params={f"reason_{i}": reason},
)
@classmethod
def deny(cls, reason: str) -> "PermissionSQL":
return cls.allow(reason=reason, _allow=False)
# This is obsolete, replaced by Action and ResourceType # This is obsolete, replaced by Action and ResourceType
@dataclass @dataclass
class Permission: class Permission:
name: str name: str
abbr: str | None abbr: Optional[str]
description: str | None description: Optional[str]
takes_database: bool takes_database: bool
takes_resource: bool takes_resource: bool
default: bool default: bool

View file

@ -50,7 +50,7 @@ def after(outcome, hook_name, hook_impls, kwargs):
results = outcome.get_result() results = outcome.get_result()
if not isinstance(results, list): if not isinstance(results, list):
results = [results] results = [results]
print("Results:", file=sys.stderr) print(f"Results:", file=sys.stderr)
pprint(results, width=40, indent=4, stream=sys.stderr) pprint(results, width=40, indent=4, stream=sys.stderr)
@ -94,24 +94,21 @@ def get_plugins():
for plugin in pm.get_plugins(): for plugin in pm.get_plugins():
static_path = None static_path = None
templates_path = None templates_path = None
plugin_name = ( if plugin.__name__ not in DEFAULT_PLUGINS:
plugin.__name__
if hasattr(plugin, "__name__")
else plugin.__class__.__name__
)
if plugin_name not in DEFAULT_PLUGINS:
try: try:
if (importlib_resources.files(plugin_name) / "static").is_dir(): if (importlib_resources.files(plugin.__name__) / "static").is_dir():
static_path = str(importlib_resources.files(plugin_name) / "static") static_path = str(
if (importlib_resources.files(plugin_name) / "templates").is_dir(): importlib_resources.files(plugin.__name__) / "static"
)
if (importlib_resources.files(plugin.__name__) / "templates").is_dir():
templates_path = str( templates_path = str(
importlib_resources.files(plugin_name) / "templates" importlib_resources.files(plugin.__name__) / "templates"
) )
except (TypeError, ModuleNotFoundError): except (TypeError, ModuleNotFoundError):
# Caused by --plugins_dir= plugins # Caused by --plugins_dir= plugins
pass pass
plugin_info = { plugin_info = {
"name": plugin_name, "name": plugin.__name__,
"static_path": static_path, "static_path": static_path,
"templates_path": templates_path, "templates_path": templates_path,
"hooks": [h.name for h in pm.get_hookcallers(plugin)], "hooks": [h.name for h in pm.get_hookcallers(plugin)],

View file

@ -3,7 +3,7 @@ import click
import json import json
import os import os
import re import re
from subprocess import CalledProcessError, check_call, check_output from subprocess import check_call, check_output
from .common import ( from .common import (
add_common_publish_arguments_and_options, add_common_publish_arguments_and_options,
@ -23,9 +23,7 @@ def publish_subcommand(publish):
help="Application name to use when building", help="Application name to use when building",
) )
@click.option( @click.option(
"--service", "--service", default="", help="Cloud Run service to deploy (or over-write)"
default="",
help="Cloud Run service to deploy (or over-write)",
) )
@click.option("--spatialite", is_flag=True, help="Enable SpatialLite extension") @click.option("--spatialite", is_flag=True, help="Enable SpatialLite extension")
@click.option( @click.option(
@ -57,32 +55,13 @@ def publish_subcommand(publish):
@click.option( @click.option(
"--max-instances", "--max-instances",
type=int, type=int,
default=1, help="Maximum Cloud Run instances",
show_default=True,
help="Maximum Cloud Run instances (use 0 to remove the limit)",
) )
@click.option( @click.option(
"--min-instances", "--min-instances",
type=int, type=int,
help="Minimum Cloud Run instances", help="Minimum Cloud Run instances",
) )
@click.option(
"--artifact-repository",
default="datasette",
show_default=True,
help="Artifact Registry repository to store the image",
)
@click.option(
"--artifact-region",
default="us",
show_default=True,
help="Artifact Registry location (region or multi-region)",
)
@click.option(
"--artifact-project",
default=None,
help="Project ID for Artifact Registry (defaults to the active project)",
)
def cloudrun( def cloudrun(
files, files,
metadata, metadata,
@ -112,9 +91,6 @@ def publish_subcommand(publish):
apt_get_extras, apt_get_extras,
max_instances, max_instances,
min_instances, min_instances,
artifact_repository,
artifact_region,
artifact_project,
): ):
"Publish databases to Datasette running on Cloud Run" "Publish databases to Datasette running on Cloud Run"
fail_if_publish_binary_not_installed( fail_if_publish_binary_not_installed(
@ -124,21 +100,6 @@ def publish_subcommand(publish):
"gcloud config get-value project", shell=True, universal_newlines=True "gcloud config get-value project", shell=True, universal_newlines=True
).strip() ).strip()
artifact_project = artifact_project or project
# Ensure Artifact Registry exists for the target image
_ensure_artifact_registry(
artifact_project=artifact_project,
artifact_region=artifact_region,
artifact_repository=artifact_repository,
)
artifact_host = (
artifact_region
if artifact_region.endswith("-docker.pkg.dev")
else f"{artifact_region}-docker.pkg.dev"
)
if not service: if not service:
# Show the user their current services, then prompt for one # Show the user their current services, then prompt for one
click.echo("Please provide a service name for this deployment\n") click.echo("Please provide a service name for this deployment\n")
@ -156,11 +117,6 @@ def publish_subcommand(publish):
click.echo("") click.echo("")
service = click.prompt("Service name", type=str) service = click.prompt("Service name", type=str)
image_id = (
f"{artifact_host}/{artifact_project}/"
f"{artifact_repository}/datasette-{service}"
)
extra_metadata = { extra_metadata = {
"title": title, "title": title,
"license": license, "license": license,
@ -217,6 +173,7 @@ def publish_subcommand(publish):
print(fp.read()) print(fp.read())
print("\n====================\n") print("\n====================\n")
image_id = f"gcr.io/{project}/datasette-{service}"
check_call( check_call(
"gcloud builds submit --tag {}{}".format( "gcloud builds submit --tag {}{}".format(
image_id, " --timeout {}".format(timeout) if timeout else "" image_id, " --timeout {}".format(timeout) if timeout else ""
@ -230,7 +187,7 @@ def publish_subcommand(publish):
("--max-instances", max_instances), ("--max-instances", max_instances),
("--min-instances", min_instances), ("--min-instances", min_instances),
): ):
if value is not None: if value:
extra_deploy_options.append("{} {}".format(option, value)) extra_deploy_options.append("{} {}".format(option, value))
check_call( check_call(
"gcloud run deploy --allow-unauthenticated --platform=managed --image {} {}{}".format( "gcloud run deploy --allow-unauthenticated --platform=managed --image {} {}{}".format(
@ -242,52 +199,6 @@ def publish_subcommand(publish):
) )
def _ensure_artifact_registry(artifact_project, artifact_region, artifact_repository):
"""Ensure Artifact Registry API is enabled and the repository exists."""
enable_cmd = (
"gcloud services enable artifactregistry.googleapis.com "
f"--project {artifact_project} --quiet"
)
try:
check_call(enable_cmd, shell=True)
except CalledProcessError as exc:
raise click.ClickException(
"Failed to enable artifactregistry.googleapis.com. "
"Please ensure you have permissions to manage services."
) from exc
describe_cmd = (
"gcloud artifacts repositories describe {repo} --project {project} "
"--location {location} --quiet"
).format(
repo=artifact_repository,
project=artifact_project,
location=artifact_region,
)
try:
check_call(describe_cmd, shell=True)
return
except CalledProcessError:
create_cmd = (
"gcloud artifacts repositories create {repo} --repository-format=docker "
'--location {location} --project {project} --description "Datasette Cloud Run images" --quiet'
).format(
repo=artifact_repository,
location=artifact_region,
project=artifact_project,
)
try:
check_call(create_cmd, shell=True)
click.echo(f"Created Artifact Registry repository '{artifact_repository}'")
except CalledProcessError as exc:
raise click.ClickException(
"Failed to create Artifact Registry repository. "
"Use --artifact-repository/--artifact-region to point to an existing repo "
"or create one manually."
) from exc
def get_existing_services(): def get_existing_services():
services = json.loads( services = json.loads(
check_output( check_output(
@ -303,7 +214,6 @@ def get_existing_services():
"url": service["status"]["address"]["url"], "url": service["status"]["address"]["url"],
} }
for service in services for service in services
if "url" in service["status"]
] ]

View file

@ -20,7 +20,7 @@ def convert_specific_columns_to_json(rows, columns, json_cols):
if column in json_cols: if column in json_cols:
try: try:
value = json.loads(value) value = json.loads(value)
except (TypeError, ValueError): except (TypeError, ValueError) as e:
pass pass
new_row.append(value) new_row.append(value)
new_rows.append(new_row) new_rows.append(new_row)

View file

@ -3,17 +3,31 @@
from datasette.permissions import Resource from datasette.permissions import Resource
class InstanceResource(Resource):
"""The Datasette instance itself."""
name = "instance"
parent_name = None
def __init__(self):
super().__init__(parent=None, child=None)
@classmethod
def resources_sql(cls) -> str:
return "SELECT NULL AS parent, NULL AS child"
class DatabaseResource(Resource): class DatabaseResource(Resource):
"""A database in Datasette.""" """A database in Datasette."""
name = "database" name = "database"
parent_class = None # Top of the resource hierarchy parent_name = "instance"
def __init__(self, database: str): def __init__(self, database: str):
super().__init__(parent=database, child=None) super().__init__(parent=database, child=None)
@classmethod @classmethod
async def resources_sql(cls, datasette) -> str: def resources_sql(cls) -> str:
return """ return """
SELECT database_name AS parent, NULL AS child SELECT database_name AS parent, NULL AS child
FROM catalog_databases FROM catalog_databases
@ -24,13 +38,13 @@ class TableResource(Resource):
"""A table in a database.""" """A table in a database."""
name = "table" name = "table"
parent_class = DatabaseResource parent_name = "database"
def __init__(self, database: str, table: str): def __init__(self, database: str, table: str):
super().__init__(parent=database, child=table) super().__init__(parent=database, child=table)
@classmethod @classmethod
async def resources_sql(cls, datasette) -> str: def resources_sql(cls) -> str:
return """ return """
SELECT database_name AS parent, table_name AS child SELECT database_name AS parent, table_name AS child
FROM catalog_tables FROM catalog_tables
@ -44,47 +58,12 @@ class QueryResource(Resource):
"""A canned query in a database.""" """A canned query in a database."""
name = "query" name = "query"
parent_class = DatabaseResource parent_name = "database"
def __init__(self, database: str, query: str): def __init__(self, database: str, query: str):
super().__init__(parent=database, child=query) super().__init__(parent=database, child=query)
@classmethod @classmethod
async def resources_sql(cls, datasette) -> str: def resources_sql(cls) -> str:
from datasette.plugins import pm # TODO: Need catalog for queries
from datasette.utils import await_me_maybe return "SELECT NULL AS parent, NULL AS child WHERE 0"
# Get all databases from catalog
db = datasette.get_internal_database()
result = await db.execute("SELECT database_name FROM catalog_databases")
databases = [row[0] for row in result.rows]
# Gather all canned queries from all databases
query_pairs = []
for database_name in databases:
# Call the hook to get queries (including from config via default plugin)
for queries_result in pm.hook.canned_queries(
datasette=datasette,
database=database_name,
actor=None, # Get ALL queries for resource enumeration
):
queries = await await_me_maybe(queries_result)
if queries:
for query_name in queries.keys():
query_pairs.append((database_name, query_name))
# Build SQL
if not query_pairs:
return "SELECT NULL AS parent, NULL AS child WHERE 0"
# Generate UNION ALL query
selects = []
for db_name, query_name in query_pairs:
# Escape single quotes by doubling them
db_escaped = db_name.replace("'", "''")
query_escaped = query_name.replace("'", "''")
selects.append(
f"SELECT '{db_escaped}' AS parent, '{query_escaped}' AS child"
)
return " UNION ALL ".join(selects)

View file

@ -93,12 +93,12 @@ const datasetteManager = {
*/ */
renderAboveTablePanel: () => { renderAboveTablePanel: () => {
const aboveTablePanel = document.querySelector( const aboveTablePanel = document.querySelector(
DOM_SELECTORS.aboveTablePanel, DOM_SELECTORS.aboveTablePanel
); );
if (!aboveTablePanel) { if (!aboveTablePanel) {
console.warn( console.warn(
"This page does not have a table, the renderAboveTablePanel cannot be used.", "This page does not have a table, the renderAboveTablePanel cannot be used."
); );
return; return;
} }

View file

@ -7,8 +7,8 @@ MIT Licensed
typeof exports === "object" && typeof module !== "undefined" typeof exports === "object" && typeof module !== "undefined"
? (module.exports = factory()) ? (module.exports = factory())
: typeof define === "function" && define.amd : typeof define === "function" && define.amd
? define(factory) ? define(factory)
: (global.jsonFormatHighlight = factory()); : (global.jsonFormatHighlight = factory());
})(this, function () { })(this, function () {
"use strict"; "use strict";
@ -42,13 +42,13 @@ MIT Licensed
color = /true/.test(match) color = /true/.test(match)
? colors.trueColor ? colors.trueColor
: /false/.test(match) : /false/.test(match)
? colors.falseColor ? colors.falseColor
: /null/.test(match) : /null/.test(match)
? colors.nullColor ? colors.nullColor
: color; : color;
} }
return '<span style="color: ' + color + '">' + match + "</span>"; return '<span style="color: ' + color + '">' + match + "</span>";
}, }
); );
} }

View file

@ -188,8 +188,9 @@ class NavigationSearch extends HTMLElement {
setupEventListeners() { setupEventListeners() {
const dialog = this.shadowRoot.querySelector("dialog"); const dialog = this.shadowRoot.querySelector("dialog");
const input = this.shadowRoot.querySelector(".search-input"); const input = this.shadowRoot.querySelector(".search-input");
const resultsContainer = const resultsContainer = this.shadowRoot.querySelector(
this.shadowRoot.querySelector(".results-container"); ".results-container"
);
// Global keyboard listener for "/" // Global keyboard listener for "/"
document.addEventListener("keydown", (e) => { document.addEventListener("keydown", (e) => {
@ -303,7 +304,7 @@ class NavigationSearch extends HTMLElement {
this.matches = (this.allItems || []).filter( this.matches = (this.allItems || []).filter(
(item) => (item) =>
item.name.toLowerCase().includes(lowerQuery) || item.name.toLowerCase().includes(lowerQuery) ||
item.url.toLowerCase().includes(lowerQuery), item.url.toLowerCase().includes(lowerQuery)
); );
} }
this.selectedIndex = this.matches.length > 0 ? 0 : -1; this.selectedIndex = this.matches.length > 0 ? 0 : -1;
@ -335,12 +336,12 @@ class NavigationSearch extends HTMLElement {
> >
<div> <div>
<div class="result-name">${this.escapeHtml( <div class="result-name">${this.escapeHtml(
match.name, match.name
)}</div> )}</div>
<div class="result-url">${this.escapeHtml(match.url)}</div> <div class="result-url">${this.escapeHtml(match.url)}</div>
</div> </div>
</div> </div>
`, `
) )
.join(""); .join("");
@ -376,7 +377,7 @@ class NavigationSearch extends HTMLElement {
detail: match, detail: match,
bubbles: true, bubbles: true,
composed: true, composed: true,
}), })
); );
// Navigate to URL // Navigate to URL

View file

@ -132,7 +132,7 @@ const initDatasetteTable = function (manager) {
/* Only show "Facet by this" if it's not the first column, not selected, /* Only show "Facet by this" if it's not the first column, not selected,
not a single PK and the Datasette allow_facet setting is True */ not a single PK and the Datasette allow_facet setting is True */
var displayedFacets = Array.from( var displayedFacets = Array.from(
document.querySelectorAll(".facet-info"), document.querySelectorAll(".facet-info")
).map((el) => el.dataset.column); ).map((el) => el.dataset.column);
var isFirstColumn = var isFirstColumn =
th.parentElement.querySelector("th:first-of-type") == th; th.parentElement.querySelector("th:first-of-type") == th;
@ -152,7 +152,7 @@ const initDatasetteTable = function (manager) {
} }
/* Show notBlank option if not selected AND at least one visible blank value */ /* Show notBlank option if not selected AND at least one visible blank value */
var tdsForThisColumn = Array.from( var tdsForThisColumn = Array.from(
th.closest("table").querySelectorAll("td." + th.className), th.closest("table").querySelectorAll("td." + th.className)
); );
if ( if (
params.get(`${column}__notblank`) != "1" && params.get(`${column}__notblank`) != "1" &&
@ -191,31 +191,29 @@ const initDatasetteTable = function (manager) {
// Plugin hook: allow adding JS-based additional menu items // Plugin hook: allow adding JS-based additional menu items
const columnActionsPayload = { const columnActionsPayload = {
columnName: th.dataset.column, columnName: th.dataset.column,
columnNotNull: th.dataset.columnNotNull === "1", columnNotNull: th.dataset.columnNotNull === '1',
columnType: th.dataset.columnType, columnType: th.dataset.columnType,
isPk: th.dataset.isPk === "1", isPk: th.dataset.isPk === '1'
}; };
const columnItemConfigs = manager.makeColumnActions(columnActionsPayload); const columnItemConfigs = manager.makeColumnActions(columnActionsPayload);
const menuList = menu.querySelector("ul"); const menuList = menu.querySelector('ul');
columnItemConfigs.forEach((itemConfig) => { columnItemConfigs.forEach(itemConfig => {
// Remove items from previous render. We assume entries have unique labels. // Remove items from previous render. We assume entries have unique labels.
const existingItems = menuList.querySelectorAll(`li`); const existingItems = menuList.querySelectorAll(`li`);
Array.from(existingItems) Array.from(existingItems).filter(item => item.innerText === itemConfig.label).forEach(node => {
.filter((item) => item.innerText === itemConfig.label) node.remove();
.forEach((node) => { });
node.remove();
});
const newLink = document.createElement("a"); const newLink = document.createElement('a');
newLink.textContent = itemConfig.label; newLink.textContent = itemConfig.label;
newLink.href = itemConfig.href ?? "#"; newLink.href = itemConfig.href ?? '#';
if (itemConfig.onClick) { if (itemConfig.onClick) {
newLink.onclick = itemConfig.onClick; newLink.onclick = itemConfig.onClick;
} }
// Attach new elements to DOM // Attach new elements to DOM
const menuItem = document.createElement("li"); const menuItem = document.createElement('li');
menuItem.appendChild(newLink); menuItem.appendChild(newLink);
menuList.appendChild(menuItem); menuList.appendChild(menuItem);
}); });
@ -227,17 +225,17 @@ const initDatasetteTable = function (manager) {
menu.style.left = windowWidth - menuWidth - 20 + "px"; menu.style.left = windowWidth - menuWidth - 20 + "px";
} }
// Align menu .hook arrow with the column cog icon // Align menu .hook arrow with the column cog icon
const hook = menu.querySelector(".hook"); const hook = menu.querySelector('.hook');
const icon = th.querySelector(".dropdown-menu-icon"); const icon = th.querySelector('.dropdown-menu-icon');
const iconRect = icon.getBoundingClientRect(); const iconRect = icon.getBoundingClientRect();
const hookLeft = iconRect.left - menuLeft + 1 + "px"; const hookLeft = (iconRect.left - menuLeft + 1) + 'px';
hook.style.left = hookLeft; hook.style.left = hookLeft;
// Move the whole menu right if the hook is too far right // Move the whole menu right if the hook is too far right
const menuRect = menu.getBoundingClientRect(); const menuRect = menu.getBoundingClientRect();
if (iconRect.right > menuRect.right) { if (iconRect.right > menuRect.right) {
menu.style.left = iconRect.right - menuWidth + "px"; menu.style.left = (iconRect.right - menuWidth) + 'px';
// And move hook tip as well // And move hook tip as well
hook.style.left = menuWidth - 13 + "px"; hook.style.left = (menuWidth - 13) + 'px';
} }
} }
@ -252,9 +250,7 @@ const initDatasetteTable = function (manager) {
menu.style.display = "none"; menu.style.display = "none";
document.body.appendChild(menu); document.body.appendChild(menu);
var ths = Array.from( var ths = Array.from(document.querySelectorAll(manager.selectors.tableHeaders));
document.querySelectorAll(manager.selectors.tableHeaders),
);
ths.forEach((th) => { ths.forEach((th) => {
if (!th.querySelector("a")) { if (!th.querySelector("a")) {
return; return;
@ -268,9 +264,9 @@ const initDatasetteTable = function (manager) {
/* Add x buttons to the filter rows */ /* Add x buttons to the filter rows */
function addButtonsToFilterRows(manager) { function addButtonsToFilterRows(manager) {
var x = "✖"; var x = "✖";
var rows = Array.from( var rows = Array.from(document.querySelectorAll(manager.selectors.filterRow)).filter((el) =>
document.querySelectorAll(manager.selectors.filterRow), el.querySelector(".filter-op")
).filter((el) => el.querySelector(".filter-op")); );
rows.forEach((row) => { rows.forEach((row) => {
var a = document.createElement("a"); var a = document.createElement("a");
a.setAttribute("href", "#"); a.setAttribute("href", "#");
@ -291,18 +287,18 @@ function addButtonsToFilterRows(manager) {
a.style.display = "none"; a.style.display = "none";
} }
}); });
} };
/* Set up datalist autocomplete for filter values */ /* Set up datalist autocomplete for filter values */
function initAutocompleteForFilterValues(manager) { function initAutocompleteForFilterValues(manager) {
function createDataLists() { function createDataLists() {
var facetResults = document.querySelectorAll( var facetResults = document.querySelectorAll(
manager.selectors.facetResults, manager.selectors.facetResults
); );
Array.from(facetResults).forEach(function (facetResult) { Array.from(facetResults).forEach(function (facetResult) {
// Use link text from all links in the facet result // Use link text from all links in the facet result
var links = Array.from( var links = Array.from(
facetResult.querySelectorAll("li:not(.facet-truncated) a"), facetResult.querySelectorAll("li:not(.facet-truncated) a")
); );
// Create a datalist element // Create a datalist element
var datalist = document.createElement("datalist"); var datalist = document.createElement("datalist");
@ -328,7 +324,7 @@ function initAutocompleteForFilterValues(manager) {
.setAttribute("list", "datalist-" + event.target.value); .setAttribute("list", "datalist-" + event.target.value);
} }
}); });
} };
// Ensures Table UI is initialized only after the Manager is ready. // Ensures Table UI is initialized only after the Manager is ready.
document.addEventListener("datasette_init", function (evt) { document.addEventListener("datasette_init", function (evt) {

View file

@ -40,6 +40,26 @@ function populateFormFromURL() {
return params; return params;
} }
// Update URL with current form values
function updateURL(formId, page = 1) {
const form = document.getElementById(formId);
const formData = new FormData(form);
const params = new URLSearchParams();
for (const [key, value] of formData.entries()) {
if (value) {
params.append(key, value);
}
}
if (page > 1) {
params.set('page', page.toString());
}
const newURL = window.location.pathname + (params.toString() ? '?' + params.toString() : '');
window.history.pushState({}, '', newURL);
}
// HTML escape function // HTML escape function
function escapeHtml(text) { function escapeHtml(text) {
if (text === null || text === undefined) return ''; if (text === null || text === undefined) return '';

View file

@ -1,54 +0,0 @@
{% if has_debug_permission %}
{% set query_string = '?' + request.query_string if request.query_string else '' %}
<style>
.permissions-debug-tabs {
border-bottom: 2px solid #e0e0e0;
margin-bottom: 2em;
display: flex;
flex-wrap: wrap;
gap: 0.5em;
}
.permissions-debug-tabs a {
padding: 0.75em 1.25em;
text-decoration: none;
color: #333;
border-bottom: 3px solid transparent;
margin-bottom: -2px;
transition: all 0.2s;
font-weight: 500;
}
.permissions-debug-tabs a:hover {
background-color: #f5f5f5;
border-bottom-color: #999;
}
.permissions-debug-tabs a.active {
color: #0066cc;
border-bottom-color: #0066cc;
background-color: #f0f7ff;
}
@media only screen and (max-width: 576px) {
.permissions-debug-tabs {
flex-direction: column;
gap: 0;
}
.permissions-debug-tabs a {
border-bottom: 1px solid #e0e0e0;
margin-bottom: 0;
}
.permissions-debug-tabs a.active {
border-left: 3px solid #0066cc;
border-bottom: 1px solid #e0e0e0;
}
}
</style>
<nav class="permissions-debug-tabs">
<a href="{{ urls.path('-/permissions') }}" {% if current_tab == "permissions" %}class="active"{% endif %}>Playground</a>
<a href="{{ urls.path('-/check') }}{{ query_string }}" {% if current_tab == "check" %}class="active"{% endif %}>Check</a>
<a href="{{ urls.path('-/allowed') }}{{ query_string }}" {% if current_tab == "allowed" %}class="active"{% endif %}>Allowed</a>
<a href="{{ urls.path('-/rules') }}{{ query_string }}" {% if current_tab == "rules" %}class="active"{% endif %}>Rules</a>
<a href="{{ urls.path('-/actions') }}" {% if current_tab == "actions" %}class="active"{% endif %}>Actions</a>
<a href="{{ urls.path('-/allow-debug') }}" {% if current_tab == "allow_debug" %}class="active"{% endif %}>Allow debug</a>
</nav>
{% endif %}

View file

@ -33,9 +33,6 @@ p.message-warning {
<h1>Debug allow rules</h1> <h1>Debug allow rules</h1>
{% set current_tab = "allow_debug" %}
{% include "_permissions_debug_tabs.html" %}
<p>Use this tool to try out different actor and allow combinations. See <a href="https://docs.datasette.io/en/stable/authentication.html#defining-permissions-with-allow-blocks">Defining permissions with "allow" blocks</a> for documentation.</p> <p>Use this tool to try out different actor and allow combinations. See <a href="https://docs.datasette.io/en/stable/authentication.html#defining-permissions-with-allow-blocks">Defining permissions with "allow" blocks</a> for documentation.</p>
<form class="core" action="{{ urls.path('-/allow-debug') }}" method="get" style="margin-bottom: 1em"> <form class="core" action="{{ urls.path('-/allow-debug') }}" method="get" style="margin-bottom: 1em">

View file

@ -57,7 +57,7 @@
<summary style="cursor: pointer;">Restrict actions that can be performed using this token</summary> <summary style="cursor: pointer;">Restrict actions that can be performed using this token</summary>
<h2>All databases and tables</h2> <h2>All databases and tables</h2>
<ul> <ul>
{% for permission in all_actions %} {% for permission in all_permissions %}
<li><label><input type="checkbox" name="all:{{ permission }}"> {{ permission }}</label></li> <li><label><input type="checkbox" name="all:{{ permission }}"> {{ permission }}</label></li>
{% endfor %} {% endfor %}
</ul> </ul>
@ -65,7 +65,7 @@
{% for database in database_with_tables %} {% for database in database_with_tables %}
<h2>All tables in "{{ database.name }}"</h2> <h2>All tables in "{{ database.name }}"</h2>
<ul> <ul>
{% for permission in database_actions %} {% for permission in database_permissions %}
<li><label><input type="checkbox" name="database:{{ database.encoded }}:{{ permission }}"> {{ permission }}</label></li> <li><label><input type="checkbox" name="database:{{ database.encoded }}:{{ permission }}"> {{ permission }}</label></li>
{% endfor %} {% endfor %}
</ul> </ul>
@ -75,7 +75,7 @@
{% for table in database.tables %} {% for table in database.tables %}
<h3>{{ database.name }}: {{ table.name }}</h3> <h3>{{ database.name }}: {{ table.name }}</h3>
<ul> <ul>
{% for permission in child_actions %} {% for permission in resource_permissions %}
<li><label><input type="checkbox" name="resource:{{ database.encoded }}:{{ table.encoded }}:{{ permission }}"> {{ permission }}</label></li> <li><label><input type="checkbox" name="resource:{{ database.encoded }}:{{ table.encoded }}:{{ permission }}"> {{ permission }}</label></li>
{% endfor %} {% endfor %}
</ul> </ul>

View file

@ -56,7 +56,7 @@
{% endif %} {% endif %}
{% if tables %} {% if tables %}
<h2 id="tables">Tables <a style="font-weight: normal; font-size: 0.75em; padding-left: 0.5em;" href="{{ urls.database(database) }}/-/schema">schema</a></h2> <h2 id="tables">Tables</h2>
{% endif %} {% endif %}
{% for table in tables %} {% for table in tables %}

View file

@ -1,43 +0,0 @@
{% extends "base.html" %}
{% block title %}Registered Actions{% endblock %}
{% block content %}
<h1>Registered actions</h1>
{% set current_tab = "actions" %}
{% include "_permissions_debug_tabs.html" %}
<p style="margin-bottom: 2em;">
This Datasette instance has registered {{ data|length }} action{{ data|length != 1 and "s" or "" }}.
Actions are used by the permission system to control access to different features.
</p>
<table class="rows-and-columns">
<thead>
<tr>
<th>Name</th>
<th>Abbr</th>
<th>Description</th>
<th>Resource</th>
<th>Takes Parent</th>
<th>Takes Child</th>
<th>Also Requires</th>
</tr>
</thead>
<tbody>
{% for action in data %}
<tr>
<td><strong>{{ action.name }}</strong></td>
<td>{% if action.abbr %}<code>{{ action.abbr }}</code>{% endif %}</td>
<td>{{ action.description or "" }}</td>
<td>{% if action.resource_class %}<code>{{ action.resource_class }}</code>{% endif %}</td>
<td>{% if action.takes_parent %}✓{% endif %}</td>
<td>{% if action.takes_child %}✓{% endif %}</td>
<td>{% if action.also_requires %}<code>{{ action.also_requires }}</code>{% endif %}</td>
</tr>
{% endfor %}
</tbody>
</table>
{% endblock %}

View file

@ -9,10 +9,8 @@
{% endblock %} {% endblock %}
{% block content %} {% block content %}
<h1>Allowed resources</h1>
{% set current_tab = "allowed" %} <h1>Allowed Resources</h1>
{% include "_permissions_debug_tabs.html" %}
<p>Use this tool to check which resources the current actor is allowed to access for a given permission action. It queries the <code>/-/allowed.json</code> API endpoint.</p> <p>Use this tool to check which resources the current actor is allowed to access for a given permission action. It queries the <code>/-/allowed.json</code> API endpoint.</p>
@ -23,13 +21,13 @@
{% endif %} {% endif %}
<div class="permission-form"> <div class="permission-form">
<form id="allowed-form" method="get" action="{{ urls.path("-/allowed") }}"> <form id="allowed-form">
<div class="form-section"> <div class="form-section">
<label for="action">Action (permission name):</label> <label for="action">Action (permission name):</label>
<select id="action" name="action" required> <select id="action" name="action" required>
<option value="">Select an action...</option> <option value="">Select an action...</option>
{% for action_name in supported_actions %} {% for permission_name in supported_actions %}
<option value="{{ action_name }}">{{ action_name }}</option> <option value="{{ permission_name }}">{{ permission_name }}</option>
{% endfor %} {% endfor %}
</select> </select>
<small>Only certain actions are supported by this endpoint</small> <small>Only certain actions are supported by this endpoint</small>
@ -44,7 +42,7 @@
<div class="form-section"> <div class="form-section">
<label for="child">Filter by child (optional):</label> <label for="child">Filter by child (optional):</label>
<input type="text" id="child" name="child" placeholder="e.g., table name"> <input type="text" id="child" name="child" placeholder="e.g., table name">
<small>Filter results to a specific child resource (requires parent to be set)</small> <small>Filter results to a specific child resource (requires parent)</small>
</div> </div>
<div class="form-section"> <div class="form-section">
@ -82,7 +80,23 @@ const resultsContent = document.getElementById('results-content');
const resultsCount = document.getElementById('results-count'); const resultsCount = document.getElementById('results-count');
const pagination = document.getElementById('pagination'); const pagination = document.getElementById('pagination');
const submitBtn = document.getElementById('submit-btn'); const submitBtn = document.getElementById('submit-btn');
const hasDebugPermission = {{ 'true' if has_debug_permission else 'false' }}; let currentData = null;
form.addEventListener('submit', async (ev) => {
ev.preventDefault();
updateURL('allowed-form', 1);
await fetchResults(1, false);
});
// Handle browser back/forward
window.addEventListener('popstate', () => {
const params = populateFormFromURL();
const action = params.get('action');
const page = params.get('page');
if (action) {
fetchResults(page ? parseInt(page) : 1, false);
}
});
// Populate form on initial load // Populate form on initial load
(function() { (function() {
@ -90,11 +104,11 @@ const hasDebugPermission = {{ 'true' if has_debug_permission else 'false' }};
const action = params.get('action'); const action = params.get('action');
const page = params.get('page'); const page = params.get('page');
if (action) { if (action) {
fetchResults(page ? parseInt(page) : 1); fetchResults(page ? parseInt(page) : 1, false);
} }
})(); })();
async function fetchResults(page = 1) { async function fetchResults(page = 1, updateHistory = true) {
submitBtn.disabled = true; submitBtn.disabled = true;
submitBtn.textContent = 'Loading...'; submitBtn.textContent = 'Loading...';
@ -122,6 +136,7 @@ async function fetchResults(page = 1) {
const data = await response.json(); const data = await response.json();
if (response.ok) { if (response.ok) {
currentData = data;
displayResults(data); displayResults(data);
} else { } else {
displayError(data); displayError(data);
@ -149,9 +164,8 @@ function displayResults(data) {
html += '<th>Resource Path</th>'; html += '<th>Resource Path</th>';
html += '<th>Parent</th>'; html += '<th>Parent</th>';
html += '<th>Child</th>'; html += '<th>Child</th>';
if (hasDebugPermission) { html += '<th>Reason</th>';
html += '<th>Reason</th>'; html += '<th>Source Plugin</th>';
}
html += '</tr></thead>'; html += '</tr></thead>';
html += '<tbody>'; html += '<tbody>';
@ -160,14 +174,8 @@ function displayResults(data) {
html += `<td><span class="resource-path">${escapeHtml(item.resource || '/')}</span></td>`; html += `<td><span class="resource-path">${escapeHtml(item.resource || '/')}</span></td>`;
html += `<td>${escapeHtml(item.parent || '—')}</td>`; html += `<td>${escapeHtml(item.parent || '—')}</td>`;
html += `<td>${escapeHtml(item.child || '—')}</td>`; html += `<td>${escapeHtml(item.child || '—')}</td>`;
if (hasDebugPermission) { html += `<td>${escapeHtml(item.reason || '—')}</td>`;
// Display reason as JSON array html += `<td>${escapeHtml(item.source_plugin || '—')}</td>`;
let reasonHtml = '—';
if (item.reason && Array.isArray(item.reason)) {
reasonHtml = `<code>${escapeHtml(JSON.stringify(item.reason))}</code>`;
}
html += `<td>${reasonHtml}</td>`;
}
html += '</tr>'; html += '</tr>';
} }
@ -180,8 +188,13 @@ function displayResults(data) {
if (data.previous_url || data.next_url) { if (data.previous_url || data.next_url) {
if (data.previous_url) { if (data.previous_url) {
const prevLink = document.createElement('a'); const prevLink = document.createElement('a');
prevLink.href = data.previous_url; prevLink.href = '#';
prevLink.textContent = '← Previous'; prevLink.textContent = '← Previous';
prevLink.addEventListener('click', (e) => {
e.preventDefault();
updateURL('allowed-form', data.page - 1);
fetchResults(data.page - 1, false);
});
pagination.appendChild(prevLink); pagination.appendChild(prevLink);
} }
@ -191,14 +204,22 @@ function displayResults(data) {
if (data.next_url) { if (data.next_url) {
const nextLink = document.createElement('a'); const nextLink = document.createElement('a');
nextLink.href = data.next_url; nextLink.href = '#';
nextLink.textContent = 'Next →'; nextLink.textContent = 'Next →';
nextLink.addEventListener('click', (e) => {
e.preventDefault();
updateURL('allowed-form', data.page + 1);
fetchResults(data.page + 1, false);
});
pagination.appendChild(nextLink); pagination.appendChild(nextLink);
} }
} }
// Update raw JSON // Update raw JSON
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data); document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
// Scroll to results
resultsContainer.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
} }
function displayError(data) { function displayError(data) {
@ -209,21 +230,20 @@ function displayError(data) {
resultsContent.innerHTML = `<div class="error-message">Error: ${escapeHtml(data.error || 'Unknown error')}</div>`; resultsContent.innerHTML = `<div class="error-message">Error: ${escapeHtml(data.error || 'Unknown error')}</div>`;
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data); document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
resultsContainer.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
} }
// Disable child input if parent is empty // Disable child input if parent is empty
const parentInput = document.getElementById('parent'); const parentInput = document.getElementById('parent');
const childInput = document.getElementById('child'); const childInput = document.getElementById('child');
parentInput.addEventListener('input', () => { childInput.addEventListener('focus', () => {
childInput.disabled = !parentInput.value;
if (!parentInput.value) { if (!parentInput.value) {
childInput.value = ''; alert('Please specify a parent resource first before filtering by child resource.');
parentInput.focus();
} }
}); });
// Initialize disabled state
childInput.disabled = !parentInput.value;
</script> </script>
{% endblock %} {% endblock %}

View file

@ -4,9 +4,35 @@
{% block extra_head %} {% block extra_head %}
<script src="{{ base_url }}-/static/json-format-highlight-1.0.1.js"></script> <script src="{{ base_url }}-/static/json-format-highlight-1.0.1.js"></script>
{% include "_permission_ui_styles.html" %}
{% include "_debug_common_functions.html" %} {% include "_debug_common_functions.html" %}
<style> <style>
.form-section {
margin-bottom: 1em;
}
.form-section label {
display: block;
margin-bottom: 0.3em;
font-weight: bold;
}
.form-section input[type="text"],
.form-section select {
width: 100%;
max-width: 500px;
padding: 0.5em;
box-sizing: border-box;
border: 1px solid #ccc;
border-radius: 3px;
}
.form-section input[type="text"]:focus,
.form-section select:focus {
outline: 2px solid #0066cc;
border-color: #0066cc;
}
.form-section small {
display: block;
margin-top: 0.3em;
color: #666;
}
#output { #output {
margin-top: 2em; margin-top: 2em;
padding: 1em; padding: 1em;
@ -48,14 +74,28 @@
.details-section dd { .details-section dd {
margin-left: 1em; margin-left: 1em;
} }
#submit-btn {
padding: 0.6em 1.5em;
font-size: 1em;
background-color: #0066cc;
color: white;
border: none;
border-radius: 3px;
cursor: pointer;
}
#submit-btn:hover {
background-color: #0052a3;
}
#submit-btn:disabled {
background-color: #ccc;
cursor: not-allowed;
}
</style> </style>
{% endblock %} {% endblock %}
{% block content %} {% block content %}
<h1>Permission check</h1>
{% set current_tab = "check" %} <h1>Permission Check</h1>
{% include "_permissions_debug_tabs.html" %}
<p>Use this tool to test permission checks for the current actor. It queries the <code>/-/check.json</code> API endpoint.</p> <p>Use this tool to test permission checks for the current actor. It queries the <code>/-/check.json</code> API endpoint.</p>
@ -65,36 +105,32 @@
<p>Current actor: <strong>anonymous (not logged in)</strong></p> <p>Current actor: <strong>anonymous (not logged in)</strong></p>
{% endif %} {% endif %}
<div class="permission-form"> <form id="check-form" class="core">
<form id="check-form" method="get" action="{{ urls.path("-/check") }}"> <div class="form-section">
<div class="form-section"> <label for="action">Action (permission name):</label>
<label for="action">Action (permission name):</label> <select id="action" name="action" required>
<select id="action" name="action" required> <option value="">Select an action...</option>
<option value="">Select an action...</option> {% for permission_name in sorted_permissions %}
{% for action_name in sorted_actions %} <option value="{{ permission_name }}">{{ permission_name }}</option>
<option value="{{ action_name }}">{{ action_name }}</option> {% endfor %}
{% endfor %} </select>
</select> <small>The permission action to check</small>
<small>The permission action to check</small> </div>
</div>
<div class="form-section"> <div class="form-section">
<label for="parent">Parent resource (optional):</label> <label for="parent">Parent resource (optional):</label>
<input type="text" id="parent" name="parent" placeholder="e.g., database name"> <input type="text" id="parent" name="parent" placeholder="e.g., database name">
<small>For database-level permissions, specify the database name</small> <small>For database-level permissions, specify the database name</small>
</div> </div>
<div class="form-section"> <div class="form-section">
<label for="child">Child resource (optional):</label> <label for="child">Child resource (optional):</label>
<input type="text" id="child" name="child" placeholder="e.g., table name"> <input type="text" id="child" name="child" placeholder="e.g., table name">
<small>For table-level permissions, specify the table name (requires parent)</small> <small>For table-level permissions, specify the table name (requires parent)</small>
</div> </div>
<div class="form-actions"> <button type="submit" id="submit-btn">Check Permission</button>
<button type="submit" class="submit-btn" id="submit-btn">Check Permission</button> </form>
</div>
</form>
</div>
<div id="output" style="display: none;"> <div id="output" style="display: none;">
<h2>Result: <span class="result-badge" id="result-badge"></span></h2> <h2>Result: <span class="result-badge" id="result-badge"></span></h2>
@ -159,6 +195,21 @@ async function performCheck() {
} }
} }
form.addEventListener('submit', async (ev) => {
ev.preventDefault();
updateURL('check-form');
await performCheck();
});
// Handle browser back/forward
window.addEventListener('popstate', () => {
const params = populateFormFromURL();
const action = params.get('action');
if (action) {
performCheck();
}
});
// Populate form on initial load // Populate form on initial load
(function() { (function() {
const params = populateFormFromURL(); const params = populateFormFromURL();

View file

@ -1,166 +0,0 @@
{% extends "base.html" %}
{% block title %}Debug permissions{% endblock %}
{% block extra_head %}
{% include "_permission_ui_styles.html" %}
<style type="text/css">
.check-result-true {
color: green;
}
.check-result-false {
color: red;
}
.check-result-no-opinion {
color: #aaa;
}
.check h2 {
font-size: 1em
}
.check-action, .check-when, .check-result {
font-size: 1.3em;
}
textarea {
height: 10em;
width: 95%;
box-sizing: border-box;
padding: 0.5em;
border: 2px dotted black;
}
.two-col {
display: inline-block;
width: 48%;
}
.two-col label {
width: 48%;
}
@media only screen and (max-width: 576px) {
.two-col {
width: 100%;
}
}
</style>
{% endblock %}
{% block content %}
<h1>Permission playground</h1>
{% set current_tab = "permissions" %}
{% include "_permissions_debug_tabs.html" %}
<p>This tool lets you simulate an actor and a permission check for that actor.</p>
<div class="permission-form">
<form action="{{ urls.path('-/permissions') }}" id="debug-post" method="post">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<div class="two-col">
<div class="form-section">
<label>Actor</label>
<textarea name="actor">{% if actor_input %}{{ actor_input }}{% else %}{"id": "root"}{% endif %}</textarea>
</div>
</div>
<div class="two-col" style="vertical-align: top">
<div class="form-section">
<label for="permission">Action</label>
<select name="permission" id="permission">
{% for permission in permissions %}
<option value="{{ permission.name }}">{{ permission.name }}</option>
{% endfor %}
</select>
</div>
<div class="form-section">
<label for="resource_1">Parent</label>
<input type="text" id="resource_1" name="resource_1" placeholder="e.g., database name">
</div>
<div class="form-section">
<label for="resource_2">Child</label>
<input type="text" id="resource_2" name="resource_2" placeholder="e.g., table name">
</div>
</div>
<div class="form-actions">
<button type="submit" class="submit-btn">Simulate permission check</button>
</div>
<pre style="margin-top: 1em" id="debugResult"></pre>
</form>
</div>
<script>
var rawPerms = {{ permissions|tojson }};
var permissions = Object.fromEntries(rawPerms.map(p => [p.name, p]));
var permissionSelect = document.getElementById('permission');
var resource1 = document.getElementById('resource_1');
var resource2 = document.getElementById('resource_2');
var resource1Section = resource1.closest('.form-section');
var resource2Section = resource2.closest('.form-section');
function updateResourceVisibility() {
var permission = permissionSelect.value;
var {takes_parent, takes_child} = permissions[permission];
resource1Section.style.display = takes_parent ? 'block' : 'none';
resource2Section.style.display = takes_child ? 'block' : 'none';
}
permissionSelect.addEventListener('change', updateResourceVisibility);
updateResourceVisibility();
// When #debug-post form is submitted, use fetch() to POST data
var debugPost = document.getElementById('debug-post');
var debugResult = document.getElementById('debugResult');
debugPost.addEventListener('submit', function(ev) {
ev.preventDefault();
var formData = new FormData(debugPost);
fetch(debugPost.action, {
method: 'POST',
body: new URLSearchParams(formData),
headers: {
'Accept': 'application/json'
}
}).then(function(response) {
if (!response.ok) {
throw new Error('Request failed with status ' + response.status);
}
return response.json();
}).then(function(data) {
debugResult.innerText = JSON.stringify(data, null, 4);
}).catch(function(error) {
debugResult.innerText = JSON.stringify({ error: error.message }, null, 4);
});
});
</script>
<h1>Recent permissions checks</h1>
<p>
{% if filter != "all" %}<a href="?filter=all">All</a>{% else %}<strong>All</strong>{% endif %},
{% if filter != "exclude-yours" %}<a href="?filter=exclude-yours">Exclude yours</a>{% else %}<strong>Exclude yours</strong>{% endif %},
{% if filter != "only-yours" %}<a href="?filter=only-yours">Only yours</a>{% else %}<strong>Only yours</strong>{% endif %}
</p>
{% if permission_checks %}
<table class="rows-and-columns permission-checks-table" id="permission-checks-table">
<thead>
<tr>
<th>When</th>
<th>Action</th>
<th>Parent</th>
<th>Child</th>
<th>Actor</th>
<th>Result</th>
</tr>
</thead>
<tbody>
{% for check in permission_checks %}
<tr>
<td><span style="font-size: 0.8em">{{ check.when.split('T', 1)[0] }}</span><br>{{ check.when.split('T', 1)[1].split('+', 1)[0].split('-', 1)[0].split('Z', 1)[0] }}</td>
<td><code>{{ check.action }}</code></td>
<td>{{ check.parent or '—' }}</td>
<td>{{ check.child or '—' }}</td>
<td>{% if check.actor %}<code>{{ check.actor|tojson }}</code>{% else %}<span class="check-actor-anon">anonymous</span>{% endif %}</td>
<td>{% if check.result %}<span class="check-result check-result-true">Allowed</span>{% elif check.result is none %}<span class="check-result check-result-no-opinion">No opinion</span>{% else %}<span class="check-result check-result-false">Denied</span>{% endif %}</td>
</tr>
{% endfor %}
</tbody>
</table>
{% else %}
<p class="no-results">No permission checks have been recorded yet.</p>
{% endif %}
{% endblock %}

View file

@ -9,10 +9,8 @@
{% endblock %} {% endblock %}
{% block content %} {% block content %}
<h1>Permission rules</h1>
{% set current_tab = "rules" %} <h1>Permission Rules</h1>
{% include "_permissions_debug_tabs.html" %}
<p>Use this tool to view the permission rules that allow the current actor to access resources for a given permission action. It queries the <code>/-/rules.json</code> API endpoint.</p> <p>Use this tool to view the permission rules that allow the current actor to access resources for a given permission action. It queries the <code>/-/rules.json</code> API endpoint.</p>
@ -23,13 +21,13 @@
{% endif %} {% endif %}
<div class="permission-form"> <div class="permission-form">
<form id="rules-form" method="get" action="{{ urls.path("-/rules") }}"> <form id="rules-form">
<div class="form-section"> <div class="form-section">
<label for="action">Action (permission name):</label> <label for="action">Action (permission name):</label>
<select id="action" name="action" required> <select id="action" name="action" required>
<option value="">Select an action...</option> <option value="">Select an action...</option>
{% for action_name in sorted_actions %} {% for permission_name in sorted_permissions %}
<option value="{{ action_name }}">{{ action_name }}</option> <option value="{{ permission_name }}">{{ permission_name }}</option>
{% endfor %} {% endfor %}
</select> </select>
<small>The permission action to check</small> <small>The permission action to check</small>
@ -70,6 +68,23 @@ const resultsContent = document.getElementById('results-content');
const resultsCount = document.getElementById('results-count'); const resultsCount = document.getElementById('results-count');
const pagination = document.getElementById('pagination'); const pagination = document.getElementById('pagination');
const submitBtn = document.getElementById('submit-btn'); const submitBtn = document.getElementById('submit-btn');
let currentData = null;
form.addEventListener('submit', async (ev) => {
ev.preventDefault();
updateURL('rules-form', 1);
await fetchResults(1, false);
});
// Handle browser back/forward
window.addEventListener('popstate', () => {
const params = populateFormFromURL();
const action = params.get('action');
const page = params.get('page');
if (action) {
fetchResults(page ? parseInt(page) : 1, false);
}
});
// Populate form on initial load // Populate form on initial load
(function() { (function() {
@ -77,11 +92,11 @@ const submitBtn = document.getElementById('submit-btn');
const action = params.get('action'); const action = params.get('action');
const page = params.get('page'); const page = params.get('page');
if (action) { if (action) {
fetchResults(page ? parseInt(page) : 1); fetchResults(page ? parseInt(page) : 1, false);
} }
})(); })();
async function fetchResults(page = 1) { async function fetchResults(page = 1, updateHistory = true) {
submitBtn.disabled = true; submitBtn.disabled = true;
submitBtn.textContent = 'Loading...'; submitBtn.textContent = 'Loading...';
@ -109,6 +124,7 @@ async function fetchResults(page = 1) {
const data = await response.json(); const data = await response.json();
if (response.ok) { if (response.ok) {
currentData = data;
displayResults(data); displayResults(data);
} else { } else {
displayError(data); displayError(data);
@ -137,8 +153,8 @@ function displayResults(data) {
html += '<th>Resource Path</th>'; html += '<th>Resource Path</th>';
html += '<th>Parent</th>'; html += '<th>Parent</th>';
html += '<th>Child</th>'; html += '<th>Child</th>';
html += '<th>Source Plugin</th>';
html += '<th>Reason</th>'; html += '<th>Reason</th>';
html += '<th>Source Plugin</th>';
html += '</tr></thead>'; html += '</tr></thead>';
html += '<tbody>'; html += '<tbody>';
@ -153,8 +169,8 @@ function displayResults(data) {
html += `<td><span class="resource-path">${escapeHtml(item.resource || '/')}</span></td>`; html += `<td><span class="resource-path">${escapeHtml(item.resource || '/')}</span></td>`;
html += `<td>${escapeHtml(item.parent || '—')}</td>`; html += `<td>${escapeHtml(item.parent || '—')}</td>`;
html += `<td>${escapeHtml(item.child || '—')}</td>`; html += `<td>${escapeHtml(item.child || '—')}</td>`;
html += `<td>${escapeHtml(item.source_plugin || '—')}</td>`;
html += `<td>${escapeHtml(item.reason || '—')}</td>`; html += `<td>${escapeHtml(item.reason || '—')}</td>`;
html += `<td>${escapeHtml(item.source_plugin || '—')}</td>`;
html += '</tr>'; html += '</tr>';
} }
@ -167,8 +183,13 @@ function displayResults(data) {
if (data.previous_url || data.next_url) { if (data.previous_url || data.next_url) {
if (data.previous_url) { if (data.previous_url) {
const prevLink = document.createElement('a'); const prevLink = document.createElement('a');
prevLink.href = data.previous_url; prevLink.href = '#';
prevLink.textContent = '← Previous'; prevLink.textContent = '← Previous';
prevLink.addEventListener('click', (e) => {
e.preventDefault();
updateURL('rules-form', data.page - 1);
fetchResults(data.page - 1, false);
});
pagination.appendChild(prevLink); pagination.appendChild(prevLink);
} }
@ -178,14 +199,22 @@ function displayResults(data) {
if (data.next_url) { if (data.next_url) {
const nextLink = document.createElement('a'); const nextLink = document.createElement('a');
nextLink.href = data.next_url; nextLink.href = '#';
nextLink.textContent = 'Next →'; nextLink.textContent = 'Next →';
nextLink.addEventListener('click', (e) => {
e.preventDefault();
updateURL('rules-form', data.page + 1);
fetchResults(data.page + 1, false);
});
pagination.appendChild(nextLink); pagination.appendChild(nextLink);
} }
} }
// Update raw JSON // Update raw JSON
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data); document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
// Scroll to results
resultsContainer.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
} }
function displayError(data) { function displayError(data) {
@ -196,6 +225,8 @@ function displayError(data) {
resultsContent.innerHTML = `<div class="error-message">Error: ${escapeHtml(data.error || 'Unknown error')}</div>`; resultsContent.innerHTML = `<div class="error-message">Error: ${escapeHtml(data.error || 'Unknown error')}</div>`;
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data); document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
resultsContainer.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
} }
</script> </script>

View file

@ -0,0 +1,149 @@
{% extends "base.html" %}
{% block title %}Debug permissions{% endblock %}
{% block extra_head %}
<style type="text/css">
.check-result-true {
color: green;
}
.check-result-false {
color: red;
}
.check-result-no-opinion {
color: #aaa;
}
.check h2 {
font-size: 1em
}
.check-action, .check-when, .check-result {
font-size: 1.3em;
}
textarea {
height: 10em;
width: 95%;
box-sizing: border-box;
padding: 0.5em;
border: 2px dotted black;
}
.two-col {
display: inline-block;
width: 48%;
}
.two-col label {
width: 48%;
}
@media only screen and (max-width: 576px) {
.two-col {
width: 100%;
}
}
</style>
{% endblock %}
{% block content %}
<h1>Permission check testing tool</h1>
<p>This tool lets you simulate an actor and a permission check for that actor.</p>
<form class="core" action="{{ urls.path('-/permissions') }}" id="debug-post" method="post" style="margin-bottom: 1em">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<div class="two-col">
<p><label>Actor</label></p>
<textarea name="actor">{% if actor_input %}{{ actor_input }}{% else %}{"id": "root"}{% endif %}</textarea>
</div>
<div class="two-col" style="vertical-align: top">
<p><label for="permission" style="display:block">Permission</label>
<select name="permission" id="permission">
{% for permission in permissions %}
<option value="{{ permission.name }}">{{ permission.name }} (default {{ permission.default }})</option>
{% endfor %}
</select>
<p><label for="resource_1">Database name</label><input type="text" id="resource_1" name="resource_1"></p>
<p><label for="resource_2">Table or query name</label><input type="text" id="resource_2" name="resource_2"></p>
</div>
<div style="margin-top: 1em;">
<input type="submit" value="Simulate permission check">
</div>
<pre style="margin-top: 1em" id="debugResult"></pre>
</form>
<script>
var rawPerms = {{ permissions|tojson }};
var permissions = Object.fromEntries(rawPerms.map(p => [p.name, p]));
var permissionSelect = document.getElementById('permission');
var resource1 = document.getElementById('resource_1');
var resource2 = document.getElementById('resource_2');
function updateResourceVisibility() {
var permission = permissionSelect.value;
var {takes_database, takes_resource} = permissions[permission];
if (takes_database) {
resource1.closest('p').style.display = 'block';
} else {
resource1.closest('p').style.display = 'none';
}
if (takes_resource) {
resource2.closest('p').style.display = 'block';
} else {
resource2.closest('p').style.display = 'none';
}
}
permissionSelect.addEventListener('change', updateResourceVisibility);
updateResourceVisibility();
// When #debug-post form is submitted, use fetch() to POST data
var debugPost = document.getElementById('debug-post');
var debugResult = document.getElementById('debugResult');
debugPost.addEventListener('submit', function(ev) {
ev.preventDefault();
var formData = new FormData(debugPost);
console.log(formData);
fetch(debugPost.action, {
method: 'POST',
body: new URLSearchParams(formData),
}).then(function(response) {
return response.json();
}).then(function(data) {
debugResult.innerText = JSON.stringify(data, null, 4);
});
});
</script>
<h1>Recent permissions checks</h1>
<p>
{% if filter != "all" %}<a href="?filter=all">All</a>{% else %}<strong>All</strong>{% endif %},
{% if filter != "exclude-yours" %}<a href="?filter=exclude-yours">Exclude yours</a>{% else %}<strong>Exclude yours</strong>{% endif %},
{% if filter != "only-yours" %}<a href="?filter=only-yours">Only yours</a>{% else %}<strong>Only yours</strong>{% endif %}
</p>
{% for check in permission_checks %}
<div class="check">
<h2>
<span class="check-action">{{ check.action }}</span>
checked at
<span class="check-when">{{ check.when }}</span>
{% if check.result %}
<span class="check-result check-result-true"></span>
{% elif check.result is none %}
<span class="check-result check-result-no-opinion">none</span>
{% else %}
<span class="check-result check-result-false"></span>
{% endif %}
{% if check.used_default %}
<span class="check-used-default">(used default)</span>
{% endif %}
</h2>
<p><strong>Actor:</strong> {{ check.actor|tojson }}</p>
{% if check.resource %}
<p><strong>Resource:</strong> {{ check.resource }}</p>
{% endif %}
</div>
{% endfor %}
<h1>All registered permissions</h1>
<pre>{{ permissions|tojson(2) }}</pre>
{% endblock %}

View file

@ -1,41 +0,0 @@
{% extends "base.html" %}
{% block title %}{% if is_instance %}Schema for all databases{% elif table_name %}Schema for {{ schemas[0].database }}.{{ table_name }}{% else %}Schema for {{ schemas[0].database }}{% endif %}{% endblock %}
{% block body_class %}schema{% endblock %}
{% block crumbs %}
{% if is_instance %}
{{ crumbs.nav(request=request) }}
{% elif table_name %}
{{ crumbs.nav(request=request, database=schemas[0].database, table=table_name) }}
{% else %}
{{ crumbs.nav(request=request, database=schemas[0].database) }}
{% endif %}
{% endblock %}
{% block content %}
<div class="page-header">
<h1>{% if is_instance %}Schema for all databases{% elif table_name %}Schema for {{ table_name }}{% else %}Schema for {{ schemas[0].database }}{% endif %}</h1>
</div>
{% for item in schemas %}
{% if is_instance %}
<h2>{{ item.database }}</h2>
{% endif %}
{% if item.schema %}
<pre style="background-color: #f5f5f5; padding: 1em; overflow-x: auto; border: 1px solid #ddd; border-radius: 4px;"><code>{{ item.schema }}</code></pre>
{% else %}
<p><em>No schema available for this database.</em></p>
{% endif %}
{% if not loop.last %}
<hr style="margin: 2em 0;">
{% endif %}
{% endfor %}
{% if not schemas %}
<p><em>No databases with viewable schemas found.</em></p>
{% endif %}
{% endblock %}

View file

@ -4,7 +4,6 @@ import aiofiles
import click import click
from collections import OrderedDict, namedtuple, Counter from collections import OrderedDict, namedtuple, Counter
import copy import copy
import dataclasses
import base64 import base64
import hashlib import hashlib
import inspect import inspect
@ -28,58 +27,6 @@ from .sqlite import sqlite3, supports_table_xinfo
if typing.TYPE_CHECKING: if typing.TYPE_CHECKING:
from datasette.database import Database from datasette.database import Database
from datasette.permissions import Resource
@dataclasses.dataclass
class PaginatedResources:
"""Paginated results from allowed_resources query."""
resources: List["Resource"]
next: str | None # Keyset token for next page (None if no more results)
_datasette: typing.Any = dataclasses.field(default=None, repr=False)
_action: str = dataclasses.field(default=None, repr=False)
_actor: typing.Any = dataclasses.field(default=None, repr=False)
_parent: str | None = dataclasses.field(default=None, repr=False)
_include_is_private: bool = dataclasses.field(default=False, repr=False)
_include_reasons: bool = dataclasses.field(default=False, repr=False)
_limit: int = dataclasses.field(default=100, repr=False)
async def all(self):
"""
Async generator that yields all resources across all pages.
Automatically handles pagination under the hood. This is useful when you need
to iterate through all results without manually managing pagination tokens.
Yields:
Resource objects one at a time
Example:
page = await datasette.allowed_resources("view-table", actor)
async for table in page.all():
print(f"{table.parent}/{table.child}")
"""
# Yield all resources from current page
for resource in self.resources:
yield resource
# Continue fetching subsequent pages if there are more
next_token = self.next
while next_token:
page = await self._datasette.allowed_resources(
self._action,
self._actor,
parent=self._parent,
include_is_private=self._include_is_private,
include_reasons=self._include_reasons,
limit=self._limit,
next=next_token,
)
for resource in page.resources:
yield resource
next_token = page.next
# From https://www.sqlite.org/lang_keywords.html # From https://www.sqlite.org/lang_keywords.html
reserved_words = set( reserved_words = set(

View file

@ -23,12 +23,42 @@ The core pattern is:
from typing import TYPE_CHECKING from typing import TYPE_CHECKING
from datasette.utils.permissions import gather_permission_sql_from_hooks from datasette.plugins import pm
from datasette.utils import await_me_maybe
from datasette.permissions import PermissionSQL
if TYPE_CHECKING: if TYPE_CHECKING:
from datasette.app import Datasette from datasette.app import Datasette
def _process_permission_results(results) -> tuple[list[str], dict]:
"""
Process plugin permission results into SQL fragments and parameters.
Args:
results: Results from permission_resources_sql hook (may be list or single PermissionSQL)
Returns:
A tuple of (list of SQL strings, dict of parameters)
"""
rule_sqls = []
all_params = {}
if results is None:
return rule_sqls, all_params
if isinstance(results, list):
for plugin_sql in results:
if isinstance(plugin_sql, PermissionSQL):
rule_sqls.append(plugin_sql.sql)
all_params.update(plugin_sql.params)
elif isinstance(results, PermissionSQL):
rule_sqls.append(results.sql)
all_params.update(results.params)
return rule_sqls, all_params
async def build_allowed_resources_sql( async def build_allowed_resources_sql(
datasette: "Datasette", datasette: "Datasette",
actor: dict | None, actor: dict | None,
@ -70,123 +100,25 @@ async def build_allowed_resources_sql(
if not action_obj: if not action_obj:
raise ValueError(f"Unknown action: {action}") raise ValueError(f"Unknown action: {action}")
# If this action also_requires another action, we need to combine the queries
if action_obj.also_requires:
# Build both queries
main_sql, main_params = await _build_single_action_sql(
datasette,
actor,
action,
parent=parent,
include_is_private=include_is_private,
)
required_sql, required_params = await _build_single_action_sql(
datasette,
actor,
action_obj.also_requires,
parent=parent,
include_is_private=False,
)
# Merge parameters - they should have identical values for :actor, :actor_id, etc.
all_params = {**main_params, **required_params}
if parent is not None:
all_params["filter_parent"] = parent
# Combine with INNER JOIN - only resources allowed by both actions
combined_sql = f"""
WITH
main_allowed AS (
{main_sql}
),
required_allowed AS (
{required_sql}
)
SELECT m.parent, m.child, m.reason"""
if include_is_private:
combined_sql += ", m.is_private"
combined_sql += """
FROM main_allowed m
INNER JOIN required_allowed r
ON ((m.parent = r.parent) OR (m.parent IS NULL AND r.parent IS NULL))
AND ((m.child = r.child) OR (m.child IS NULL AND r.child IS NULL))
"""
if parent is not None:
combined_sql += "WHERE m.parent = :filter_parent\n"
combined_sql += "ORDER BY m.parent, m.child"
return combined_sql, all_params
# No also_requires, build single action query
return await _build_single_action_sql(
datasette, actor, action, parent=parent, include_is_private=include_is_private
)
async def _build_single_action_sql(
datasette: "Datasette",
actor: dict | None,
action: str,
*,
parent: str | None = None,
include_is_private: bool = False,
) -> tuple[str, dict]:
"""
Build SQL for a single action (internal helper for build_allowed_resources_sql).
This contains the original logic from build_allowed_resources_sql, extracted
to allow combining multiple actions when also_requires is used.
"""
# Get the Action object
action_obj = datasette.actions.get(action)
if not action_obj:
raise ValueError(f"Unknown action: {action}")
# Get base resources SQL from the resource class # Get base resources SQL from the resource class
base_resources_sql = await action_obj.resource_class.resources_sql(datasette) base_resources_sql = action_obj.resource_class.resources_sql()
permission_sqls = await gather_permission_sql_from_hooks( # Get all permission rule fragments from plugins via the hook
rule_results = pm.hook.permission_resources_sql(
datasette=datasette, datasette=datasette,
actor=actor, actor=actor,
action=action, action=action,
) )
# If permission_sqls is the sentinel, skip all permission checks # Combine rule fragments and collect parameters
# Return SQL that allows all resources
from datasette.utils.permissions import SKIP_PERMISSION_CHECKS
if permission_sqls is SKIP_PERMISSION_CHECKS:
cols = "parent, child, 'skip_permission_checks' AS reason"
if include_is_private:
cols += ", 0 AS is_private"
return f"SELECT {cols} FROM ({base_resources_sql})", {}
all_params = {} all_params = {}
rule_sqls = [] rule_sqls = []
restriction_sqls = []
for permission_sql in permission_sqls: for result in rule_results:
# Always collect params (even from restriction-only plugins) result = await await_me_maybe(result)
all_params.update(permission_sql.params or {}) sqls, params = _process_permission_results(result)
rule_sqls.extend(sqls)
# Collect restriction SQL filters all_params.update(params)
if permission_sql.restriction_sql:
restriction_sqls.append(permission_sql.restriction_sql)
# Skip plugins that only provide restriction_sql (no permission rules)
if permission_sql.sql is None:
continue
rule_sqls.append(
f"""
SELECT parent, child, allow, reason, '{permission_sql.source}' AS source_plugin FROM (
{permission_sql.sql}
)
""".strip()
)
# If no rules, return empty result (deny all) # If no rules, return empty result (deny all)
if not rule_sqls: if not rule_sqls:
@ -211,24 +143,28 @@ async def _build_single_action_sql(
# If include_is_private, we need to build anonymous permissions too # If include_is_private, we need to build anonymous permissions too
if include_is_private: if include_is_private:
anon_permission_sqls = await gather_permission_sql_from_hooks( # Get anonymous permission rules
anon_rule_results = pm.hook.permission_resources_sql(
datasette=datasette, datasette=datasette,
actor=None, actor=None,
action=action, action=action,
) )
anon_sqls_rewritten = [] anon_rule_sqls = []
anon_params = {} anon_params = {}
for result in anon_rule_results:
result = await await_me_maybe(result)
sqls, params = _process_permission_results(result)
anon_rule_sqls.extend(sqls)
# Namespace anonymous params to avoid conflicts
for key, value in params.items():
anon_params[f"anon_{key}"] = value
for permission_sql in anon_permission_sqls: # Rewrite anonymous SQL to use namespaced params
# Skip plugins that only provide restriction_sql (no permission rules) anon_sqls_rewritten = []
if permission_sql.sql is None: for sql in anon_rule_sqls:
continue for key in params.keys():
rewritten_sql = permission_sql.sql sql = sql.replace(f":{key}", f":anon_{key}")
for key, value in (permission_sql.params or {}).items(): anon_sqls_rewritten.append(sql)
anon_key = f"anon_{key}"
anon_params[anon_key] = value
rewritten_sql = rewritten_sql.replace(f":{key}", f":{anon_key}")
anon_sqls_rewritten.append(rewritten_sql)
all_params.update(anon_params) all_params.update(anon_params)
@ -249,8 +185,8 @@ async def _build_single_action_sql(
" SELECT b.parent, b.child,", " SELECT b.parent, b.child,",
" MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,", " MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,",
" MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow,", " MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow,",
" json_group_array(CASE WHEN ar.allow = 0 THEN ar.source_plugin || ': ' || ar.reason END) AS deny_reasons,", " MAX(CASE WHEN ar.allow = 0 THEN ar.reason ELSE NULL END) AS deny_reason,",
" json_group_array(CASE WHEN ar.allow = 1 THEN ar.source_plugin || ': ' || ar.reason END) AS allow_reasons", " MAX(CASE WHEN ar.allow = 1 THEN ar.reason ELSE NULL END) AS allow_reason",
" FROM base b", " FROM base b",
" LEFT JOIN all_rules ar ON ar.parent = b.parent AND ar.child = b.child", " LEFT JOIN all_rules ar ON ar.parent = b.parent AND ar.child = b.child",
" GROUP BY b.parent, b.child", " GROUP BY b.parent, b.child",
@ -259,8 +195,8 @@ async def _build_single_action_sql(
" SELECT b.parent, b.child,", " SELECT b.parent, b.child,",
" MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,", " MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,",
" MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow,", " MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow,",
" json_group_array(CASE WHEN ar.allow = 0 THEN ar.source_plugin || ': ' || ar.reason END) AS deny_reasons,", " MAX(CASE WHEN ar.allow = 0 THEN ar.reason ELSE NULL END) AS deny_reason,",
" json_group_array(CASE WHEN ar.allow = 1 THEN ar.source_plugin || ': ' || ar.reason END) AS allow_reasons", " MAX(CASE WHEN ar.allow = 1 THEN ar.reason ELSE NULL END) AS allow_reason",
" FROM base b", " FROM base b",
" LEFT JOIN all_rules ar ON ar.parent = b.parent AND ar.child IS NULL", " LEFT JOIN all_rules ar ON ar.parent = b.parent AND ar.child IS NULL",
" GROUP BY b.parent, b.child", " GROUP BY b.parent, b.child",
@ -269,8 +205,8 @@ async def _build_single_action_sql(
" SELECT b.parent, b.child,", " SELECT b.parent, b.child,",
" MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,", " MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,",
" MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow,", " MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow,",
" json_group_array(CASE WHEN ar.allow = 0 THEN ar.source_plugin || ': ' || ar.reason END) AS deny_reasons,", " MAX(CASE WHEN ar.allow = 0 THEN ar.reason ELSE NULL END) AS deny_reason,",
" json_group_array(CASE WHEN ar.allow = 1 THEN ar.source_plugin || ': ' || ar.reason END) AS allow_reasons", " MAX(CASE WHEN ar.allow = 1 THEN ar.reason ELSE NULL END) AS allow_reason",
" FROM base b", " FROM base b",
" LEFT JOIN all_rules ar ON ar.parent IS NULL AND ar.child IS NULL", " LEFT JOIN all_rules ar ON ar.parent IS NULL AND ar.child IS NULL",
" GROUP BY b.parent, b.child", " GROUP BY b.parent, b.child",
@ -351,13 +287,13 @@ async def _build_single_action_sql(
" ELSE 0", " ELSE 0",
" END AS is_allowed,", " END AS is_allowed,",
" CASE", " CASE",
" WHEN cl.any_deny = 1 THEN cl.deny_reasons", " WHEN cl.any_deny = 1 THEN cl.deny_reason",
" WHEN cl.any_allow = 1 THEN cl.allow_reasons", " WHEN cl.any_allow = 1 THEN cl.allow_reason",
" WHEN pl.any_deny = 1 THEN pl.deny_reasons", " WHEN pl.any_deny = 1 THEN pl.deny_reason",
" WHEN pl.any_allow = 1 THEN pl.allow_reasons", " WHEN pl.any_allow = 1 THEN pl.allow_reason",
" WHEN gl.any_deny = 1 THEN gl.deny_reasons", " WHEN gl.any_deny = 1 THEN gl.deny_reason",
" WHEN gl.any_allow = 1 THEN gl.allow_reasons", " WHEN gl.any_allow = 1 THEN gl.allow_reason",
" ELSE '[]'", " ELSE 'default deny'",
" END AS reason", " END AS reason",
] ]
) )
@ -383,17 +319,6 @@ async def _build_single_action_sql(
query_parts.append(")") query_parts.append(")")
# Add restriction list CTE if there are restrictions
if restriction_sqls:
# Wrap each restriction_sql in a subquery to avoid operator precedence issues
# with UNION ALL inside the restriction SQL statements
restriction_intersect = "\nINTERSECT\n".join(
f"SELECT * FROM ({sql})" for sql in restriction_sqls
)
query_parts.extend(
[",", "restriction_list AS (", f" {restriction_intersect}", ")"]
)
# Final SELECT # Final SELECT
select_cols = "parent, child, reason" select_cols = "parent, child, reason"
if include_is_private: if include_is_private:
@ -403,17 +328,6 @@ async def _build_single_action_sql(
query_parts.append("FROM decisions") query_parts.append("FROM decisions")
query_parts.append("WHERE is_allowed = 1") query_parts.append("WHERE is_allowed = 1")
# Add restriction filter if there are restrictions
if restriction_sqls:
query_parts.append(
"""
AND EXISTS (
SELECT 1 FROM restriction_list r
WHERE (r.parent = decisions.parent OR r.parent IS NULL)
AND (r.child = decisions.child OR r.child IS NULL)
)"""
)
# Add parent filter if specified # Add parent filter if specified
if parent is not None: if parent is not None:
query_parts.append(" AND parent = :filter_parent") query_parts.append(" AND parent = :filter_parent")
@ -425,74 +339,7 @@ async def _build_single_action_sql(
return query, all_params return query, all_params
async def build_permission_rules_sql(
datasette: "Datasette", actor: dict | None, action: str
) -> tuple[str, dict]:
"""
Build the UNION SQL and params for all permission rules for a given actor and action.
Returns:
A tuple of (sql, params) where sql is a UNION ALL query that returns
(parent, child, allow, reason, source_plugin) rows.
"""
# Get the Action object
action_obj = datasette.actions.get(action)
if not action_obj:
raise ValueError(f"Unknown action: {action}")
permission_sqls = await gather_permission_sql_from_hooks(
datasette=datasette,
actor=actor,
action=action,
)
# If permission_sqls is the sentinel, skip all permission checks
# Return SQL that allows everything
from datasette.utils.permissions import SKIP_PERMISSION_CHECKS
if permission_sqls is SKIP_PERMISSION_CHECKS:
return (
"SELECT NULL AS parent, NULL AS child, 1 AS allow, 'skip_permission_checks' AS reason, 'skip' AS source_plugin",
{},
[],
)
if not permission_sqls:
return (
"SELECT NULL AS parent, NULL AS child, 0 AS allow, NULL AS reason, NULL AS source_plugin WHERE 0",
{},
[],
)
union_parts = []
all_params = {}
restriction_sqls = []
for permission_sql in permission_sqls:
all_params.update(permission_sql.params or {})
# Collect restriction SQL filters
if permission_sql.restriction_sql:
restriction_sqls.append(permission_sql.restriction_sql)
# Skip plugins that only provide restriction_sql (no permission rules)
if permission_sql.sql is None:
continue
union_parts.append(
f"""
SELECT parent, child, allow, reason, '{permission_sql.source}' AS source_plugin FROM (
{permission_sql.sql}
)
""".strip()
)
rules_union = " UNION ALL ".join(union_parts)
return rules_union, all_params, restriction_sqls
async def check_permission_for_resource( async def check_permission_for_resource(
*,
datasette: "Datasette", datasette: "Datasette",
actor: dict | None, actor: dict | None,
action: str, action: str,
@ -515,69 +362,76 @@ async def check_permission_for_resource(
This builds the cascading permission query and checks if the specific This builds the cascading permission query and checks if the specific
resource is in the allowed set. resource is in the allowed set.
""" """
rules_union, all_params, restriction_sqls = await build_permission_rules_sql( # Get the Action object
datasette, actor, action action_obj = datasette.actions.get(action)
if not action_obj:
raise ValueError(f"Unknown action: {action}")
# Get all permission rule fragments from plugins via the hook
rule_results = pm.hook.permission_resources_sql(
datasette=datasette,
actor=actor,
action=action,
) )
# If no rules (empty SQL), default deny # Combine rule fragments and collect parameters
if not rules_union: all_params = {}
rule_sqls = []
for result in rule_results:
result = await await_me_maybe(result)
sqls, params = _process_permission_results(result)
rule_sqls.extend(sqls)
all_params.update(params)
# If no rules, default deny
if not rule_sqls:
return False return False
# Build a simplified query that just checks for this one resource
rules_union = " UNION ALL ".join(rule_sqls)
# Add parameters for the resource we're checking # Add parameters for the resource we're checking
all_params["_check_parent"] = parent all_params["_check_parent"] = parent
all_params["_check_child"] = child all_params["_check_child"] = child
# If there are restriction filters, check if the resource passes them first
if restriction_sqls:
# Check if resource is in restriction allowlist
# Database-level restrictions (parent, NULL) should match all children (parent, *)
# Wrap each restriction_sql in a subquery to avoid operator precedence issues
restriction_check = "\nINTERSECT\n".join(
f"SELECT * FROM ({sql})" for sql in restriction_sqls
)
restriction_query = f"""
WITH restriction_list AS (
{restriction_check}
)
SELECT EXISTS (
SELECT 1 FROM restriction_list
WHERE (parent = :_check_parent OR parent IS NULL)
AND (child = :_check_child OR child IS NULL)
) AS in_allowlist
"""
result = await datasette.get_internal_database().execute(
restriction_query, all_params
)
if result.rows and not result.rows[0][0]:
# Resource not in restriction allowlist - deny
return False
query = f""" query = f"""
WITH WITH
all_rules AS ( all_rules AS (
{rules_union} {rules_union}
), ),
matched_rules AS ( child_lvl AS (
SELECT ar.*, SELECT
CASE MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,
WHEN ar.child IS NOT NULL THEN 2 -- child-level (most specific) MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow
WHEN ar.parent IS NOT NULL THEN 1 -- parent-level
ELSE 0 -- root/global
END AS depth
FROM all_rules ar FROM all_rules ar
WHERE (ar.parent IS NULL OR ar.parent = :_check_parent) WHERE ar.parent = :_check_parent AND ar.child = :_check_child
AND (ar.child IS NULL OR ar.child = :_check_child)
), ),
winner AS ( parent_lvl AS (
SELECT * SELECT
FROM matched_rules MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,
ORDER BY MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow
depth DESC, -- specificity first (higher depth wins) FROM all_rules ar
CASE WHEN allow=0 THEN 0 ELSE 1 END, -- then deny over allow WHERE ar.parent = :_check_parent AND ar.child IS NULL
source_plugin -- stable tie-break ),
LIMIT 1 global_lvl AS (
SELECT
MAX(CASE WHEN ar.allow = 0 THEN 1 ELSE 0 END) AS any_deny,
MAX(CASE WHEN ar.allow = 1 THEN 1 ELSE 0 END) AS any_allow
FROM all_rules ar
WHERE ar.parent IS NULL AND ar.child IS NULL
) )
SELECT COALESCE((SELECT allow FROM winner), 0) AS is_allowed SELECT
CASE
WHEN cl.any_deny = 1 THEN 0
WHEN cl.any_allow = 1 THEN 1
WHEN pl.any_deny = 1 THEN 0
WHEN pl.any_allow = 1 THEN 1
WHEN gl.any_deny = 1 THEN 0
WHEN gl.any_allow = 1 THEN 1
ELSE 0
END AS is_allowed
FROM child_lvl cl, parent_lvl pl, global_lvl gl
""" """
# Execute the query against the internal database # Execute the query against the internal database

View file

@ -1,3 +1,4 @@
import hashlib
import json import json
from datasette.utils import MultiParams, calculate_etag from datasette.utils import MultiParams, calculate_etag
from mimetypes import guess_type from mimetypes import guess_type
@ -6,7 +7,6 @@ from pathlib import Path
from http.cookies import SimpleCookie, Morsel from http.cookies import SimpleCookie, Morsel
import aiofiles import aiofiles
import aiofiles.os import aiofiles.os
import re
# Workaround for adding samesite support to pre 3.8 python # Workaround for adding samesite support to pre 3.8 python
Morsel._reserved["samesite"] = "SameSite" Morsel._reserved["samesite"] = "SameSite"
@ -249,9 +249,6 @@ async def asgi_send_html(send, html, status=200, headers=None):
async def asgi_send_redirect(send, location, status=302): async def asgi_send_redirect(send, location, status=302):
# Prevent open redirect vulnerability: strip multiple leading slashes
# //example.com would be interpreted as a protocol-relative URL (e.g., https://example.com/)
location = re.sub(r"^/+", "/", location)
await asgi_send( await asgi_send(
send, send,
"", "",

View file

@ -6,82 +6,6 @@ from typing import Any, Dict, Iterable, List, Sequence, Tuple
import sqlite3 import sqlite3
from datasette.permissions import PermissionSQL from datasette.permissions import PermissionSQL
from datasette.plugins import pm
from datasette.utils import await_me_maybe
# Sentinel object to indicate permission checks should be skipped
SKIP_PERMISSION_CHECKS = object()
async def gather_permission_sql_from_hooks(
*, datasette, actor: dict | None, action: str
) -> List[PermissionSQL] | object:
"""Collect PermissionSQL objects from the permission_resources_sql hook.
Ensures that each returned PermissionSQL has a populated ``source``.
Returns SKIP_PERMISSION_CHECKS sentinel if skip_permission_checks context variable
is set, signaling that all permission checks should be bypassed.
"""
from datasette.permissions import _skip_permission_checks
# Check if we should skip permission checks BEFORE calling hooks
# This avoids creating unawaited coroutines
if _skip_permission_checks.get():
return SKIP_PERMISSION_CHECKS
hook_caller = pm.hook.permission_resources_sql
hookimpls = hook_caller.get_hookimpls()
hook_results = list(hook_caller(datasette=datasette, actor=actor, action=action))
collected: List[PermissionSQL] = []
actor_json = json.dumps(actor) if actor is not None else None
actor_id = actor.get("id") if isinstance(actor, dict) else None
for index, result in enumerate(hook_results):
hookimpl = hookimpls[index]
resolved = await await_me_maybe(result)
default_source = _plugin_name_from_hookimpl(hookimpl)
for permission_sql in _iter_permission_sql_from_result(resolved, action=action):
if not permission_sql.source:
permission_sql.source = default_source
params = permission_sql.params or {}
params.setdefault("action", action)
params.setdefault("actor", actor_json)
params.setdefault("actor_id", actor_id)
collected.append(permission_sql)
return collected
def _plugin_name_from_hookimpl(hookimpl) -> str:
if getattr(hookimpl, "plugin_name", None):
return hookimpl.plugin_name
plugin = getattr(hookimpl, "plugin", None)
if hasattr(plugin, "__name__"):
return plugin.__name__
return repr(plugin)
def _iter_permission_sql_from_result(
result: Any, *, action: str
) -> Iterable[PermissionSQL]:
if result is None:
return []
if isinstance(result, PermissionSQL):
return [result]
if isinstance(result, (list, tuple)):
collected: List[PermissionSQL] = []
for item in result:
collected.extend(_iter_permission_sql_from_result(item, action=action))
return collected
if callable(result):
permission_sql = result(action) # type: ignore[call-arg]
return _iter_permission_sql_from_result(permission_sql, action=action)
raise TypeError(
"Plugin providers must return PermissionSQL instances, sequences, or callables"
)
# ----------------------------- # -----------------------------
@ -89,37 +13,49 @@ def _iter_permission_sql_from_result(
# ----------------------------- # -----------------------------
def _namespace_params(i: int, params: Dict[str, Any]) -> Tuple[str, Dict[str, Any]]:
"""
Rewrite parameter placeholders to distinct names per plugin block.
Returns (rewritten_sql, namespaced_params).
"""
replacements = {key: f"{key}_{i}" for key in params.keys()}
def rewrite(s: str) -> str:
for key in sorted(replacements.keys(), key=len, reverse=True):
s = s.replace(f":{key}", f":{replacements[key]}")
return s
namespaced: Dict[str, Any] = {}
for key, value in params.items():
namespaced[replacements[key]] = value
return rewrite, namespaced
def build_rules_union( def build_rules_union(
actor: dict | None, plugins: Sequence[PermissionSQL] actor: dict | None, plugins: Sequence[PermissionSQL]
) -> Tuple[str, Dict[str, Any]]: ) -> Tuple[str, Dict[str, Any]]:
""" """
Compose plugin SQL into a UNION ALL. Compose plugin SQL into a UNION ALL with namespaced parameters.
Returns: Returns:
union_sql: a SELECT with columns (parent, child, allow, reason, source_plugin) union_sql: a SELECT with columns (parent, child, allow, reason, source_plugin)
params: dict of bound parameters including :actor (JSON), :actor_id, and plugin params params: dict of bound parameters including :actor (JSON), :actor_id, and namespaced plugin params
Note: Plugins are responsible for ensuring their parameter names don't conflict.
The system reserves these parameter names: :actor, :actor_id, :action, :filter_parent
Plugin parameters should be prefixed with a unique identifier (e.g., source name).
""" """
parts: List[str] = [] parts: List[str] = []
actor_json = json.dumps(actor) if actor else None actor_json = json.dumps(actor) if actor else None
actor_id = actor.get("id") if actor else None actor_id = actor.get("id") if actor else None
params: Dict[str, Any] = {"actor": actor_json, "actor_id": actor_id} params: Dict[str, Any] = {"actor": actor_json, "actor_id": actor_id}
for p in plugins: for i, p in enumerate(plugins):
# No namespacing - just use plugin params as-is rewrite, ns_params = _namespace_params(i, p.params)
params.update(p.params or {}) sql_block = rewrite(p.sql)
params.update(ns_params)
# Skip plugins that only provide restriction_sql (no permission rules)
if p.sql is None:
continue
parts.append( parts.append(
f""" f"""
SELECT parent, child, allow, reason, '{p.source}' AS source_plugin FROM ( SELECT parent, child, allow, reason, '{p.source}' AS source_plugin FROM (
{p.sql} {sql_block}
) )
""".strip() """.strip()
) )
@ -172,8 +108,6 @@ async def resolve_permissions_from_catalog(
- resource (rendered "/parent/child" or "/parent" or "/") - resource (rendered "/parent/child" or "/parent" or "/")
""" """
resolved_plugins: List[PermissionSQL] = [] resolved_plugins: List[PermissionSQL] = []
restriction_sqls: List[str] = []
for plugin in plugins: for plugin in plugins:
if callable(plugin) and not isinstance(plugin, PermissionSQL): if callable(plugin) and not isinstance(plugin, PermissionSQL):
resolved = plugin(action) # type: ignore[arg-type] resolved = plugin(action) # type: ignore[arg-type]
@ -183,10 +117,6 @@ async def resolve_permissions_from_catalog(
raise TypeError("Plugin providers must return PermissionSQL instances") raise TypeError("Plugin providers must return PermissionSQL instances")
resolved_plugins.append(resolved) resolved_plugins.append(resolved)
# Collect restriction SQL filters
if resolved.restriction_sql:
restriction_sqls.append(resolved.restriction_sql)
union_sql, rule_params = build_rules_union(actor, resolved_plugins) union_sql, rule_params = build_rules_union(actor, resolved_plugins)
all_params = { all_params = {
**(candidate_params or {}), **(candidate_params or {}),
@ -222,8 +152,8 @@ async def resolve_permissions_from_catalog(
PARTITION BY parent, child PARTITION BY parent, child
ORDER BY ORDER BY
depth DESC, -- specificity first depth DESC, -- specificity first
CASE WHEN allow=0 THEN 0 ELSE 1 END, -- then deny over allow at same depth CASE WHEN allow=0 THEN 0 ELSE 1 END, -- deny over allow at same depth
source_plugin -- stable tie-break source_plugin -- stable tie-break
) AS rn ) AS rn
FROM matched FROM matched
), ),
@ -251,145 +181,6 @@ async def resolve_permissions_from_catalog(
ORDER BY c.parent, c.child ORDER BY c.parent, c.child
""" """
# If there are restriction filters, wrap the query with INTERSECT
# This ensures only resources in the restriction allowlist are returned
if restriction_sqls:
# Start with the main query, but select only parent/child for the INTERSECT
main_query_for_intersect = f"""
WITH
cands AS (
{candidate_sql}
),
rules AS (
{union_sql}
),
matched AS (
SELECT
c.parent, c.child,
r.allow, r.reason, r.source_plugin,
CASE
WHEN r.child IS NOT NULL THEN 2 -- child-level (most specific)
WHEN r.parent IS NOT NULL THEN 1 -- parent-level
ELSE 0 -- root/global
END AS depth
FROM cands c
JOIN rules r
ON (r.parent IS NULL OR r.parent = c.parent)
AND (r.child IS NULL OR r.child = c.child)
),
ranked AS (
SELECT *,
ROW_NUMBER() OVER (
PARTITION BY parent, child
ORDER BY
depth DESC, -- specificity first
CASE WHEN allow=0 THEN 0 ELSE 1 END, -- then deny over allow at same depth
source_plugin -- stable tie-break
) AS rn
FROM matched
),
winner AS (
SELECT parent, child,
allow, reason, source_plugin, depth
FROM ranked WHERE rn = 1
),
permitted_resources AS (
SELECT c.parent, c.child
FROM cands c
LEFT JOIN winner w
ON ((w.parent = c.parent) OR (w.parent IS NULL AND c.parent IS NULL))
AND ((w.child = c.child ) OR (w.child IS NULL AND c.child IS NULL))
WHERE COALESCE(w.allow, CASE WHEN :implicit_deny THEN 0 ELSE NULL END) = 1
)
SELECT parent, child FROM permitted_resources
"""
# Build restriction list with INTERSECT (all must match)
# Then filter to resources that match hierarchically
# Wrap each restriction_sql in a subquery to avoid operator precedence issues
# with UNION ALL inside the restriction SQL statements
restriction_intersect = "\nINTERSECT\n".join(
f"SELECT * FROM ({sql})" for sql in restriction_sqls
)
# Combine: resources allowed by permissions AND in restriction allowlist
# Database-level restrictions (parent, NULL) should match all children (parent, *)
filtered_resources = f"""
WITH restriction_list AS (
{restriction_intersect}
),
permitted AS (
{main_query_for_intersect}
),
filtered AS (
SELECT p.parent, p.child
FROM permitted p
WHERE EXISTS (
SELECT 1 FROM restriction_list r
WHERE (r.parent = p.parent OR r.parent IS NULL)
AND (r.child = p.child OR r.child IS NULL)
)
)
"""
# Now join back to get full results for only the filtered resources
sql = f"""
{filtered_resources}
, cands AS (
{candidate_sql}
),
rules AS (
{union_sql}
),
matched AS (
SELECT
c.parent, c.child,
r.allow, r.reason, r.source_plugin,
CASE
WHEN r.child IS NOT NULL THEN 2 -- child-level (most specific)
WHEN r.parent IS NOT NULL THEN 1 -- parent-level
ELSE 0 -- root/global
END AS depth
FROM cands c
JOIN rules r
ON (r.parent IS NULL OR r.parent = c.parent)
AND (r.child IS NULL OR r.child = c.child)
),
ranked AS (
SELECT *,
ROW_NUMBER() OVER (
PARTITION BY parent, child
ORDER BY
depth DESC, -- specificity first
CASE WHEN allow=0 THEN 0 ELSE 1 END, -- then deny over allow at same depth
source_plugin -- stable tie-break
) AS rn
FROM matched
),
winner AS (
SELECT parent, child,
allow, reason, source_plugin, depth
FROM ranked WHERE rn = 1
)
SELECT
c.parent, c.child,
COALESCE(w.allow, CASE WHEN :implicit_deny THEN 0 ELSE NULL END) AS allow,
COALESCE(w.reason, CASE WHEN :implicit_deny THEN 'implicit deny' ELSE NULL END) AS reason,
w.source_plugin,
COALESCE(w.depth, -1) AS depth,
:action AS action,
CASE
WHEN c.parent IS NULL THEN '/'
WHEN c.child IS NULL THEN '/' || c.parent
ELSE '/' || c.parent || '/' || c.child
END AS resource
FROM filtered c
LEFT JOIN winner w
ON ((w.parent = c.parent) OR (w.parent IS NULL AND c.parent IS NULL))
AND ((w.child = c.child ) OR (w.child IS NULL AND c.child IS NULL))
ORDER BY c.parent, c.child
"""
rows_iter: Iterable[sqlite3.Row] = await db.execute( rows_iter: Iterable[sqlite3.Row] = await db.execute(
sql, sql,
{**all_params, "implicit_deny": 1 if implicit_deny else 0}, {**all_params, "implicit_deny": 1 if implicit_deny else 0},

View file

@ -1,2 +1,2 @@
__version__ = "1.0a23" __version__ = "1.0a19"
__version_info__ = tuple(__version__.split(".")) __version_info__ = tuple(__version__.split("."))

View file

@ -174,6 +174,24 @@ class BaseView:
headers=headers, headers=headers,
) )
async def respond_json_or_html(self, request, data, filename):
"""Return JSON or HTML with pretty JSON depending on format parameter."""
as_format = request.url_vars.get("format")
if as_format:
headers = {}
if self.ds.cors:
add_cors_headers(headers)
return Response.json(data, headers=headers)
else:
return await self.render(
["show_json.html"],
request=request,
context={
"filename": filename,
"data_json": json.dumps(data, indent=4, default=repr),
},
)
@classmethod @classmethod
def as_view(cls, *class_args, **class_kwargs): def as_view(cls, *class_args, **class_kwargs):
async def view(request, send): async def view(request, send):

View file

@ -9,10 +9,10 @@ import os
import re import re
import sqlite_utils import sqlite_utils
import textwrap import textwrap
from typing import List
from datasette.events import AlterTableEvent, CreateTableEvent, InsertRowsEvent from datasette.events import AlterTableEvent, CreateTableEvent, InsertRowsEvent
from datasette.database import QueryInterrupted from datasette.database import QueryInterrupted
from datasette.resources import DatabaseResource, QueryResource
from datasette.utils import ( from datasette.utils import (
add_cors_headers, add_cors_headers,
await_me_maybe, await_me_maybe,
@ -49,8 +49,10 @@ class DatabaseView(View):
visible, private = await datasette.check_visibility( visible, private = await datasette.check_visibility(
request.actor, request.actor,
action="view-database", permissions=[
resource=DatabaseResource(database=database), ("view-database", database),
"view-instance",
],
) )
if not visible: if not visible:
raise Forbidden("You do not have permission to view this database") raise Forbidden("You do not have permission to view this database")
@ -70,15 +72,13 @@ class DatabaseView(View):
metadata = await datasette.get_database_metadata(database) metadata = await datasette.get_database_metadata(database)
# Get all tables/views this actor can see in bulk with private flag # Get all tables/views this actor can see in bulk with private flag
allowed_tables_page = await datasette.allowed_resources( from datasette.resources import TableResource
"view-table",
request.actor, allowed_tables = await datasette.allowed_resources(
parent=database, "view-table", request.actor, parent=database, include_is_private=True
include_is_private=True,
limit=1000,
) )
# Create lookup dict for quick access # Create lookup dict for quick access
allowed_dict = {r.child: r for r in allowed_tables_page.resources} allowed_dict = {r.child: r for r in allowed_tables}
# Filter to just views # Filter to just views
view_names_set = set(await db.view_names()) view_names_set = set(await db.view_names())
@ -89,25 +89,20 @@ class DatabaseView(View):
] ]
tables = await get_tables(datasette, request, db, allowed_dict) tables = await get_tables(datasette, request, db, allowed_dict)
# Get allowed queries using the new permission system
allowed_query_page = await datasette.allowed_resources(
"view-query",
request.actor,
parent=database,
include_is_private=True,
limit=1000,
)
# Build canned_queries list by looking up each allowed query
all_queries = await datasette.get_canned_queries(database, request.actor)
canned_queries = [] canned_queries = []
for query_resource in allowed_query_page.resources: for query in (
query_name = query_resource.child await datasette.get_canned_queries(database, request.actor)
if query_name in all_queries: ).values():
canned_queries.append( query_visible, query_private = await datasette.check_visibility(
dict(all_queries[query_name], private=query_resource.private) request.actor,
) permissions=[
("view-query", (database, query["name"])),
("view-database", database),
"view-instance",
],
)
if query_visible:
canned_queries.append(dict(query, private=query_private))
async def database_actions(): async def database_actions():
links = [] links = []
@ -124,10 +119,8 @@ class DatabaseView(View):
attached_databases = [d.name for d in await db.attached_databases()] attached_databases = [d.name for d in await db.attached_databases()]
allow_execute_sql = await datasette.allowed( allow_execute_sql = await datasette.permission_allowed(
action="execute-sql", request.actor, "execute-sql", database
resource=DatabaseResource(database=database),
actor=request.actor,
) )
json_data = { json_data = {
"database": database, "database": database,
@ -349,6 +342,7 @@ async def get_tables(datasette, request, db, allowed_dict):
allowed_dict: Dict mapping table name -> Resource object with .private attribute allowed_dict: Dict mapping table name -> Resource object with .private attribute
""" """
tables = [] tables = []
database = db.name
table_counts = await db.table_counts(100) table_counts = await db.table_counts(100)
hidden_table_names = set(await db.hidden_table_names()) hidden_table_names = set(await db.hidden_table_names())
all_foreign_keys = await db.get_all_foreign_keys() all_foreign_keys = await db.get_all_foreign_keys()
@ -375,13 +369,14 @@ async def get_tables(datasette, request, db, allowed_dict):
async def database_download(request, datasette): async def database_download(request, datasette):
from datasette.resources import DatabaseResource
database = tilde_decode(request.url_vars["database"]) database = tilde_decode(request.url_vars["database"])
await datasette.ensure_permission( await datasette.ensure_permissions(
action="view-database-download", request.actor,
resource=DatabaseResource(database=database), [
actor=request.actor, ("view-database-download", database),
("view-database", database),
"view-instance",
],
) )
try: try:
db = datasette.get_database(route=database) db = datasette.get_database(route=database)
@ -516,15 +511,13 @@ class QueryView(View):
database = db.name database = db.name
# Get all tables/views this actor can see in bulk with private flag # Get all tables/views this actor can see in bulk with private flag
allowed_tables_page = await datasette.allowed_resources( from datasette.resources import TableResource
"view-table",
request.actor, allowed_tables = await datasette.allowed_resources(
parent=database, "view-table", request.actor, parent=database, include_is_private=True
include_is_private=True,
limit=1000,
) )
# Create lookup dict for quick access # Create lookup dict for quick access
allowed_dict = {r.child: r for r in allowed_tables_page.resources} allowed_dict = {r.child: r for r in allowed_tables}
# Are we a canned query? # Are we a canned query?
canned_query = None canned_query = None
@ -546,17 +539,18 @@ class QueryView(View):
# Respect canned query permissions # Respect canned query permissions
visible, private = await datasette.check_visibility( visible, private = await datasette.check_visibility(
request.actor, request.actor,
action="view-query", permissions=[
resource=QueryResource(database=database, query=canned_query["name"]), ("view-query", (database, canned_query["name"])),
("view-database", database),
"view-instance",
],
) )
if not visible: if not visible:
raise Forbidden("You do not have permission to view this query") raise Forbidden("You do not have permission to view this query")
else: else:
await datasette.ensure_permission( await datasette.ensure_permissions(
action="execute-sql", request.actor, [("execute-sql", database)]
resource=DatabaseResource(database=database),
actor=request.actor,
) )
# Flattened because of ?sql=&name1=value1&name2=value2 feature # Flattened because of ?sql=&name1=value1&name2=value2 feature
@ -735,10 +729,8 @@ class QueryView(View):
path_with_format(request=request, format=key) path_with_format(request=request, format=key)
) )
allow_execute_sql = await datasette.allowed( allow_execute_sql = await datasette.permission_allowed(
action="execute-sql", request.actor, "execute-sql", database
resource=DatabaseResource(database=database),
actor=request.actor,
) )
show_hide_hidden = "" show_hide_hidden = ""
@ -948,10 +940,8 @@ class TableCreateView(BaseView):
database_name = db.name database_name = db.name
# Must have create-table permission # Must have create-table permission
if not await self.ds.allowed( if not await self.ds.permission_allowed(
action="create-table", request.actor, "create-table", resource=database_name
resource=DatabaseResource(database=database_name),
actor=request.actor,
): ):
return _error(["Permission denied"], 403) return _error(["Permission denied"], 403)
@ -987,10 +977,8 @@ class TableCreateView(BaseView):
if replace: if replace:
# Must have update-row permission # Must have update-row permission
if not await self.ds.allowed( if not await self.ds.permission_allowed(
action="update-row", request.actor, "update-row", resource=database_name
resource=DatabaseResource(database=database_name),
actor=request.actor,
): ):
return _error(["Permission denied: need update-row"], 403) return _error(["Permission denied: need update-row"], 403)
@ -1013,10 +1001,8 @@ class TableCreateView(BaseView):
if rows or row: if rows or row:
# Must have insert-row permission # Must have insert-row permission
if not await self.ds.allowed( if not await self.ds.permission_allowed(
action="insert-row", request.actor, "insert-row", resource=database_name
resource=DatabaseResource(database=database_name),
actor=request.actor,
): ):
return _error(["Permission denied: need insert-row"], 403) return _error(["Permission denied: need insert-row"], 403)
@ -1028,10 +1014,8 @@ class TableCreateView(BaseView):
else: else:
# alter=True only if they request it AND they have permission # alter=True only if they request it AND they have permission
if data.get("alter"): if data.get("alter"):
if not await self.ds.allowed( if not await self.ds.permission_allowed(
action="alter-table", request.actor, "alter-table", resource=database_name
resource=DatabaseResource(database=database_name),
actor=request.actor,
): ):
return _error(["Permission denied: need alter-table"], 403) return _error(["Permission denied: need alter-table"], 403)
alter = True alter = True

View file

@ -25,21 +25,20 @@ class IndexView(BaseView):
async def get(self, request): async def get(self, request):
as_format = request.url_vars["format"] as_format = request.url_vars["format"]
await self.ds.ensure_permission(action="view-instance", actor=request.actor) await self.ds.ensure_permissions(request.actor, ["view-instance"])
# Get all allowed databases and tables in bulk # Get all allowed databases and tables in bulk
db_page = await self.ds.allowed_resources( allowed_databases = await self.ds.allowed_resources(
"view-database", request.actor, include_is_private=True "view-database", request.actor, include_is_private=True
) )
allowed_databases = [r async for r in db_page.all()]
allowed_db_dict = {r.parent: r for r in allowed_databases} allowed_db_dict = {r.parent: r for r in allowed_databases}
# Group tables by database allowed_tables = await self.ds.allowed_resources(
tables_by_db = {}
table_page = await self.ds.allowed_resources(
"view-table", request.actor, include_is_private=True "view-table", request.actor, include_is_private=True
) )
async for t in table_page.all(): # Group by database
tables_by_db = {}
for t in allowed_tables:
if t.parent not in tables_by_db: if t.parent not in tables_by_db:
tables_by_db[t.parent] = {} tables_by_db[t.parent] = {}
tables_by_db[t.parent][t.child] = t tables_by_db[t.parent][t.child] = t
@ -178,8 +177,8 @@ class IndexView(BaseView):
"databases": databases, "databases": databases,
"metadata": await self.ds.get_instance_metadata(), "metadata": await self.ds.get_instance_metadata(),
"datasette_version": __version__, "datasette_version": __version__,
"private": not await self.ds.allowed( "private": not await self.ds.permission_allowed(
action="view-instance", actor=None None, "view-instance"
), ),
"top_homepage": make_slot_function( "top_homepage": make_slot_function(
"top_homepage", self.ds, request "top_homepage", self.ds, request

View file

@ -1,7 +1,6 @@
from datasette.utils.asgi import NotFound, Forbidden, Response from datasette.utils.asgi import NotFound, Forbidden, Response
from datasette.database import QueryInterrupted from datasette.database import QueryInterrupted
from datasette.events import UpdateRowEvent, DeleteRowEvent from datasette.events import UpdateRowEvent, DeleteRowEvent
from datasette.resources import TableResource
from .base import DataView, BaseView, _error from .base import DataView, BaseView, _error
from datasette.utils import ( from datasette.utils import (
await_me_maybe, await_me_maybe,
@ -28,8 +27,11 @@ class RowView(DataView):
# Ensure user has permission to view this row # Ensure user has permission to view this row
visible, private = await self.ds.check_visibility( visible, private = await self.ds.check_visibility(
request.actor, request.actor,
action="view-table", permissions=[
resource=TableResource(database=database, table=table), ("view-table", (database, table)),
("view-database", database),
"view-instance",
],
) )
if not visible: if not visible:
raise Forbidden("You do not have permission to view this table") raise Forbidden("You do not have permission to view this table")
@ -182,10 +184,8 @@ async def _resolve_row_and_check_permission(datasette, request, permission):
return False, _error(["Record not found: {}".format(e.pk_values)], 404) return False, _error(["Record not found: {}".format(e.pk_values)], 404)
# Ensure user has permission to delete this row # Ensure user has permission to delete this row
if not await datasette.allowed( if not await datasette.permission_allowed(
action=permission, request.actor, permission, resource=(resolved.db.name, resolved.table)
resource=TableResource(database=resolved.db.name, table=resolved.table),
actor=request.actor,
): ):
return False, _error(["Permission denied"], 403) return False, _error(["Permission denied"], 403)
@ -247,7 +247,7 @@ class RowUpdateView(BaseView):
if not isinstance(data, dict): if not isinstance(data, dict):
return _error(["JSON must be a dictionary"]) return _error(["JSON must be a dictionary"])
if "update" not in data or not isinstance(data["update"], dict): if not "update" in data or not isinstance(data["update"], dict):
return _error(["JSON must contain an update dictionary"]) return _error(["JSON must contain an update dictionary"])
invalid_keys = set(data.keys()) - {"update", "return", "alter"} invalid_keys = set(data.keys()) - {"update", "return", "alter"}
@ -257,10 +257,8 @@ class RowUpdateView(BaseView):
update = data["update"] update = data["update"]
alter = data.get("alter") alter = data.get("alter")
if alter and not await self.ds.allowed( if alter and not await self.ds.permission_allowed(
action="alter-table", request.actor, "alter-table", resource=(resolved.db.name, resolved.table)
resource=TableResource(database=resolved.db.name, table=resolved.table),
actor=request.actor,
): ):
return _error(["Permission denied for alter-table"], 403) return _error(["Permission denied for alter-table"], 403)

File diff suppressed because it is too large Load diff

View file

@ -15,7 +15,6 @@ from datasette.events import (
UpsertRowsEvent, UpsertRowsEvent,
) )
from datasette import tracer from datasette import tracer
from datasette.resources import DatabaseResource, TableResource
from datasette.utils import ( from datasette.utils import (
add_cors_headers, add_cors_headers,
await_me_maybe, await_me_maybe,
@ -166,6 +165,7 @@ async def display_columns_and_rows(
column_details = { column_details = {
col.name: col for col in await db.table_column_details(table_name) col.name: col for col in await db.table_column_details(table_name)
} }
table_config = await datasette.table_config(database_name, table_name)
pks = await db.primary_keys(table_name) pks = await db.primary_keys(table_name)
pks_for_display = pks pks_for_display = pks
if not pks_for_display: if not pks_for_display:
@ -449,15 +449,11 @@ class TableInsertView(BaseView):
if upsert: if upsert:
# Must have insert-row AND upsert-row permissions # Must have insert-row AND upsert-row permissions
if not ( if not (
await self.ds.allowed( await self.ds.permission_allowed(
action="insert-row", request.actor, "insert-row", resource=(database_name, table_name)
resource=TableResource(database=database_name, table=table_name),
actor=request.actor,
) )
and await self.ds.allowed( and await self.ds.permission_allowed(
action="update-row", request.actor, "update-row", resource=(database_name, table_name)
resource=TableResource(database=database_name, table=table_name),
actor=request.actor,
) )
): ):
return _error( return _error(
@ -465,10 +461,8 @@ class TableInsertView(BaseView):
) )
else: else:
# Must have insert-row permission # Must have insert-row permission
if not await self.ds.allowed( if not await self.ds.permission_allowed(
action="insert-row", request.actor, "insert-row", resource=(database_name, table_name)
resource=TableResource(database=database_name, table=table_name),
actor=request.actor,
): ):
return _error(["Permission denied"], 403) return _error(["Permission denied"], 403)
@ -497,20 +491,16 @@ class TableInsertView(BaseView):
if upsert and (ignore or replace): if upsert and (ignore or replace):
return _error(["Upsert does not support ignore or replace"], 400) return _error(["Upsert does not support ignore or replace"], 400)
if replace and not await self.ds.allowed( if replace and not await self.ds.permission_allowed(
action="update-row", request.actor, "update-row", resource=(database_name, table_name)
resource=TableResource(database=database_name, table=table_name),
actor=request.actor,
): ):
return _error(['Permission denied: need update-row to use "replace"'], 403) return _error(['Permission denied: need update-row to use "replace"'], 403)
initial_schema = None initial_schema = None
if alter: if alter:
# Must have alter-table permission # Must have alter-table permission
if not await self.ds.allowed( if not await self.ds.permission_allowed(
action="alter-table", request.actor, "alter-table", resource=(database_name, table_name)
resource=TableResource(database=database_name, table=table_name),
actor=request.actor,
): ):
return _error(["Permission denied for alter-table"], 403) return _error(["Permission denied for alter-table"], 403)
# Track initial schema to check if it changed later # Track initial schema to check if it changed later
@ -637,10 +627,8 @@ class TableDropView(BaseView):
db = self.ds.get_database(database_name) db = self.ds.get_database(database_name)
if not await db.table_exists(table_name): if not await db.table_exists(table_name):
return _error(["Table not found: {}".format(table_name)], 404) return _error(["Table not found: {}".format(table_name)], 404)
if not await self.ds.allowed( if not await self.ds.permission_allowed(
action="drop-table", request.actor, "drop-table", resource=(database_name, table_name)
resource=TableResource(database=database_name, table=table_name),
actor=request.actor,
): ):
return _error(["Permission denied"], 403) return _error(["Permission denied"], 403)
if not db.is_mutable: if not db.is_mutable:
@ -926,10 +914,8 @@ async def table_view_traced(datasette, request):
"true" if datasette.setting("allow_facet") else "false" "true" if datasette.setting("allow_facet") else "false"
), ),
is_sortable=any(c["sortable"] for c in data["display_columns"]), is_sortable=any(c["sortable"] for c in data["display_columns"]),
allow_execute_sql=await datasette.allowed( allow_execute_sql=await datasette.permission_allowed(
action="execute-sql", request.actor, "execute-sql", resolved.db.name
resource=DatabaseResource(database=resolved.db.name),
actor=request.actor,
), ),
query_ms=1.2, query_ms=1.2,
select_templates=[ select_templates=[
@ -976,8 +962,11 @@ async def table_view_data(
# Can this user view it? # Can this user view it?
visible, private = await datasette.check_visibility( visible, private = await datasette.check_visibility(
request.actor, request.actor,
action="view-table", permissions=[
resource=TableResource(database=database_name, table=table_name), ("view-table", (database_name, table_name)),
("view-database", database_name),
"view-instance",
],
) )
if not visible: if not visible:
raise Forbidden("You do not have permission to view this table") raise Forbidden("You do not have permission to view this table")

View file

@ -6,18 +6,18 @@
Datasette doesn't require authentication by default. Any visitor to a Datasette instance can explore the full data and execute read-only SQL queries. Datasette doesn't require authentication by default. Any visitor to a Datasette instance can explore the full data and execute read-only SQL queries.
Datasette can be configured to only allow authenticated users, or to control which databases, tables, and queries can be accessed by the public or by specific users. Datasette's plugin system can be used to add many different styles of authentication, such as user accounts, single sign-on or API keys. Datasette's plugin system can be used to add many different styles of authentication, such as user accounts, single sign-on or API keys.
.. _authentication_actor: .. _authentication_actor:
Actors Actors
====== ======
Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API clients (via authentication tokens). The word "actor" is used to cover both of these cases. Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API agents (via authentication tokens). The word "actor" is used to cover both of these cases.
Every request to Datasette has an associated actor value, available in the code as ``request.actor``. This can be ``None`` for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API clients. Every request to Datasette has an associated actor value, available in the code as ``request.actor``. This can be ``None`` for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API agents.
The actor dictionary can be any shape - the design of that data structure is left up to the plugins. Actors should always include a unique ``"id"`` string, as demonstrated by the "root" actor below. The actor dictionary can be any shape - the design of that data structure is left up to the plugins. A useful convention is to include an ``"id"`` string, as demonstrated by the "root" actor below.
Plugins can use the :ref:`plugin_hook_actor_from_request` hook to implement custom logic for authenticating an actor based on the incoming HTTP request. Plugins can use the :ref:`plugin_hook_actor_from_request` hook to implement custom logic for authenticating an actor based on the incoming HTTP request.
@ -32,21 +32,19 @@ The one exception is the "root" account, which you can sign into while using Dat
The ``--root`` flag is designed for local development and testing. When you start Datasette with ``--root``, the root user automatically receives every permission, including: The ``--root`` flag is designed for local development and testing. When you start Datasette with ``--root``, the root user automatically receives every permission, including:
* All view permissions (``view-instance``, ``view-database``, ``view-table``, etc.) * All view permissions (view-instance, view-database, view-table, etc.)
* All write permissions (``insert-row``, ``update-row``, ``delete-row``, ``create-table``, ``alter-table``, ``drop-table``) * All write permissions (insert-row, update-row, delete-row, create-table, alter-table, drop-table)
* Debug permissions (``permissions-debug``, ``debug-menu``) * Debug permissions (permissions-debug, debug-menu)
* Any custom permissions defined by plugins * Any custom permissions defined by plugins
If you add explicit deny rules in ``datasette.yaml`` those can still block the .. warning::
root actor from specific databases or tables. The ``--root`` flag should only be used for local development. Never use it in production or on publicly accessible servers.
The ``--root`` flag sets an internal ``root_enabled`` switch—without it, a signed-in user with ``{"id": "root"}`` is treated like any other actor.
To sign in as root, start Datasette using the ``--root`` command-line option, like this:: To sign in as root, start Datasette using the ``--root`` command-line option, like this::
datasette --root datasette --root
Datasette will output a single-use-only login URL on startup:: ::
http://127.0.0.1:8001/-/auth-token?token=786fc524e0199d70dc9a581d851f466244e114ca92f33aa3b42a139e9388daa7 http://127.0.0.1:8001/-/auth-token?token=786fc524e0199d70dc9a581d851f466244e114ca92f33aa3b42a139e9388daa7
INFO: Started server process [25801] INFO: Started server process [25801]
@ -54,7 +52,7 @@ Datasette will output a single-use-only login URL on startup::
INFO: Application startup complete. INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
Click on that link and then visit ``http://127.0.0.1:8001/-/actor`` to confirm that you are authenticated as an actor that looks like this: The URL on the first line includes a one-use token which can be used to sign in as the "root" actor in your browser. Click on that link and then visit ``http://127.0.0.1:8001/-/actor`` to confirm that you are authenticated as an actor that looks like this:
.. code-block:: json .. code-block:: json
@ -67,7 +65,7 @@ Click on that link and then visit ``http://127.0.0.1:8001/-/actor`` to confirm t
Permissions Permissions
=========== ===========
Datasette's permissions system is built around SQL queries. Datasette and its plugins construct SQL queries to resolve the list of resources that an actor cas access. Datasette has an extensive permissions system built-in, which can be further extended and customized by plugins.
The key question the permissions system answers is this: The key question the permissions system answers is this:
@ -75,80 +73,37 @@ The key question the permissions system answers is this:
**Actors** are :ref:`described above <authentication_actor>`. **Actors** are :ref:`described above <authentication_actor>`.
An **action** is a string describing the action the actor would like to perform. A full list is :ref:`provided below <actions>` - examples include ``view-table`` and ``execute-sql``. An **action** is a string describing the action the actor would like to perform. A full list is :ref:`provided below <permissions>` - examples include ``view-table`` and ``execute-sql``.
A **resource** is the item the actor wishes to interact with - for example a specific database or table. Some actions, such as ``permissions-debug``, are not associated with a particular resource. A **resource** is the item the actor wishes to interact with - for example a specific database or table. Some actions, such as ``permissions-debug``, are not associated with a particular resource.
Datasette's built-in view actions (``view-database``, ``view-table`` etc) are allowed by Datasette's default configuration: unless you :ref:`configure additional permission rules <authentication_permissions_config>` unauthenticated users will be allowed to access content. Datasette's built-in view permissions (``view-database``, ``view-table`` etc) default to *allow* - unless you :ref:`configure additional permission rules <authentication_permissions_config>` unauthenticated users will be allowed to access content.
Other actions, including those introduced by plugins, will default to *deny*. Permissions with potentially harmful effects should default to *deny*. Plugin authors should account for this when designing new plugins - for example, the `datasette-upload-csvs <https://github.com/simonw/datasette-upload-csvs>`__ plugin defaults to deny so that installations don't accidentally allow unauthenticated users to create new tables by uploading a CSV file.
.. _authentication_default_deny:
Denying all permissions by default
----------------------------------
By default, Datasette allows unauthenticated access to view databases, tables, and execute SQL queries.
You may want to run Datasette in a mode where **all** access is denied by default, and you explicitly grant permissions only to authenticated users, either using the :ref:`--root mechanism <authentication_root>` or through :ref:`configuration file rules <authentication_permissions_config>` or plugins.
Use the ``--default-deny`` command-line option to run Datasette in this mode::
datasette --default-deny data.db --root
With ``--default-deny`` enabled:
* Anonymous users are denied access to view the instance, databases, tables, and queries
* Authenticated users are also denied access unless they're explicitly granted permissions
* The root user (when using ``--root``) still has access to everything
* You can grant permissions using :ref:`configuration file rules <authentication_permissions_config>` or plugins
For example, to allow only a specific user to access your instance::
datasette --default-deny data.db --config datasette.yaml
Where ``datasette.yaml`` contains:
.. code-block:: yaml
allow:
id: alice
This configuration will deny access to everyone except the user with ``id`` of ``alice``.
.. _authentication_permissions_explained: .. _authentication_permissions_explained:
How permissions are resolved How permissions are resolved
---------------------------- ----------------------------
Datasette performs permission checks using the internal :ref:`datasette_allowed`, method which accepts keyword arguments for ``action``, ``resource`` and an optional ``actor``. The :ref:`datasette.permission_allowed(actor, action, resource=None, default=...)<datasette_permission_allowed>` method is called to check if an actor is allowed to perform a specific action.
``resource`` should be an instance of the appropriate ``Resource`` subclass from :mod:`datasette.resources`—for example ``InstanceResource()``, ``DatabaseResource(database="...``)`` or ``TableResource(database="...", table="...")``. This defaults to ``InstanceResource()`` if not specified. This method asks every plugin that implements the :ref:`plugin_hook_permission_allowed` hook if the actor is allowed to perform the action.
When a check runs Datasette gathers allow/deny rules from multiple sources and Each plugin can return ``True`` to indicate that the actor is allowed to perform the action, ``False`` if they are not allowed and ``None`` if the plugin has no opinion on the matter.
compiles them into a SQL query. The resulting query describes all of the
resources an actor may access for that action, together with the reasons those
resources were allowed or denied. The combined sources are:
* ``allow`` blocks configured in :ref:`datasette.yaml <authentication_permissions_config>`. ``False`` acts as a veto - if any plugin returns ``False`` then the permission check is denied. Otherwise, if any plugin returns ``True`` then the permission check is allowed.
* :ref:`Actor restrictions <authentication_cli_create_token_restrict>` encoded into the actor dictionary or API token.
* The "root" user shortcut when ``--root`` (or :attr:`Datasette.root_enabled <datasette.app.Datasette.root_enabled>`) is active, replying ``True`` to all permission chucks unless configuration rules deny them at a more specific level.
* Any additional SQL provided by plugins implementing :ref:`plugin_hook_permission_resources_sql`.
Datasette evaluates the SQL to determine if the requested ``resource`` is The ``resource`` argument can be used to specify a specific resource that the action is being performed against. Some permissions, such as ``view-instance``, do not involve a resource. Others such as ``view-database`` have a resource that is a string naming the database. Permissions that take both a database name and the name of a table, view or canned query within that database use a resource that is a tuple of two strings, ``(database_name, resource_name)``.
included. Explicit deny rules returned by configuration or plugins will block
access even if other rules allowed it. Plugins that implement the ``permission_allowed()`` hook can decide if they are going to consider the provided resource or not.
.. _authentication_permissions_allow: .. _authentication_permissions_allow:
Defining permissions with "allow" blocks Defining permissions with "allow" blocks
---------------------------------------- ----------------------------------------
One way to define permissions in Datasette is to use an ``"allow"`` block :ref:`in the datasette.yaml file <authentication_permissions_config>`. This is a JSON document describing which actors are allowed to perform an action against a specific resource. The standard way to define permissions in Datasette is to use an ``"allow"`` block :ref:`in the datasette.yaml file <authentication_permissions_config>`. This is a JSON document describing which actors are allowed to perform a permission.
Each ``allow`` block is compiled into SQL and combined with any
:ref:`plugin-provided rules <plugin_hook_permission_resources_sql>` to produce
the cascading allow/deny decisions that power :ref:`datasette_allowed`.
The most basic form of allow block is this (`allow demo <https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22root%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D>`__, `deny demo <https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22trevor%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D>`__): The most basic form of allow block is this (`allow demo <https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22root%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D>`__, `deny demo <https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22trevor%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D>`__):
@ -470,7 +425,7 @@ You can control the following:
* Access to specific tables and views * Access to specific tables and views
* Access to specific :ref:`canned_queries` * Access to specific :ref:`canned_queries`
If a user has permission to view a table they will be able to view that table, independent of if they have permission to view the database or instance that the table exists within. If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries.
.. _authentication_permissions_instance: .. _authentication_permissions_instance:
@ -708,7 +663,7 @@ Controlling the ability to execute arbitrary SQL
Datasette defaults to allowing any site visitor to execute their own custom SQL queries, for example using the form on `the database page <https://latest.datasette.io/fixtures>`__ or by appending a ``?_where=`` parameter to the table page `like this <https://latest.datasette.io/fixtures/facetable?_where=_city_id=1>`__. Datasette defaults to allowing any site visitor to execute their own custom SQL queries, for example using the form on `the database page <https://latest.datasette.io/fixtures>`__ or by appending a ``?_where=`` parameter to the table page `like this <https://latest.datasette.io/fixtures/facetable?_where=_city_id=1>`__.
Access to this ability is controlled by the :ref:`actions_execute_sql` permission. Access to this ability is controlled by the :ref:`permissions_execute_sql` permission.
The easiest way to disable arbitrary SQL queries is using the :ref:`default_allow_sql setting <setting_default_allow_sql>` when you first start Datasette running. The easiest way to disable arbitrary SQL queries is using the :ref:`default_allow_sql setting <setting_default_allow_sql>` when you first start Datasette running.
@ -1066,37 +1021,15 @@ This example outputs the following::
} }
} }
Restrictions act as an allowlist layered on top of the actor's existing
permissions. They can only remove access the actor would otherwise have—they
cannot grant new access. If the underlying actor is denied by ``allow`` rules in
``datasette.yaml`` or by a plugin, a token that lists that resource in its
``"_r"`` section will still be denied.
.. _permissions_plugins: .. _permissions_plugins:
Checking permissions in plugins Checking permissions in plugins
=============================== ===============================
Datasette plugins can check if an actor has permission to perform an action using :ref:`datasette_allowed`—for example:: Datasette plugins can check if an actor has permission to perform an action using the :ref:`datasette.permission_allowed(...)<datasette_permission_allowed>` method.
from datasette.resources import TableResource Datasette core performs a number of permission checks, :ref:`documented below <permissions>`. Plugins can implement the :ref:`plugin_hook_permission_allowed` plugin hook to participate in decisions about whether an actor should be able to perform a specified action.
can_edit = await datasette.allowed(
action="update-row",
resource=TableResource(database="fixtures", table="facetable"),
actor=request.actor,
)
Use :ref:`datasette_ensure_permission` when you need to enforce a permission and
raise a ``Forbidden`` error automatically.
Plugins that define new operations should return :class:`~datasette.permissions.Action`
objects from :ref:`plugin_register_actions` and can supply additional allow/deny
rules by returning :class:`~datasette.permissions.PermissionSQL` objects from the
:ref:`plugin_hook_permission_resources_sql` hook. Those rules are merged with
configuration ``allow`` blocks and actor restrictions to determine the final
result for each check.
.. _authentication_actor_matches_allow: .. _authentication_actor_matches_allow:
@ -1116,14 +1049,12 @@ The currently authenticated actor is made available to plugins as ``request.acto
.. _PermissionsDebugView: .. _PermissionsDebugView:
Permissions debug tools The permissions debug tool
======================= ==========================
The debug tool at ``/-/permissions`` is available to any actor with the ``permissions-debug`` permission. By default this is just the :ref:`authenticated root user <authentication_root>` but you can open it up to all users by starting Datasette like this:: The debug tool at ``/-/permissions`` is only available to the :ref:`authenticated root user <authentication_root>` (or any actor granted the ``permissions-debug`` action).
datasette -s permissions.permissions-debug true data.db It shows the thirty most recent permission checks that have been carried out by the Datasette instance.
The page shows the permission checks that have been carried out by the Datasette instance.
It also provides an interface for running hypothetical permission checks against a hypothetical actor. This is a useful way of confirming that your configured permissions work in the way you expect. It also provides an interface for running hypothetical permission checks against a hypothetical actor. This is a useful way of confirming that your configured permissions work in the way you expect.
@ -1132,20 +1063,37 @@ This is designed to help administrators and plugin authors understand exactly ho
.. _AllowedResourcesView: .. _AllowedResourcesView:
Allowed resources view Allowed resources view
---------------------- ======================
The ``/-/allowed`` endpoint displays resources that the current actor can access for a specified ``action``. The ``/-/allowed`` endpoint displays resources that the current actor can access for a supplied ``action`` query string argument.
This endpoint provides an interactive HTML form interface. Add ``.json`` to the URL path (e.g. ``/-/allowed.json``) to get the raw JSON response instead. This endpoint provides an interactive HTML form interface. Add ``.json`` to the URL path (e.g. ``/-/allowed.json``) to get the raw JSON response instead.
Pass ``?action=view-table`` (or another action) to select the action. Optional ``parent=`` and ``child=`` query parameters can narrow the results to a specific database/table pair. Pass ``?action=view-table`` (or another action) to select the action. Optional ``parent=`` and ``child=`` query parameters can narrow the results to a specific database/table pair.
This endpoint is publicly accessible to help users understand their own permissions. The potentially sensitive ``reason`` field is only shown to users with the ``permissions-debug`` permission - it shows the plugins and explanatory reasons that were responsible for each decision. This endpoint is publicly accessible to help users understand their own permissions. However, potentially sensitive fields (``reason`` and ``source_plugin``) are only included in responses for users with the ``permissions-debug`` permission.
Datasette includes helper endpoints for exploring the action-based permission resolver:
``/-/allowed``
Returns a paginated list of resources that the current actor is allowed to access for a given action. Pass ``?action=view-table`` (or another action) to select the action, and optional ``parent=``/``child=`` query parameters to narrow the results to a specific database/table pair.
``/-/rules``
Lists the raw permission rules (both allow and deny) contributing to each resource for the supplied action. This includes configuration-derived and plugin-provided rules. **Requires the permissions-debug permission** (only available to the root user by default).
``/-/check``
Evaluates whether the current actor can perform ``action`` against an optional ``parent``/``child`` resource tuple, returning the winning rule and reason.
These endpoints work in conjunction with :ref:`plugin_hook_permission_resources_sql` and make it easier to verify that configuration allow blocks and plugins are behaving as intended.
All three endpoints support both HTML and JSON responses. Visit the endpoint directly for an interactive HTML form interface, or add ``.json`` to the URL for a raw JSON response.
**Security note:** The ``/-/check`` and ``/-/allowed`` endpoints are publicly accessible to help users understand their own permissions. However, potentially sensitive fields (``reason`` and ``source_plugin``) are only included in responses for users with the ``permissions-debug`` permission. The ``/-/rules`` endpoint requires the ``permissions-debug`` permission for all access.
.. _PermissionRulesView: .. _PermissionRulesView:
Permission rules view Permission rules view
--------------------- ======================
The ``/-/rules`` endpoint displays all permission rules (both allow and deny) for each candidate resource for the requested action. The ``/-/rules`` endpoint displays all permission rules (both allow and deny) for each candidate resource for the requested action.
@ -1153,12 +1101,12 @@ This endpoint provides an interactive HTML form interface. Add ``.json`` to the
Pass ``?action=`` as a query parameter to specify which action to check. Pass ``?action=`` as a query parameter to specify which action to check.
This endpoint requires the ``permissions-debug`` permission. **Requires the permissions-debug permission** - this endpoint returns a 403 Forbidden error for users without this permission.
.. _PermissionCheckView: .. _PermissionCheckView:
Permission check view Permission check view
--------------------- ======================
The ``/-/check`` endpoint evaluates a single action/resource pair and returns information indicating whether the access was allowed along with diagnostic information. The ``/-/check`` endpoint evaluates a single action/resource pair and returns information indicating whether the access was allowed along with diagnostic information.
@ -1166,6 +1114,8 @@ This endpoint provides an interactive HTML form interface. Add ``.json`` to the
Pass ``?action=`` to specify the action to check, and optional ``?parent=`` and ``?child=`` parameters to specify the resource. Pass ``?action=`` to specify the action to check, and optional ``?parent=`` and ``?child=`` parameters to specify the resource.
This endpoint is publicly accessible to help users understand their own permissions. However, potentially sensitive fields (``reason`` and ``source_plugin``) are only included in responses for users with the ``permissions-debug`` permission.
.. _authentication_ds_actor: .. _authentication_ds_actor:
The ds_actor cookie The ds_actor cookie
@ -1231,156 +1181,168 @@ The /-/logout page
The page at ``/-/logout`` provides the ability to log out of a ``ds_actor`` cookie authentication session. The page at ``/-/logout`` provides the ability to log out of a ``ds_actor`` cookie authentication session.
.. _actions: .. _permissions:
Built-in actions Built-in permissions
================ ====================
This section lists all of the permission checks that are carried out by Datasette core, along with the ``resource`` if it was passed. This section lists all of the permission checks that are carried out by Datasette core, along with the ``resource`` if it was passed.
.. _actions_view_instance: .. _permissions_view_instance:
view-instance view-instance
------------- -------------
Top level permission - Actor is allowed to view any pages within this instance, starting at https://latest.datasette.io/ Top level permission - Actor is allowed to view any pages within this instance, starting at https://latest.datasette.io/
.. _actions_view_database: Default *allow*.
.. _permissions_view_database:
view-database view-database
------------- -------------
Actor is allowed to view a database page, e.g. https://latest.datasette.io/fixtures Actor is allowed to view a database page, e.g. https://latest.datasette.io/fixtures
``resource`` - ``datasette.permissions.DatabaseResource(database)`` ``resource`` - string
``database`` is the name of the database (string) The name of the database
.. _actions_view_database_download: Default *allow*.
.. _permissions_view_database_download:
view-database-download view-database-download
---------------------- -----------------------
Actor is allowed to download a database, e.g. https://latest.datasette.io/fixtures.db Actor is allowed to download a database, e.g. https://latest.datasette.io/fixtures.db
``resource`` - ``datasette.resources.DatabaseResource(database)`` ``resource`` - string
``database`` is the name of the database (string) The name of the database
.. _actions_view_table: Default *allow*.
.. _permissions_view_table:
view-table view-table
---------- ----------
Actor is allowed to view a table (or view) page, e.g. https://latest.datasette.io/fixtures/complex_foreign_keys Actor is allowed to view a table (or view) page, e.g. https://latest.datasette.io/fixtures/complex_foreign_keys
``resource`` - ``datasette.resources.TableResource(database, table)`` ``resource`` - tuple: (string, string)
``database`` is the name of the database (string) The name of the database, then the name of the table
``table`` is the name of the table (string) Default *allow*.
.. _actions_view_query: .. _permissions_view_query:
view-query view-query
---------- ----------
Actor is allowed to view (and execute) a :ref:`canned query <canned_queries>` page, e.g. https://latest.datasette.io/fixtures/pragma_cache_size - this includes executing :ref:`canned_queries_writable`. Actor is allowed to view (and execute) a :ref:`canned query <canned_queries>` page, e.g. https://latest.datasette.io/fixtures/pragma_cache_size - this includes executing :ref:`canned_queries_writable`.
``resource`` - ``datasette.resources.QueryResource(database, query)`` ``resource`` - tuple: (string, string)
``database`` is the name of the database (string) The name of the database, then the name of the canned query
``query`` is the name of the canned query (string)
.. _actions_insert_row: Default *allow*.
.. _permissions_insert_row:
insert-row insert-row
---------- ----------
Actor is allowed to insert rows into a table. Actor is allowed to insert rows into a table.
``resource`` - ``datasette.resources.TableResource(database, table)`` ``resource`` - tuple: (string, string)
``database`` is the name of the database (string) The name of the database, then the name of the table
``table`` is the name of the table (string) Default *deny*.
.. _actions_delete_row: .. _permissions_delete_row:
delete-row delete-row
---------- ----------
Actor is allowed to delete rows from a table. Actor is allowed to delete rows from a table.
``resource`` - ``datasette.resources.TableResource(database, table)`` ``resource`` - tuple: (string, string)
``database`` is the name of the database (string) The name of the database, then the name of the table
``table`` is the name of the table (string) Default *deny*.
.. _actions_update_row: .. _permissions_update_row:
update-row update-row
---------- ----------
Actor is allowed to update rows in a table. Actor is allowed to update rows in a table.
``resource`` - ``datasette.resources.TableResource(database, table)`` ``resource`` - tuple: (string, string)
``database`` is the name of the database (string) The name of the database, then the name of the table
``table`` is the name of the table (string) Default *deny*.
.. _actions_create_table: .. _permissions_create_table:
create-table create-table
------------ ------------
Actor is allowed to create a database table. Actor is allowed to create a database table.
``resource`` - ``datasette.resources.DatabaseResource(database)`` ``resource`` - string
``database`` is the name of the database (string) The name of the database
.. _actions_alter_table: Default *deny*.
.. _permissions_alter_table:
alter-table alter-table
----------- -----------
Actor is allowed to alter a database table. Actor is allowed to alter a database table.
``resource`` - ``datasette.resources.TableResource(database, table)`` ``resource`` - tuple: (string, string)
``database`` is the name of the database (string) The name of the database, then the name of the table
``table`` is the name of the table (string) Default *deny*.
.. _actions_drop_table: .. _permissions_drop_table:
drop-table drop-table
---------- ----------
Actor is allowed to drop a database table. Actor is allowed to drop a database table.
``resource`` - ``datasette.resources.TableResource(database, table)`` ``resource`` - tuple: (string, string)
``database`` is the name of the database (string) The name of the database, then the name of the table
``table`` is the name of the table (string) Default *deny*.
.. _actions_execute_sql: .. _permissions_execute_sql:
execute-sql execute-sql
----------- -----------
Actor is allowed to run arbitrary SQL queries against a specific database, e.g. https://latest.datasette.io/fixtures/-/query?sql=select+100 Actor is allowed to run arbitrary SQL queries against a specific database, e.g. https://latest.datasette.io/fixtures?sql=select+100
``resource`` - ``datasette.resources.DatabaseResource(database)`` ``resource`` - string
``database`` is the name of the database (string) The name of the database
See also :ref:`the default_allow_sql setting <setting_default_allow_sql>`. Default *allow*. See also :ref:`the default_allow_sql setting <setting_default_allow_sql>`.
.. _actions_permissions_debug: .. _permissions_permissions_debug:
permissions-debug permissions-debug
----------------- -----------------
Actor is allowed to view the ``/-/permissions`` debug tools. Actor is allowed to view the ``/-/permissions`` debug page.
.. _actions_debug_menu: Default *deny*.
.. _permissions_debug_menu:
debug-menu debug-menu
---------- ----------
Controls if the various debug pages are displayed in the navigation menu. Controls if the various debug pages are displayed in the navigation menu.
Default *deny*.

View file

@ -4,93 +4,6 @@
Changelog Changelog
========= =========
.. _v1_0_a23:
1.0a23 (2025-12-02)
-------------------
- Fix for bug where a stale database entry in ``internal.db`` could cause a 500 error on the homepage. (:issue:`2605`)
- Cosmetic improvement to ``/-/actions`` page. (:issue:`2599`)
.. _v1_0_a22:
1.0a22 (2025-11-13)
-------------------
- ``datasette serve --default-deny`` option for running Datasette configured to :ref:`deny all permissions by default <authentication_default_deny>`. (:issue:`2592`)
- ``datasette.is_client()`` method for detecting if code is :ref:`executing inside a datasette.client request <internals_datasette_is_client>`. (:issue:`2594`)
- ``datasette.pm`` property can now be used to :ref:`register and unregister plugins in tests <testing_plugins_register_in_test>`. (:issue:`2595`)
.. _v1_0_a21:
1.0a21 (2025-11-05)
-------------------
- Fixes an **open redirect** security issue: Datasette instances would redirect to ``example.com/foo/bar`` if you accessed the path ``//example.com/foo/bar``. Thanks to `James Jefferies <https://github.com/jamesjefferies>`__ for the fix. (:issue:`2429`)
- Fixed ``datasette publish cloudrun`` to work with changes to the underlying Cloud Run architecture. (:issue:`2511`)
- New ``datasette --get /path --headers`` option for inspecting the headers returned by a path. (:issue:`2578`)
- New ``datasette.client.get(..., skip_permission_checks=True)`` parameter to bypass permission checks when making requests using the internal client. (:issue:`2583`)
.. _v0_65_2:
0.65.2 (2025-11-05)
-------------------
- Fixes an **open redirect** security issue: Datasette instances would redirect to ``example.com/foo/bar`` if you accessed the path ``//example.com/foo/bar``. Thanks to `James Jefferies <https://github.com/jamesjefferies>`__ for the fix. (:issue:`2429`)
- Upgraded for compatibility with Python 3.14.
- Fixed ``datasette publish cloudrun`` to work with changes to the underlying Cloud Run architecture. (:issue:`2511`)
- Minor upgrades to fix warnings, including ``pkg_resources`` deprecation.
.. _v1_0_a20:
1.0a20 (2025-11-03)
-------------------
This alpha introduces a major breaking change prior to the 1.0 release of Datasette concerning how Datasette's permission system works.
Permission system redesign
~~~~~~~~~~~~~~~~~~~~~~~~~~
Previously the permission system worked using ``datasette.permission_allowed()`` checks which consulted all available plugins in turn to determine whether a given actor was allowed to perform a given action on a given resource.
This approach could become prohibitively expensive for large lists of items - for example to determine the list of tables that a user could view in a large Datasette instance each plugin implementation of that hook would be fired for every table.
The new design uses SQL queries against Datasette's internal :ref:`catalog tables <internals_internal>` to derive the list of resources for which an actor has permission for a given action. This turns an N x M problem (N resources, M plugins) into a single SQL query.
Plugins can use the new :ref:`plugin_hook_permission_resources_sql` hook to return SQL fragments which will be used as part of that query.
Plugins that use any of the following features will need to be updated to work with this and following alphas (and Datasette 1.0 stable itself):
- Checking permissions with ``datasette.permission_allowed()`` - this method has been replaced with :ref:`datasette.allowed() <datasette_allowed>`.
- Implementing the ``permission_allowed()`` plugin hook - this hook has been removed in favor of :ref:`permission_resources_sql() <plugin_hook_permission_resources_sql>`.
- Using ``register_permissions()`` to register permissions - this hook has been removed in favor of :ref:`register_actions() <plugin_register_actions>`.
Consult the :ref:`v1.0a20 upgrade guide <upgrade_guide_v1_a20>` for further details on how to upgrade affected plugins.
Plugins can now make use of two new internal methods to help resolve permission checks:
- :ref:`datasette.allowed_resources() <datasette_allowed_resources>` returns a ``PaginatedResources`` object with a ``.resources`` list of ``Resource`` instances that an actor is allowed to access for a given action (and a ``.next`` token for pagination).
- :ref:`datasette.allowed_resources_sql() <datasette_allowed_resources_sql>` returns the SQL and parameters that can be executed against the internal catalog tables to determine which resources an actor is allowed to access for a given action. This can be combined with further SQL to perform advanced custom filtering.
Related changes:
- The way ``datasette --root`` works has changed. Running Datasette with this flag now causes the root actor to pass *all* permission checks. (:issue:`2521`)
- Permission debugging improvements:
- The ``/-/allowed`` endpoint shows resources the user is allowed to interact with for different actions.
- ``/-/rules`` shows the raw allow/deny rules that apply to different permission checks.
- ``/-/actions`` lists every available action.
- ``/-/check`` can be used to try out different permission checks for the current actor.
Other changes
~~~~~~~~~~~~~
- The internal ``catalog_views`` table now tracks SQLite views alongside tables in the introspection database. (:issue:`2495`)
- Hitting the ``/`` brings up a search interface for navigating to tables that the current user can view. A new ``/-/tables`` endpoint supports this functionality. (:issue:`2523`)
- Datasette attempts to detect some configuration errors on startup.
- Datasette now supports Python 3.14 and no longer tests against Python 3.9.
.. _v1_0_a19: .. _v1_0_a19:
1.0a19 (2025-04-21) 1.0a19 (2025-04-21)
@ -275,7 +188,7 @@ This alpha release adds basic alter table support to the Datasette Write API and
Alter table support for create, insert, upsert and update Alter table support for create, insert, upsert and update
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The :ref:`JSON write API <json_api_write>` can now be used to apply simple alter table schema changes, provided the acting actor has the new :ref:`actions_alter_table` permission. (:issue:`2101`) The :ref:`JSON write API <json_api_write>` can now be used to apply simple alter table schema changes, provided the acting actor has the new :ref:`permissions_alter_table` permission. (:issue:`2101`)
The only alter operation supported so far is adding new columns to an existing table. The only alter operation supported so far is adding new columns to an existing table.
@ -290,12 +203,12 @@ Permissions fix for the upsert API
The :ref:`/database/table/-/upsert API <TableUpsertView>` had a minor permissions bug, only affecting Datasette instances that had configured the ``insert-row`` and ``update-row`` permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue :issue:`2262`. The :ref:`/database/table/-/upsert API <TableUpsertView>` had a minor permissions bug, only affecting Datasette instances that had configured the ``insert-row`` and ``update-row`` permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue :issue:`2262`.
To avoid similar mistakes in the future the ``datasette.permission_allowed()`` method now specifies ``default=`` as a keyword-only argument. To avoid similar mistakes in the future the :ref:`datasette.permission_allowed() <datasette_permission_allowed>` method now specifies ``default=`` as a keyword-only argument.
Permission checks now consider opinions from every plugin Permission checks now consider opinions from every plugin
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The ``datasette.permission_allowed()`` method previously consulted every plugin that implemented the ``permission_allowed()`` plugin hook and obeyed the opinion of the last plugin to return a value. (:issue:`2275`) The :ref:`datasette.permission_allowed() <datasette_permission_allowed>` method previously consulted every plugin that implemented the :ref:`permission_allowed() <plugin_hook_permission_allowed>` plugin hook and obeyed the opinion of the last plugin to return a value. (:issue:`2275`)
Datasette now consults every plugin and checks to see if any of them returned ``False`` (the veto rule), and if none of them did, it then checks to see if any of them returned ``True``. Datasette now consults every plugin and checks to see if any of them returned ``False`` (the veto rule), and if none of them did, it then checks to see if any of them returned ``True``.
@ -555,7 +468,7 @@ The third Datasette 1.0 alpha release adds upsert support to the JSON API, plus
See `Datasette 1.0a2: Upserts and finely grained permissions <https://simonwillison.net/2022/Dec/15/datasette-1a2/>`__ for an extended, annotated version of these release notes. See `Datasette 1.0a2: Upserts and finely grained permissions <https://simonwillison.net/2022/Dec/15/datasette-1a2/>`__ for an extended, annotated version of these release notes.
- New ``/db/table/-/upsert`` API, :ref:`documented here <TableUpsertView>`. upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. (:issue:`1878`) - New ``/db/table/-/upsert`` API, :ref:`documented here <TableUpsertView>`. upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. (:issue:`1878`)
- New ``register_permissions()`` plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. (:issue:`1940`) - New :ref:`plugin_register_permissions` plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. (:issue:`1940`)
- The ``/db/-/create`` API for :ref:`creating a table <TableCreateView>` now accepts ``"ignore": true`` and ``"replace": true`` options when called with the ``"rows"`` property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. (:issue:`1927`) - The ``/db/-/create`` API for :ref:`creating a table <TableCreateView>` now accepts ``"ignore": true`` and ``"replace": true`` options when called with the ``"rows"`` property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. (:issue:`1927`)
- Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's :ref:`metadata` JSON and YAML files. The new ``"permissions"`` key can be used to specify which actors should have which permissions. See :ref:`authentication_permissions_other` for details. (:issue:`1636`) - Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's :ref:`metadata` JSON and YAML files. The new ``"permissions"`` key can be used to specify which actors should have which permissions. See :ref:`authentication_permissions_other` for details. (:issue:`1636`)
- The ``/-/create-token`` page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See :ref:`CreateTokenView` for details. (:issue:`1947`) - The ``/-/create-token`` page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See :ref:`CreateTokenView` for details. (:issue:`1947`)
@ -664,7 +577,7 @@ Documentation
.. _v0_62: .. _v0_62:
0.62 (2022-08-14) 0.62 (2022-08-14)
----------------- -------------------
Datasette can now run entirely in your browser using WebAssembly. Try out `Datasette Lite <https://lite.datasette.io/>`__, take a look `at the code <https://github.com/simonw/datasette-lite>`__ or read more about it in `Datasette Lite: a server-side Python web application running in a browser <https://simonwillison.net/2022/May/4/datasette-lite/>`__. Datasette can now run entirely in your browser using WebAssembly. Try out `Datasette Lite <https://lite.datasette.io/>`__, take a look `at the code <https://github.com/simonw/datasette-lite>`__ or read more about it in `Datasette Lite: a server-side Python web application running in a browser <https://simonwillison.net/2022/May/4/datasette-lite/>`__.
@ -730,7 +643,7 @@ Datasette also now requires Python 3.7 or higher.
- Datasette is now covered by a `Code of Conduct <https://github.com/simonw/datasette/blob/main/CODE_OF_CONDUCT.md>`__. (:issue:`1654`) - Datasette is now covered by a `Code of Conduct <https://github.com/simonw/datasette/blob/main/CODE_OF_CONDUCT.md>`__. (:issue:`1654`)
- Python 3.6 is no longer supported. (:issue:`1577`) - Python 3.6 is no longer supported. (:issue:`1577`)
- Tests now run against Python 3.11-dev. (:issue:`1621`) - Tests now run against Python 3.11-dev. (:issue:`1621`)
- New ``datasette.ensure_permissions(actor, permissions)`` internal method for checking multiple permissions at once. (:issue:`1675`) - New :ref:`datasette.ensure_permissions(actor, permissions) <datasette_ensure_permissions>` internal method for checking multiple permissions at once. (:issue:`1675`)
- New :ref:`datasette.check_visibility(actor, action, resource=None) <datasette_check_visibility>` internal method for checking if a user can see a resource that would otherwise be invisible to unauthenticated users. (:issue:`1678`) - New :ref:`datasette.check_visibility(actor, action, resource=None) <datasette_check_visibility>` internal method for checking if a user can see a resource that would otherwise be invisible to unauthenticated users. (:issue:`1678`)
- Table and row HTML pages now include a ``<link rel="alternate" type="application/json+datasette" href="...">`` element and return a ``Link: URL; rel="alternate"; type="application/json+datasette"`` HTTP header pointing to the JSON version of those pages. (:issue:`1533`) - Table and row HTML pages now include a ``<link rel="alternate" type="application/json+datasette" href="...">`` element and return a ``Link: URL; rel="alternate"; type="application/json+datasette"`` HTTP header pointing to the JSON version of those pages. (:issue:`1533`)
- ``Access-Control-Expose-Headers: Link`` is now added to the CORS headers, allowing remote JavaScript to access that header. - ``Access-Control-Expose-Headers: Link`` is now added to the CORS headers, allowing remote JavaScript to access that header.
@ -1155,7 +1068,7 @@ Smaller changes
~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~
- Wide tables shown within Datasette now scroll horizontally (:issue:`998`). This is achieved using a new ``<div class="table-wrapper">`` element which may impact the implementation of some plugins (for example `this change to datasette-cluster-map <https://github.com/simonw/datasette-cluster-map/commit/fcb4abbe7df9071c5ab57defd39147de7145b34e>`__). - Wide tables shown within Datasette now scroll horizontally (:issue:`998`). This is achieved using a new ``<div class="table-wrapper">`` element which may impact the implementation of some plugins (for example `this change to datasette-cluster-map <https://github.com/simonw/datasette-cluster-map/commit/fcb4abbe7df9071c5ab57defd39147de7145b34e>`__).
- New :ref:`actions_debug_menu` permission. (:issue:`1068`) - New :ref:`permissions_debug_menu` permission. (:issue:`1068`)
- Removed ``--debug`` option, which didn't do anything. (:issue:`814`) - Removed ``--debug`` option, which didn't do anything. (:issue:`814`)
- ``Link:`` HTTP header pagination. (:issue:`1014`) - ``Link:`` HTTP header pagination. (:issue:`1014`)
- ``x`` button for clearing filters. (:issue:`1016`) - ``x`` button for clearing filters. (:issue:`1016`)
@ -1414,7 +1327,7 @@ You can use the new ``"allow"`` block syntax in ``metadata.json`` (or ``metadata
See :ref:`authentication_permissions_allow` for more details. See :ref:`authentication_permissions_allow` for more details.
Plugins can implement their own custom permission checks using the new ``plugin_hook_permission_allowed()`` plugin hook. Plugins can implement their own custom permission checks using the new :ref:`plugin_hook_permission_allowed` hook.
A new debug page at ``/-/permissions`` shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the ``permissions-debug`` permission. (:issue:`788`) A new debug page at ``/-/permissions`` shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the ``permissions-debug`` permission. (:issue:`788`)
@ -1490,7 +1403,7 @@ Smaller changes
- New :ref:`datasette.get_database() <datasette_get_database>` method. - New :ref:`datasette.get_database() <datasette_get_database>` method.
- Added ``_`` prefix to many private, undocumented methods of the Datasette class. (:issue:`576`) - Added ``_`` prefix to many private, undocumented methods of the Datasette class. (:issue:`576`)
- Removed the ``db.get_outbound_foreign_keys()`` method which duplicated the behaviour of ``db.foreign_keys_for_table()``. - Removed the ``db.get_outbound_foreign_keys()`` method which duplicated the behaviour of ``db.foreign_keys_for_table()``.
- New ``await datasette.permission_allowed()`` method. - New :ref:`await datasette.permission_allowed() <datasette_permission_allowed>` method.
- ``/-/actor`` debugging endpoint for viewing the currently authenticated actor. - ``/-/actor`` debugging endpoint for viewing the currently authenticated actor.
- New ``request.cookies`` property. - New ``request.cookies`` property.
- ``/-/plugins`` endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1 - ``/-/plugins`` endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1

View file

@ -119,10 +119,8 @@ Once started you can access it at ``http://localhost:8001``
signed cookies signed cookies
--root Output URL that sets a cookie authenticating --root Output URL that sets a cookie authenticating
the root user the root user
--default-deny Deny all permissions by default
--get TEXT Run an HTTP GET request against this path, --get TEXT Run an HTTP GET request against this path,
print results and exit print results and exit
--headers Include HTTP headers in --get output
--token TEXT API token to send with --get requests --token TEXT API token to send with --get requests
--actor TEXT Actor to use for --get requests (JSON string) --actor TEXT Actor to use for --get requests (JSON string)
--version-note TEXT Additional note to show on /-/versions --version-note TEXT Additional note to show on /-/versions
@ -490,15 +488,8 @@ See :ref:`publish_cloud_run`.
--cpu [1|2|4] Number of vCPUs to allocate in Cloud Run --cpu [1|2|4] Number of vCPUs to allocate in Cloud Run
--timeout INTEGER Build timeout in seconds --timeout INTEGER Build timeout in seconds
--apt-get-install TEXT Additional packages to apt-get install --apt-get-install TEXT Additional packages to apt-get install
--max-instances INTEGER Maximum Cloud Run instances (use 0 to remove --max-instances INTEGER Maximum Cloud Run instances
the limit) [default: 1]
--min-instances INTEGER Minimum Cloud Run instances --min-instances INTEGER Minimum Cloud Run instances
--artifact-repository TEXT Artifact Registry repository to store the
image [default: datasette]
--artifact-region TEXT Artifact Registry location (region or multi-
region) [default: us]
--artifact-project TEXT Project ID for Artifact Registry (defaults to
the active project)
--help Show this message and exit. --help Show this message and exit.

View file

@ -36,19 +36,12 @@ extensions = [
"sphinx.ext.extlinks", "sphinx.ext.extlinks",
"sphinx.ext.autodoc", "sphinx.ext.autodoc",
"sphinx_copybutton", "sphinx_copybutton",
"myst_parser",
"sphinx_markdown_builder",
] ]
if not os.environ.get("DISABLE_SPHINX_INLINE_TABS"): if not os.environ.get("DISABLE_SPHINX_INLINE_TABS"):
extensions += ["sphinx_inline_tabs"] extensions += ["sphinx_inline_tabs"]
autodoc_member_order = "bysource" autodoc_member_order = "bysource"
myst_enable_extensions = ["colon_fence"]
markdown_http_base = "https://docs.datasette.io/en/stable"
markdown_uri_doc_suffix = ".html"
extlinks = { extlinks = {
"issue": ("https://github.com/simonw/datasette/issues/%s", "#%s"), "issue": ("https://github.com/simonw/datasette/issues/%s", "#%s"),
} }
@ -60,10 +53,7 @@ templates_path = ["_templates"]
# You can specify multiple suffix as a list of string: # You can specify multiple suffix as a list of string:
# #
# source_suffix = ['.rst', '.md'] # source_suffix = ['.rst', '.md']
source_suffix = { source_suffix = ".rst"
".rst": "restructuredtext",
".md": "markdown",
}
# The master toctree document. # The master toctree document.
master_doc = "index" master_doc = "index"

View file

@ -20,7 +20,7 @@ General guidelines
Setting up a development environment Setting up a development environment
------------------------------------ ------------------------------------
If you have Python 3.10 or higher installed on your computer (on OS X the quickest way to do this `is using homebrew <https://docs.python-guide.org/starting/install3/osx/>`__) you can install an editable copy of Datasette using the following steps. If you have Python 3.8 or higher installed on your computer (on OS X the quickest way to do this `is using homebrew <https://docs.python-guide.org/starting/install3/osx/>`__) you can install an editable copy of Datasette using the following steps.
If you want to use GitHub to publish your changes, first `create a fork of datasette <https://github.com/simonw/datasette/fork>`__ under your own GitHub account. If you want to use GitHub to publish your changes, first `create a fork of datasette <https://github.com/simonw/datasette/fork>`__ under your own GitHub account.
@ -42,7 +42,7 @@ The next step is to create a virtual environment for your project and use it to
# Install Datasette and its testing dependencies # Install Datasette and its testing dependencies
python3 -m pip install -e '.[test]' python3 -m pip install -e '.[test]'
That last line does most of the work: ``pip install -e`` means "install this package in a way that allows me to edit the source code in place". The ``.[test]`` option means "install the optional testing dependencies as well". That last line does most of the work: ``pip install -e`` means "install this package in a way that allows me to edit the source code in place". The ``.[test]`` option means "use the setup.py in this directory and install the optional testing dependencies as well".
.. _contributing_running_tests: .. _contributing_running_tests:
@ -131,15 +131,6 @@ These formatters are enforced by Datasette's continuous integration: if a commit
When developing locally, you can verify and correct the formatting of your code using these tools. When developing locally, you can verify and correct the formatting of your code using these tools.
If you are using `Just <https://github.com/casey/just>`__ the quickest way to run these is like so::
just black
just prettier
Or run both at the same time::
just format
.. _contributing_formatting_black: .. _contributing_formatting_black:
Running Black Running Black
@ -160,7 +151,7 @@ If any of your code does not conform to Black you can run this to automatically
:: ::
reformatted ../datasette/app.py reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨ All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged. 1 file reformatted, 94 files left unchanged.

View file

@ -79,7 +79,7 @@ Datasette will not be accessible from outside the server because it is listening
.. _deploying_openrc: .. _deploying_openrc:
Running Datasette using OpenRC Running Datasette using OpenRC
============================== ===============================
OpenRC is the service manager on non-systemd Linux distributions like `Alpine Linux <https://www.alpinelinux.org/>`__ and `Gentoo <https://www.gentoo.org/>`__. OpenRC is the service manager on non-systemd Linux distributions like `Alpine Linux <https://www.alpinelinux.org/>`__ and `Gentoo <https://www.gentoo.org/>`__.
Create an init script at ``/etc/init.d/datasette`` with the following contents: Create an init script at ``/etc/init.d/datasette`` with the following contents:

View file

@ -1,14 +1,14 @@
(events)= .. _events:
# Events
Events
======
Datasette includes a mechanism for tracking events that occur while the software is running. This is primarily intended to be used by plugins, which can both trigger events and listen for events. Datasette includes a mechanism for tracking events that occur while the software is running. This is primarily intended to be used by plugins, which can both trigger events and listen for events.
The core Datasette application triggers events when certain things happen. This page describes those events. The core Datasette application triggers events when certain things happen. This page describes those events.
Plugins can listen for events using the {ref}`plugin_hook_track_event` plugin hook, which will be called with instances of the following classes - or additional classes {ref}`registered by other plugins <plugin_hook_register_events>`. Plugins can listen for events using the :ref:`plugin_hook_track_event` plugin hook, which will be called with instances of the following classes - or additional classes :ref:`registered by other plugins <plugin_hook_register_events>`.
```{eval-rst}
.. automodule:: datasette.events .. automodule:: datasette.events
:members: :members:
:exclude-members: Event :exclude-members: Event
```

View file

@ -272,14 +272,14 @@ The dictionary keys are the name of the database that is used in the URL - e.g.
All databases are listed, irrespective of user permissions. All databases are listed, irrespective of user permissions.
.. _datasette_actions: .. _datasette_permissions:
.actions .permissions
-------- ------------
Property exposing a dictionary of actions that have been registered using the :ref:`plugin_register_actions` plugin hook. Property exposing a dictionary of permissions that have been registered using the :ref:`plugin_register_permissions` plugin hook.
The dictionary keys are the action names - e.g. ``view-instance`` - and the values are ``Action()`` objects describing the permission. The dictionary keys are the permission names - e.g. ``view-instance`` - and the values are ``Permission()`` objects describing the permission. Here is a :ref:`description of that object <plugin_register_permissions>`.
.. _datasette_plugin_config: .. _datasette_plugin_config:
@ -342,6 +342,33 @@ If no plugins that implement that hook are installed, the default return value l
"2": {"id": "2"} "2": {"id": "2"}
} }
.. _datasette_permission_allowed:
await .permission_allowed(actor, action, resource=None, default=...)
--------------------------------------------------------------------
``actor`` - dictionary
The authenticated actor. This is usually ``request.actor``.
``action`` - string
The name of the action that is being permission checked.
``resource`` - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
``default`` - optional: True, False or None
What value should be returned by default if nothing provides an opinion on this permission check.
Set to ``True`` for default allow or ``False`` for default deny.
If not specified the ``default`` from the ``Permission()`` tuple that was registered using :ref:`plugin_register_permissions` will be used.
Check if the given actor has :ref:`permission <authentication_permissions>` to perform the given action on the given resource.
Some permission checks are carried out against :ref:`rules defined in datasette.yaml <authentication_permissions_config>`, while other custom permissions may be decided by plugins that implement the :ref:`plugin_hook_permission_allowed` plugin hook.
If neither ``metadata.json`` nor any of the plugins provide an answer to the permission query the ``default`` argument will be returned.
See :ref:`permissions` for a full list of permission actions included in Datasette core.
.. _datasette_allowed: .. _datasette_allowed:
await .allowed(\*, action, resource, actor=None) await .allowed(\*, action, resource, actor=None)
@ -358,6 +385,8 @@ await .allowed(\*, action, resource, actor=None)
This method checks if the given actor has permission to perform the given action on the given resource. All parameters must be passed as keyword arguments. This method checks if the given actor has permission to perform the given action on the given resource. All parameters must be passed as keyword arguments.
This is the modern resource-based permission checking method. It works with Resource objects that provide structured information about what is being accessed.
Example usage: Example usage:
.. code-block:: python .. code-block:: python
@ -385,148 +414,50 @@ Example usage:
The method returns ``True`` if the permission is granted, ``False`` if denied. The method returns ``True`` if the permission is granted, ``False`` if denied.
.. _datasette_allowed_resources: For legacy string/tuple based permission checking, use :ref:`datasette_permission_allowed` instead.
await .allowed_resources(action, actor=None, \*, parent=None, include_is_private=False, include_reasons=False, limit=100, next=None) .. _datasette_ensure_permissions:
------------------------------------------------------------------------------------------------------------------------------------
Returns a ``PaginatedResources`` object containing resources that the actor can access for the specified action, with support for keyset pagination. await .ensure_permissions(actor, permissions)
---------------------------------------------
``action`` - string
The action name (e.g., "view-table", "view-database")
``actor`` - dictionary, optional
The authenticated actor. Defaults to ``None`` for unauthenticated requests.
``parent`` - string, optional
Optional parent filter (e.g., database name) to limit results
``include_is_private`` - boolean, optional
If True, adds a ``.private`` attribute to each Resource indicating whether anonymous users can access it
``include_reasons`` - boolean, optional
If True, adds a ``.reasons`` attribute with a list of strings describing why access was granted (useful for debugging)
``limit`` - integer, optional
Maximum number of results to return per page (1-1000, default 100)
``next`` - string, optional
Keyset token from a previous page for pagination
The method returns a ``PaginatedResources`` object (from ``datasette.utils``) with the following attributes:
``resources`` - list
List of ``Resource`` objects for the current page
``next`` - string or None
Token for the next page, or ``None`` if no more results exist
Example usage:
.. code-block:: python
# Get first page of tables
page = await datasette.allowed_resources(
"view-table",
actor=request.actor,
parent="fixtures",
limit=50,
)
for table in page.resources:
print(table.parent, table.child)
if hasattr(table, "private"):
print(f" Private: {table.private}")
# Get next page if available
if page.next:
next_page = await datasette.allowed_resources(
"view-table", actor=request.actor, next=page.next
)
# Iterate through all results automatically
page = await datasette.allowed_resources(
"view-table", actor=request.actor
)
async for table in page.all():
print(table.parent, table.child)
# With reasons for debugging
page = await datasette.allowed_resources(
"view-table", actor=request.actor, include_reasons=True
)
for table in page.resources:
print(f"{table.child}: {table.reasons}")
The ``page.all()`` async generator automatically handles pagination, fetching additional pages and yielding all resources one at a time.
This method uses :ref:`datasette_allowed_resources_sql` under the hood and is an efficient way to list the databases, tables or other resources that an actor can access for a specific action.
.. _datasette_allowed_resources_sql:
await .allowed_resources_sql(\*, action, actor=None, parent=None, include_is_private=False)
-------------------------------------------------------------------------------------------
Builds the SQL query that Datasette uses to determine which resources an actor may access for a specific action. Returns a ``(sql: str, params: dict)`` namedtuple that can be executed against the internal ``catalog_*`` database tables. ``parent`` can be used to limit results to a specific database, and ``include_is_private`` adds a column indicating whether anonymous users would be denied access to that resource.
Plugins that need to execute custom analysis over the raw allow/deny rules can use this helper to run the same query that powers the ``/-/allowed`` debugging interface.
The SQL query built by this method will return the following columns:
- ``parent``: The parent resource identifier (or NULL)
- ``child``: The child resource identifier (or NULL)
- ``reason``: The reason from the rule that granted access
- ``is_private``: (if ``include_is_private``) 1 if anonymous users cannot access, 0 otherwise
.. _datasette_ensure_permission:
await .ensure_permission(action, resource=None, actor=None)
-----------------------------------------------------------
``action`` - string
The action to check. See :ref:`actions` for a list of available actions.
``resource`` - Resource object (optional)
The resource to check the permission against. Must be an instance of ``InstanceResource``, ``DatabaseResource``, or ``TableResource`` from the ``datasette.resources`` module. If omitted, defaults to ``InstanceResource()`` for instance-level permissions.
``actor`` - dictionary (optional)
The authenticated actor. This is usually ``request.actor``.
This is a convenience wrapper around :ref:`datasette_allowed` that raises a ``datasette.Forbidden`` exception if the permission check fails. Use this when you want to enforce a permission check and halt execution if the actor is not authorized.
Example:
.. code-block:: python
from datasette.resources import TableResource
# Will raise Forbidden if actor cannot view the table
await datasette.ensure_permission(
action="view-table",
resource=TableResource(
database="fixtures", table="cities"
),
actor=request.actor,
)
# For instance-level actions, resource can be omitted:
await datasette.ensure_permission(
action="permissions-debug", actor=request.actor
)
.. _datasette_check_visibility:
await .check_visibility(actor, action, resource=None)
-----------------------------------------------------
``actor`` - dictionary ``actor`` - dictionary
The authenticated actor. This is usually ``request.actor``. The authenticated actor. This is usually ``request.actor``.
``action`` - string ``permissions`` - list
A list of permissions to check. Each permission in that list can be a string ``action`` name or a 2-tuple of ``(action, resource)``.
This method allows multiple permissions to be checked at once. It raises a ``datasette.Forbidden`` exception if any of the checks are denied before one of them is explicitly granted.
This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns ``True`` or not a single one of them returns ``False``:
.. code-block:: python
await datasette.ensure_permissions(
request.actor,
[
("view-table", (database, table)),
("view-database", database),
"view-instance",
],
)
.. _datasette_check_visibility:
await .check_visibility(actor, action=None, resource=None, permissions=None)
----------------------------------------------------------------------------
``actor`` - dictionary
The authenticated actor. This is usually ``request.actor``.
``action`` - string, optional
The name of the action that is being permission checked. The name of the action that is being permission checked.
``resource`` - Resource object, optional ``resource`` - string or tuple, optional
The resource being checked, as a Resource object such as ``DatabaseResource(database=...)``, ``TableResource(database=..., table=...)``, or ``QueryResource(database=..., query=...)``. Only some permissions apply to a resource. The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
``permissions`` - list of ``action`` strings or ``(action, resource)`` tuples, optional
Provide this instead of ``action`` and ``resource`` to check multiple permissions at once.
This convenience method can be used to answer the question "should this item be considered private, in that it is visible to me but it is not visible to anonymous users?" This convenience method can be used to answer the question "should this item be considered private, in that it is visible to me but it is not visible to anonymous users?"
@ -536,12 +467,23 @@ This example checks if the user can access a specific table, and sets ``private`
.. code-block:: python .. code-block:: python
from datasette.resources import TableResource
visible, private = await datasette.check_visibility( visible, private = await datasette.check_visibility(
request.actor, request.actor,
action="view-table", action="view-table",
resource=TableResource(database=database, table=table), resource=(database, table),
)
The following example runs three checks in a row, similar to :ref:`datasette_ensure_permissions`. If any of the checks are denied before one of them is explicitly granted then ``visible`` will be ``False``. ``private`` will be ``True`` if an anonymous user would not be able to view the resource.
.. code-block:: python
visible, private = await datasette.check_visibility(
request.actor,
permissions=[
("view-table", (database, table)),
("view-database", database),
"view-instance",
],
) )
.. _datasette_create_token: .. _datasette_create_token:
@ -594,6 +536,16 @@ The following example creates a token that can access ``view-instance`` and ``vi
}, },
) )
.. _datasette_get_permission:
.get_permission(name_or_abbr)
-----------------------------
``name_or_abbr`` - string
The name or abbreviation of the permission to look up, e.g. ``view-table`` or ``vt``.
Returns a :ref:`Permission object <plugin_register_permissions>` representing the permission, or raises a ``KeyError`` if one is not found.
.. _datasette_get_database: .. _datasette_get_database:
.get_database(name) .get_database(name)
@ -781,8 +733,8 @@ Use ``is_mutable=False`` to add an immutable database.
.. _datasette_add_memory_database: .. _datasette_add_memory_database:
.add_memory_database(memory_name, name=None, route=None) .add_memory_database(name)
-------------------------------------------------------- --------------------------
Adds a shared in-memory database with the specified name: Adds a shared in-memory database with the specified name:
@ -800,9 +752,7 @@ This is a shortcut for the following:
Database(datasette, memory_name="statistics") Database(datasette, memory_name="statistics")
) )
Using either of these patterns will result in the in-memory database being served at ``/statistics``. Using either of these pattern will result in the in-memory database being served at ``/statistics``.
The ``name`` and ``route`` parameters are optional and work the same way as they do for :ref:`datasette_add_database`.
.. _datasette_remove_database: .. _datasette_remove_database:
@ -1047,60 +997,6 @@ These methods can be used with :ref:`internals_datasette_urls` - for example:
For documentation on available ``**kwargs`` options and the shape of the HTTPX Response object refer to the `HTTPX Async documentation <https://www.python-httpx.org/async/>`__. For documentation on available ``**kwargs`` options and the shape of the HTTPX Response object refer to the `HTTPX Async documentation <https://www.python-httpx.org/async/>`__.
Bypassing permission checks
~~~~~~~~~~~~~~~~~~~~~~~~~~~
All ``datasette.client`` methods accept an optional ``skip_permission_checks=True`` parameter. When set, all permission checks will be bypassed for that request, allowing access to any resource regardless of the configured permissions.
This is useful for plugins and internal operations that need to access all resources without being subject to permission restrictions.
Example usage:
.. code-block:: python
# Regular request - respects permissions
response = await datasette.client.get(
"/private-db/secret-table.json"
)
# May return 403 Forbidden if access is denied
# With skip_permission_checks - bypasses all permission checks
response = await datasette.client.get(
"/private-db/secret-table.json",
skip_permission_checks=True,
)
# Will return 200 OK and the data, regardless of permissions
This parameter works with all HTTP methods (``get``, ``post``, ``put``, ``patch``, ``delete``, ``options``, ``head``) and the generic ``request`` method.
.. warning::
Use ``skip_permission_checks=True`` with caution. It completely bypasses Datasette's permission system and should only be used in trusted plugin code or internal operations where you need guaranteed access to resources.
.. _internals_datasette_is_client:
Detecting internal client requests
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
``datasette.in_client()`` - returns bool
Returns ``True`` if the current code is executing within a ``datasette.client`` request, ``False`` otherwise.
This method is useful for plugins that need to behave differently when called through ``datasette.client`` versus when handling external HTTP requests.
Example usage:
.. code-block:: python
async def fetch_documents(datasette):
if not datasette.in_client():
return Response.text(
"Only available via internal client requests",
status=403,
)
...
Note that ``datasette.in_client()`` is independent of ``skip_permission_checks``. A request made through ``datasette.client`` will always have ``in_client()`` return ``True``, regardless of whether ``skip_permission_checks`` is set.
.. _internals_datasette_urls: .. _internals_datasette_urls:
datasette.urls datasette.urls
@ -1155,7 +1051,7 @@ These methods each return a ``datasette.utils.PrefixedUrlString`` object, which
.. _internals_permission_classes: .. _internals_permission_classes:
Permission classes and utilities Permission classes and utilities
================================ =================================
.. _internals_permission_sql: .. _internals_permission_sql:
@ -1404,7 +1300,7 @@ Example usage:
.. _database_execute_write: .. _database_execute_write:
await db.execute_write(sql, params=None, block=True) await db.execute_write(sql, params=None, block=True)
---------------------------------------------------- -----------------------------------------------------
SQLite only allows one database connection to write at a time. Datasette handles this for you by maintaining a queue of writes to be executed against a given database. Plugins can submit write operations to this queue and they will be executed in the order in which they are received. SQLite only allows one database connection to write at a time. Datasette handles this for you by maintaining a queue of writes to be executed against a given database. Plugins can submit write operations to this queue and they will be executed in the order in which they are received.
@ -1421,7 +1317,7 @@ Each call to ``execute_write()`` will be executed inside a transaction.
.. _database_execute_write_script: .. _database_execute_write_script:
await db.execute_write_script(sql, block=True) await db.execute_write_script(sql, block=True)
---------------------------------------------- -----------------------------------------------
Like ``execute_write()`` but can be used to send multiple SQL statements in a single string separated by semicolons, using the ``sqlite3`` `conn.executescript() <https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executescript>`__ method. Like ``execute_write()`` but can be used to send multiple SQL statements in a single string separated by semicolons, using the ``sqlite3`` `conn.executescript() <https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executescript>`__ method.
@ -1430,7 +1326,7 @@ Each call to ``execute_write_script()`` will be executed inside a transaction.
.. _database_execute_write_many: .. _database_execute_write_many:
await db.execute_write_many(sql, params_seq, block=True) await db.execute_write_many(sql, params_seq, block=True)
-------------------------------------------------------- ---------------------------------------------------------
Like ``execute_write()`` but uses the ``sqlite3`` `conn.executemany() <https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executemany>`__ method. This will efficiently execute the same SQL statement against each of the parameters in the ``params_seq`` iterator, for example: Like ``execute_write()`` but uses the ``sqlite3`` `conn.executemany() <https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executemany>`__ method. This will efficiently execute the same SQL statement against each of the parameters in the ``params_seq`` iterator, for example:

View file

@ -347,7 +347,7 @@ Special table arguments
though this could potentially result in errors if the wrong syntax is used. though this could potentially result in errors if the wrong syntax is used.
``?_where=SQL-fragment`` ``?_where=SQL-fragment``
If the :ref:`actions_execute_sql` permission is enabled, this parameter If the :ref:`permissions_execute_sql` permission is enabled, this parameter
can be used to pass one or more additional SQL fragments to be used in the can be used to pass one or more additional SQL fragments to be used in the
`WHERE` clause of the SQL used to query the table. `WHERE` clause of the SQL used to query the table.
@ -510,7 +510,7 @@ Datasette provides a write API for JSON data. This is a POST-only API that requi
Inserting rows Inserting rows
~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~
This requires the :ref:`actions_insert_row` permission. This requires the :ref:`permissions_insert_row` permission.
A single row can be inserted using the ``"row"`` key: A single row can be inserted using the ``"row"`` key:
@ -621,9 +621,9 @@ Pass ``"ignore": true`` to ignore these errors and insert the other rows:
"ignore": true "ignore": true
} }
Or you can pass ``"replace": true`` to replace any rows with conflicting primary keys with the new values. This requires the :ref:`actions_update_row` permission. Or you can pass ``"replace": true`` to replace any rows with conflicting primary keys with the new values. This requires the :ref:`permissions_update_row` permission.
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission. Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
.. _TableUpsertView: .. _TableUpsertView:
@ -632,7 +632,7 @@ Upserting rows
An upsert is an insert or update operation. If a row with a matching primary key already exists it will be updated - otherwise a new row will be inserted. An upsert is an insert or update operation. If a row with a matching primary key already exists it will be updated - otherwise a new row will be inserted.
The upsert API is mostly the same shape as the :ref:`insert API <TableInsertView>`. It requires both the :ref:`actions_insert_row` and :ref:`actions_update_row` permissions. The upsert API is mostly the same shape as the :ref:`insert API <TableInsertView>`. It requires both the :ref:`permissions_insert_row` and :ref:`permissions_update_row` permissions.
:: ::
@ -735,14 +735,14 @@ When using upsert you must provide the primary key column (or columns if the tab
If your table does not have an explicit primary key you should pass the SQLite ``rowid`` key instead. If your table does not have an explicit primary key you should pass the SQLite ``rowid`` key instead.
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission. Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
.. _RowUpdateView: .. _RowUpdateView:
Updating a row Updating a row
~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~
To update a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/update``. This requires the :ref:`actions_update_row` permission. To update a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/update``. This requires the :ref:`permissions_update_row` permission.
:: ::
@ -792,14 +792,14 @@ The returned JSON will look like this:
Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error. Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error.
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission. Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
.. _RowDeleteView: .. _RowDeleteView:
Deleting a row Deleting a row
~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~
To delete a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/delete``. This requires the :ref:`actions_delete_row` permission. To delete a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/delete``. This requires the :ref:`permissions_delete_row` permission.
:: ::
@ -818,7 +818,7 @@ Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false
Creating a table Creating a table
~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~
To create a table, make a ``POST`` to ``/<database>/-/create``. This requires the :ref:`actions_create_table` permission. To create a table, make a ``POST`` to ``/<database>/-/create``. This requires the :ref:`permissions_create_table` permission.
:: ::
@ -859,8 +859,8 @@ The JSON here describes the table that will be created:
* ``pks`` can be used instead of ``pk`` to create a compound primary key. It should be a JSON list of column names to use in that primary key. * ``pks`` can be used instead of ``pk`` to create a compound primary key. It should be a JSON list of column names to use in that primary key.
* ``ignore`` can be set to ``true`` to ignore existing rows by primary key if the table already exists. * ``ignore`` can be set to ``true`` to ignore existing rows by primary key if the table already exists.
* ``replace`` can be set to ``true`` to replace existing rows by primary key if the table already exists. This requires the :ref:`actions_update_row` permission. * ``replace`` can be set to ``true`` to replace existing rows by primary key if the table already exists. This requires the :ref:`permissions_update_row` permission.
* ``alter`` can be set to ``true`` if you want to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission. * ``alter`` can be set to ``true`` if you want to automatically add any missing columns to the table. This requires the :ref:`permissions_alter_table` permission.
If the table is successfully created this will return a ``201`` status code and the following response: If the table is successfully created this will return a ``201`` status code and the following response:
@ -906,7 +906,7 @@ Datasette will create a table with a schema that matches those rows and insert t
"pk": "id" "pk": "id"
} }
Doing this requires both the :ref:`actions_create_table` and :ref:`actions_insert_row` permissions. Doing this requires both the :ref:`permissions_create_table` and :ref:`permissions_insert_row` permissions.
The ``201`` response here will be similar to the ``columns`` form, but will also include the number of rows that were inserted as ``row_count``: The ``201`` response here will be similar to the ``columns`` form, but will also include the number of rows that were inserted as ``row_count``:
@ -937,16 +937,16 @@ If you pass a row to the create endpoint with a primary key that already exists
You can avoid this error by passing the same ``"ignore": true`` or ``"replace": true`` options to the create endpoint as you can to the :ref:`insert endpoint <TableInsertView>`. You can avoid this error by passing the same ``"ignore": true`` or ``"replace": true`` options to the create endpoint as you can to the :ref:`insert endpoint <TableInsertView>`.
To use the ``"replace": true`` option you will also need the :ref:`actions_update_row` permission. To use the ``"replace": true`` option you will also need the :ref:`permissions_update_row` permission.
Pass ``"alter": true`` to automatically add any missing columns to the existing table that are present in the rows you are submitting. This requires the :ref:`actions_alter_table` permission. Pass ``"alter": true`` to automatically add any missing columns to the existing table that are present in the rows you are submitting. This requires the :ref:`permissions_alter_table` permission.
.. _TableDropView: .. _TableDropView:
Dropping tables Dropping tables
~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~
To drop a table, make a ``POST`` to ``/<database>/<table>/-/drop``. This requires the :ref:`actions_drop_table` permission. To drop a table, make a ``POST`` to ``/<database>/<table>/-/drop``. This requires the :ref:`permissions_drop_table` permission.
:: ::

View file

@ -28,7 +28,7 @@ The index page can also be accessed at ``/-/``, useful for if the default index
Database Database
======== ========
Each database has a page listing the tables, views and canned queries available for that database. If the :ref:`actions_execute_sql` permission is enabled (it's on by default) there will also be an interface for executing arbitrary SQL select queries against the data. Each database has a page listing the tables, views and canned queries available for that database. If the :ref:`permissions_execute_sql` permission is enabled (it's on by default) there will also be an interface for executing arbitrary SQL select queries against the data.
Examples: Examples:
@ -60,7 +60,7 @@ The following tables are hidden by default:
Queries Queries
======= =======
The ``/database-name/-/query`` page can be used to execute an arbitrary SQL query against that database, if the :ref:`actions_execute_sql` permission is enabled. This query is passed as the ``?sql=`` query string parameter. The ``/database-name/-/query`` page can be used to execute an arbitrary SQL query against that database, if the :ref:`permissions_execute_sql` permission is enabled. This query is passed as the ``?sql=`` query string parameter.
This means you can link directly to a query by constructing the following URL: This means you can link directly to a query by constructing the following URL:
@ -107,46 +107,3 @@ Note that this URL includes the encoded primary key of the record.
Here's that same page as JSON: Here's that same page as JSON:
`../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json <https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json>`_ `../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json <https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json>`_
.. _pages_schemas:
Schemas
=======
Datasette offers ``/-/schema`` endpoints to expose the SQL schema for databases and tables.
.. _InstanceSchemaView:
Instance schema
---------------
Access ``/-/schema`` to see the complete schema for all attached databases in the Datasette instance.
Use ``/-/schema.md`` to get the same information as Markdown.
Use ``/-/schema.json`` to get the same information as JSON, which looks like this:
.. code-block:: json
{
"schemas": [
{
"database": "content",
"schema": "create table posts ..."
}
}
.. _DatabaseSchemaView:
Database schema
---------------
Use ``/database-name/-/schema`` to see the complete schema for a specific database. The ``.md`` and ``.json`` extensions work here too. The JSON returns an object with ``"database"`` and ``"schema"`` keys.
.. _TableSchemaView:
Table schema
------------
Use ``/database-name/table-name/-/schema`` to see the schema for a specific table. The ``.md`` and ``.json`` extensions work here too. The JSON returns an object with ``"database"``, ``"table"``, and ``"schema"`` keys.

View file

@ -691,7 +691,7 @@ Help text (from the docstring for the function plus any defined Click arguments
Plugins can register multiple commands by making multiple calls to the ``@cli.command()`` decorator. Consult the `Click documentation <https://click.palletsprojects.com/>`__ for full details on how to build a CLI command, including how to define arguments and options. Plugins can register multiple commands by making multiple calls to the ``@cli.command()`` decorator. Consult the `Click documentation <https://click.palletsprojects.com/>`__ for full details on how to build a CLI command, including how to define arguments and options.
Note that ``register_commands()`` plugins cannot used with the :ref:`--plugins-dir mechanism <writing_plugins_one_off>` - they need to be installed into the same virtual environment as Datasette using ``pip install``. Provided it has a ``pyproject.toml`` file (see :ref:`writing_plugins_packaging`) you can run ``pip install`` directly against the directory in which you are developing your plugin like so:: Note that ``register_commands()`` plugins cannot used with the :ref:`--plugins-dir mechanism <writing_plugins_one_off>` - they need to be installed into the same virtual environment as Datasette using ``pip install``. Provided it has a ``setup.py`` file (see :ref:`writing_plugins_packaging`) you can run ``pip install`` directly against the directory in which you are developing your plugin like so::
pip install -e path/to/my/datasette-plugin pip install -e path/to/my/datasette-plugin
@ -777,10 +777,60 @@ The plugin hook can then be used to register the new facet class like this:
def register_facet_classes(): def register_facet_classes():
return [SpecialFacet] return [SpecialFacet]
.. _plugin_register_permissions:
register_permissions(datasette)
--------------------------------
.. note::
This hook is deprecated. Use :ref:`plugin_register_actions` instead, which provides a more flexible resource-based permission system.
If your plugin needs to register additional permissions unique to that plugin - ``upload-csvs`` for example - you can return a list of those permissions from this hook.
.. code-block:: python
from datasette import hookimpl, Permission
@hookimpl
def register_permissions(datasette):
return [
Permission(
name="upload-csvs",
abbr=None,
description="Upload CSV files",
takes_database=True,
takes_resource=False,
default=False,
)
]
The fields of the ``Permission`` class are as follows:
``name`` - string
The name of the permission, e.g. ``upload-csvs``. This should be unique across all plugins that the user might have installed, so choose carefully.
``abbr`` - string or None
An abbreviation of the permission, e.g. ``uc``. This is optional - you can set it to ``None`` if you do not want to pick an abbreviation. Since this needs to be unique across all installed plugins it's best not to specify an abbreviation at all. If an abbreviation is provided it will be used when creating restricted signed API tokens.
``description`` - string or None
A human-readable description of what the permission lets you do. Should make sense as the second part of a sentence that starts "A user with this permission can ...".
``takes_database`` - boolean
``True`` if this permission can be granted on a per-database basis, ``False`` if it is only valid at the overall Datasette instance level.
``takes_resource`` - boolean
``True`` if this permission can be granted on a per-resource basis. A resource is a database table, SQL view or :ref:`canned query <canned_queries>`.
``default`` - boolean
The default value for this permission if it is not explicitly granted to a user. ``True`` means the permission is granted by default, ``False`` means it is not.
This should only be ``True`` if you want anonymous users to be able to take this action.
.. _plugin_register_actions: .. _plugin_register_actions:
register_actions(datasette) register_actions(datasette)
--------------------------- ----------------------------
If your plugin needs to register actions that can be checked with Datasette's new resource-based permission system, return a list of those actions from this hook. If your plugin needs to register actions that can be checked with Datasette's new resource-based permission system, return a list of those actions from this hook.
@ -833,18 +883,24 @@ Actions define what operations can be performed on resources (like viewing a tab
name="list-documents", name="list-documents",
abbr="ld", abbr="ld",
description="List documents in a collection", description="List documents in a collection",
takes_parent=True,
takes_child=False,
resource_class=DocumentCollectionResource, resource_class=DocumentCollectionResource,
), ),
Action( Action(
name="view-document", name="view-document",
abbr="vdoc", abbr="vdoc",
description="View document", description="View document",
takes_parent=True,
takes_child=True,
resource_class=DocumentResource, resource_class=DocumentResource,
), ),
Action( Action(
name="edit-document", name="edit-document",
abbr="edoc", abbr="edoc",
description="Edit document", description="Edit document",
takes_parent=True,
takes_child=True,
resource_class=DocumentResource, resource_class=DocumentResource,
), ),
] ]
@ -855,25 +911,31 @@ The fields of the ``Action`` dataclass are as follows:
The name of the action, e.g. ``view-document``. This should be unique across all plugins. The name of the action, e.g. ``view-document``. This should be unique across all plugins.
``abbr`` - string or None ``abbr`` - string or None
An abbreviation of the action, e.g. ``vdoc``. This is optional. Since this needs to be unique across all installed plugins it's best to choose carefully or omit it entirely (same as setting it to ``None``.) An abbreviation of the action, e.g. ``vdoc``. This is optional. Since this needs to be unique across all installed plugins it's best to choose carefully or use ``None``.
``description`` - string or None ``description`` - string or None
A human-readable description of what the action allows you to do. A human-readable description of what the action allows you to do.
``resource_class`` - type[Resource] or None ``takes_parent`` - boolean
The Resource subclass that defines what kind of resource this action applies to. Omit this (or set to ``None``) for global actions that apply only at the instance level with no associated resources (like ``debug-menu`` or ``permissions-debug``). Your Resource subclass must: ``True`` if this action requires a parent identifier (like a database name).
``takes_child`` - boolean
``True`` if this action requires a child identifier (like a table or document name).
``resource_class`` - type[Resource]
The Resource subclass that defines what kind of resource this action applies to. Your Resource subclass must:
- Define a ``name`` class attribute (e.g., ``"document"``) - Define a ``name`` class attribute (e.g., ``"document"``)
- Define a ``parent_class`` class attribute (``None`` for top-level resources like databases, or the parent ``Resource`` subclass for child resources) - Optionally define a ``parent_name`` class attribute (e.g., ``"collection"``)
- Implement a ``resources_sql()`` classmethod that returns SQL returning all resources as ``(parent, child)`` columns - Implement a ``resources_sql()`` classmethod that returns SQL returning all resources as ``(parent, child)`` columns
- Have an ``__init__`` method that accepts appropriate parameters and calls ``super().__init__(parent=..., child=...)`` - Have an ``__init__`` method that accepts appropriate parameters and calls ``super().__init__(parent=..., child=...)``
The ``resources_sql()`` method The ``resources_sql()`` method
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The ``resources_sql()`` classmethod returns a SQL query that lists all resources of that type that exist in the system. The ``resources_sql()`` classmethod is crucial to Datasette's permission system. It returns a SQL query that lists all resources of that type that exist in the system.
This query is used by Datasette to efficiently check permissions across multiple resources at once. When a user requests a list of resources (like tables, documents, or other entities), Datasette uses this SQL to: This SQL query is used by Datasette to efficiently check permissions across multiple resources at once. When a user requests a list of resources (like tables, documents, or other entities), Datasette uses this SQL to:
1. Get all resources of this type from your data catalog 1. Get all resources of this type from your data catalog
2. Combine it with permission rules from the ``permission_resources_sql`` hook 2. Combine it with permission rules from the ``permission_resources_sql`` hook
@ -1314,10 +1376,76 @@ This example plugin causes 0 results to be returned if ``?_nothing=1`` is added
Example: `datasette-leaflet-freedraw <https://datasette.io/plugins/datasette-leaflet-freedraw>`_ Example: `datasette-leaflet-freedraw <https://datasette.io/plugins/datasette-leaflet-freedraw>`_
.. _plugin_hook_permission_allowed:
permission_allowed(datasette, actor, action, resource)
------------------------------------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
``actor`` - dictionary
The current actor, as decided by :ref:`plugin_hook_actor_from_request`.
``action`` - string
The action to be performed, e.g. ``"edit-table"``.
``resource`` - string or None
An identifier for the individual resource, e.g. the name of the table.
Called to check that an actor has permission to perform an action on a resource. Can return ``True`` if the action is allowed, ``False`` if the action is not allowed or ``None`` if the plugin does not have an opinion one way or the other.
Here's an example plugin which randomly selects if a permission should be allowed or denied, except for ``view-instance`` which always uses the default permission scheme instead.
.. code-block:: python
from datasette import hookimpl
import random
@hookimpl
def permission_allowed(action):
if action != "view-instance":
# Return True or False at random
return random.random() > 0.5
# Returning None falls back to default permissions
This function can alternatively return an awaitable function which itself returns ``True``, ``False`` or ``None``. You can use this option if you need to execute additional database queries using ``await datasette.execute(...)``.
Here's an example that allows users to view the ``admin_log`` table only if their actor ``id`` is present in the ``admin_users`` table. It aso disallows arbitrary SQL queries for the ``staff.db`` database for all users.
.. code-block:: python
@hookimpl
def permission_allowed(datasette, actor, action, resource):
async def inner():
if action == "execute-sql" and resource == "staff":
return False
if action == "view-table" and resource == (
"staff",
"admin_log",
):
if not actor:
return False
user_id = actor["id"]
result = await datasette.get_database(
"staff"
).execute(
"select count(*) from admin_users where user_id = :user_id",
{"user_id": user_id},
)
return result.first()[0] > 0
return inner
See :ref:`built-in permissions <permissions>` for a full list of permissions that are included in Datasette core.
Example: `datasette-permissions-sql <https://datasette.io/plugins/datasette-permissions-sql>`_
.. _plugin_hook_permission_resources_sql: .. _plugin_hook_permission_resources_sql:
permission_resources_sql(datasette, actor, action) permission_resources_sql(datasette, actor, action)
-------------------------------------------------- ---------------------------------------------------
``datasette`` - :ref:`internals_datasette` ``datasette`` - :ref:`internals_datasette`
Access to the Datasette instance. Access to the Datasette instance.
@ -1329,28 +1457,13 @@ permission_resources_sql(datasette, actor, action)
The permission action being evaluated. Examples include ``"view-table"`` or ``"insert-row"``. The permission action being evaluated. Examples include ``"view-table"`` or ``"insert-row"``.
Return value Return value
A :class:`datasette.permissions.PermissionSQL` object, ``None`` or an iterable of ``PermissionSQL`` objects. A :class:`datasette.utils.permissions.PluginSQL` object, ``None`` or an iterable of ``PluginSQL`` objects.
Datasette's action-based permission resolver calls this hook to gather SQL rows describing which Datasette's action-based permission resolver calls this hook to gather SQL rows describing which
resources an actor may access (``allow = 1``) or should be denied (``allow = 0``) for a specific action. resources an actor may access (``allow = 1``) or should be denied (``allow = 0``) for a specific action.
Each SQL snippet should return ``parent``, ``child``, ``allow`` and ``reason`` columns. Each SQL snippet should return ``parent``, ``child``, ``allow`` and ``reason`` columns. Any bound parameters
supplied via ``PluginSQL.params`` are automatically namespaced per plugin.
**Parameter naming convention:** Plugin parameters in ``PermissionSQL.params`` should use unique names
to avoid conflicts with other plugins. The recommended convention is to prefix parameters with your
plugin's source name (e.g., ``myplugin_user_id``). The system reserves these parameter names:
``:actor``, ``:actor_id``, ``:action``, and ``:filter_parent``.
You can also use return ``PermissionSQL.allow(reason="reason goes here")`` or ``PermissionSQL.deny(reason="reason goes here")`` as shortcuts for simple root-level allow or deny rules. These will create SQL snippets that look like this:
.. code-block:: sql
SELECT
NULL AS parent,
NULL AS child,
1 AS allow,
'reason goes here' AS reason
Or ``0 AS allow`` for denies.
Permission plugin examples Permission plugin examples
~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~
@ -1358,7 +1471,7 @@ Permission plugin examples
These snippets show how to use the new ``permission_resources_sql`` hook to These snippets show how to use the new ``permission_resources_sql`` hook to
contribute rows to the action-based permission resolver. Each hook receives the contribute rows to the action-based permission resolver. Each hook receives the
current actor dictionary (or ``None``) and must return ``None`` or an instance or list of current actor dictionary (or ``None``) and must return ``None`` or an instance or list of
``datasette.permissions.PermissionSQL`` (or a coroutine that resolves to that). ``datasette.utils.permissions.PluginSQL`` (or a coroutine that resolves to that).
Allow Alice to view a specific table Allow Alice to view a specific table
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
@ -1369,7 +1482,7 @@ This plugin grants the actor with ``id == "alice"`` permission to perform the
.. code-block:: python .. code-block:: python
from datasette import hookimpl from datasette import hookimpl
from datasette.permissions import PermissionSQL from datasette.utils.permissions import PluginSQL
@hookimpl @hookimpl
@ -1379,7 +1492,8 @@ This plugin grants the actor with ``id == "alice"`` permission to perform the
if not actor or actor.get("id") != "alice": if not actor or actor.get("id") != "alice":
return None return None
return PermissionSQL( return PluginSQL(
source="alice_sales_allow",
sql=""" sql="""
SELECT SELECT
'accounting' AS parent, 'accounting' AS parent,
@ -1387,6 +1501,7 @@ This plugin grants the actor with ``id == "alice"`` permission to perform the
1 AS allow, 1 AS allow,
'alice can view accounting/sales' AS reason 'alice can view accounting/sales' AS reason
""", """,
params={},
) )
Restrict execute-sql to a database prefix Restrict execute-sql to a database prefix
@ -1399,7 +1514,7 @@ will pass through to the SQL snippet.
.. code-block:: python .. code-block:: python
from datasette import hookimpl from datasette import hookimpl
from datasette.permissions import PermissionSQL from datasette.utils.permissions import PluginSQL
@hookimpl @hookimpl
@ -1407,7 +1522,8 @@ will pass through to the SQL snippet.
if action != "execute-sql": if action != "execute-sql":
return None return None
return PermissionSQL( return PluginSQL(
source="analytics_execute_sql",
sql=""" sql="""
SELECT SELECT
parent, parent,
@ -1415,10 +1531,10 @@ will pass through to the SQL snippet.
1 AS allow, 1 AS allow,
'execute-sql allowed for analytics_*' AS reason 'execute-sql allowed for analytics_*' AS reason
FROM catalog_databases FROM catalog_databases
WHERE database_name LIKE :analytics_prefix WHERE database_name LIKE :prefix
""", """,
params={ params={
"analytics_prefix": "analytics_%", "prefix": "analytics_%",
}, },
) )
@ -1431,7 +1547,7 @@ with columns ``(actor_id, action, parent, child, allow, reason)``.
.. code-block:: python .. code-block:: python
from datasette import hookimpl from datasette import hookimpl
from datasette.permissions import PermissionSQL from datasette.utils.permissions import PluginSQL
@hookimpl @hookimpl
@ -1439,7 +1555,8 @@ with columns ``(actor_id, action, parent, child, allow, reason)``.
if not actor: if not actor:
return None return None
return PermissionSQL( return PluginSQL(
source="permission_grants_table",
sql=""" sql="""
SELECT SELECT
parent, parent,
@ -1447,12 +1564,12 @@ with columns ``(actor_id, action, parent, child, allow, reason)``.
allow, allow,
COALESCE(reason, 'permission_grants table') AS reason COALESCE(reason, 'permission_grants table') AS reason
FROM permission_grants FROM permission_grants
WHERE actor_id = :grants_actor_id WHERE actor_id = :actor_id
AND action = :grants_action AND action = :action
""", """,
params={ params={
"grants_actor_id": actor.get("id"), "actor_id": actor.get("id"),
"grants_action": action, "action": action,
}, },
) )
@ -1465,7 +1582,7 @@ The resolver will automatically apply the most specific rule.
.. code-block:: python .. code-block:: python
from datasette import hookimpl from datasette import hookimpl
from datasette.permissions import PermissionSQL from datasette.utils.permissions import PluginSQL
TRUSTED = {"alice", "bob"} TRUSTED = {"alice", "bob"}
@ -1479,14 +1596,17 @@ The resolver will automatically apply the most specific rule.
actor_id = (actor or {}).get("id") actor_id = (actor or {}).get("id")
if actor_id not in TRUSTED: if actor_id not in TRUSTED:
return PermissionSQL( return PluginSQL(
source="view_table_root_deny",
sql=""" sql="""
SELECT NULL AS parent, NULL AS child, 0 AS allow, SELECT NULL AS parent, NULL AS child, 0 AS allow,
'default deny view-table' AS reason 'default deny view-table' AS reason
""", """,
params={},
) )
return PermissionSQL( return PluginSQL(
source="trusted_allow",
sql=""" sql="""
SELECT NULL AS parent, NULL AS child, 0 AS allow, SELECT NULL AS parent, NULL AS child, 0 AS allow,
'default deny view-table' AS reason 'default deny view-table' AS reason
@ -1915,16 +2035,16 @@ This example adds a new database action for creating a table, if the user has th
.. code-block:: python .. code-block:: python
from datasette import hookimpl from datasette import hookimpl
from datasette.resources import DatabaseResource
@hookimpl @hookimpl
def database_actions(datasette, actor, database): def database_actions(datasette, actor, database):
async def inner(): async def inner():
if not await datasette.allowed( if not await datasette.permission_allowed(
actor, actor,
"edit-schema", "edit-schema",
resource=DatabaseResource("database"), resource=database,
default=False,
): ):
return [] return []
return [ return [

View file

@ -232,8 +232,9 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
"version": null, "version": null,
"hooks": [ "hooks": [
"actor_from_request", "actor_from_request",
"canned_queries", "permission_allowed",
"permission_resources_sql", "permission_resources_sql",
"register_permissions",
"skip_csrf" "skip_csrf"
] ]
}, },

View file

@ -69,7 +69,7 @@ default_allow_sql
Should users be able to execute arbitrary SQL queries by default? Should users be able to execute arbitrary SQL queries by default?
Setting this to ``off`` causes permission checks for :ref:`actions_execute_sql` to fail by default. Setting this to ``off`` causes permission checks for :ref:`permissions_execute_sql` to fail by default.
:: ::

View file

@ -7,7 +7,7 @@ Datasette treats SQLite database files as read-only and immutable. This means it
The easiest way to execute custom SQL against Datasette is through the web UI. The database index page includes a SQL editor that lets you run any SELECT query you like. You can also construct queries using the filter interface on the tables page, then click "View and edit SQL" to open that query in the custom SQL editor. The easiest way to execute custom SQL against Datasette is through the web UI. The database index page includes a SQL editor that lets you run any SELECT query you like. You can also construct queries using the filter interface on the tables page, then click "View and edit SQL" to open that query in the custom SQL editor.
Note that this interface is only available if the :ref:`actions_execute_sql` permission is allowed. See :ref:`authentication_permissions_execute_sql`. Note that this interface is only available if the :ref:`permissions_execute_sql` permission is allowed. See :ref:`authentication_permissions_execute_sql`.
Any Datasette SQL query is reflected in the URL of the page, allowing you to bookmark them, share them with others and navigate through previous queries using your browser back button. Any Datasette SQL query is reflected in the URL of the page, allowing you to bookmark them, share them with others and navigate through previous queries using your browser back button.

View file

@ -33,16 +33,16 @@ You can install these packages like so::
pip install pytest pytest-asyncio pip install pytest pytest-asyncio
If you are building an installable package you can add them as test dependencies to your ``pyproject.toml`` file like this: If you are building an installable package you can add them as test dependencies to your ``setup.py`` module like this:
.. code-block:: toml .. code-block:: python
[project] setup(
name = "datasette-my-plugin" name="datasette-my-plugin",
# ... # ...
extras_require={"test": ["pytest", "pytest-asyncio"]},
[project.optional-dependencies] tests_require=["datasette-my-plugin[test]"],
test = ["pytest", "pytest-asyncio"] )
You can then install the test dependencies like so:: You can then install the test dependencies like so::
@ -283,12 +283,13 @@ Here's a test for that plugin that mocks the HTTPX outbound request:
Registering a plugin for the duration of a test Registering a plugin for the duration of a test
----------------------------------------------- -----------------------------------------------
When writing tests for plugins you may find it useful to register a test plugin just for the duration of a single test. You can do this using ``datasette.pm.register()`` and ``datasette.pm.unregister()`` like this: When writing tests for plugins you may find it useful to register a test plugin just for the duration of a single test. You can do this using ``pm.register()`` and ``pm.unregister()`` like this:
.. code-block:: python .. code-block:: python
from datasette import hookimpl from datasette import hookimpl
from datasette.app import Datasette from datasette.app import Datasette
from datasette.plugins import pm
import pytest import pytest
@ -304,14 +305,14 @@ When writing tests for plugins you may find it useful to register a test plugin
(r"^/error$", lambda: 1 / 0), (r"^/error$", lambda: 1 / 0),
] ]
datasette = Datasette() pm.register(TestPlugin(), name="undo")
try: try:
# The test implementation goes here # The test implementation goes here
datasette.pm.register(TestPlugin(), name="undo") datasette = Datasette()
response = await datasette.client.get("/error") response = await datasette.client.get("/error")
assert response.status_code == 500 assert response.status_code == 500
finally: finally:
datasette.pm.unregister(name="undo") pm.unregister(name="undo")
To reuse the same temporary plugin in multiple tests, you can register it inside a fixture in your ``conftest.py`` file like this: To reuse the same temporary plugin in multiple tests, you can register it inside a fixture in your ``conftest.py`` file like this:

View file

@ -1,289 +0,0 @@
---
orphan: true
---
(upgrade_guide_v1_a20)=
# Datasette 1.0a20 plugin upgrade guide
Datasette 1.0a20 makes some breaking changes to Datasette's permission system. Plugins need to be updated if they use **any of the following**:
- The `register_permissions()` plugin hook - this should be replaced with `register_actions`
- The `permission_allowed()` plugin hook - this should be upgraded to use `permission_resources_sql()`.
- The `datasette.permission_allowed()` internal method - this should be replaced with `datasette.allowed()`
- Logic that grants access to the `"root"` actor can be removed.
## Permissions are now actions
The `register_permissions()` hook shoud be replaced with `register_actions()`.
Old code:
```python
@hookimpl
def register_permissions(datasette):
return [
Permission(
name="explain-sql",
abbr=None,
description="Can explain SQL queries",
takes_database=True,
takes_resource=False,
default=False,
),
Permission(
name="annotate-rows",
abbr=None,
description="Can annotate rows",
takes_database=True,
takes_resource=True,
default=False,
),
Permission(
name="view-debug-info",
abbr=None,
description="Can view debug information",
takes_database=False,
takes_resource=False,
default=False,
),
]
```
The new `Action` does not have a `default=` parameter.
Here's the equivalent new code:
```python
from datasette import hookimpl
from datasette.permissions import Action
from datasette.resources import DatabaseResource, TableResource
@hookimpl
def register_actions(datasette):
return [
Action(
name="explain-sql",
description="Explain SQL queries",
resource_class=DatabaseResource,
),
Action(
name="annotate-rows",
description="Annotate rows",
resource_class=TableResource,
),
Action(
name="view-debug-info",
description="View debug information",
),
]
```
The `abbr=` is now optional and defaults to `None`.
For actions that apply to specific resources (like databases or tables), specify the `resource_class` instead of `takes_parent` and `takes_child`. Note that `view-debug-info` does not specify a `resource_class` because it applies globally.
## permission_allowed() hook is replaced by permission_resources_sql()
The following old code:
```python
@hookimpl
def permission_allowed(action):
if action == "permissions-debug":
return True
```
Can be replaced by:
```python
from datasette.permissions import PermissionSQL
@hookimpl
def permission_resources_sql(action):
return PermissionSQL.allow(reason="datasette-allow-permissions-debug")
```
A `.deny(reason="")` class method is also available.
For more complex permission checks consult the documentation for that plugin hook:
<https://docs.datasette.io/en/latest/plugin_hooks.html#permission-resources-sql-datasette-actor-action>
## Using datasette.allowed() to check permissions instead of datasette.permission_allowed()
The internal method `datasette.permission_allowed()` has been replaced by `datasette.allowed()`.
The old method looked like this:
```python
can_debug = await datasette.permission_allowed(
request.actor,
"view-debug-info",
)
can_explain_sql = await datasette.permission_allowed(
request.actor,
"explain-sql",
resource="database_name",
)
can_annotate_rows = await datasette.permission_allowed(
request.actor,
"annotate-rows",
resource=(database_name, table_name),
)
```
Note the confusing design here where `resource` could be either a string or a tuple depending on the permission being checked.
The new keyword-only design makes this a lot more clear:
```python
from datasette.resources import DatabaseResource, TableResource
can_debug = await datasette.allowed(
actor=request.actor,
action="view-debug-info",
)
can_explain_sql = await datasette.allowed(
actor=request.actor,
action="explain-sql",
resource=DatabaseResource(database_name),
)
can_annotate_rows = await datasette.allowed(
actor=request.actor,
action="annotate-rows",
resource=TableResource(database_name, table_name),
)
```
## Root user checks are no longer necessary
Some plugins would introduce their own custom permission and then ensure the `"root"` actor had access to it using a pattern like this:
```python
@hookimpl
def register_permissions(datasette):
return [
Permission(
name="upload-dbs",
abbr=None,
description="Upload SQLite database files",
takes_database=False,
takes_resource=False,
default=False,
)
]
@hookimpl
def permission_allowed(actor, action):
if action == "upload-dbs" and actor and actor.get("id") == "root":
return True
```
This is no longer necessary in Datasette 1.0a20 - the `"root"` actor automatically has all permissions when Datasette is started with the `datasette --root` option.
The `permission_allowed()` hook in this example can be entirely removed.
### Root-enabled instances during testing
When writing tests that exercise root-only functionality, make sure to set `datasette.root_enabled = True` on the `Datasette` instance. Root permissions are only granted automatically when Datasette is started with `datasette --root` or when the flag is enabled directly in tests.
## Target the new APIs exclusively
Datasette 1.0a20s permission system is substantially different from previous releases. Attempting to keep plugin code compatible with both the old `permission_allowed()` and the new `allowed()` interfaces leads to brittle workarounds. Prefer to adopt the 1.0a20 APIs (`register_actions`, `permission_resources_sql()`, and `datasette.allowed()`) outright and drop legacy fallbacks.
## Fixing async with httpx.AsyncClient(app=app)
Some older plugins may use the following pattern in their tests, which is no longer supported:
```python
app = Datasette([], memory=True).app()
async with httpx.AsyncClient(app=app) as client:
response = await client.get("http://localhost/path")
```
The new pattern is to use `ds.client` like this:
```python
ds = Datasette([], memory=True)
response = await ds.client.get("/path")
```
## Migrating from metadata= to config=
Datasette 1.0 separates metadata (titles, descriptions, licenses) from configuration (settings, plugins, queries, permissions). Plugin tests and code need to be updated accordingly.
### Update test constructors
Old code:
```python
ds = Datasette(
memory=True,
metadata={
"databases": {
"_memory": {"queries": {"my_query": {"sql": "select 1", "title": "My Query"}}}
},
"plugins": {
"my-plugin": {"setting": "value"}
}
}
)
```
New code:
```python
ds = Datasette(
memory=True,
config={
"databases": {
"_memory": {"queries": {"my_query": {"sql": "select 1", "title": "My Query"}}}
},
"plugins": {
"my-plugin": {"setting": "value"}
}
}
)
```
### Update datasette.metadata() calls
The `datasette.metadata()` method has been removed. Use these methods instead:
Old code:
```python
try:
title = datasette.metadata(database=database)["queries"][query_name]["title"]
except (KeyError, TypeError):
pass
```
New code:
```python
try:
query_info = await datasette.get_canned_query(database, query_name, request.actor)
if query_info and "title" in query_info:
title = query_info["title"]
except (KeyError, TypeError):
pass
```
### Update render functions to async
If your plugin's render function needs to call `datasette.get_canned_query()` or other async Datasette methods, it must be declared as async:
Old code:
```python
def render_atom(datasette, request, sql, columns, rows, database, table, query_name, view_name, data):
# ...
if query_name:
title = datasette.metadata(database=database)["queries"][query_name]["title"]
```
New code:
```python
async def render_atom(datasette, request, sql, columns, rows, database, table, query_name, view_name, data):
# ...
if query_name:
query_info = await datasette.get_canned_query(database, query_name, request.actor)
if query_info and "title" in query_info:
title = query_info["title"]
```
### Update query URLs in tests
Datasette now redirects `?sql=` parameters from database pages to the query view:
Old code:
```python
response = await ds.client.get("/_memory.atom?sql=select+1")
```
New code:
```python
response = await ds.client.get("/_memory/-/query.atom?sql=select+1")
```

View file

@ -1,116 +0,0 @@
(upgrade_guide)=
# Upgrade guide
(upgrade_guide_v1)=
## Datasette 0.X -> 1.0
This section reviews breaking changes Datasette ``1.0`` has when upgrading from a ``0.XX`` version. For new features that ``1.0`` offers, see the {ref}`changelog`.
(upgrade_guide_v1_sql_queries)=
### New URL for SQL queries
Prior to ``1.0a14`` the URL for executing a SQL query looked like this:
```text
/databasename?sql=select+1
# Or for JSON:
/databasename.json?sql=select+1
```
This endpoint served two purposes: without a ``?sql=`` it would list the tables in the database, but with that option it would return results of a query instead.
The URL for executing a SQL query now looks like this:
```text
/databasename/-/query?sql=select+1
# Or for JSON:
/databasename/-/query.json?sql=select+1
```
**This isn't a breaking change.** API calls to the older ``/databasename?sql=...`` endpoint will redirect to the new ``databasename/-/query?sql=...`` endpoint. Upgrading to the new URL is recommended to avoid the overhead of the additional redirect.
(upgrade_guide_v1_metadata)=
### Metadata changes
Metadata was completely revamped for Datasette 1.0. There are a number of related breaking changes, from the ``metadata.yaml`` file to Python APIs, that you'll need to consider when upgrading.
(upgrade_guide_v1_metadata_split)=
#### ``metadata.yaml`` split into ``datasette.yaml``
Before Datasette 1.0, the ``metadata.yaml`` file became a kitchen sink if a mix of metadata, configuration, and settings. Now ``metadata.yaml`` is strictly for metadata (ex title and descriptions of database and tables, licensing info, etc). Other settings have been moved to a ``datasette.yml`` configuration file, described in {ref}`configuration`.
To start Datasette with both metadata and configuration files, run it like this:
```bash
datasette --metadata metadata.yaml --config datasette.yaml
# Or the shortened version:
datasette -m metadata.yml -c datasette.yml
```
(upgrade_guide_v1_metadata_upgrade)=
#### Upgrading an existing ``metadata.yaml`` file
The [datasette-upgrade plugin](https://github.com/datasette/datasette-upgrade) can be used to split a Datasette 0.x.x ``metadata.yaml`` (or ``.json``) file into separate ``metadata.yaml`` and ``datasette.yaml`` files. First, install the plugin:
```bash
datasette install datasette-upgrade
```
Then run it like this to produce the two new files:
```bash
datasette upgrade metadata-to-config metadata.json -m metadata.yml -c datasette.yml
```
#### Metadata "fallback" has been removed
Certain keys in metadata like ``license`` used to "fallback" up the chain of ownership.
For example, if you set an ``MIT`` to a database and a table within that database did not have a specified license, then that table would inherit an ``MIT`` license.
This behavior has been removed in Datasette 1.0. Now license fields must be placed on all items, including individual databases and tables.
(upgrade_guide_v1_metadata_removed)=
#### The ``get_metadata()`` plugin hook has been removed
In Datasette ``0.x`` plugins could implement a ``get_metadata()`` plugin hook to customize how metadata was retrieved for different instances, databases and tables.
This hook could be inefficient, since some pages might load metadata for many different items (to list a large number of tables, for example) which could result in a large number of calls to potentially expensive plugin hook implementations.
As of Datasette ``1.0a14`` (2024-08-05), the ``get_metadata()`` hook has been deprecated:
```python
# ❌ DEPRECATED in Datasette 1.0
@hookimpl
def get_metadata(datasette, key, database, table):
pass
```
Instead, plugins are encouraged to interact directly with Datasette's in-memory metadata tables in SQLite using the following methods on the {ref}`internals_datasette`:
- {ref}`get_instance_metadata() <datasette_get_instance_metadata>` and {ref}`set_instance_metadata() <datasette_set_instance_metadata>`
- {ref}`get_database_metadata() <datasette_get_database_metadata>` and {ref}`set_database_metadata() <datasette_set_database_metadata>`
- {ref}`get_resource_metadata() <datasette_get_resource_metadata>` and {ref}`set_resource_metadata() <datasette_set_resource_metadata>`
- {ref}`get_column_metadata() <datasette_get_column_metadata>` and {ref}`set_column_metadata() <datasette_set_column_metadata>`
A plugin that stores or calculates its own metadata can implement the {ref}`plugin_hook_startup` hook to populate those items on startup, and then call those methods while it is running to persist any new metadata changes.
(upgrade_guide_v1_metadata_json_removed)=
#### The ``/metadata.json`` endpoint has been removed
As of Datasette ``1.0a14``, the root level ``/metadata.json`` endpoint has been removed. Metadata for tables will become available through currently in-development extras in a future alpha.
(upgrade_guide_v1_metadata_method_removed)=
#### The ``metadata()`` method on the Datasette class has been removed
As of Datasette ``1.0a14``, the ``.metadata()`` method on the Datasette Python API has been removed.
Instead, one should use the following methods on a Datasette class:
- {ref}`get_instance_metadata() <datasette_get_instance_metadata>`
- {ref}`get_database_metadata() <datasette_get_database_metadata>`
- {ref}`get_resource_metadata() <datasette_get_resource_metadata>`
- {ref}`get_column_metadata() <datasette_get_column_metadata>`
```{include} upgrade-1.0a20.md
:heading-offset: 1
```

130
docs/upgrade_guide.rst Normal file
View file

@ -0,0 +1,130 @@
.. _upgrade_guide:
===============
Upgrade guide
===============
.. _upgrade_guide_v1:
Datasette 0.X -> 1.0
====================
This section reviews breaking changes Datasette ``1.0`` has when upgrading from a ``0.XX`` version. For new features that ``1.0`` offers, see the :ref:`changelog`.
.. _upgrade_guide_v1_sql_queries:
New URL for SQL queries
-----------------------
Prior to ``1.0a14`` the URL for executing a SQL query looked like this:
::
/databasename?sql=select+1
# Or for JSON:
/databasename.json?sql=select+1
This endpoint served two purposes: without a ``?sql=`` it would list the tables in the database, but with that option it would return results of a query instead.
The URL for executing a SQL query now looks like this::
/databasename/-/query?sql=select+1
# Or for JSON:
/databasename/-/query.json?sql=select+1
**This isn't a breaking change.** API calls to the older ``/databasename?sql=...`` endpoint will redirect to the new ``databasename/-/query?sql=...`` endpoint. Upgrading to the new URL is recommended to avoid the overhead of the additional redirect.
.. _upgrade_guide_v1_metadata:
Metadata changes
----------------
Metadata was completely revamped for Datasette 1.0. There are a number of related breaking changes, from the ``metadata.yaml`` file to Python APIs, that you'll need to consider when upgrading.
.. _upgrade_guide_v1_metadata_split:
``metadata.yaml`` split into ``datasette.yaml``
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Before Datasette 1.0, the ``metadata.yaml`` file became a kitchen sink if a mix of metadata, configuration, and settings. Now ``metadata.yaml`` is strictly for metaata (ex title and descriptions of database and tables, licensing info, etc). Other settings have been moved to a ``datasette.yml`` configuration file, described in :ref:`configuration`.
To start Datasette with both metadata and configuration files, run it like this:
.. code-block:: bash
datasette --metadata metadata.yaml --config datasette.yaml
# Or the shortened version:
datasette -m metadata.yml -c datasette.yml
.. _upgrade_guide_v1_metadata_upgrade:
Upgrading an existing ``metadata.yaml`` file
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The `datasette-upgrade plugin <https://github.com/datasette/datasette-upgrade>`__ can be used to split a Datasette 0.x.x ``metadata.yaml`` (or ``.json``) file into separate ``metadata.yaml`` and ``datasette.yaml`` files. First, install the plugin:
.. code-block:: bash
datasette install datasette-upgrade
Then run it like this to produce the two new files:
.. code-block:: bash
datasette upgrade metadata-to-config metadata.json -m metadata.yml -c datasette.yml
Metadata "fallback" has been removed
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Certain keys in metadata like ``license`` used to "fallback" up the chain of ownership.
For example, if you set an ``MIT`` to a database and a table within that database did not have a specified license, then that table would inherit an ``MIT`` license.
This behavior has been removed in Datasette 1.0. Now license fields must be placed on all items, including individual databases and tables.
.. _upgrade_guide_v1_metadata_removed:
The ``get_metadata()`` plugin hook has been removed
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
In Datasette ``0.x`` plugins could implement a ``get_metadata()`` plugin hook to customize how metadata was retrieved for different instances, databases and tables.
This hook could be inefficient, since some pages might load metadata for many different items (to list a large number of tables, for example) which could result in a large number of calls to potentially expensive plugin hook implementations.
As of Datasette ``1.0a14`` (2024-08-05), the ``get_metadata()`` hook has been deprecated:
.. code-block:: python
# ❌ DEPRECATED in Datasette 1.0
@hookimpl
def get_metadata(datasette, key, database, table):
pass
Instead, plugins are encouraged to interact directly with Datasette's in-memory metadata tables in SQLite using the following methods on the :ref:`internals_datasette`:
- :ref:`get_instance_metadata() <datasette_get_instance_metadata>` and :ref:`set_instance_metadata() <datasette_set_instance_metadata>`
- :ref:`get_database_metadata() <datasette_get_database_metadata>` and :ref:`set_database_metadata() <datasette_set_database_metadata>`
- :ref:`get_resource_metadata() <datasette_get_resource_metadata>` and :ref:`set_resource_metadata() <datasette_set_resource_metadata>`
- :ref:`get_column_metadata() <datasette_get_column_metadata>` and :ref:`set_column_metadata() <datasette_set_column_metadata>`
A plugin that stores or calculates its own metadata can implement the :ref:`plugin_hook_startup` hook to populate those items on startup, and then call those methods while it is running to persist any new metadata changes.
.. _upgrade_guide_v1_metadata_json_removed:
The ``/metadata.json`` endpoint has been removed
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
As of Datasette ``1.0a14``, the root level ``/metadata.json`` endpoint has been removed. Metadata for tables will become available through currently in-development extras in a future alpha.
.. _upgrade_guide_v1_metadata_method_removed:
The ``metadata()`` method on the Datasette class has been removed
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
As of Datasette ``1.0a14``, the ``.metadata()`` method on the Datasette Python API has been removed.
Instead, one should use the following methods on a Datasette class:
- :ref:`get_instance_metadata() <datasette_get_instance_metadata>`
- :ref:`get_database_metadata() <datasette_get_database_metadata>`
- :ref:`get_resource_metadata() <datasette_get_resource_metadata>`
- :ref:`get_column_metadata() <datasette_get_column_metadata>`

22
package-lock.json generated
View file

@ -13,7 +13,7 @@
"rollup": "^3.29.5" "rollup": "^3.29.5"
}, },
"devDependencies": { "devDependencies": {
"prettier": "^3.0.0" "prettier": "^2.2.1"
} }
}, },
"node_modules/@codemirror/autocomplete": { "node_modules/@codemirror/autocomplete": {
@ -391,19 +391,15 @@
} }
}, },
"node_modules/prettier": { "node_modules/prettier": {
"version": "3.6.2", "version": "2.2.1",
"resolved": "https://registry.npmjs.org/prettier/-/prettier-3.6.2.tgz", "resolved": "https://registry.npmjs.org/prettier/-/prettier-2.2.1.tgz",
"integrity": "sha512-I7AIg5boAr5R0FFtJ6rCfD+LFsWHp81dolrFD8S79U9tb8Az2nGrJncnMSnys+bpQJfRUzqs9hnA81OAA3hCuQ==", "integrity": "sha512-PqyhM2yCjg/oKkFPtTGUojv7gnZAoG80ttl45O6x2Ug/rMJw4wcc9k6aaf2hibP7BGVCCM33gZoGjyvt9mm16Q==",
"dev": true, "dev": true,
"license": "MIT",
"bin": { "bin": {
"prettier": "bin/prettier.cjs" "prettier": "bin-prettier.js"
}, },
"engines": { "engines": {
"node": ">=14" "node": ">=10.13.0"
},
"funding": {
"url": "https://github.com/prettier/prettier?sponsor=1"
} }
}, },
"node_modules/resolve": { "node_modules/resolve": {
@ -781,9 +777,9 @@
"integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==" "integrity": "sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA=="
}, },
"prettier": { "prettier": {
"version": "3.6.2", "version": "2.2.1",
"resolved": "https://registry.npmjs.org/prettier/-/prettier-3.6.2.tgz", "resolved": "https://registry.npmjs.org/prettier/-/prettier-2.2.1.tgz",
"integrity": "sha512-I7AIg5boAr5R0FFtJ6rCfD+LFsWHp81dolrFD8S79U9tb8Az2nGrJncnMSnys+bpQJfRUzqs9hnA81OAA3hCuQ==", "integrity": "sha512-PqyhM2yCjg/oKkFPtTGUojv7gnZAoG80ttl45O6x2Ug/rMJw4wcc9k6aaf2hibP7BGVCCM33gZoGjyvt9mm16Q==",
"dev": true "dev": true
}, },
"resolve": { "resolve": {

View file

@ -2,7 +2,7 @@
"name": "datasette", "name": "datasette",
"private": true, "private": true,
"devDependencies": { "devDependencies": {
"prettier": "^3.0.0" "prettier": "^2.2.1"
}, },
"scripts": { "scripts": {
"fix": "npm run prettier -- --write", "fix": "npm run prettier -- --write",

View file

@ -1,98 +0,0 @@
[project]
name = "datasette"
dynamic = ["version"]
description = "An open source multi-tool for exploring and publishing data"
readme = { file = "README.md", content-type = "text/markdown" }
authors = [
{ name = "Simon Willison" },
]
license = "Apache-2.0"
requires-python = ">=3.10"
classifiers = [
"Development Status :: 4 - Beta",
"Framework :: Datasette",
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"Intended Audience :: End Users/Desktop",
"Topic :: Database",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3.14",
]
dependencies = [
"asgiref>=3.2.10",
"click>=7.1.1",
"click-default-group>=1.2.3",
"Jinja2>=2.10.3",
"hupper>=1.9",
"httpx>=0.20,<1.0",
"pluggy>=1.0",
"uvicorn>=0.11",
"aiofiles>=0.4",
"janus>=0.6.2",
"asgi-csrf>=0.10",
"PyYAML>=5.3",
"mergedeep>=1.1.1",
"itsdangerous>=1.1",
"sqlite-utils>=3.30",
"asyncinject>=0.6.1",
"setuptools",
"pip",
]
[project.urls]
Homepage = "https://datasette.io/"
Documentation = "https://docs.datasette.io/en/stable/"
Changelog = "https://docs.datasette.io/en/stable/changelog.html"
"Live demo" = "https://latest.datasette.io/"
"Source code" = "https://github.com/simonw/datasette"
Issues = "https://github.com/simonw/datasette/issues"
CI = "https://github.com/simonw/datasette/actions?query=workflow%3ATest"
[project.scripts]
datasette = "datasette.cli:cli"
[project.optional-dependencies]
docs = [
"Sphinx==7.4.7",
"furo==2025.9.25",
"sphinx-autobuild",
"codespell>=2.2.5",
"blacken-docs",
"sphinx-copybutton",
"sphinx-inline-tabs",
"myst-parser",
"sphinx-markdown-builder",
"ruamel.yaml",
]
test = [
"pytest>=9",
"pytest-xdist>=2.2.1",
"pytest-asyncio>=1.2.0",
"beautifulsoup4>=4.8.1",
"black==25.11.0",
"blacken-docs==1.20.0",
"pytest-timeout>=1.4.2",
"trustme>=0.7",
"cogapp>=3.3.0",
]
rich = ["rich"]
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[tool.setuptools.packages.find]
include = ["datasette*"]
[tool.setuptools.package-data]
datasette = ["templates/*.html"]
[tool.setuptools.dynamic]
version = {attr = "datasette.version.__version__"}
[tool.uv]
package = true

View file

@ -1,2 +1 @@
line-length = 160 line-length = 160
target-version = "py310"

107
setup.py Normal file
View file

@ -0,0 +1,107 @@
from setuptools import setup, find_packages
import os
def get_long_description():
with open(
os.path.join(os.path.dirname(os.path.abspath(__file__)), "README.md"),
encoding="utf8",
) as fp:
return fp.read()
def get_version():
path = os.path.join(
os.path.dirname(os.path.abspath(__file__)), "datasette", "version.py"
)
g = {}
with open(path) as fp:
exec(fp.read(), g)
return g["__version__"]
setup(
name="datasette",
version=get_version(),
description="An open source multi-tool for exploring and publishing data",
long_description=get_long_description(),
long_description_content_type="text/markdown",
author="Simon Willison",
license="Apache License, Version 2.0",
url="https://datasette.io/",
project_urls={
"Documentation": "https://docs.datasette.io/en/stable/",
"Changelog": "https://docs.datasette.io/en/stable/changelog.html",
"Live demo": "https://latest.datasette.io/",
"Source code": "https://github.com/simonw/datasette",
"Issues": "https://github.com/simonw/datasette/issues",
"CI": "https://github.com/simonw/datasette/actions?query=workflow%3ATest",
},
packages=find_packages(exclude=("tests",)),
package_data={"datasette": ["templates/*.html"]},
include_package_data=True,
python_requires=">=3.10",
install_requires=[
"asgiref>=3.2.10",
"click>=7.1.1",
"click-default-group>=1.2.3",
"Jinja2>=2.10.3",
"hupper>=1.9",
"httpx>=0.20",
'importlib_metadata>=4.6; python_version < "3.10"',
"pluggy>=1.0",
"uvicorn>=0.11",
"aiofiles>=0.4",
"janus>=0.6.2",
"asgi-csrf>=0.10",
"PyYAML>=5.3",
"mergedeep>=1.1.1",
"itsdangerous>=1.1",
"sqlite-utils>=3.30",
"asyncinject>=0.6.1",
"setuptools",
"pip",
],
entry_points="""
[console_scripts]
datasette=datasette.cli:cli
""",
extras_require={
"docs": [
"Sphinx==7.4.7",
"furo==2024.8.6",
"sphinx-autobuild",
"codespell>=2.2.5",
"blacken-docs",
"sphinx-copybutton",
"sphinx-inline-tabs",
"ruamel.yaml",
],
"test": [
"pytest>=5.2.2",
"pytest-xdist>=2.2.1",
"pytest-asyncio>=1.2.0",
"beautifulsoup4>=4.8.1",
"black==25.1.0",
"blacken-docs==1.19.1",
"pytest-timeout>=1.4.2",
"trustme>=0.7",
"cogapp>=3.3.0",
],
"rich": ["rich"],
},
classifiers=[
"Development Status :: 4 - Beta",
"Framework :: Datasette",
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"Intended Audience :: End Users/Desktop",
"Topic :: Database",
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3.14",
"Programming Language :: Python :: 3.13",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.10",
],
)

View file

@ -23,10 +23,6 @@ UNDOCUMENTED_PERMISSIONS = {
"this_is_allowed_async", "this_is_allowed_async",
"this_is_denied_async", "this_is_denied_async",
"no_match", "no_match",
# Test actions from test_hook_register_actions_with_custom_resources
"manage_documents",
"view_document_collection",
"view_document",
} }
_ds_client = None _ds_client = None
@ -46,9 +42,7 @@ def wait_until_responds(url, timeout=5.0, client=httpx, **kwargs):
@pytest_asyncio.fixture @pytest_asyncio.fixture
async def ds_client(): async def ds_client():
from datasette.app import Datasette from datasette.app import Datasette
from datasette.database import Database
from .fixtures import CONFIG, METADATA, PLUGINS_DIR from .fixtures import CONFIG, METADATA, PLUGINS_DIR
import secrets
global _ds_client global _ds_client
if _ds_client is not None: if _ds_client is not None:
@ -62,7 +56,6 @@ async def ds_client():
"default_page_size": 50, "default_page_size": 50,
"max_returned_rows": 100, "max_returned_rows": 100,
"sql_time_limit_ms": 200, "sql_time_limit_ms": 200,
"facet_suggest_time_limit_ms": 200, # Up from 50 default
# Default is 3 but this results in "too many open files" # Default is 3 but this results in "too many open files"
# errors when running the full test suite: # errors when running the full test suite:
"num_sql_threads": 1, "num_sql_threads": 1,
@ -70,10 +63,7 @@ async def ds_client():
) )
from .fixtures import TABLES, TABLE_PARAMETERIZED_SQL from .fixtures import TABLES, TABLE_PARAMETERIZED_SQL
# Use a unique memory_name to avoid collisions between different db = ds.add_memory_database("fixtures")
# Datasette instances in the same process, but use "fixtures" for routing
unique_memory_name = f"fixtures_{secrets.token_hex(8)}"
db = ds.add_database(Database(ds, memory_name=unique_memory_name), name="fixtures")
ds.remove_database("_memory") ds.remove_database("_memory")
def prepare(conn): def prepare(conn):
@ -143,30 +133,32 @@ def restore_working_directory(tmpdir, request):
@pytest.fixture(scope="session", autouse=True) @pytest.fixture(scope="session", autouse=True)
def check_actions_are_documented(): def check_permission_actions_are_documented():
from datasette.plugins import pm from datasette.plugins import pm
content = ( content = (
pathlib.Path(__file__).parent.parent / "docs" / "authentication.rst" pathlib.Path(__file__).parent.parent / "docs" / "authentication.rst"
).read_text() ).read_text()
permissions_re = re.compile(r"\.\. _actions_([^\s:]+):") permissions_re = re.compile(r"\.\. _permissions_([^\s:]+):")
documented_actions = set(permissions_re.findall(content)).union( documented_permission_actions = set(permissions_re.findall(content)).union(
UNDOCUMENTED_PERMISSIONS UNDOCUMENTED_PERMISSIONS
) )
def before(hook_name, hook_impls, kwargs): def before(hook_name, hook_impls, kwargs):
if hook_name == "permission_resources_sql": if hook_name == "permission_allowed":
datasette = kwargs["datasette"] datasette = kwargs["datasette"]
assert kwargs["action"] in datasette.actions, ( assert kwargs["action"] in datasette.permissions, (
"'{}' has not been registered with register_actions()".format( "'{}' has not been registered with register_permissions()".format(
kwargs["action"] kwargs["action"]
) )
+ " (or maybe a test forgot to do await ds.invoke_startup())" + " (or maybe a test forgot to do await ds.invoke_startup())"
) )
action = kwargs.get("action").replace("-", "_") action = kwargs.get("action").replace("-", "_")
assert ( assert (
action in documented_actions action in documented_permission_actions
), "Undocumented permission action: {}".format(action) ), "Undocumented permission action: {}, resource: {}".format(
action, kwargs["resource"]
)
pm.add_hookcall_monitoring( pm.add_hookcall_monitoring(
before=before, after=lambda outcome, hook_name, hook_impls, kwargs: None before=before, after=lambda outcome, hook_name, hook_impls, kwargs: None
@ -241,27 +233,3 @@ def ds_unix_domain_socket_server(tmp_path_factory):
yield ds_proc, uds yield ds_proc, uds
# Shut it down at the end of the pytest session # Shut it down at the end of the pytest session
ds_proc.terminate() ds_proc.terminate()
# Import fixtures from fixtures.py to make them available
from .fixtures import ( # noqa: E402, F401
app_client,
app_client_base_url_prefix,
app_client_conflicting_database_names,
app_client_csv_max_mb_one,
app_client_immutable_and_inspect_file,
app_client_larger_cache_size,
app_client_no_files,
app_client_returned_rows_matches_page_size,
app_client_shorter_time_limit,
app_client_two_attached_databases,
app_client_two_attached_databases_crossdb_enabled,
app_client_two_attached_databases_one_immutable,
app_client_with_cors,
app_client_with_dot,
app_client_with_trace,
generate_compound_rows,
generate_sortable_rows,
make_app_client,
TEMP_PLUGIN_SECRET_FILE,
)

View file

@ -44,13 +44,14 @@ EXPECTED_PLUGINS = [
"forbidden", "forbidden",
"homepage_actions", "homepage_actions",
"menu_links", "menu_links",
"permission_resources_sql", "permission_allowed",
"prepare_connection", "prepare_connection",
"prepare_jinja2_environment", "prepare_jinja2_environment",
"query_actions", "query_actions",
"register_actions", "register_actions",
"register_facet_classes", "register_facet_classes",
"register_magic_parameters", "register_magic_parameters",
"register_permissions",
"register_routes", "register_routes",
"render_cell", "render_cell",
"row_actions", "row_actions",
@ -73,6 +74,7 @@ EXPECTED_PLUGINS = [
"extra_template_vars", "extra_template_vars",
"handle_exception", "handle_exception",
"menu_links", "menu_links",
"permission_allowed",
"prepare_jinja2_environment", "prepare_jinja2_environment",
"register_routes", "register_routes",
"render_cell", "render_cell",
@ -755,41 +757,16 @@ def assert_permissions_checked(datasette, actions):
resource = None resource = None
else: else:
action, resource = action action, resource = action
# Convert PermissionCheck dataclass to old resource format for comparison
def check_matches(pc, action, resource):
if pc.action != action:
return False
# Convert parent/child to old resource format
if pc.parent and pc.child:
pc_resource = (pc.parent, pc.child)
elif pc.parent:
pc_resource = pc.parent
else:
pc_resource = None
return pc_resource == resource
assert [ assert [
pc pc
for pc in datasette._permission_checks for pc in datasette._permission_checks
if check_matches(pc, action, resource) if pc["action"] == action and pc["resource"] == resource
], """Missing expected permission check: action={}, resource={} ], """Missing expected permission check: action={}, resource={}
Permission checks seen: {} Permission checks seen: {}
""".format( """.format(
action, action,
resource, resource,
json.dumps( json.dumps(list(datasette._permission_checks), indent=4),
[
{
"action": pc.action,
"parent": pc.parent,
"child": pc.child,
"result": pc.result,
}
for pc in datasette._permission_checks
],
indent=4,
),
) )

View file

@ -1,5 +1,5 @@
import asyncio import asyncio
from datasette import hookimpl from datasette import hookimpl, Permission
from datasette.facets import Facet from datasette.facets import Facet
from datasette import tracer from datasette import tracer
from datasette.permissions import Action from datasette.permissions import Action
@ -214,6 +214,29 @@ def asgi_wrapper():
return wrap return wrap
@hookimpl
def permission_allowed(actor, action):
if action == "this_is_allowed":
return True
elif action == "this_is_denied":
return False
elif action == "view-database-download":
return actor.get("can_download") if actor else None
# Special permissions for latest.datasette.io demos
# See https://github.com/simonw/todomvc-datasette/issues/2
actor_id = None
if actor:
actor_id = actor.get("id")
if actor_id == "todomvc" and action in (
"insert-row",
"create-table",
"drop-table",
"delete-row",
"update-row",
):
return True
@hookimpl @hookimpl
def register_routes(): def register_routes():
async def one(datasette): async def one(datasette):
@ -452,140 +475,42 @@ def skip_csrf(scope):
@hookimpl @hookimpl
def register_actions(datasette): def register_permissions(datasette):
extras_old = datasette.plugin_config("datasette-register-permissions") or {} extras = datasette.plugin_config("datasette-register-permissions") or {}
extras_new = datasette.plugin_config("datasette-register-actions") or {} permissions = [
Permission(
name="permission-from-plugin",
abbr="np",
description="New permission added by a plugin",
takes_database=True,
takes_resource=False,
default=False,
)
]
if extras:
permissions.extend(
Permission(
name=p["name"],
abbr=p["abbr"],
description=p["description"],
takes_database=p["takes_database"],
takes_resource=p["takes_resource"],
default=p["default"],
)
for p in extras["permissions"]
)
return permissions
actions = [
Action( @hookimpl
name="action-from-plugin", def register_actions(datasette):
abbr="ap", return [
description="New action added by a plugin",
resource_class=DatabaseResource,
),
Action( Action(
name="view-collection", name="view-collection",
abbr="vc", abbr="vc",
description="View a collection", description="View a collection",
takes_parent=True,
takes_child=False,
resource_class=DatabaseResource, resource_class=DatabaseResource,
), )
# Test actions for test_hook_custom_allowed (global actions - no resource_class)
Action(
name="this_is_allowed",
abbr=None,
description=None,
),
Action(
name="this_is_denied",
abbr=None,
description=None,
),
Action(
name="this_is_allowed_async",
abbr=None,
description=None,
),
Action(
name="this_is_denied_async",
abbr=None,
description=None,
),
] ]
# Support old-style config for backwards compatibility
if extras_old:
for p in extras_old["permissions"]:
# Map old takes_database/takes_resource to new global/resource_class
if p.get("takes_database"):
# Has database -> DatabaseResource
actions.append(
Action(
name=p["name"],
abbr=p["abbr"],
description=p["description"],
resource_class=DatabaseResource,
)
)
else:
# No database -> global action (no resource_class)
actions.append(
Action(
name=p["name"],
abbr=p["abbr"],
description=p["description"],
)
)
# Support new-style config
if extras_new:
for a in extras_new["actions"]:
# Check if this is a global action (no resource_class specified)
if not a.get("resource_class"):
actions.append(
Action(
name=a["name"],
abbr=a["abbr"],
description=a["description"],
)
)
else:
# Map string resource_class to actual class
resource_class_map = {
"DatabaseResource": DatabaseResource,
}
resource_class = resource_class_map.get(
a.get("resource_class", "DatabaseResource"), DatabaseResource
)
actions.append(
Action(
name=a["name"],
abbr=a["abbr"],
description=a["description"],
resource_class=resource_class,
)
)
return actions
@hookimpl
def permission_resources_sql(datasette, actor, action):
from datasette.permissions import PermissionSQL
# Handle test actions used in test_hook_custom_allowed
if action == "this_is_allowed":
return PermissionSQL.allow(reason="test plugin allows this_is_allowed")
elif action == "this_is_denied":
return PermissionSQL.deny(reason="test plugin denies this_is_denied")
elif action == "this_is_allowed_async":
return PermissionSQL.allow(reason="test plugin allows this_is_allowed_async")
elif action == "this_is_denied_async":
return PermissionSQL.deny(reason="test plugin denies this_is_denied_async")
elif action == "view-database-download":
# Return rule based on actor's can_download permission
if actor and actor.get("can_download"):
return PermissionSQL.allow(reason="actor has can_download")
else:
return None # No opinion
elif action == "view-database":
# Also grant view-database if actor has can_download (needed for download to work)
if actor and actor.get("can_download"):
return PermissionSQL.allow(
reason="actor has can_download, grants view-database"
)
else:
return None
elif action in (
"insert-row",
"create-table",
"drop-table",
"delete-row",
"update-row",
):
# Special permissions for latest.datasette.io demos
actor_id = actor.get("id") if actor else None
if actor_id == "todomvc":
return PermissionSQL.allow(reason=f"todomvc actor allowed for {action}")
return None

View file

@ -113,6 +113,24 @@ def actor_from_request(datasette, request):
return inner return inner
@hookimpl
def permission_allowed(datasette, actor, action):
# Testing asyncio version of permission_allowed
async def inner():
assert (
2
== (
await datasette.get_internal_database().execute("select 1 + 1")
).first()[0]
)
if action == "this_is_allowed_async":
return True
elif action == "this_is_denied_async":
return False
return inner
@hookimpl @hookimpl
def prepare_jinja2_environment(env, datasette): def prepare_jinja2_environment(env, datasette):
env.filters["format_numeric"] = lambda s: f"{float(s):,.0f}" env.filters["format_numeric"] = lambda s: f"{float(s):,.0f}"

View file

@ -2,15 +2,16 @@
Tests for the new Resource-based permission system. Tests for the new Resource-based permission system.
These tests verify: These tests verify:
1. The new Datasette.allowed_resources() method (with pagination) 1. The new Datasette.allowed_resources() method
2. The new Datasette.allowed() method 2. The new Datasette.allowed() method
3. The include_reasons parameter for debugging 3. The new Datasette.allowed_resources_with_reasons() method
4. That SQL does the heavy lifting (no Python filtering) 4. That SQL does the heavy lifting (no Python filtering)
""" """
import pytest import pytest
import pytest_asyncio import pytest_asyncio
from datasette.app import Datasette from datasette.app import Datasette
from datasette.plugins import pm
from datasette.permissions import PermissionSQL from datasette.permissions import PermissionSQL
from datasette.resources import TableResource from datasette.resources import TableResource
from datasette import hookimpl from datasette import hookimpl
@ -62,16 +63,15 @@ async def test_allowed_resources_global_allow(test_ds):
def rules_callback(datasette, actor, action): def rules_callback(datasette, actor, action):
if actor and actor.get("id") == "alice": if actor and actor.get("id") == "alice":
sql = "SELECT NULL AS parent, NULL AS child, 1 AS allow, 'global: alice has access' AS reason" sql = "SELECT NULL AS parent, NULL AS child, 1 AS allow, 'global: alice has access' AS reason"
return PermissionSQL(sql=sql) return PermissionSQL(source="test", sql=sql, params={})
return None return None
plugin = PermissionRulesPlugin(rules_callback) plugin = PermissionRulesPlugin(rules_callback)
test_ds.pm.register(plugin, name="test_plugin") pm.register(plugin, name="test_plugin")
try: try:
# Use the new allowed_resources() method # Use the new allowed_resources() method
result = await test_ds.allowed_resources("view-table", {"id": "alice"}) tables = await test_ds.allowed_resources("view-table", {"id": "alice"})
tables = result.resources
# Alice should see all tables # Alice should see all tables
assert len(tables) == 5 assert len(tables) == 5
@ -86,7 +86,7 @@ async def test_allowed_resources_global_allow(test_ds):
assert ("production", "orders") in table_set assert ("production", "orders") in table_set
finally: finally:
test_ds.pm.unregister(plugin, name="test_plugin") pm.unregister(plugin, name="test_plugin")
@pytest.mark.asyncio @pytest.mark.asyncio
@ -101,11 +101,11 @@ async def test_allowed_specific_resource(test_ds):
UNION ALL UNION ALL
SELECT 'analytics' AS parent, NULL AS child, 1 AS allow, 'analyst access' AS reason SELECT 'analytics' AS parent, NULL AS child, 1 AS allow, 'analyst access' AS reason
""" """
return PermissionSQL(sql=sql) return PermissionSQL(source="test", sql=sql, params={})
return None return None
plugin = PermissionRulesPlugin(rules_callback) plugin = PermissionRulesPlugin(rules_callback)
test_ds.pm.register(plugin, name="test_plugin") pm.register(plugin, name="test_plugin")
try: try:
actor = {"id": "bob", "role": "analyst"} actor = {"id": "bob", "role": "analyst"}
@ -129,11 +129,13 @@ async def test_allowed_specific_resource(test_ds):
) )
finally: finally:
test_ds.pm.unregister(plugin, name="test_plugin") pm.unregister(plugin, name="test_plugin")
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_allowed_resources_include_reasons(test_ds): async def test_allowed_resources_with_reasons(test_ds):
"""Test allowed_resources_with_reasons() exposes debugging info"""
def rules_callback(datasette, actor, action): def rules_callback(datasette, actor, action):
if actor and actor.get("role") == "analyst": if actor and actor.get("role") == "analyst":
sql = """ sql = """
@ -143,33 +145,31 @@ async def test_allowed_resources_include_reasons(test_ds):
SELECT 'analytics' AS parent, 'sensitive' AS child, 0 AS allow, SELECT 'analytics' AS parent, 'sensitive' AS child, 0 AS allow,
'child: sensitive data denied' AS reason 'child: sensitive data denied' AS reason
""" """
return PermissionSQL(sql=sql) return PermissionSQL(source="test", sql=sql, params={})
return None return None
plugin = PermissionRulesPlugin(rules_callback) plugin = PermissionRulesPlugin(rules_callback)
test_ds.pm.register(plugin, name="test_plugin") pm.register(plugin, name="test_plugin")
try: try:
# Use allowed_resources with include_reasons to get debugging info # Use allowed_resources_with_reasons to get debugging info
result = await test_ds.allowed_resources( allowed = await test_ds.allowed_resources_with_reasons(
"view-table", {"id": "bob", "role": "analyst"}, include_reasons=True "view-table", {"id": "bob", "role": "analyst"}
) )
allowed = result.resources
# Should get analytics tables except sensitive # Should get analytics tables except sensitive
assert len(allowed) >= 2 # At least users and events assert len(allowed) >= 2 # At least users and events
# Check we can access both resource and reason # Check we can access both resource and reason
for resource in allowed: for item in allowed:
assert isinstance(resource, TableResource) assert isinstance(item.resource, TableResource)
assert isinstance(resource.reasons, list) assert isinstance(item.reason, str)
if resource.parent == "analytics": if item.resource.parent == "analytics":
# Should mention parent-level reason in at least one of the reasons # Should mention parent-level reason
reasons_text = " ".join(resource.reasons).lower() assert "analyst access" in item.reason.lower()
assert "analyst access" in reasons_text
finally: finally:
test_ds.pm.unregister(plugin, name="test_plugin") pm.unregister(plugin, name="test_plugin")
@pytest.mark.asyncio @pytest.mark.asyncio
@ -185,16 +185,15 @@ async def test_child_deny_overrides_parent_allow(test_ds):
SELECT 'analytics' AS parent, 'sensitive' AS child, 0 AS allow, SELECT 'analytics' AS parent, 'sensitive' AS child, 0 AS allow,
'child: deny sensitive' AS reason 'child: deny sensitive' AS reason
""" """
return PermissionSQL(sql=sql) return PermissionSQL(source="test", sql=sql, params={})
return None return None
plugin = PermissionRulesPlugin(rules_callback) plugin = PermissionRulesPlugin(rules_callback)
test_ds.pm.register(plugin, name="test_plugin") pm.register(plugin, name="test_plugin")
try: try:
actor = {"id": "bob", "role": "analyst"} actor = {"id": "bob", "role": "analyst"}
result = await test_ds.allowed_resources("view-table", actor) tables = await test_ds.allowed_resources("view-table", actor)
tables = result.resources
# Should see analytics tables except sensitive # Should see analytics tables except sensitive
analytics_tables = [t for t in tables if t.parent == "analytics"] analytics_tables = [t for t in tables if t.parent == "analytics"]
@ -218,7 +217,7 @@ async def test_child_deny_overrides_parent_allow(test_ds):
) )
finally: finally:
test_ds.pm.unregister(plugin, name="test_plugin") pm.unregister(plugin, name="test_plugin")
@pytest.mark.asyncio @pytest.mark.asyncio
@ -234,16 +233,15 @@ async def test_child_allow_overrides_parent_deny(test_ds):
SELECT 'production' AS parent, 'orders' AS child, 1 AS allow, SELECT 'production' AS parent, 'orders' AS child, 1 AS allow,
'child: carol can see orders' AS reason 'child: carol can see orders' AS reason
""" """
return PermissionSQL(sql=sql) return PermissionSQL(source="test", sql=sql, params={})
return None return None
plugin = PermissionRulesPlugin(rules_callback) plugin = PermissionRulesPlugin(rules_callback)
test_ds.pm.register(plugin, name="test_plugin") pm.register(plugin, name="test_plugin")
try: try:
actor = {"id": "carol"} actor = {"id": "carol"}
result = await test_ds.allowed_resources("view-table", actor) tables = await test_ds.allowed_resources("view-table", actor)
tables = result.resources
# Should only see production.orders # Should only see production.orders
production_tables = [t for t in tables if t.parent == "production"] production_tables = [t for t in tables if t.parent == "production"]
@ -263,7 +261,7 @@ async def test_child_allow_overrides_parent_deny(test_ds):
) )
finally: finally:
test_ds.pm.unregister(plugin, name="test_plugin") pm.unregister(plugin, name="test_plugin")
@pytest.mark.asyncio @pytest.mark.asyncio
@ -284,10 +282,10 @@ async def test_sql_does_filtering_not_python(test_ds):
SELECT 'analytics' AS parent, 'users' AS child, 1 AS allow, SELECT 'analytics' AS parent, 'users' AS child, 1 AS allow,
'specific allow' AS reason 'specific allow' AS reason
""" """
return PermissionSQL(sql=sql) return PermissionSQL(source="test", sql=sql, params={})
plugin = PermissionRulesPlugin(rules_callback) plugin = PermissionRulesPlugin(rules_callback)
test_ds.pm.register(plugin, name="test_plugin") pm.register(plugin, name="test_plugin")
try: try:
actor = {"id": "dave"} actor = {"id": "dave"}
@ -306,11 +304,66 @@ async def test_sql_does_filtering_not_python(test_ds):
) )
# allowed_resources() should also use SQL filtering # allowed_resources() should also use SQL filtering
result = await test_ds.allowed_resources("view-table", actor) tables = await test_ds.allowed_resources("view-table", actor)
tables = result.resources
assert len(tables) == 1 assert len(tables) == 1
assert tables[0].parent == "analytics" assert tables[0].parent == "analytics"
assert tables[0].child == "users" assert tables[0].child == "users"
finally: finally:
test_ds.pm.unregister(plugin, name="test_plugin") pm.unregister(plugin, name="test_plugin")
@pytest.mark.asyncio
async def test_no_permission_rules_returns_correct_schema():
"""
Test that when no permission rules exist, the empty result has correct schema.
This is a regression test for a bug where the empty result returned only
2 columns (parent, child) instead of the documented 3 columns
(parent, child, reason), causing schema mismatches.
See: https://github.com/simonw/datasette/pull/2515#discussion_r2457803901
"""
from datasette.utils.actions_sql import build_allowed_resources_sql
# Create a fresh datasette instance
ds = Datasette()
await ds.invoke_startup()
# Add a test database
db = ds.add_memory_database("testdb")
await db.execute_write(
"CREATE TABLE IF NOT EXISTS test_table (id INTEGER PRIMARY KEY)"
)
await ds._refresh_schemas()
# Temporarily block all permission_resources_sql hooks to simulate no rules
original_hook = pm.hook.permission_resources_sql
def empty_hook(*args, **kwargs):
return []
pm.hook.permission_resources_sql = empty_hook
try:
# Call build_allowed_resources_sql directly which will hit the no-rules code path
sql, params = await build_allowed_resources_sql(
ds, actor={"id": "nobody"}, action="view-table"
)
# Execute the query to verify it has correct column structure
result = await ds.get_internal_database().execute(sql, params)
# Should have 3 columns: parent, child, reason
# This assertion would fail if the empty result only had 2 columns
assert (
len(result.columns) == 3
), f"Expected 3 columns, got {len(result.columns)}: {result.columns}"
assert result.columns == ["parent", "child", "reason"]
# Should have no rows (no rules = no access)
assert len(result.rows) == 0
finally:
# Restore original hook
pm.hook.permission_resources_sql = original_hook

View file

@ -1,133 +0,0 @@
"""
Test for actor restrictions bug with database-level config.
This test currently FAILS, demonstrating the bug where database-level
config allow blocks can bypass table-level restrictions.
"""
import pytest
from datasette.app import Datasette
from datasette.resources import TableResource
@pytest.mark.asyncio
async def test_table_restrictions_not_bypassed_by_database_level_config():
"""
Actor restrictions should act as hard limits that config cannot override.
BUG: When an actor has table-level restrictions (e.g., only table2 and table3)
but config has a database-level allow block, the database-level config rule
currently allows ALL tables, not just those in the restriction allowlist.
This test documents the expected behavior and will FAIL until the bug is fixed.
"""
# Config grants access at DATABASE level (not table level)
config = {
"databases": {
"test_db_rnbbdlc": {
"allow": {
"id": "user"
} # Database-level allow - grants access to all tables
}
}
}
ds = Datasette(config=config)
await ds.invoke_startup()
db = ds.add_memory_database("test_db_rnbbdlc")
await db.execute_write("create table table1 (id integer primary key)")
await db.execute_write("create table table2 (id integer primary key)")
await db.execute_write("create table table3 (id integer primary key)")
await db.execute_write("create table table4 (id integer primary key)")
# Actor restricted to ONLY table2 and table3
# Even though config allows the whole database, restrictions should limit access
actor = {
"id": "user",
"_r": {
"r": { # Resource-level (table-level) restrictions
"test_db_rnbbdlc": {
"table2": ["vt"], # vt = view-table abbreviation
"table3": ["vt"],
}
}
},
}
# table2 should be allowed (in restriction allowlist AND config allows)
result = await ds.allowed(
action="view-table",
resource=TableResource("test_db_rnbbdlc", "table2"),
actor=actor,
)
assert result is True, "table2 should be allowed - in restriction allowlist"
# table3 should be allowed (in restriction allowlist AND config allows)
result = await ds.allowed(
action="view-table",
resource=TableResource("test_db_rnbbdlc", "table3"),
actor=actor,
)
assert result is True, "table3 should be allowed - in restriction allowlist"
# table1 should be DENIED (NOT in restriction allowlist)
# Even though database-level config allows it, restrictions should deny it
result = await ds.allowed(
action="view-table",
resource=TableResource("test_db_rnbbdlc", "table1"),
actor=actor,
)
assert (
result is False
), "table1 should be DENIED - not in restriction allowlist, config cannot override"
# table4 should be DENIED (NOT in restriction allowlist)
# Even though database-level config allows it, restrictions should deny it
result = await ds.allowed(
action="view-table",
resource=TableResource("test_db_rnbbdlc", "table4"),
actor=actor,
)
assert (
result is False
), "table4 should be DENIED - not in restriction allowlist, config cannot override"
@pytest.mark.asyncio
async def test_database_restrictions_with_database_level_config():
"""
Verify that database-level restrictions work correctly with database-level config.
This should pass - it's testing the case where restriction granularity
matches config granularity.
"""
config = {
"databases": {"test_db_rwdl": {"allow": {"id": "user"}}} # Database-level allow
}
ds = Datasette(config=config)
await ds.invoke_startup()
db = ds.add_memory_database("test_db_rwdl")
await db.execute_write("create table table1 (id integer primary key)")
await db.execute_write("create table table2 (id integer primary key)")
# Actor has database-level restriction (all tables in test_db_rwdl)
actor = {
"id": "user",
"_r": {"d": {"test_db_rwdl": ["vt"]}}, # Database-level restrictions
}
# Both tables should be allowed (database-level restriction matches database-level config)
result = await ds.allowed(
action="view-table",
resource=TableResource("test_db_rwdl", "table1"),
actor=actor,
)
assert result is True, "table1 should be allowed"
result = await ds.allowed(
action="view-table",
resource=TableResource("test_db_rwdl", "table2"),
actor=actor,
)
assert result is True, "table2 should be allowed"

View file

@ -1,5 +1,6 @@
from datasette.app import Datasette from datasette.app import Datasette
from datasette.plugins import DEFAULT_PLUGINS from datasette.plugins import DEFAULT_PLUGINS
from datasette.utils.sqlite import supports_table_xinfo
from datasette.version import __version__ from datasette.version import __version__
from .fixtures import ( # noqa from .fixtures import ( # noqa
app_client, app_client,
@ -833,41 +834,6 @@ async def test_versions_json(ds_client):
assert data["sqlite"]["extensions"]["json1"] assert data["sqlite"]["extensions"]["json1"]
@pytest.mark.asyncio
async def test_actions_json(ds_client):
original_root_enabled = ds_client.ds.root_enabled
try:
ds_client.ds.root_enabled = True
cookies = {"ds_actor": ds_client.actor_cookie({"id": "root"})}
response = await ds_client.get("/-/actions.json", cookies=cookies)
data = response.json()
finally:
ds_client.ds.root_enabled = original_root_enabled
assert isinstance(data, list)
assert len(data) > 0
# Check structure of first action
action = data[0]
for key in (
"name",
"abbr",
"description",
"takes_parent",
"takes_child",
"resource_class",
"also_requires",
):
assert key in action
# Check that some expected actions exist
action_names = {a["name"] for a in data}
for expected_action in (
"view-instance",
"view-database",
"view-table",
"execute-sql",
):
assert expected_action in action_names
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_settings_json(ds_client): async def test_settings_json(ds_client):
response = await ds_client.get("/-/settings.json") response = await ds_client.get("/-/settings.json")
@ -875,7 +841,7 @@ async def test_settings_json(ds_client):
"default_page_size": 50, "default_page_size": 50,
"default_facet_size": 30, "default_facet_size": 30,
"default_allow_sql": True, "default_allow_sql": True,
"facet_suggest_time_limit_ms": 200, "facet_suggest_time_limit_ms": 50,
"facet_time_limit_ms": 200, "facet_time_limit_ms": 200,
"max_returned_rows": 100, "max_returned_rows": 100,
"max_insert_rows": 100, "max_insert_rows": 100,

View file

@ -1,9 +1,11 @@
from bs4 import BeautifulSoup as Soup from bs4 import BeautifulSoup as Soup
from .fixtures import app_client
from .utils import cookie_was_deleted, last_event from .utils import cookie_was_deleted, last_event
from click.testing import CliRunner from click.testing import CliRunner
from datasette.utils import baseconv from datasette.utils import baseconv
from datasette.cli import cli from datasette.cli import cli
from datasette.resources import ( from datasette.resources import (
InstanceResource,
DatabaseResource, DatabaseResource,
TableResource, TableResource,
) )
@ -356,12 +358,18 @@ async def test_root_with_root_enabled_gets_all_permissions(ds_client):
# Test instance-level permissions (no resource) # Test instance-level permissions (no resource)
assert ( assert (
await ds_client.ds.allowed(action="permissions-debug", actor=root_actor) is True await ds_client.ds.permission_allowed(root_actor, "permissions-debug", None)
is True
) )
assert await ds_client.ds.allowed(action="debug-menu", actor=root_actor) is True assert await ds_client.ds.permission_allowed(root_actor, "debug-menu", None) is True
# Test view permissions using the new ds.allowed() method # Test view permissions using the new ds.allowed() method
assert await ds_client.ds.allowed(action="view-instance", actor=root_actor) is True assert (
await ds_client.ds.allowed(
action="view-instance", resource=InstanceResource(), actor=root_actor
)
is True
)
assert ( assert (
await ds_client.ds.allowed( await ds_client.ds.allowed(
@ -454,7 +462,10 @@ async def test_root_without_root_enabled_no_special_permissions(ds_client):
# View permissions should still work (default=True) # View permissions should still work (default=True)
assert ( assert (
await ds_client.ds.allowed(action="view-instance", actor=root_actor) is True await ds_client.ds.allowed(
action="view-instance", resource=InstanceResource(), actor=root_actor
)
is True
) # Default permission ) # Default permission
assert ( assert (
@ -468,7 +479,9 @@ async def test_root_without_root_enabled_no_special_permissions(ds_client):
# But restricted permissions should NOT automatically be granted # But restricted permissions should NOT automatically be granted
# Test with instance-level permission (no resource class) # Test with instance-level permission (no resource class)
result = await ds_client.ds.allowed(action="permissions-debug", actor=root_actor) result = await ds_client.ds.permission_allowed(
root_actor, "permissions-debug", None
)
assert ( assert (
result is not True result is not True
), "Root without root_enabled should not automatically get permissions-debug" ), "Root without root_enabled should not automatically get permissions-debug"

11
tests/test_black.py Normal file
View file

@ -0,0 +1,11 @@
import black
from click.testing import CliRunner
from pathlib import Path
code_root = Path(__file__).parent.parent
def test_black():
runner = CliRunner()
result = runner.invoke(black.main, [str(code_root), "--check"])
assert result.exit_code == 0, result.output

View file

@ -2,7 +2,7 @@ from bs4 import BeautifulSoup as Soup
import json import json
import pytest import pytest
import re import re
from .fixtures import make_app_client from .fixtures import make_app_client, app_client
@pytest.fixture @pytest.fixture

View file

@ -1,4 +1,5 @@
from .fixtures import ( from .fixtures import (
app_client,
make_app_client, make_app_client,
TestClient as _TestClient, TestClient as _TestClient,
EXPECTED_PLUGINS, EXPECTED_PLUGINS,
@ -142,12 +143,10 @@ def test_metadata_yaml():
settings=[], settings=[],
secret=None, secret=None,
root=False, root=False,
default_deny=False,
token=None, token=None,
actor=None, actor=None,
version_note=None, version_note=None,
get=None, get=None,
headers=False,
help_settings=False, help_settings=False,
pdb=False, pdb=False,
crossdb=False, crossdb=False,
@ -449,6 +448,17 @@ def test_serve_duplicate_database_names(tmpdir):
assert {db["name"] for db in databases} == {"db", "db_2"} assert {db["name"] for db in databases} == {"db", "db_2"}
def test_serve_deduplicate_same_database_path(tmpdir):
"'datasette db.db db.db' should only attach one database, /db"
runner = CliRunner()
db_path = str(tmpdir / "db.db")
sqlite3.connect(db_path).execute("vacuum")
result = runner.invoke(cli, [db_path, db_path, "--get", "/-/databases.json"])
assert result.exit_code == 0, result.output
databases = json.loads(result.output)
assert {db["name"] for db in databases} == {"db"}
@pytest.mark.parametrize( @pytest.mark.parametrize(
"filename", ["test-database (1).sqlite", "database (1).sqlite"] "filename", ["test-database (1).sqlite", "database (1).sqlite"]
) )
@ -487,57 +497,3 @@ def test_internal_db(tmpdir):
) )
assert result.exit_code == 0 assert result.exit_code == 0
assert internal_path.exists() assert internal_path.exists()
def test_duplicate_database_files_error(tmpdir):
"""Test that passing the same database file multiple times raises an error"""
runner = CliRunner()
db_path = str(tmpdir / "test.db")
sqlite3.connect(db_path).execute("vacuum")
# Test with exact duplicate
result = runner.invoke(cli, ["serve", db_path, db_path, "--get", "/"])
assert result.exit_code == 1
assert "Duplicate database file" in result.output
assert "both refer to" in result.output
# Test with different paths to same file (relative vs absolute)
result2 = runner.invoke(
cli, ["serve", db_path, str(pathlib.Path(db_path).resolve()), "--get", "/"]
)
assert result2.exit_code == 1
assert "Duplicate database file" in result2.output
# Test that a file in the config_dir can't also be passed explicitly
config_dir = tmpdir / "config"
config_dir.mkdir()
config_db_path = str(config_dir / "data.db")
sqlite3.connect(config_db_path).execute("vacuum")
result3 = runner.invoke(
cli, ["serve", config_db_path, str(config_dir), "--get", "/"]
)
assert result3.exit_code == 1
assert "Duplicate database file" in result3.output
assert "both refer to" in result3.output
# Test that mixing a file NOT in the directory with a directory works fine
other_db_path = str(tmpdir / "other.db")
sqlite3.connect(other_db_path).execute("vacuum")
result4 = runner.invoke(
cli, ["serve", other_db_path, str(config_dir), "--get", "/-/databases.json"]
)
assert result4.exit_code == 0
databases = json.loads(result4.output)
assert {db["name"] for db in databases} == {"other", "data"}
# Test that multiple directories raise an error
config_dir2 = tmpdir / "config2"
config_dir2.mkdir()
result5 = runner.invoke(
cli, ["serve", str(config_dir), str(config_dir2), "--get", "/"]
)
assert result5.exit_code == 1
assert "Cannot pass multiple directories" in result5.output

View file

@ -52,26 +52,6 @@ def test_serve_with_get(tmp_path_factory):
pm.unregister(to_unregister) pm.unregister(to_unregister)
def test_serve_with_get_headers():
runner = CliRunner()
result = runner.invoke(
cli,
[
"serve",
"--memory",
"--get",
"/_memory/",
"--headers",
],
)
# exit_code is 1 because it wasn't a 200 response
assert result.exit_code == 1, result.output
lines = result.output.splitlines()
assert lines and lines[0] == "HTTP/1.1 302"
assert "location: /_memory" in lines
assert "content-type: text/html; charset=utf-8" in lines
def test_serve_with_get_and_token(): def test_serve_with_get_and_token():
runner = CliRunner() runner = CliRunner()
result1 = runner.invoke( result1 = runner.invoke(

View file

@ -2,7 +2,6 @@ import pytest
from datasette.app import Datasette from datasette.app import Datasette
from datasette.database import Database from datasette.database import Database
from datasette.resources import DatabaseResource, TableResource
async def setup_datasette(config=None, databases=None): async def setup_datasette(config=None, databases=None):
@ -19,16 +18,8 @@ async def test_root_permissions_allow():
config = {"permissions": {"execute-sql": {"id": "alice"}}} config = {"permissions": {"execute-sql": {"id": "alice"}}}
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert await ds.allowed( assert await ds.permission_allowed_2({"id": "alice"}, "execute-sql", "content")
action="execute-sql", assert not await ds.permission_allowed_2({"id": "bob"}, "execute-sql", "content")
resource=DatabaseResource(database="content"),
actor={"id": "alice"},
)
assert not await ds.allowed(
action="execute-sql",
resource=DatabaseResource(database="content"),
actor={"id": "bob"},
)
@pytest.mark.asyncio @pytest.mark.asyncio
@ -44,15 +35,11 @@ async def test_database_permission():
} }
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert await ds.allowed( assert await ds.permission_allowed_2(
action="insert-row", {"id": "alice"}, "insert-row", ("content", "repos")
resource=TableResource(database="content", table="repos"),
actor={"id": "alice"},
) )
assert not await ds.allowed( assert not await ds.permission_allowed_2(
action="insert-row", {"id": "bob"}, "insert-row", ("content", "repos")
resource=TableResource(database="content", table="repos"),
actor={"id": "bob"},
) )
@ -67,15 +54,11 @@ async def test_table_permission():
} }
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert await ds.allowed( assert await ds.permission_allowed_2(
action="delete-row", {"id": "alice"}, "delete-row", ("content", "repos")
resource=TableResource(database="content", table="repos"),
actor={"id": "alice"},
) )
assert not await ds.allowed( assert not await ds.permission_allowed_2(
action="delete-row", {"id": "bob"}, "delete-row", ("content", "repos")
resource=TableResource(database="content", table="repos"),
actor={"id": "bob"},
) )
@ -86,20 +69,14 @@ async def test_view_table_allow_block():
} }
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert await ds.allowed( assert await ds.permission_allowed_2(
action="view-table", {"id": "alice"}, "view-table", ("content", "repos")
resource=TableResource(database="content", table="repos"),
actor={"id": "alice"},
) )
assert not await ds.allowed( assert not await ds.permission_allowed_2(
action="view-table", {"id": "bob"}, "view-table", ("content", "repos")
resource=TableResource(database="content", table="repos"),
actor={"id": "bob"},
) )
assert await ds.allowed( assert await ds.permission_allowed_2(
action="view-table", {"id": "bob"}, "view-table", ("content", "other")
resource=TableResource(database="content", table="other"),
actor={"id": "bob"},
) )
@ -108,10 +85,8 @@ async def test_view_table_allow_false_blocks():
config = {"databases": {"content": {"tables": {"repos": {"allow": False}}}}} config = {"databases": {"content": {"tables": {"repos": {"allow": False}}}}}
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert not await ds.allowed( assert not await ds.permission_allowed_2(
action="view-table", {"id": "alice"}, "view-table", ("content", "repos")
resource=TableResource(database="content", table="repos"),
actor={"id": "alice"},
) )
@ -120,38 +95,18 @@ async def test_allow_sql_blocks():
config = {"allow_sql": {"id": "alice"}} config = {"allow_sql": {"id": "alice"}}
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert await ds.allowed( assert await ds.permission_allowed_2({"id": "alice"}, "execute-sql", "content")
action="execute-sql", assert not await ds.permission_allowed_2({"id": "bob"}, "execute-sql", "content")
resource=DatabaseResource(database="content"),
actor={"id": "alice"},
)
assert not await ds.allowed(
action="execute-sql",
resource=DatabaseResource(database="content"),
actor={"id": "bob"},
)
config = {"databases": {"content": {"allow_sql": {"id": "bob"}}}} config = {"databases": {"content": {"allow_sql": {"id": "bob"}}}}
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert await ds.allowed( assert await ds.permission_allowed_2({"id": "bob"}, "execute-sql", "content")
action="execute-sql", assert not await ds.permission_allowed_2({"id": "alice"}, "execute-sql", "content")
resource=DatabaseResource(database="content"),
actor={"id": "bob"},
)
assert not await ds.allowed(
action="execute-sql",
resource=DatabaseResource(database="content"),
actor={"id": "alice"},
)
config = {"allow_sql": False} config = {"allow_sql": False}
ds = await setup_datasette(config=config, databases=["content"]) ds = await setup_datasette(config=config, databases=["content"])
assert not await ds.allowed( assert not await ds.permission_allowed_2({"id": "alice"}, "execute-sql", "content")
action="execute-sql",
resource=DatabaseResource(database="content"),
actor={"id": "alice"},
)
@pytest.mark.asyncio @pytest.mark.asyncio
@ -159,5 +114,5 @@ async def test_view_instance_allow_block():
config = {"allow": {"id": "alice"}} config = {"allow": {"id": "alice"}}
ds = await setup_datasette(config=config) ds = await setup_datasette(config=config)
assert await ds.allowed(action="view-instance", actor={"id": "alice"}) assert await ds.permission_allowed_2({"id": "alice"}, "view-instance")
assert not await ds.allowed(action="view-instance", actor={"id": "bob"}) assert not await ds.permission_allowed_2({"id": "bob"}, "view-instance")

View file

@ -2,6 +2,7 @@ from datasette.cli import cli
from click.testing import CliRunner from click.testing import CliRunner
import urllib import urllib
import sqlite3 import sqlite3
from .fixtures import app_client_two_attached_databases_crossdb_enabled
def test_crossdb_join(app_client_two_attached_databases_crossdb_enabled): def test_crossdb_join(app_client_two_attached_databases_crossdb_enabled):

View file

@ -97,10 +97,3 @@ def test_custom_route_pattern_404(custom_pages_client):
assert response.status == 404 assert response.status == 404
assert "<h1>Error 404</h1>" in response.text assert "<h1>Error 404</h1>" in response.text
assert ">Oh no</" in response.text assert ">Oh no</" in response.text
def test_custom_route_pattern_with_slash_slash_302(custom_pages_client):
# https://github.com/simonw/datasette/issues/2429
response = custom_pages_client.get("//example.com/")
assert response.status == 302
assert response.headers["location"] == "/example.com"

View file

@ -1,129 +0,0 @@
import pytest
from datasette.app import Datasette
from datasette.resources import DatabaseResource, TableResource
@pytest.mark.asyncio
async def test_default_deny_denies_default_permissions():
"""Test that default_deny=True denies default permissions"""
# Without default_deny, anonymous users can view instance/database/tables
ds_normal = Datasette()
await ds_normal.invoke_startup()
# Add a test database
db = ds_normal.add_memory_database("test_db_normal")
await db.execute_write("create table test_table (id integer primary key)")
await ds_normal._refresh_schemas() # Trigger catalog refresh
# Test default behavior - anonymous user should be able to view
response = await ds_normal.client.get("/")
assert response.status_code == 200
response = await ds_normal.client.get("/test_db_normal")
assert response.status_code == 200
response = await ds_normal.client.get("/test_db_normal/test_table")
assert response.status_code == 200
# With default_deny=True, anonymous users should be denied
ds_deny = Datasette(default_deny=True)
await ds_deny.invoke_startup()
# Add the same test database
db = ds_deny.add_memory_database("test_db_deny")
await db.execute_write("create table test_table (id integer primary key)")
await ds_deny._refresh_schemas() # Trigger catalog refresh
# Anonymous user should be denied
response = await ds_deny.client.get("/")
assert response.status_code == 403
response = await ds_deny.client.get("/test_db_deny")
assert response.status_code == 403
response = await ds_deny.client.get("/test_db_deny/test_table")
assert response.status_code == 403
@pytest.mark.asyncio
async def test_default_deny_with_root_user():
"""Test that root user still has access when default_deny=True"""
ds = Datasette(default_deny=True)
ds.root_enabled = True
await ds.invoke_startup()
root_actor = {"id": "root"}
# Root user should have all permissions even with default_deny
assert await ds.allowed(action="view-instance", actor=root_actor) is True
assert (
await ds.allowed(
action="view-database",
actor=root_actor,
resource=DatabaseResource("test_db"),
)
is True
)
assert (
await ds.allowed(
action="view-table",
actor=root_actor,
resource=TableResource("test_db", "test_table"),
)
is True
)
assert (
await ds.allowed(
action="execute-sql", actor=root_actor, resource=DatabaseResource("test_db")
)
is True
)
@pytest.mark.asyncio
async def test_default_deny_with_config_allow():
"""Test that config allow rules still work with default_deny=True"""
ds = Datasette(default_deny=True, config={"allow": {"id": "user1"}})
await ds.invoke_startup()
# Anonymous user should be denied
assert await ds.allowed(action="view-instance", actor=None) is False
# Authenticated user with explicit permission should have access
assert await ds.allowed(action="view-instance", actor={"id": "user1"}) is True
# Different user should be denied
assert await ds.allowed(action="view-instance", actor={"id": "user2"}) is False
@pytest.mark.asyncio
async def test_default_deny_basic_permissions():
"""Test that default_deny=True denies basic permissions"""
ds = Datasette(default_deny=True)
await ds.invoke_startup()
# Anonymous user should be denied all default permissions
assert await ds.allowed(action="view-instance", actor=None) is False
assert (
await ds.allowed(
action="view-database", actor=None, resource=DatabaseResource("test_db")
)
is False
)
assert (
await ds.allowed(
action="view-table",
actor=None,
resource=TableResource("test_db", "test_table"),
)
is False
)
assert (
await ds.allowed(
action="execute-sql", actor=None, resource=DatabaseResource("test_db")
)
is False
)
# Authenticated user without explicit permission should also be denied
assert await ds.allowed(action="view-instance", actor={"id": "user"}) is False

View file

@ -28,10 +28,9 @@ def settings_headings():
return get_headings((docs_path / "settings.rst").read_text(), "~") return get_headings((docs_path / "settings.rst").read_text(), "~")
def test_settings_are_documented(settings_headings, subtests): @pytest.mark.parametrize("setting", app.SETTINGS)
for setting in app.SETTINGS: def test_settings_are_documented(settings_headings, setting):
with subtests.test(setting=setting.name): assert setting.name in settings_headings
assert setting.name in settings_headings
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
@ -39,21 +38,21 @@ def plugin_hooks_content():
return (docs_path / "plugin_hooks.rst").read_text() return (docs_path / "plugin_hooks.rst").read_text()
def test_plugin_hooks_are_documented(plugin_hooks_content, subtests): @pytest.mark.parametrize(
"plugin", [name for name in dir(app.pm.hook) if not name.startswith("_")]
)
def test_plugin_hooks_are_documented(plugin, plugin_hooks_content):
headings = set() headings = set()
headings.update(get_headings(plugin_hooks_content, "-")) headings.update(get_headings(plugin_hooks_content, "-"))
headings.update(get_headings(plugin_hooks_content, "~")) headings.update(get_headings(plugin_hooks_content, "~"))
plugins = [name for name in dir(app.pm.hook) if not name.startswith("_")] assert plugin in headings
for plugin in plugins: hook_caller = getattr(app.pm.hook, plugin)
with subtests.test(plugin=plugin): arg_names = [a for a in hook_caller.spec.argnames if a != "__multicall__"]
assert plugin in headings # Check for plugin_name(arg1, arg2, arg3)
hook_caller = getattr(app.pm.hook, plugin) expected = f"{plugin}({', '.join(arg_names)})"
arg_names = [a for a in hook_caller.spec.argnames if a != "__multicall__"] assert (
# Check for plugin_name(arg1, arg2, arg3) expected in plugin_hooks_content
expected = f"{plugin}({', '.join(arg_names)})" ), f"Missing from plugin hook documentation: {expected}"
assert (
expected in plugin_hooks_content
), f"Missing from plugin hook documentation: {expected}"
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
@ -69,11 +68,9 @@ def documented_views():
return view_labels return view_labels
def test_view_classes_are_documented(documented_views, subtests): @pytest.mark.parametrize("view_class", [v for v in dir(app) if v.endswith("View")])
view_classes = [v for v in dir(app) if v.endswith("View")] def test_view_classes_are_documented(documented_views, view_class):
for view_class in view_classes: assert view_class in documented_views
with subtests.test(view_class=view_class):
assert view_class in documented_views
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
@ -88,10 +85,9 @@ def documented_table_filters():
} }
def test_table_filters_are_documented(documented_table_filters, subtests): @pytest.mark.parametrize("filter", [f.key for f in Filters._filters])
for f in Filters._filters: def test_table_filters_are_documented(documented_table_filters, filter):
with subtests.test(filter=f.key): assert filter in documented_table_filters
assert f.key in documented_table_filters
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
@ -105,69 +101,9 @@ def documented_fns():
} }
def test_functions_marked_with_documented_are_documented(documented_fns, subtests): @pytest.mark.parametrize("fn", utils.functions_marked_as_documented)
for fn in utils.functions_marked_as_documented: def test_functions_marked_with_documented_are_documented(documented_fns, fn):
with subtests.test(fn=fn.__name__): assert fn.__name__ in documented_fns
assert fn.__name__ in documented_fns
def test_rst_heading_underlines_match_title_length():
"""Test that RST heading underlines are the same length as their titles."""
# Common RST underline characters
underline_chars = ["-", "=", "~", "^", "+", "*", "#"]
errors = []
for rst_file in docs_path.glob("*.rst"):
content = rst_file.read_text()
lines = content.split("\n")
for i in range(len(lines) - 1):
current_line = lines[i]
next_line = lines[i + 1]
# Check if next line is entirely made of a single underline character
# and is at least 5 characters long (to avoid false positives)
if (
next_line
and len(next_line) >= 5
and len(set(next_line)) == 1
and next_line[0] in underline_chars
):
# Skip if the previous line is empty (blank line before underline)
if not current_line:
continue
# Check if this is an overline+underline style heading
# Look at the line before current_line to see if it's also an underline
if i > 0:
prev_line = lines[i - 1]
if (
prev_line
and len(prev_line) >= 5
and len(set(prev_line)) == 1
and prev_line[0] in underline_chars
and len(prev_line) == len(next_line)
):
# This is overline+underline style, skip it
continue
# This is a heading underline
title_length = len(current_line)
underline_length = len(next_line)
if title_length != underline_length:
errors.append(
f"{rst_file.name}:{i+1}: Title length {title_length} != underline length {underline_length}\n"
f" Title: {current_line!r}\n"
f" Underline: {next_line!r}"
)
if errors:
raise AssertionError(
f"Found {len(errors)} RST heading(s) with mismatched underline length:\n\n"
+ "\n\n".join(errors)
)
# Tests for testing_plugins.rst documentation # Tests for testing_plugins.rst documentation

View file

@ -2,6 +2,7 @@
# -- start datasette_with_plugin_fixture -- # -- start datasette_with_plugin_fixture --
from datasette import hookimpl from datasette import hookimpl
from datasette.app import Datasette from datasette.app import Datasette
from datasette.plugins import pm
import pytest import pytest
import pytest_asyncio import pytest_asyncio
@ -17,12 +18,11 @@ async def datasette_with_plugin():
(r"^/error$", lambda: 1 / 0), (r"^/error$", lambda: 1 / 0),
] ]
datasette = Datasette() pm.register(TestPlugin(), name="undo")
datasette.pm.register(TestPlugin(), name="undo")
try: try:
yield datasette yield Datasette()
finally: finally:
datasette.pm.unregister(name="undo") pm.unregister(name="undo")
# -- end datasette_with_plugin_fixture -- # -- end datasette_with_plugin_fixture --

Some files were not shown because too many files have changed in this diff Show more