Compare commits

...

2,108 commits

Author SHA1 Message Date
Simon Willison
1d4448fc56
Use subtests in tests/test_docs.py (#2609)
Closes #2608
2025-12-04 21:36:39 -08:00
Simon Willison
2ca00b6c75 Release 1.0a23
Refs #2605, #2599
2025-12-02 19:20:43 -08:00
Simon Willison
03ab359208 tool.uv.package = true 2025-12-02 19:19:48 -08:00
Simon Willison
3eca3ad6d4 Better recipe for 'just docs' 2025-12-02 19:16:39 -08:00
Simon Willison
0a924524be
Split default_permissions.py into a package (#2603)
* Split default_permissions.py into a package, refs #2602

* Remove unused is_resource_allowed() method, improve test coverage

- Remove dead code: is_resource_allowed() method was never called
- Change isinstance check to assertion with error message
- Add test cases for table-level restrictions in restrictions_allow_action()
- Coverage for restrictions.py improved from 79% to 99%

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Additional permission test for gap spotted by coverage
2025-12-02 19:11:31 -08:00
Simon Willison
170b3ff61c Better fix for stale catalog_databases, closes #2606
Refs 2605
2025-12-02 19:00:13 -08:00
Simon Willison
c6c2a238c3 Fix for stale internal database bug, closes #2605 2025-12-02 16:22:42 -08:00
Simon Willison
68f1179bac Fix for text None shown on /-/actions, closes #2599 2025-11-26 17:12:52 -08:00
Simon Willison
2125115cd9 Release 1.0a22
Refs #2592, #2594, #2595, #2596
2025-11-13 10:41:02 -08:00
Simon Willison
93b455239a Release notes for 1.0a22, closes #2596 2025-11-13 10:40:24 -08:00
Simon Willison
4b4add4d31 datasette.pm property, closes #2595 2025-11-13 10:31:03 -08:00
Simon Willison
5125bef573 datasette.in_client() method, closes #2594 2025-11-13 10:00:04 -08:00
Simon Willison
23a640d38b
datasette serve --default-deny option (#2593)
Closes #2592
2025-11-12 16:14:21 -08:00
dependabot[bot]
32a425868c
Bump black from 25.9.0 to 25.11.0 in the python-packages group (#2590)
Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).


Updates `black` from 25.9.0 to 25.11.0
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/25.9.0...25.11.0)

---
updated-dependencies:
- dependency-name: black
  dependency-version: 25.11.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-12 06:07:16 -08:00
Simon Willison
291f71ec6b
Remove out-dated plugin_hook_permission_allowed references 2025-11-11 21:59:26 -08:00
Simon Willison
354d7a2873
Bump a few versions, deploy on push to main
Refs:
- #2511
2025-11-09 15:42:11 -08:00
Simon Willison
a508fc4a8e Remove permission_allowed hook docs, closes #2588
Refs #2528
2025-11-07 16:50:00 -08:00
Simon Willison
8bc9b1ee03
/-/schema and /db/-/schema and /db/table/-/schema pages (plus .json/.md)
* Add schema endpoints for databases, instances, and tables

Closes: #2586

This commit adds new endpoints to view database schemas in multiple formats:

- /-/schema - View schemas for all databases (HTML, JSON, MD)
- /database/-/schema - View schema for a specific database (HTML, JSON, MD)
- /database/table/-/schema - View schema for a specific table (JSON, MD)

Features:
- Supports HTML, JSON, and Markdown output formats
- Respects view-database and view-table permissions
- Uses group_concat(sql, ';' || CHAR(10)) from sqlite_master to retrieve schemas
- Includes comprehensive tests covering all formats and permission checks

The JSON endpoints return:
- Instance level: {"schemas": [{"database": "name", "schema": "sql"}, ...]}
- Database level: {"database": "name", "schema": "sql"}
- Table level: {"database": "name", "table": "name", "schema": "sql"}

Markdown format provides formatted output with headings and SQL code blocks.

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-07 12:01:23 -08:00
Simon Willison
1df4028d78 add_memory_database(memory_name, name=None, route=None) 2025-11-05 15:18:17 -08:00
Simon Willison
257e1c1b1b Release 1.0a21
Refs #2429, #2511, #2578, #2583
2025-11-05 13:51:58 -08:00
Simon Willison
d814e81b32
datasette.client.get(..., skip_permission_checks=True)
Closes #2580
2025-11-05 13:38:01 -08:00
Simon Willison
ec99bb46f8 stable-docs YAML workflow, refs #2582 2025-11-05 10:51:46 -08:00
Simon Willison
3c2254463b
Release notes for 0.65.2
Adding those to main. Refs #2579
2025-11-05 10:25:37 -08:00
Simon Willison
f12f6cc2ab
Get publish cloudrun working with latest Cloud Run (#2581)
Refs:
- #2511

Filter out bad services, refs:
- https://github.com/simonw/datasette/pull/2581#issuecomment-3492243400
2025-11-05 09:28:41 -08:00
Simon Willison
12016342e7 Fix test_metadata_yaml I broke in #2578 2025-11-04 18:40:58 -08:00
Simon Willison
b4385a3ff7 Made test_serve_with_get_headers a bit more forgiving 2025-11-04 18:39:25 -08:00
Simon Willison
ce464da34b datasette --get --headers option, closes #2578 2025-11-04 18:12:15 -08:00
Simon Willison
9f74dc22a8 Run cog with --extra test
Previously it kept on adding stuff to cli-reference.rst
that came from other plugins installed for my global environment
2025-11-04 18:11:24 -08:00
Simon Willison
8b371495dc Move open redirect fix to asgi_send_redirect, refs #2429
See https://github.com/simonw/datasette/pull/2500#issuecomment-3488632278
2025-11-04 17:08:06 -08:00
James Jefferies
f257ca6edb
Fix for open redirect - identified in Issue 2429 (#2500)
* Issue 2429 indicates the possiblity of an open redirect

The 404 processing ends up redirecting a request with multiple path
slashes to that site, i.e.

https://my-site//shedcode.co.uk will redirect to https://shedcode.co.uk

This commit uses a regular expression to remove the multiple leading
slashes before redirecting.
2025-11-04 17:04:12 -08:00
Simon Willison
295e4a2e87 Pin to httpx<1.0
Refs https://github.com/encode/httpx/issues/3635
Closes #2576
2025-11-03 15:05:17 -08:00
Simon Willison
95a1fef280 Release 1.0a20
Refs #2488, #2495, #2503, #2505, #2509, #2510, #2513, #2515, #2517, #2519, #2520, #2521,
#2524, #2525, #2526, #2528, #2530, #2531, #2534, #2537, #2543, #2544, #2550, #2551,
#2555, #2558, #2561, #2562, #2564, #2565, #2567, #2569, #2570, #2571, #2574
2025-11-03 14:47:24 -08:00
Simon Willison
dc3f9fe9e4 Python 3.10, not 3.8 2025-11-03 14:42:59 -08:00
Simon Willison
5d4dfcec6b Fix for link from changelog not working
Annoyingly we now get a warning in the docs build about a duplicate label,
but it seems harmless enough.
2025-11-03 14:38:57 -08:00
Simon Willison
b3b8c5831b Fixed some broken reference links on upgrade guide 2025-11-03 14:34:29 -08:00
Simon Willison
b212895b97 Updated release notes for 1.0a20
Refs #2550
2025-11-03 14:27:41 -08:00
Simon Willison
18fd373a8f
New PermissionSQL.restriction_sql mechanism for actor restrictions
Implement INTERSECT-based actor restrictions to prevent permission bypass

Actor restrictions are now implemented as SQL filters using INTERSECT rather
than as deny/allow permission rules. This ensures restrictions act as hard
limits that cannot be overridden by other permission plugins or config blocks.

Previously, actor restrictions (_r in actor dict) were implemented by 
generating permission rules with deny/allow logic. This approach had a 
critical flaw: database-level config allow blocks could bypass table-level 
restrictions, granting access to tables not in the actor's allowlist.

The new approach separates concerns:

- Permission rules determine what's allowed based on config and plugins
- Restriction filters limit the result set to only allowlisted resources
- Restrictions use INTERSECT to ensure all restriction criteria are met
- Database-level restrictions (parent, NULL) properly match all child tables

Implementation details:

- Added restriction_sql field to PermissionSQL dataclass
- Made PermissionSQL.sql optional to support restriction-only plugins
- Updated actor_restrictions_sql() to return restriction filters instead of rules
- Modified SQL builders to apply restrictions via INTERSECT and EXISTS clauses

Closes #2572
2025-11-03 14:17:51 -08:00
Simon Willison
c76c3e6e6f facet_suggest_time_limit_ms 200ms in tests, closes #2574 2025-11-03 11:52:12 -08:00
Simon Willison
fa978ec100 More upgrade tips, written by Claude Code
Refs #2549

From the datasette-atom upgrade, https://gistpreview.github.io/?d5047e04bbd9c20c59437916e21754ae
2025-11-02 12:02:45 -08:00
Simon Willison
2459285052 Additional upgrade notes by Codex CLI
Refs https://github.com/simonw/datasette/issues/2549#issuecomment-3477398336

Refs #2564
2025-11-01 20:32:42 -07:00
Simon Willison
506ce5b0ac Remove docs for obsolete register_permissions() hook, refs #2528
Also removed docs for datasette.get_permission() method which no longer exists.
2025-11-01 20:23:37 -07:00
Simon Willison
063bf7a96f Action() is kw_only, abbr= is optional, closes #2571 2025-11-01 20:20:17 -07:00
Simon Willison
7e09e1bf1b Removed obsolete actor ID v.s. actor dict code, refs #2570 2025-11-01 19:30:56 -07:00
Simon Willison
e37aa37edc Further refactor to collapse some utility functions
Refs #2570
2025-11-01 19:28:31 -07:00
Simon Willison
b8cee8768e Completed upgrade guide, closes #2564 2025-11-01 18:57:56 -07:00
Simon Willison
5c16c6687d Split permissions_resources_sql() into 5 for readability
Also remove an obsolete test that caused trouble with the new split plugin hook.

Closes #2570
2025-11-01 18:38:47 -07:00
Simon Willison
a528555e84
Additional actor restriction should not grant access to additional actions (#2569)
Closes #2568
2025-11-01 18:38:29 -07:00
Simon Willison
2b962beaeb Fix permissions_execute_sql warnings in documentation 2025-11-01 11:52:23 -07:00
Simon Willison
5705ce0d95
Move takes_child/takes_parent information from Action to Resource (#2567)
Simplified Action by moving takes_child/takes_parent logic to Resource

- Removed InstanceResource - global actions are now simply those with resource_class=None
- Resource.parent_class - Replaced parent_name: str with parent_class: type[Resource] | None for direct class references
- Simplified Action dataclass - No more redundant fields, everything is derived from the Resource class structure
- Validation - The __init_subclass__ method now checks parent_class.parent_class to enforce the 2-level hierarchy

Closes #2563
2025-11-01 11:35:08 -07:00
Simon Willison
1f8995e776 upgrade-1.0a20.md, refs #2564
And another Markdown conversion, refs #2565
2025-10-31 19:13:41 -07:00
Simon Willison
47e4060469 Enable MyST Markdown docs, port events.rst, refs #2565 2025-10-31 16:38:04 -07:00
Simon Willison
48982a0ff5 Mark 1.0a20 unreleased
Refs #2550
2025-10-31 16:12:54 -07:00
Simon Willison
223dcc7c0e Remove unused link 2025-10-31 16:11:53 -07:00
Simon Willison
3184bfae54 Release notes for 1.0a20, refs #2550 2025-10-31 15:37:30 -07:00
Simon Willison
e5f392ae7a datasette.allowed_resources_sql() returns namedtuple 2025-10-31 15:07:37 -07:00
Simon Willison
400fa08e4c
Add keyset pagination to allowed_resources() (#2562)
* Add keyset pagination to allowed_resources()

This replaces the unbounded list return with PaginatedResources,
which supports efficient keyset pagination for handling thousands
of resources.

Closes #2560

Changes:
- allowed_resources() now returns PaginatedResources instead of list
- Added limit (1-1000, default 100) and next (keyset token) parameters
- Added include_reasons parameter (replaces allowed_resources_with_reasons)
- Removed allowed_resources_with_reasons() method entirely
- PaginatedResources.all() async generator for automatic pagination
- Uses tilde-encoding for tokens (matching table pagination)
- Updated all callers to use .resources accessor
- Updated documentation with new API and examples

The PaginatedResources object has:
- resources: List of Resource objects for current page
- next: Token for next page (None if no more results)
- all(): Async generator that yields all resources across pages

Example usage:
    page = await ds.allowed_resources("view-table", actor, limit=100)
    for table in page.resources:
        print(table.child)

    # Iterate all pages automatically
    async for table in page.all():
        print(table.child)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-31 14:50:46 -07:00
Simon Willison
b7ef968c6f Fixed some rST labels I broke 2025-10-31 09:15:39 -07:00
Simon Willison
ba654b5576 Forbid same DB passed twice or via config_dir, closes #2561 2025-10-30 21:40:09 -07:00
Simon Willison
e4be95b16c
Update permissions documentation for new action system (#2551) 2025-10-30 17:59:54 -07:00
Simon Willison
87aa798148 Permission tabs include allow debug page
Closes #2559
2025-10-30 17:54:07 -07:00
Simon Willison
6a71bde37f
Permissions SQL API improvements (#2558)
* Neater design for PermissionSQL class, refs #2556
  - source is now automatically set to the source plugin
  - params is optional
* PermissionSQL.allow() and PermissionSQL.deny() shortcuts

Closes #2556

* Filter out temp database from attached_databases()

Refs https://github.com/simonw/datasette/issues/2557#issuecomment-3470510837
2025-10-30 15:48:46 -07:00
Simon Willison
5247856bd4 Filter out temp database from attached_databases()
Refs https://github.com/simonw/datasette/issues/2557#issuecomment-3470510837
2025-10-30 15:48:10 -07:00
Simon Willison
ce4b0794b2
Ported setup.py to pyproject.toml (#2555)
* Ported setup.py to pyproject.toml, refs #2553

* Make fixtures tests less flaky

The in-memory fixtures table was being shared between different
instances of the test client, leading to occasional errors when
running the full test suite.
2025-10-30 10:41:41 -07:00
Simon Willison
53e6a72a95 Move black to YAML, not pytest 2025-10-30 10:40:46 -07:00
Simon Willison
1289eb0589 Fix SQLite locking issue in execute_write_script
The execute_write_script() method was causing SQLite database locking
errors when multiple executescript() calls ran in quick succession.

Root cause: SQLite's executescript() method has special behavior - it
implicitly commits any pending transaction and operates in autocommit
mode. However, execute_write_script() was passing these calls through
execute_write_fn() with the default transaction=True, which wrapped
the executescript() call in a transaction context (with conn:).

This created a conflict where sequential executescript() calls would
cause the second call to fail with "OperationalError: database table
is locked: sqlite_master" because the sqlite_master table was still
locked from the first operation's implicit commit.

Fix: Pass transaction=False to execute_write_fn() since executescript()
manages its own transactions and should not be wrapped in an additional
transaction context.

This was causing test_hook_extra_body_script to fail because the
internal database initialization (which calls executescript twice in
succession) would fail, preventing the application from rendering
pages correctly.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-30 10:30:09 -07:00
Simon Willison
5da3c9f4bd Better display of recent permissions checks, refs #2543 2025-10-30 10:28:04 -07:00
Simon Willison
b018eb3171 Simplified the code for the permission debug pages
Decided not to use as much JavaScript

Used Codex CLI for this. Refs #2543
2025-10-30 10:28:04 -07:00
Simon Willison
73014abe8b Improved permissions UI WIP 2025-10-30 10:28:04 -07:00
Simon Willison
b3721eaf50 Add /-/actions endpoint to list registered actions
This adds a new endpoint at /-/actions that lists all registered actions
in the permission system. The endpoint supports both JSON and HTML output.

Changes:
- Added _actions() method to Datasette class to return action list
- Added route for /-/actions with JsonDataView
- Created actions.html template for nice HTML display
- Added template parameter to JsonDataView for custom templates
- Moved respond_json_or_html from BaseView to JsonDataView
- Added test for the new endpoint

The endpoint requires view-instance permission and provides details about
each action including name, abbreviation, description, resource class,
and parent/child requirements.

Closes #2547

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-26 16:14:58 -07:00
Simon Willison
5c537e0a3e Fix type annotation bugs and remove unused imports
This fixes issues introduced by the ruff commit e57f391a which converted
Optional[x] to x | None:

- Fixed datasette/app.py line 1024: Dict[id | str, Dict] -> Dict[int | str, Dict]
  (was using id built-in function instead of int type)
- Fixed datasette/app.py line 1074: Optional["Resource"] -> "Resource" | None
- Added 'from __future__ import annotations' for Python 3.10 compatibility
- Added TYPE_CHECKING blocks to avoid circular imports
- Removed dead code (unused variable assignments) from cli.py and views
- Removed unused imports flagged by ruff across multiple files
- Fixed test fixtures: moved app_client fixture imports to conftest.py
  (fixed 71 test errors caused by fixtures not being registered)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-26 16:03:13 -07:00
Simon Willison
2c8e92acf2 Require permissions-debug permission for /-/check endpoint
The /-/check endpoint now requires the permissions-debug permission
to access. This prevents unauthorized users from probing the permission
system. Administrators can grant this permission to specific users or
anonymous users if they want to allow open access.

Added test to verify anonymous and regular users are denied access,
while root user (who has all permissions) can access the endpoint.

Closes #2546
2025-10-26 11:16:07 -07:00
Simon Willison
e7ed948238 Use ruff to upgrade Optional[x] to x | None
Refs #2545
2025-10-26 10:50:29 -07:00
Simon Willison
06b442c894 Applied Black, refs #2544 2025-10-26 10:05:12 -07:00
Simon Willison
6de83bf3a9 Make deploy-latest.yml workflow dispatch-only
It is currently broken, will revert once I fix it.
2025-10-26 09:51:09 -07:00
Simon Willison
4fe1765dc3 Add test for RST heading underline lengths, closes #2544
Added test_rst_heading_underlines_match_title_length() to verify that RST
heading underlines match their title lengths. The test properly handles:
- Overline+underline style headings (skips validation for those)
- Empty lines before underlines (ignores them)
- Minimum 5-character underline length (avoids false positives)

Running this test identified 14 heading underline mismatches which have
been fixed across 5 documentation files:
- docs/authentication.rst (3 headings)
- docs/plugin_hooks.rst (4 headings)
- docs/internals.rst (5 headings)
- docs/deploying.rst (1 heading)
- docs/changelog.rst (1 heading)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-26 09:49:49 -07:00
Simon Willison
653c94209c Remove broken reference to datasette_ensure_permissions in changelog 2025-10-26 09:49:49 -07:00
Simon Willison
95286fbb60 Refactor check_visibility() to use Resource objects, refs #2537
Updated check_visibility() method signature to accept Resource objects
(DatabaseResource, TableResource, QueryResource) instead of plain strings
and tuples.

Changes:
- Updated check_visibility() signature to only accept Resource objects
- Added validation with helpful error message for incorrect types
- Updated all check_visibility() calls throughout the codebase:
  - datasette/views/database.py: Use DatabaseResource and QueryResource
  - datasette/views/special.py: Use DatabaseResource and TableResource
  - datasette/views/row.py: Use TableResource
  - datasette/views/table.py: Use TableResource
  - datasette/app.py: Use TableResource in expand_foreign_keys
- Updated tests to use Resource objects
- Updated documentation in docs/internals.rst:
  - Removed outdated permissions parameter
  - Updated examples to use Resource objects

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-26 09:49:49 -07:00
Simon Willison
653475edde Fix permissions_debug.html to use takes_parent/takes_child, refs #2530
The JavaScript was still referencing the old field names takes_database
and takes_resource instead of the new takes_parent and takes_child. This
caused the resource input fields to not show/hide properly when selecting
different permission actions.
2025-10-26 09:49:49 -07:00
dependabot[bot]
c652e92049 Bump the python-packages group across 1 directory with 3 updates
Bumps the python-packages group with 3 updates in the / directory: [furo](https://github.com/pradyunsg/furo), [blacken-docs](https://github.com/asottile/blacken-docs) and [black](https://github.com/psf/black).


Updates `furo` from 2024.8.6 to 2025.7.19
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2024.08.06...2025.07.19)

Updates `blacken-docs` from 1.19.1 to 1.20.0
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/1.19.1...1.20.0)

Updates `black` from 25.1.0 to 25.9.0
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/25.1.0...25.9.0)

---
updated-dependencies:
- dependency-name: furo
  dependency-version: 2025.7.19
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: python-packages
- dependency-name: blacken-docs
  dependency-version: 1.20.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: black
  dependency-version: 25.9.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-25 21:32:52 -07:00
Simon Willison
d769e97ab8 Show multiple permission reasons as JSON arrays, refs #2531
- Modified /-/allowed to show all reasons that grant access to a resource
- Changed from MAX(reason) to json_group_array() in SQL to collect all reasons
- Reasons now displayed as JSON arrays in both HTML and JSON responses
- Only show Reason column to users with permissions-debug permission
- Removed obsolete "Source Plugin" column from /-/rules interface
- Updated allowed_resources_with_reasons() to parse and return reason lists
- Fixed alert() on /-/allowed by replacing with disabled input state
2025-10-25 21:24:05 -07:00
Simon Willison
ee4fcff5c0 On /-/allowed show reason column if vsible to user 2025-10-25 21:08:59 -07:00
Simon Willison
ee62bf9bdc Fix minor irritation with /-/allowed UI 2025-10-25 18:02:26 -07:00
Simon Willison
7d9d7acb0b Rename test_tables_endpoint.py and remove outdated tests
- Renamed test_tables_endpoint.py to test_allowed_resources.py to better
  reflect that it tests the allowed_resources() API, not the HTTP endpoint
- Removed three outdated tests from test_search_tables.py that expected
  the old behavior where /-/tables.json with no query returned empty results
- The new behavior (from commit bda69ff1) returns all tables with pagination
  when no query is provided

Fixes test failures in CI from PR #2539

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 17:32:48 -07:00
Simon Willison
5530a19d9f Remove Plugin Source column from /-/allowed 2025-10-25 17:32:48 -07:00
Simon Willison
6854270da3 Fix for actor restrictions + config bug
Described here: https://github.com/simonw/datasette/pull/2539#issuecomment-3447870261
2025-10-25 17:32:48 -07:00
Simon Willison
fb9cd5c72c Transform actor restrictions into SQL permission rules
Actor restrictions (_r) now integrate with the SQL permission layer via
the permission_resources_sql() hook instead of acting as a post-filter.

This fixes the issue where allowed_resources() didn't respect restrictions,
causing incorrect database/table listings at /.json and /database.json
endpoints for restricted actors.

Key changes:
- Add _restriction_permission_rules() function to generate SQL rules from _r
- Restrictions create global DENY + specific ALLOW rules using allowlist
- Restrictions act as gating filter BEFORE config/root/default permissions
- Remove post-filter check from allowed() method (now redundant)
- Skip default allow rules when actor has restrictions
- Add comprehensive tests for restriction filtering behavior

The cascading permission logic (child → parent → global) ensures that
allowlisted resources override the global deny, while non-allowlisted
resources are blocked.

Closes #2534
2025-10-25 17:32:48 -07:00
Simon Willison
bda69ff1c9 /-/tables.json with no ?q= returns tables
Closes #2541
2025-10-25 16:48:19 -07:00
Simon Willison
59994e18e4 Fix for intermittent failing test
It was failing when calculating coverage, I think because an in-memory database
was being reused.
2025-10-25 15:38:07 -07:00
Simon Willison
62b99b1f55 Ran black 2025-10-25 15:38:07 -07:00
Simon Willison
f18d1ecac6 Better failure message to help debug test 2025-10-25 15:38:07 -07:00
Simon Willison
e7c7e21277 Ran blacken-docs 2025-10-25 15:38:07 -07:00
Simon Willison
d7d7ead0ef Ran cog 2025-10-25 15:38:07 -07:00
Simon Willison
20ed5a00e7 Ran Black 2025-10-25 15:38:07 -07:00
Simon Willison
e4f549301b Remove stale self.permissions dictionary and get_permission() method
The self.permissions dictionary was declared in __init__ but never
populated - only self.actions gets populated during startup.

The get_permission() method was unused legacy code that tried to look
up permissions from the empty self.permissions dictionary.

Changes:
- Removed self.permissions = {} from Datasette.__init__
- Removed get_permission() method (unused)
- Renamed test_get_permission → test_get_action to match actual method being tested

All tests pass, confirming these were unused artifacts.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
deb0b87e1b Fix cli.py to use ds.actions instead of ds.permissions
The create-token CLI command was checking ds.permissions.get(action)
instead of ds.actions.get(action) when validating action names. This
caused false "Unknown permission" warnings for valid actions like
"debug-menu".

This is the same bug we fixed in app.py:685. The Action objects are
stored in ds.actions, not ds.permissions.

The warnings were being printed to stderr (correctly) but CliRunner
mixes stderr and stdout, so the warnings contaminated the token output,
causing token authentication to fail in tests.

Fixes all 6 test_cli_create_token tests.

Refs #2534

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
86ea2d2c99 Fix test_actor_restricted_permissions to match current API behavior
Updated test expectations to match the actual /-/permissions POST endpoint:

1. **Resource format**: Changed from empty list `[]` to `None` when no resources,
   and from tuple `(a, b)` to list `[a, b]` for two resources (JSON serialization)

2. **Result values**: Changed from sentinel "USE_DEFAULT" to actual boolean True/False

3. **also_requires dependencies**: Fixed tests for actions with dependencies:
   - view-database-download now requires both "vdd" and "vd" in restrictions
   - execute-sql now requires both "es" and "vd" in restrictions

4. **No upward cascading**: view-database does NOT grant view-instance
   (changed expected result from True to False)

All 20 test_actor_restricted_permissions test cases now pass.

Refs #2534

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
c3eeecfb22 Restore xfail markers for test_actor_restricted_permissions and test_cli_create_token
These tests were expecting an old API behavior from the /-/permissions debug endpoint
that no longer exists. The tests expect:
- A "default" field in the response (removed when migrating to new permission system)
- "USE_DEFAULT" sentinel values instead of actual True/False results
- Empty list `[]` for no resource instead of `None`

The /-/permissions POST endpoint was updated (views/special.py:151-185) to return
simpler responses without the "default" field, but these tests weren't updated to match.

These tests need to be rewritten to test the new permission system correctly.

Refs #2534
2025-10-25 15:38:07 -07:00
Simon Willison
ca435d16f6 Fix test_auth_create_token - template variables and action abbreviation
Fixed two bugs preventing the create token UI and tests from working:

1. **Template variable mismatch**: create_token.html was using undefined variables
   - Changed `all_permissions` → `all_actions`
   - Changed `database_permissions` → `database_actions`
   - Changed `resource_permissions` → `child_actions`

   These match what CreateTokenView.shared() actually provides to the template.

2. **Action abbreviation bug**: app.py:685 was checking the wrong dictionary
   - Changed `self.permissions.get(action)` → `self.actions.get(action)`

   The abbreviate_action() function needs to look up Action objects (which have
   the `abbr` attribute), not Permission objects. This bug prevented action names
   like "view-instance" from being abbreviated to "vi" in token restrictions.

Refs #2534

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
11fb528958 Fix test_actor_restrictions to match non-cascading permission design
The test was expecting upward permission cascading (e.g., view-table permission
granting view-database access), but the actual implementation in
restrictions_allow_action() uses exact-match, non-cascading checks.

Updated 5 test cases to expect 403 (Forbidden) instead of 200 when:
- Actor has view-database permission but accesses instance page
- Actor has database-level view-table permission but accesses instance/database pages
- Actor has table-level view-table permission but accesses instance/database pages

This matches the documented behavior: "Restrictions work on an exact-match basis:
if an actor has view-table permission, they can view tables, but NOT automatically
view-instance or view-database."

Refs #2534
https://github.com/simonw/datasette/issues/2534#issuecomment-3447774464

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
08014c9732 Rename permission_name to action_name 2025-10-25 15:38:07 -07:00
Simon Willison
de21a4209c Apply database-level allow blocks to view-query action, refs #2510
When a database has an "allow" block in the configuration, it should
apply to all queries in that database, not just tables and the database
itself. This fix ensures that queries respect database-level access
controls.

This fixes the test_padlocks_on_database_page test which expects
plugin-defined queries (from_async_hook, from_hook) to show padlock
indicators when the database has restricted access.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
d300200ba5 Add datasette.resource_for_action() helper method, refs #2510
Added a new helper method resource_for_action() that creates Resource
instances for a given action by looking up the action's resource_class.
This eliminates the ugly object.__new__() pattern throughout the codebase.

Refactored all places that were using object.__new__() to create Resource
instances:
- check_visibility()
- allowed_resources()
- allowed_resources_with_reasons()

Also refactored database view to use allowed_resources() with
include_is_private=True to get canned queries, rather than manually
checking each one.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
eff4f931af Fix check_visibility to use action's resource_class, refs #2510
Updated check_visibility() to use the action's resource_class to determine
the correct Resource type to instantiate, rather than hardcoding based on
the action name. This follows the pattern used elsewhere in the codebase
and properly supports QueryResource for view-query actions.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
82cc3d5c86 Migrate view-query permission to SQL-based system, refs #2510
This change integrates canned queries with Datasette's new SQL-based
permissions system by making the following changes:

1. **Default canned_queries plugin hook**: Added a new hookimpl in
   default_permissions.py that returns canned queries from datasette
   configuration. This extracts config-reading logic into a plugin hook,
   allowing QueryResource to discover all queries.

2. **Async resources_sql()**: Converted Resource.resources_sql() from a
   synchronous class method returning a string to an async method that
   receives the datasette instance. This allows QueryResource to call
   plugin hooks and query the database.

3. **QueryResource implementation**: Implemented QueryResource.resources_sql()
   to gather all canned queries by:
   - Querying catalog_databases for all databases
   - Calling canned_queries hooks for each database with actor=None
   - Building a UNION ALL SQL query of all (database, query_name) pairs
   - Properly escaping single quotes in resource names

4. **Simplified get_canned_queries()**: Removed config-reading logic since
   it's now handled by the default plugin hook.

5. **Added view-query to default allow**: Added "view-query" to the
   default_allow_actions set so canned queries are accessible by default.

6. **Removed xfail markers**: Removed test xfail markers from:
   - tests/test_canned_queries.py (entire module)
   - tests/test_html.py (2 tests)
   - tests/test_permissions.py (1 test)
   - tests/test_plugins.py (1 test)

All canned query tests now pass with the new permission system.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
60ed646d45 Ran Black 2025-10-25 15:38:07 -07:00
Simon Willison
66f2dbb64a Fix assert_permissions_checked to handle PermissionCheck dataclass
Updated the assert_permissions_checked() helper function to work with the
new PermissionCheck dataclass instead of dictionaries. The function now:
- Uses dataclass attributes (pc.action) instead of dict subscripting
- Converts parent/child to old resource format for comparison
- Updates error message formatting to show dataclass fields

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
10ea23a59c Add PermissionCheck dataclass with parent/child fields, refs #2528
Instead of logging permission checks as dicts with a 'resource' key,
use a typed dataclass with separate parent and child fields.

Changes:
- Created PermissionCheck dataclass in app.py
- Updated permission check logging to use dataclass
- Updated PermissionsDebugView to use dataclass attributes
- Updated PermissionCheckView to check parent/child instead of resource
- Updated permissions_debug.html template to display parent/child
- Updated test expectations to use dataclass attributes

This provides better type safety and cleaner separation between
parent and child resource identifiers.
2025-10-25 15:38:07 -07:00
Simon Willison
4760cb9e06 Refactor CreateTokenView to use allowed_resources() and rename variables, refs #2528
Changes:
- Use allowed_resources() instead of manual iteration with allowed() checks
- Rename all_permissions → all_actions
- Rename database_permissions → database_actions
- Rename resource_permissions → child_actions
- Update to use takes_parent/takes_child instead of takes_database/takes_resource

This makes the code more efficient (bulk permission checking) and uses
consistent naming throughout.
2025-10-25 15:38:07 -07:00
Simon Willison
13318feb8e Use action.takes_parent/takes_child for resource object creation, refs #2528
Instead of manually checking resource_class types, use the action's
takes_parent and takes_child properties to determine how to instantiate
the resource object. This is more maintainable and works with any
resource class that follows the pattern.

Updated in:
- PermissionsDebugView.post()
- PermissionCheckView.get()
2025-10-25 15:38:07 -07:00
Simon Willison
a5910f200e Code cleanup: rename variables, remove WHERE 0 check, cleanup files, refs #2528
- Rename permission_name to action_name in debug templates for consistency
- Remove confusing WHERE 0 check from check_permission_for_resource()
- Rename tests/test_special.py to tests/test_search_tables.py
- Remove tests/vec.db that shouldn't have been committed
2025-10-25 15:38:07 -07:00
Simon Willison
fabcfd68ad Add datasette.ensure_permission() method, refs #2525, refs #2528
Implements a new ensure_permission() method that is a convenience wrapper
around allowed() that raises Forbidden instead of returning False.

Changes:
- Added ensure_permission() method to datasette/app.py
- Updated all views to use ensure_permission() instead of the pattern:
  if not await self.ds.allowed(...): raise Forbidden(...)
- Updated docs/internals.rst to document the new method
- Removed old ensure_permissions() documentation (that method was already removed)

The new method simplifies permission enforcement in views and makes the
code more concise and consistent.
2025-10-25 15:38:07 -07:00
Simon Willison
6df364cb2c Ran cog 2025-10-25 15:38:07 -07:00
Simon Willison
d0237187c4 Ran prettier 2025-10-25 15:38:07 -07:00
Simon Willison
ee1d7983ba Mark canned query tests as xfail, refs #2510, refs #2528
Canned queries are not accessible because view-query permission
has not yet been migrated to the SQL-based permission system.

Marks the following tests with xfail:
- test_config_cache_size (test_api.py)
- test_edit_sql_link_not_shown_if_user_lacks_permission (test_html.py)
- test_database_color - removes canned query path (test_html.py)
- test_hook_register_output_renderer_* (test_plugins.py - 3 tests)
- test_hook_query_actions canned query parameter (test_plugins.py)
- test_custom_query_with_unicode_characters (test_table_api.py)
- test_permissions_checked neighborhood_search (test_permissions.py)
- test_padlocks_on_database_page (test_permissions.py)

All reference issue #2510 for tracking view-query migration.
2025-10-25 15:38:07 -07:00
Simon Willison
bc81975d85 Remove used_default feature from permission system, refs #2528
The new SQL-based permission system always resolves to True or False,
so the concept of "used default" (tracking when no hook had an opinion)
is no longer relevant. Removes:

- used_default from permission check logging in app.py
- used_default from permission debug responses in special.py
- used_default display from permissions_debug.html template
- used_default from test expectations in test_permissions.py

This simplifies the permission system by eliminating the "no opinion" state.
2025-10-25 15:38:07 -07:00
Simon Willison
5c6b76f2f0 Migrate views from ds.permissions to ds.actions, refs #2528
Updates all permission debugging views to use the new ds.actions dict
instead of the old ds.permissions dict. Changes include:

- Replace all ds.permissions references with ds.actions
- Update field references: takes_database/takes_resource → takes_parent/takes_child
- Remove default field from permission display
- Rename sorted_permissions to sorted_actions in templates
- Remove source_plugin from SQL queries and responses
- Update test expectations to not check for source_plugin field

This aligns the views with the new Action dataclass structure.
2025-10-25 15:38:07 -07:00
Simon Willison
5feb5fcf5d Remove permission_allowed hook entirely, refs #2528
The permission_allowed hook has been fully replaced by permission_resources_sql.
This commit removes:
- hookspec definition from hookspecs.py
- 4 implementations from default_permissions.py
- implementations from test plugins (my_plugin.py, my_plugin_2.py)
- hook monitoring infrastructure from conftest.py
- references from fixtures.py
- Also fixes test_get_permission to use ds.get_action() instead of ds.get_permission()
- Removes 5th column (source_plugin) from PermissionSQL queries

This completes the migration to the SQL-based permission system.
2025-10-25 15:38:07 -07:00
Simon Willison
60a38cee85 Run black formatter 2025-10-25 15:38:07 -07:00
Simon Willison
b5f41772ca Fix view-database-download permission handling
Two fixes for database download permissions:

1. Added also_requires="view-database" to view-database-download action
   - You should only be able to download a database if you can view it

2. Added view-database-download to default_allow_actions list
   - This action should be allowed by default, like view-database

3. Implemented also_requires checking in allowed() method
   - The allowed() method now checks action.also_requires before
     checking the action itself
   - This ensures execute-sql requires view-database, etc.

Fixes test_database_download_for_immutable and
test_database_download_disallowed_for_memory.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
ad00bb11f6 Mark test_auth_create_token as xfail, refs #2534
This test creates tokens with actor restrictions (_r) for various
permissions. The create-token UI needs work to properly integrate
with the new permission system.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
559a13a8c6 Mark additional canned query tests in test_html.py as xfail, refs #2510
Marked specific parameter combinations that test canned queries:
- test_css_classes_on_body with /fixtures/neighborhood_search
- test_alternate_url_json with /fixtures/neighborhood_search

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
09194c72f8 Replace permission_allowed_2() with allowed() in test_config_permission_rules.py
Updated all test_config_permission_rules.py tests to use the new allowed()
method with Resource objects instead of the old permission_allowed_2()
method.

Also marked test_database_page in test_html.py as xfail since it expects
to see canned queries (view-query permission not yet migrated).

All 7 config_permission_rules tests now pass.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
d07f8944fa Mark canned query and magic parameter tests as xfail, refs #2510
These tests involve canned queries which use the view-query permission
that has not yet been migrated to the new SQL-based permission system.

Tests marked:
- test_hook_canned_queries (4 tests in test_plugins.py)
- test_hook_register_magic_parameters (test_plugins.py)
- test_hook_top_canned_query (test_plugins.py)
- test_canned_query_* (4 tests in test_html.py)
- test_edit_sql_link_on_canned_queries (test_html.py)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
562a84e3f9 Mark test_cli_create_token as xfail, refs #2534
This test creates tokens with actor restrictions (_r) which need
additional work to properly integrate with the new permission system.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
0fb148b1f4 Mark test_canned_queries.py module as xfail, refs #2534
Canned queries use view-query permission which has not yet been migrated
to the new SQL-based permission system.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
e5762b1f22 Mark actor restriction tests as xfail, refs #2534
Actor restrictions (_r in actor dict) need additional work to properly
integrate with the new SQL-based permission system. Marking these tests
as expected to fail until that work is completed.

Tests marked as xfail:
- test_actor_restricted_permissions (20 test cases)
- test_actor_restrictions (5 specific parameter combinations)

Test improvements: 37 failures → 12 failures

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
182bfaed8e Fix expand_foreign_keys and filters to use new check_visibility() and allowed() signatures
Changes:
- Fixed expand_foreign_keys() to use new check_visibility() signature
  without the 'permissions' keyword argument
- Removed 'default' parameter from allowed() call in filters.py
- Marked view-query tests as xfail since view-query permission is not yet
  migrated to the new SQL-based permission system

Test improvements: 41 failures → 37 failures

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
6584c9e03f Remove ensure_permissions() and simplify check_visibility()
This commit removes the ensure_permissions() method entirely and updates
all code to use direct allowed() checks instead.

Key changes:
- Removed ensure_permissions() method from datasette/app.py
- Simplified check_visibility() to check single permissions directly
- Replaced all ensure_permissions() calls with direct allowed() checks
- Updated all check_visibility() calls to use only primary permission
- Added Forbidden import to index.py

Why this change:
- ensure_permissions() used OR logic (any permission passes) which
  conflicted with explicit denies in the config
- For example, check_visibility() called ensure_permissions() with
  ["view-database", "view-instance"] and if view-instance passed,
  it would show pages even with explicit database deny
- The new approach checks only the specific permission needed for
  each resource, respecting explicit denies

Test improvements: 64 failures → 41 failures

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
30e2f9064b Remove implies_can_view logic from actor restrictions
Simplified restrictions_allow_action() to work on exact-match basis only.
Actor restrictions no longer use permission implication logic - if an actor
has view-table permission, they can view tables but NOT automatically
view-instance or view-database.

Updated test_restrictions_allow_action test cases to reflect new behavior:
- Removed test cases expecting view-table to imply view-instance
- Removed test cases expecting view-database to imply view-instance
- Removed test cases expecting execute-sql to imply view-instance/view-database
- Added test cases verifying exact matches work correctly
- Added test case verifying abbreviations work (es -> execute-sql)

This aligns actor restrictions with the new permission model where each
action is checked independently without hierarchical implications.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
e1582c1424 Fix actor restrictions to work with new actions system
- Updated restrictions_allow_action() to use datasette.actions instead of datasette.permissions
- Changed references from Permission to Action objects
- Updated takes_database checks to takes_parent
- Added get_action() method to Datasette class for looking up actions by name or abbreviation
- Integrated actor restriction checking into allowed() method
- Actor restrictions (_r in actor dict) are now properly enforced after SQL permission checks

This fixes tests in test_api_write.py where actors with restricted permissions
were incorrectly being granted access to actions outside their restrictions.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
dc241e8691 Remove deprecated register_permissions hook
- Removed register_permissions hook definition from hookspecs.py
- Removed register_permissions implementation from default_permissions.py
- Removed pm.hook.register_permissions() call from app.py invoke_startup()
- The register_actions hook now serves as the sole mechanism for registering actions
- Removed Permission import from default_permissions.py as it's no longer needed

This completes the migration from the old register_permissions hook to the new
register_actions hook. All permission definitions should now use Action objects
via register_actions, and permission checking should use permission_resources_sql
to provide SQL-based permission rules.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
fe2084df66 Update test infrastructure to use register_actions hook
- Consolidated register_permissions and register_actions hooks in my_plugin.py
- Added permission_resources_sql hook to provide SQL-based permission rules
- Updated conftest.py to reference datasette.actions instead of datasette.permissions
- Updated fixtures.py to include permission_resources_sql hook and remove register_permissions
- Added backwards compatibility support for old datasette-register-permissions config
- Converted test actions (this_is_allowed, this_is_denied, etc.) to use permission_resources_sql

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
59ccf797c4 Rename register_permissions tests to register_actions
- Renamed test_hook_register_permissions to test_hook_register_actions
- Renamed test_hook_register_permissions_no_duplicates to test_hook_register_actions_no_duplicates
- Renamed test_hook_register_permissions_allows_identical_duplicates to test_hook_register_actions_allows_identical_duplicates
- Updated all tests to use Action objects instead of Permission objects
- Updated config structures from datasette-register-permissions to datasette-register-actions
- Changed assertions from ds.permissions to ds.actions
- Updated test_hook_permission_allowed to register custom actions

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-25 15:38:07 -07:00
Simon Willison
7aaff5e3d2 Update tests to use new allowed() method instead of permission_allowed() 2025-10-25 15:38:07 -07:00
Simon Willison
387afb0f69 Update tests to use simplified allowed() calls
- Removed explicit InstanceResource() parameters for instance-level checks
- Removed unused InstanceResource import
2025-10-25 15:38:07 -07:00
Simon Willison
cde1624d0a Update permission hooks to include source_plugin column and simplify menu_links
- Added source_plugin column to all permission SQL queries (required by new system)
- Removed unused InstanceResource import from default_menu_links.py
- Fixed SQL format to match (parent, child, allow, reason, source_plugin) schema
2025-10-25 15:38:07 -07:00
Simon Willison
a0659075a3 Migrate all view files to use new allowed() method with Resource objects
- Converted all permission_allowed() calls to allowed()
- Use proper Resource objects (InstanceResource, DatabaseResource, TableResource)
- Removed explicit InstanceResource() parameters where default applies
- Updated PermissionRulesView to use build_permission_rules_sql() helper
2025-10-25 15:38:07 -07:00
Simon Willison
224084facc Make allowed() and check_permission_for_resource keyword-only, add default resource
- Made allowed() accept resource=None with InstanceResource() as default
- Made both functions keyword-argument only
- Added logging to _permission_checks for debug endpoints
- Fixed check_permission_for_resource to handle empty params correctly
- Created build_permission_rules_sql() helper function for debug views
2025-10-25 15:38:07 -07:00
Simon Willison
235962cd35 just blacken-docs 2025-10-25 15:02:49 -07:00
Simon Willison
4be7eece8c just prettier, just format shortcuts 2025-10-25 09:04:04 -07:00
Simon Willison
4d03e8c12e Refactor AllowedResourcesView to use datasette.allowed_resources()
Refs https://github.com/simonw/datasette/issues/2527#issuecomment-3444586698
2025-10-24 12:21:48 -07:00
Simon Willison
e8b79970fb Implement also_requires to enforce view-database for execute-sql
Adds Action.also_requires field to specify dependencies between permissions.
When an action has also_requires set, users must have permission for BOTH
the main action AND the required action on a resource.

Applies this to execute-sql, which now requires view-database permission.
This prevents the illogical scenario where users can execute SQL on a
database they cannot view.

Changes:
- Add also_requires field to Action dataclass in datasette/permissions.py
- Update execute-sql action with also_requires="view-database"
- Implement also_requires handling in build_allowed_resources_sql()
- Implement also_requires handling in AllowedResourcesView endpoint
- Add test verifying execute-sql requires view-database permission

Fixes #2527
2025-10-24 12:14:52 -07:00
Simon Willison
a2994cc5bb Remove automatic parameter namespacing from permission plugins
Simplifies the permission system by removing automatic parameter namespacing.
Plugins are now responsible for using unique parameter names. The recommended
convention is to prefix parameters with the plugin source name (e.g.,
:myplugin_user_id). System reserves :actor, :actor_id, :action, :filter_parent.

- Remove _namespace_params() function from datasette/utils/permissions.py
- Update build_rules_union() to use plugin params directly
- Document parameter naming convention in plugin_hooks.rst
- Update example plugins to use prefixed parameters
- Add test_multiple_plugins_with_own_parameters() to verify convention works
2025-10-24 11:44:43 -07:00
Simon Willison
7c6bc0b902 Fix #2509: Settings-based deny rules now override root user privileges
The root user's permission_resources_sql hook was returning early with a
blanket "allow all" rule, preventing settings-based deny rules from being
considered. This caused /-/allowed and /-/rules endpoints to incorrectly
show resources that were denied via settings.

Changed permission_resources_sql to append root permissions to the rules
list instead of returning early, allowing config-based deny rules to be
evaluated. The SQL cascading logic correctly applies: deny rules at the
same depth beat allow rules, so database-level denies override root's
global-level allow.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 11:13:19 -07:00
Simon Willison
5138e95d69 Migrate homepage to use bulk allowed_resources() and fix NULL handling in SQL JOINs
- Updated IndexView in datasette/views/index.py to fetch all allowed databases and tables
  in bulk upfront using allowed_resources() instead of calling check_visibility() for each
  database, table, and view individually
- Fixed SQL bug in build_allowed_resources_sql() where USING (parent, child) clauses failed
  for database resources because NULL = NULL evaluates to NULL in SQL, not TRUE
- Changed all INNER JOINs to use explicit ON conditions with NULL-safe comparisons:
  ON b.parent = x.parent AND (b.child = x.child OR (b.child IS NULL AND x.child IS NULL))

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 10:32:18 -07:00
Simon Willison
8674aaa392 Add parent filter and include_is_private to allowed_resources()
Major improvements to the allowed_resources() API:

1. **parent filter**: Filter results to specific database in SQL, not Python
   - Avoids loading thousands of tables into Python memory
   - Filtering happens efficiently in SQLite

2. **include_is_private flag**: Detect private resources in single SQL query
   - Compares actor permissions vs anonymous permissions in SQL
   - LEFT JOIN between actor_allowed and anon_allowed CTEs
   - Returns is_private column: 1 if anonymous blocked, 0 otherwise
   - No individual check_visibility() calls needed

3. **Resource.private property**: Safe access with clear error messages
   - Raises AttributeError if accessed without include_is_private=True
   - Prevents accidental misuse of the property

4. **Database view optimization**: Use new API to eliminate redundant checks
   - Single bulk query replaces N individual permission checks
   - Private flag computed in SQL, not via check_visibility() calls
   - Views filtered from allowed_dict instead of checking db.view_names()

All permission filtering now happens in SQLite where it belongs, with
minimal data transferred to Python.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 10:32:18 -07:00
Simon Willison
2620938661 Migrate /database view to use bulk allowed_resources()
Replace one-by-one permission checks with bulk allowed_resources() call:
- DatabaseView and QueryView now fetch all allowed tables once
- Filter views and tables using pre-fetched allowed_table_set
- Update TableResource.resources_sql() to include views from catalog_views

This improves performance by reducing permission checks from O(n) to O(1) per
table/view, where n is the number of tables in the database.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 10:32:18 -07:00
Simon Willison
23715d6c00 Error on startup if invalid setting types 2025-10-24 10:32:18 -07:00
Simon Willison
58ac5ccd6e Simplify types in datasette/permissions.py 2025-10-24 10:32:18 -07:00
Simon Willison
b311f735f9 Fix schema mismatch in empty result query
When no permission rules exist, the query was returning 2 columns (parent, child)
but the function contract specifies 3 columns (parent, child, reason). This could
cause schema mismatches in consuming code.

Added 'NULL AS reason' to match the documented 3-column schema.

Added regression test that verifies the schema has 3 columns even when no
permission rules are returned. The test fails without the fix (showing only
2 columns) and passes with it.

Thanks to @asg017 for catching this
2025-10-24 10:32:18 -07:00
Simon Willison
79879b834a Address PR #2515 review comments
- Add URL to sqlite-permissions-poc in module docstring
- Replace Optional with | None for modern Python syntax
- Add Datasette type annotations
- Add SQL comment explaining cascading permission logic
- Refactor duplicated plugin result processing into helper function
2025-10-24 10:32:18 -07:00
Simon Willison
a21a1b6c14 Ran blacken-docs 2025-10-24 10:32:18 -07:00
Simon Willison
c0b5ce04c3 Ran cog 2025-10-24 10:32:18 -07:00
Simon Willison
9172020535 Removed unneccessary isinstance(candidate, PermissionSQL) 2025-10-24 10:32:18 -07:00
Simon Willison
4d6730e3c4 Remove unused methods from Resource base class 2025-10-24 10:32:18 -07:00
Simon Willison
c7278c73f3 Ran latest prettier 2025-10-24 10:32:18 -07:00
Simon Willison
96d2e16e83 Use allowed_resources_sql() with CTE for table filtering 2025-10-24 10:32:18 -07:00
Simon Willison
eb5a95ee6e Rewrite tables endpoint to use SQL LIKE instead of Python regex 2025-10-24 10:32:18 -07:00
Simon Willison
8e47f99874 Fix /-/tables endpoint: add .json support and correct response format 2025-10-24 10:32:18 -07:00
Simon Willison
7d04211559 Fix test_tables_endpoint_config_database_allow by using unique database names 2025-10-24 10:32:18 -07:00
Simon Willison
d73b6f169f Add register_actions hook to test plugin and improve test 2025-10-24 10:32:18 -07:00
Simon Willison
130dad268d Fix test_navigation_menu_links by enabling root_enabled for root actor 2025-10-24 10:32:18 -07:00
Simon Willison
e333827687 permission_allowed_default_allow_sql 2025-10-24 10:32:18 -07:00
Simon Willison
8b098e4b3e Applied Black 2025-10-24 10:32:18 -07:00
Simon Willison
06af34240f Fix permission endpoint tests by resolving method signature conflicts
- Renamed internal allowed_resources_sql() to _build_permission_rules_sql()
  to avoid conflict with public method
- Made public allowed_resources_sql() keyword-only to prevent argument order bugs
- Fixed PermissionRulesView to use _build_permission_rules_sql() which returns
  full permission rules (with allow/deny) instead of filtered resources
- Fixed _build_permission_rules_sql() to pass actor dict to build_rules_union()
- Added actor_id extraction in AllowedResourcesView
- Added root_enabled=True to test fixture to grant permissions-debug to root user

All 51 tests in test_permission_endpoints.py now pass.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 10:32:18 -07:00
Simon Willison
7423c1a999 Fixed some more tests 2025-10-24 10:32:18 -07:00
Simon Willison
98493b7587 Fix permission_allowed_sql_bridge to not apply defaults, closes #2526
The bridge was incorrectly using the new allowed() method which applies
default allow rules. This caused actors without restrictions to get True
instead of USE_DEFAULT, breaking backward compatibility.

Fixed by:
- Removing the code that converted to resource objects and called allowed()
- Bridge now ONLY checks config-based rules via _config_permission_rules()
- Returns None when no config rules exist, allowing Permission.default to apply
- This maintains backward compatibility with the permission_allowed() API

All 177 permission tests now pass, including test_actor_restricted_permissions
and test_permissions_checked which were previously failing.
2025-10-24 10:32:18 -07:00
Simon Willison
8b5bf3e487 Mark test_permissions_checked database download test as xfail, refs #2526
The test expects ensure_permissions() to check all three permissions
(view-database-download, view-database, view-instance) but the current
implementation short-circuits after the first successful check.

Created issue #2526 to track the investigation of the expected behavior.
2025-10-24 10:32:18 -07:00
Simon Willison
2ed2849a14 Eliminate duplicate config checking by removing old permission_allowed hooks
- Removed permission_allowed_default() hook (checked config twice)
- Removed _resolve_config_view_permissions() and _resolve_config_permissions_blocks() helpers
- Added permission_allowed_sql_bridge() to bridge old permission_allowed() API to new SQL system
- Moved default_allow_sql setting check into permission_resources_sql()
- Made root-level allow blocks apply to all view-* actions (view-database, view-table, view-query)
- Added add_row_allow_block() helper for allow blocks that should deny when no match

This resolves the duplicate checking issue where config blocks were evaluated twice:
once in permission_allowed hooks and once in permission_resources_sql hooks.

Note: One test still failing (test_permissions_checked for database download) - needs investigation
2025-10-24 10:32:18 -07:00
Simon Willison
b8d26754df Document datasette.allowed(), PermissionSQL class, and SQL parameters
- Added documentation for datasette.allowed() method with keyword-only arguments
- Added comprehensive PermissionSQL class documentation with examples
- Documented the three SQL parameters available: :actor, :actor_id, :action
- Included examples of using json_extract() to access actor fields
- Explained permission resolution rules (specificity, deny over allow, implicit deny)
- Fixed RST formatting warnings (escaped asterisk, fixed underline length)
2025-10-24 10:32:18 -07:00
Simon Willison
c06e05b7db New --root mechanism with datasette.root_enabled, closes #2521 2025-10-24 10:32:18 -07:00
Simon Willison
65c427e4ee Ensure :actor, :actor_id and :action are all available to permissions SQL, closes #2520
- Updated build_rules_union() to accept actor as dict and provide :actor (JSON) and :actor_id
- Updated resolve_permissions_from_catalog() and resolve_permissions_with_candidates() to accept actor dict
- :actor is now the full actor dict as JSON (use json_extract() to access fields)
- :actor_id is the actor's id field for simple comparisons
- :action continues to be available as before
- Updated all call sites and tests to use new parameter format
- Added test demonstrating all three parameters working together
2025-10-24 10:32:18 -07:00
Simon Willison
b9c6e7a0f6 PluginSQL renamed to PermissionSQL, closes #2524 2025-10-24 10:32:18 -07:00
Simon Willison
159b9f3fec ds.allowed() is now keyword-argument only, closes #2519 2025-10-24 10:32:18 -07:00
Simon Willison
8e9916b286 Update allowed_resources_sql() and refactor allowed_resources() 2025-10-24 10:32:18 -07:00
Simon Willison
b1080e7d30 Moved Resource defaults to datasette/resources.py 2025-10-24 10:32:18 -07:00
Simon Willison
5b0baf7cd5 Ran prettier 2025-10-24 10:32:18 -07:00
Simon Willison
2b879e462f Implement resource-based permission system with SQL-driven access control
This introduces a new hierarchical permission system that uses SQL queries
for efficient permission checking across resources. The system replaces the
older permission_allowed() pattern with a more flexible resource-based
approach.

Core changes:

- New Resource ABC and Action dataclass in datasette/permissions.py
  * Resources represent hierarchical entities (instance, database, table)
  * Each resource type implements resources_sql() to list all instances
  * Actions define operations on resources with cascading rules

- New plugin hook: register_actions(datasette)
  * Plugins register actions with their associated resource types
  * Replaces register_permissions() and register_resource_types()
  * See docs/plugin_hooks.rst for full documentation

- Three new Datasette methods for permission checks:
  * allowed_resources(action, actor) - returns list[Resource]
  * allowed_resources_with_reasons(action, actor) - for debugging
  * allowed(action, resource, actor) - checks single resource
  * All use SQL for filtering, never Python iteration

- New /-/tables endpoint (TablesView)
  * Returns JSON list of tables user can view
  * Supports ?q= parameter for regex filtering
  * Format: {"matches": [{"name": "db/table", "url": "/db/table"}]}
  * Respects all permission rules from configuration and plugins

- SQL-based permission evaluation (datasette/utils/actions_sql.py)
  * Cascading rules: child-level → parent-level → global-level
  * DENY beats ALLOW at same specificity
  * Uses CTEs for efficient SQL-only filtering
  * Combines permission_resources_sql() hook results

- Default actions in datasette/default_actions.py
  * InstanceResource, DatabaseResource, TableResource, QueryResource
  * Core actions: view-instance, view-database, view-table, etc.

- Fixed default_permissions.py to handle database-level allow blocks
  * Now creates parent-level rules for view-table action
  * Fixes: datasette ... -s databases.fixtures.allow.id root

Documentation:

- Comprehensive register_actions() hook documentation
- Detailed resources_sql() method explanation
- /-/tables endpoint documentation in docs/introspection.rst
- Deprecated register_permissions() with migration guide

Tests:

- tests/test_actions_sql.py: 7 tests for core permission API
- tests/test_tables_endpoint.py: 13 tests for /-/tables endpoint
- All 118 documentation tests pass
- Tests verify SQL does filtering (not Python)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-24 10:32:18 -07:00
Simon Willison
e951f7e81f
models: read permission for tmate 2025-10-22 16:16:49 -07:00
Simon Willison
2df06e1fda
GITHUB_TOKEN env for tmate.yml 2025-10-22 16:14:27 -07:00
Simon Willison
7ce723edcf
Reformat JavaScript files with Prettier (#2517)
* Reformat JavaScript files with Prettier

Ran `npm run fix` to apply consistent code formatting across JavaScript
files using the project's Prettier configuration (2 spaces, no tabs).

Files reformatted:
- datasette/static/datasette-manager.js
- datasette/static/json-format-highlight-1.0.1.js
- datasette/static/table.js

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* Upgrade Prettier from 2.2.1 to 3.6.2

Updated package.json and package-lock.json to use Prettier 3.6.2,
ensuring consistent formatting between local development and CI.

The existing JavaScript files are already formatted with Prettier 3.x
style from the previous commit.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-20 16:41:09 -07:00
Simon Willison
ec38ad3768
Add DatabaseContext dataclass for consistent template context documentation (#2513)
Refs:
- #1510
- #2333

Claude Code:

Created DatabaseContext as a documented dataclass following the same pattern
as the existing QueryContext. This change replaces the inline dictionary
context creation with an explicit dataclass that:

- Documents all 21 template context variables with help metadata
- Inherits from the Context base class for identification
- Provides better IDE support and type safety
- Makes template variables discoverable without reading code

Also updated QueryContext to inherit from Context for consistency.
2025-10-09 12:54:02 -07:00
Simon Willison
659673614a Refactor debug templates to use shared JavaScript functions
Extracted common JavaScript utilities from debug_allowed.html, debug_check.html, and debug_rules.html into a new _debug_common_functions.html include template. This eliminates code duplication and improves maintainability.

The shared functions include:
- populateFormFromURL(): Populates form fields from URL query parameters
- updateURL(formId, page): Updates browser URL with form values
- escapeHtml(text): HTML escaping utility

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-08 21:53:34 -07:00
Simon Willison
e2a739c496 Fix for asyncio.iscoroutinefunction deprecation warnings
Closes #2512

Refs https://github.com/simonw/asyncinject/issues/18
2025-10-08 20:32:16 -07:00
Simon Willison
27084caa04
New allowed_resources_sql plugin hook and debug tools (#2505)
* allowed_resources_sql plugin hook and infrastructure
* New methods for checking permissions with the new system
* New /-/allowed and /-/check and /-/rules special endpoints

Still needs to be integrated more deeply into Datasette, especially for listing visible tables.

Refs: #2502

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-10-08 14:27:51 -07:00
Simon Willison
85da8474d4
Python 3.14, drop Python 3.9
Closes #2506
2025-10-08 13:11:32 -07:00
Simon Willison
909448fb7a Run CLI coroutines on explicit event loops
With the help of Codex CLI: https://gist.github.com/simonw/d2de93bfdf85a014a29093720c511093
2025-10-01 12:59:14 -07:00
Simon Willison
5d09ab3ff1 Remove legacy event_loop fixture usage 2025-10-01 12:51:23 -07:00
Simon Willison
571ce651c1 Use venv Python to launch datasette fixtures 2025-10-01 12:49:09 -07:00
Simon Willison
d87bd12dbc Remove obsolete mix_stderr=False 2025-09-30 14:33:24 -07:00
Simon Willison
9dc2a3ffe5 Removed broken refs to Glitch, closes #2503 2025-09-28 21:15:58 -07:00
Simon Willison
7a602140df catalog_views table, closes #2495
Refs https://github.com/datasette/datasette-queries/issues/1#issuecomment-3074491003
2025-07-15 10:22:56 -07:00
Simon Willison
e2497fdb59 Replace Glitch with Codespaces, closes #2488 2025-05-28 19:17:22 -07:00
Simon Willison
1c77a7e33f Fix global-power-points references
Refs https://github.com/simonw/datasette.io/issues/167
2025-05-28 19:07:46 -07:00
Simon Willison
6f7f4c7d89 Release 1.0a19
Refs #2479
2025-04-21 22:38:53 -07:00
Simon Willison
f4274e7a2e CSS fix for table headings on mobile, closes #2479 2025-04-21 22:33:34 -07:00
Simon Willison
271aa09056 Release 1.0a18
Refs #2466, #2468, #2470, #2476, #2477
2025-04-16 22:16:25 -07:00
Jack Stratton
d5c6e502fb
fix: tilde encode database name in expanded foreign key links (#2476)
* Tilde encode database for expanded foreign key links
* Test for foreign key fix in #2476

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2025-04-16 22:15:11 -07:00
Simon Willison
f2485dce9c
Hide FTS tables that have content=
* Hide FTS tables that have content=, closes #2477
2025-04-16 21:44:09 -07:00
Simon Willison
f6446b3095 Further wording tweaks 2025-04-16 08:25:03 -07:00
Simon Willison
d03273e205
Wording tweak 2025-04-16 08:19:22 -07:00
Simon Willison
d021ce97aa
Note that only first actor_from_request value is respected
https://github.com/datasette/datasette-profiles/issues/4#issuecomment-2758588167
2025-03-27 09:09:57 -07:00
Simon Willison
7945f4fbf2 Improved docs for db.get_all_foreign_keys() 2025-03-12 15:42:11 -07:00
dependabot[bot]
da209ed2ba
Drop 3.8 testing, add 3.13 testing, upgrade Black
Also bump some GitHub Actions versions.
2025-03-09 20:45:18 -07:00
Simon Willison
333f786cb0 Correct syntax for link headers, closes #2470 2025-03-09 20:05:43 -05:00
Simon Willison
6e512caa59 Upgrade to actions/cache@v4
v2 no longer works.
2025-02-28 22:57:22 -08:00
Simon Willison
209bdee0e8 Don't run prepare_connection() on internal database, closes #2468 2025-02-18 10:23:23 -08:00
Simon Willison
e59fd01757 Fix for incorrect REFERENCES in internal DB
Refs #2466
2025-02-12 19:40:43 -08:00
Simon Willison
cd9182a551 Release 1.0a17
Refs #1690, #1943, #2422, #2424, #2441, #2454, #2455, #2458, #2460, #2465
2025-02-06 11:12:34 -08:00
Simon Willison
7f23411002 Call db.close() in ds.remove_database()
https://github.com/simonw/datasette/issues/2465#issuecomment-2640712713
2025-02-06 10:46:11 -08:00
Simon Willison
f95ac19e71 Fix to support replacing a database, closes #2465 2025-02-06 10:32:47 -08:00
Simon Willison
53a3b3c80e
Test improvements and fixed deprecation warnings (#2464)
* `asyncio_default_fixture_loop_scope = function`
* Fix a bunch of BeautifulSoup deprecation warnings
* Fix for PytestUnraisableExceptionWarning: Exception ignored in: <_io.FileIO [closed]>
* xfail for sql_time_limit tests (these can be flaky in CI)

Refs #2461
2025-02-04 14:49:52 -08:00
Simon Willison
962da77d61
Try the event_loop fixture (#2463)
Refs https://github.com/simonw/datasette/issues/2461#issuecomment-2634920351
2025-02-04 11:56:19 -08:00
Simon Willison
b9047d812a Skip the serial marked tests in pytest coverage
Refs https://github.com/simonw/datasette/issues/2461#issuecomment-2634896235
2025-02-04 11:37:01 -08:00
Simon Willison
9e41d19f73 pytest.mark.serial on CLI tests, refs #2461 2025-02-04 11:28:16 -08:00
Simon Willison
f57977a08f /-/permissions?filter=exclude-yours/only-yours - closes #2460 2025-02-04 11:09:44 -08:00
Simon Willison
4dff846271 simple_primary_key now uses integer id, helps close #2458 2025-02-01 21:44:53 -08:00
dependabot[bot]
d48e5ae0ce
Bump rollup from 3.3.0 to 3.29.5 (#2432)
Bumps [rollup](https://github.com/rollup/rollup) from 3.3.0 to 3.29.5.
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v3.3.0...v3.29.5)

---
updated-dependencies:
- dependency-name: rollup
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-02-01 17:03:30 -08:00
Simon Willison
b190b87ec6 Detect single unique text column in label_column_for_table, closes #2458
Also added new tests for label_column_for_table()
2025-02-01 17:02:49 -08:00
Simon Willison
d9a450b197 Show registered permissions on /-/permissions
Closes #1943
2025-01-15 17:42:13 -08:00
Simon Willison
308c243cfd datasette.set_actor_cookie() and datasette.delete_actor_cookie(), closes #1690 2025-01-15 17:37:25 -08:00
Simon Willison
37873e02b0 Better breadcrumbs on database and table page, closes #2454 2025-01-09 10:07:03 -08:00
Simon Willison
34390bbed8 Fix for params metadata error, closes #2455 2025-01-09 09:54:06 -08:00
Solomon Himelbloom
1902735c63
docs: fix time travel bug via changelog.rst (#2449) 2025-01-01 15:41:42 -08:00
Simon Willison
72f8ac680a CI against Python 3.13 2024-11-28 17:15:54 -08:00
Simon Willison
7077b8b1ba Changelog for 0.65.1, refs #2443 2024-11-28 17:14:27 -08:00
Simon Willison
e85517dab3 blacken-docs, refs #2441 2024-11-15 13:34:45 -08:00
Simon Willison
dce718961c Async support for magic parameters
Closes #2441
2024-11-15 13:17:45 -08:00
Simon Willison
b0b600b79f Release notes for 0.65 in main branch
Refs #2434
2024-10-07 10:40:57 -07:00
Simon Willison
832f76ce26 Documentation for datasette serve environment variables
Refs #2422, #2424
2024-09-09 09:18:47 -07:00
Simon Willison
ea9f66f9fb Rename SQLITE_EXTENSIONS to DATASETTE_LOAD_EXTENSION
Closes #2424
2024-09-09 09:16:23 -07:00
Alex Garcia
a542870bfb
Add DATASETTE_SSL_KEYFILE and DATASETTE_SSL_CERTFILE envvars to datasette serve flags (#2423)
Closes #2422
2024-09-09 08:58:33 -07:00
Simon Willison
0bc6a2af89 Release 1.0a16
Refs #2320, #2342, #2398, #2399, #2400, #2403, #2404, #2405, #2406, #2407, #2408, #2414, #2415, #2420
2024-09-05 20:56:46 -07:00
Simon Willison
2ec4d8a4d5 Removed a img styles, closes #2420 2024-09-05 20:45:07 -07:00
Simon Willison
f601425015 Table styles now only apply to table.rows-and-columns, refs #2420 2024-09-05 20:11:23 -07:00
Simon Willison
6da8d09a14 header.hd and footer.ft, refs #2420 2024-09-05 19:57:27 -07:00
Simon Willison
deb482a41e .core label, refs #2420 2024-09-05 19:53:06 -07:00
Simon Willison
2170269258
New .core CSS class for inputs and buttons
* Initial .core input/button classes, refs #2415
* Docs for the new .core CSS class, refs #2415
* Applied .core class everywhere that needs it, closes #2415
2024-09-03 08:37:26 -07:00
Simon Willison
92c4d41ca6 results.dicts() method, closes #2414 2024-09-01 17:20:41 -07:00
Simon Willison
dc288056b8 Better handling of errors for count all button, refs #2408 2024-08-21 19:56:02 -07:00
Simon Willison
9ecce07b08 count all rows button on table page, refs #2408 2024-08-21 19:09:25 -07:00
Simon Willison
dc1d152476 Stop counting at 10,000 rows when listing tables, refs #2398 2024-08-21 14:58:29 -07:00
Simon Willison
bc46066f9d Fix huge performance bug in DateFacet, refs #2407 2024-08-21 14:38:11 -07:00
Simon Willison
f28ff8e4f0 Consider just 1000 rows for suggest facet, closes #2406 2024-08-21 13:36:42 -07:00
Simon Willison
8a63cdccc7 Tracer now catches errors, closes #2405 2024-08-21 12:19:18 -07:00
Simon Willison
34a6b2ac84 Fixed bug with ?_trace=1 and large responses, closes #2404 2024-08-21 10:58:17 -07:00
Simon Willison
9028d7f805 Support nested JSON in metadata.json, closes #2403 2024-08-21 09:53:52 -07:00
Tiago Ilieve
1f3fb5f96b
debugger: load 'ipdb' if present
* debugger: load 'ipdb' if present

Transparently chooses between the IPython-enhanced 'ipdb' or the
standard 'pdb'.

* datasette install ipdb

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2024-08-20 20:02:35 -07:00
Simon Willison
4efcc29d02
Test against Python "3.13-dev"
Refs:
- #2320
2024-08-20 19:15:36 -07:00
Simon Willison
39dfc7d7d7
Removed units functionality and Pint dependency
Closes #2400, unblocks #2320
2024-08-20 19:03:33 -07:00
Simon Willison
d444b6aad5 Fix for spacing on index page, closes #2399 2024-08-20 09:36:02 -07:00
Simon Willison
7d8dd2ac7f Release 1.0a15
Refs #2296, #2326, #2384, #2386, #2389, #2390, #2393, #2394
2024-08-15 22:04:04 -07:00
Alex Garcia
0dd41efce6
skip over "queries" blocks when processing database-level metadata items (#2386) 2024-08-15 21:48:07 -07:00
Simon Willison
53a8ae1871 Applied Black, refs #2327, #2326 2024-08-15 17:16:47 -07:00
Seb Bacon
9cb5700d60
bugfix: correctly detect json1 in versions.json (#2327)
Fixes #2326
2024-08-15 13:20:26 -07:00
Alex Garcia
6d91d082e0
Hide shadow tables, don't hide virtual tables
Closes #2296
2024-08-15 13:19:22 -07:00
Simon Willison
05dfd34fd0 Use text/html for CSRF error page, refs #2390 2024-08-15 08:48:47 -07:00
dependabot[bot]
160d82f06e
Bump furo and black (#2385)
Updates `furo` from 2024.7.18 to 2024.8.6
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2024.07.18...2024.08.06)

Updates `black` from 24.4.2 to 24.8.0
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.4.2...24.8.0)

---
updated-dependencies:

  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>

* Pin Sphinx==7.4.7

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Simon Willison <swillison@gmail.com>
2024-08-14 21:38:33 -07:00
Simon Willison
492378c2a0 Test for application/json; charset=utf-8
Refs #2384, #2392
2024-08-14 21:37:40 -07:00
Alex Garcia
cf4274f2a3
less strict requirements to content-type=application/json (#2392) 2024-08-14 21:33:58 -07:00
Simon Willison
e9d34a99b8 Missing template from previous commit, refs #2389 2024-08-14 21:32:57 -07:00
Simon Willison
06d4ffb92e Custom error on CSRF failures, closes #2390
Uses https://github.com/simonw/asgi-csrf/issues/28
2024-08-14 21:29:16 -07:00
Simon Willison
93067668fe /-/ alternative URL for homepage, closes #2393 2024-08-14 17:57:13 -07:00
Simon Willison
bf953628bb Fix bug where -s could reset settings to defaults, closes #2389 2024-08-14 14:28:48 -07:00
Simon Willison
f6bd2bf8b0 Release 1.0a14
Refs #2306, #2307, #2311, #2319, #2341, #2348, #2352, #2353, #2358, #2359, #2360, #2375

Closes #2381
2024-08-05 14:30:02 -07:00
Simon Willison
2e82eb108c Group docs on get_/set_ metadata methods, refs #2381 2024-08-05 14:16:34 -07:00
Simon Willison
e9f598609b Fix for codespell recipe 2024-08-05 14:16:11 -07:00
Simon Willison
ff710ed7eb Typo fix, refs #2381 2024-08-05 14:09:12 -07:00
Simon Willison
5bd6853bf4 Release notes for 1.0a14, refs #2381
Refs #2306, #2307, #2311, #2319, #2341, #2348, #2352, #2353, #2358, #2359, #2360, #2375
2024-08-05 14:08:15 -07:00
Simon Willison
78ce105413 Markup fix 2024-08-05 14:07:16 -07:00
Simon Willison
2b0a61ee19 Rename metadata tables and add schema to docs, refs #2382 2024-08-05 13:53:55 -07:00
Simon Willison
8dc9bfa2ab Markup tweak for track_event docs 2024-08-05 12:58:10 -07:00
Simon Willison
2ad51baa31 Move /db?sql= redirect to top of changes
Refs #2381
2024-08-05 12:16:30 -07:00
Simon Willison
bd7d3bb70f Tweaks and improvements to upgrade guide, refs ##2374
Also refs #2381
2024-08-05 12:11:17 -07:00
Alex Garcia
169ee5d710
Initial upgrade guide for v0.XX to v1 2024-08-05 10:35:38 -07:00
Simon Willison
81b68a143a /-/auth-token as root redirects to /, closes #2375 2024-07-26 14:09:20 -07:00
dependabot[bot]
feccfa2a4d
Bump the python-packages group across 1 directory with 2 updates (#2371)
Bumps the python-packages group with 2 updates in the / directory: [sphinx](https://github.com/sphinx-doc/sphinx) and [furo](https://github.com/pradyunsg/furo).


Updates `sphinx` from 7.3.7 to 7.4.7
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.3.7...v7.4.7)

Updates `furo` from 2024.5.6 to 2024.7.18
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2024.05.06...2024.07.18)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-25 16:04:29 -07:00
Simon Willison
2edf45b1b6 Use isolation_level=IMMEDIATE, refs #2358 2024-07-16 14:20:54 -07:00
Alex Garcia
a23c2aee00
Introduce new /$DB/-/query endpoint, soft replaces /$DB?sql=... (#2363)
* Introduce new default /$DB/-/query endpoint
* Fix a lot of tests
* Update pyodide test to use query endpoint
* Link to /fixtures/-/query in a few places
* Documentation for QueryView

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2024-07-15 10:33:51 -07:00
dependabot[bot]
56adfff8d2
Bump the python-packages group across 1 directory with 4 updates (#2362)
Bumps the python-packages group with 4 updates in the / directory: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo), [blacken-docs](https://github.com/adamchainz/blacken-docs) and [black](https://github.com/psf/black).


Updates `sphinx` from 7.2.6 to 7.3.7
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.6...v7.3.7)

Updates `furo` from 2024.1.29 to 2024.5.6
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2024.01.29...2024.05.06)

Updates `blacken-docs` from 1.16.0 to 1.18.0
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/adamchainz/blacken-docs/compare/1.16.0...1.18.0)

Updates `black` from 24.2.0 to 24.4.2
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.2.0...24.4.2)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-02 09:45:15 -07:00
Simon Willison
c2e8e5085b Release notes for 0.64.8 on main 2024-06-21 16:36:58 -07:00
Simon Willison
263788906a Fix for RowNotFound, refs #2359 2024-06-21 16:10:16 -07:00
Simon Willison
7316dd4ac6 Fix for TableNotFound, refs #2359 2024-06-21 16:09:20 -07:00
Simon Willison
62686114ee Do not show database name in Database Not Found error, refs #2359 2024-06-21 16:02:15 -07:00
Simon Willison
93534fd3d0 Show response.text on test_upsert failure, refs #2356 2024-06-13 10:19:26 -07:00
Simon Willison
45c27603d2 xfail two flaky tests, #2355, #2356 2024-06-13 10:15:38 -07:00
Alex Garcia
8f86d2af6a
Test against multiple SQLite versions (#2352)
* Use sqlite-versions action for testing multiple versions
2024-06-13 10:09:45 -07:00
Simon Willison
64a125b860 Removed unnecessary comments, refs #2354 2024-06-12 16:56:59 -07:00
Simon Willison
d118d5c5bb named_parameters(sql) sync function, refs #2354
Also refs #2353 and #2352
2024-06-12 16:51:07 -07:00
Simon Willison
b39b01a890 Copy across release notes from 0.64.7
Refs #2353
2024-06-12 16:21:07 -07:00
Simon Willison
780deaa275 Reminder about how to deploy a release branch 2024-06-12 16:12:05 -07:00
Simon Willison
2b6bfddafc Workaround for #2353 2024-06-11 14:04:55 -07:00
Simon Willison
7437d40e5d <html lang="en">, closes #2348 2024-06-11 10:17:02 -07:00
Simon Willison
9a3c3bfcc7 Fix for pyodide test failure, refs #2351 2024-06-11 10:11:34 -07:00
Simon Willison
c698d008e0 Only test first wheel, fixes surprise bug
https://github.com/simonw/datasette/issues/2351#issuecomment-2161211173
2024-06-11 10:04:05 -07:00
Alex Garcia
e1bfab3fca
Move Metadata to --internal database
Refs:
- https://github.com/simonw/datasette/pull/2343
- https://github.com/simonw/datasette/issues/2341
2024-06-11 09:33:23 -07:00
Simon Willison
8f9509f00c
datasette, not self.ds, in internals documentation 2024-04-22 16:01:37 -07:00
Simon Willison
7d6d471dc5 Include actor in track_event async example, refs #2319 2024-04-11 18:53:07 -07:00
Simon Willison
2a08ffed5c
Async example for track_event hook
Closes #2319
2024-04-11 18:47:01 -07:00
Simon Willison
63714cb2b7 Fixed some typos spotted by Gemini Pro 1.5, closes #2318 2024-04-10 17:05:15 -07:00
Simon Willison
d32176c5b8
Typo fix triggera -> triggers 2024-04-10 16:50:09 -07:00
Simon Willison
19b6a37336 z-index: 10000 on dropdown menu, closes #2311 2024-03-21 10:15:57 -07:00
Simon Willison
1edb24f124 Docs for 100 max rows in an insert, closes #2310 2024-03-19 09:15:39 -07:00
Simon Willison
da68662767
datasette-enrichments is example of row_actions
Refs:
- https://github.com/simonw/datasette/issues/2299
- https://github.com/datasette/datasette-enrichments/issues/41
2024-03-17 14:40:47 -07:00
Agustin Bacigalup
67e66f36c1
Add ETag header for static responses (#2306)
* add etag to static responses

* fix RuntimeError related to static headers

* Remove unnecessary import

---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2024-03-17 12:18:40 -07:00
Simon Willison
261fc8d875 Fix datetime.utcnow deprecation warning 2024-03-15 15:32:12 -07:00
Simon Willison
eb8545c172 Refactor duplicate code in DatasetteClient, closes #2307 2024-03-15 15:29:03 -07:00
Simon Willison
54f5604caf Fixed cookies= httpx warning, refs #2307 2024-03-15 15:19:23 -07:00
Simon Willison
5af6837725 Fix httpx warning about app=self.app, refs #2307 2024-03-15 15:15:31 -07:00
Simon Willison
8b6f155b45 Added two things I left out of the 1.0a13 release notes
Refs #2104, #2294

Closes #2303
2024-03-12 19:19:51 -07:00
Simon Willison
c92f326ed1 Release 1.013a
#2104, #2286, #2293, #2297, #2298, #2299, #2300, #2301, #2302
2024-03-12 19:10:53 -07:00
Simon Willison
feddd61789 Fix tests I broke in #2302 2024-03-12 17:01:51 -07:00
Simon Willison
9cc6f1908f Gradient on header and footer, closes #2302 2024-03-12 16:54:03 -07:00
Simon Willison
e088abdb46 Refactored action menus to a shared include, closes #2301 2024-03-12 16:35:34 -07:00
Simon Willison
828ef9899f Ran blacken-docs, refs #2299 2024-03-12 16:25:25 -07:00
Simon Willison
8d456aae45 Fix spelling of displayed, refs #2299 2024-03-12 16:17:53 -07:00
Simon Willison
b8711988b9 row_actions() plugin hook, closes #2299 2024-03-12 16:16:05 -07:00
Simon Willison
7339cc51de Rearrange plugin hooks page with more sections, closes #2300 2024-03-12 15:44:10 -07:00
Simon Willison
06281a0b8e Test for labels on Table/View action buttons, refs #2297 2024-03-12 14:32:48 -07:00
Simon Willison
909c85cd2b view_actions plugin hook, closes #2297 2024-03-12 14:25:28 -07:00
Simon Willison
daf5ca02ca homepage_actions() plugin hook, closes #2298 2024-03-12 13:46:06 -07:00
Simon Willison
7b32d5f7d8 datasette-create-view as example of query_actions hook 2024-03-07 00:11:14 -05:00
Simon Willison
7818e8b9d1 Hide tables starting with an _, refs #2104 2024-03-07 00:03:42 -05:00
Simon Willison
a395256c8c Allow-list select * from pragma_table_list()
Refs https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475
2024-03-07 00:03:20 -05:00
Simon Willison
090dff542b
Action menu descriptions
* Refactor tests to extract get_actions_links() helper
* Table, database and query action menu items now support optional descriptions

Closes #2294
2024-03-06 22:54:06 -05:00
Simon Willison
c6e8a4a76c
margin-bottom on .page-action-menu, refs #2286 2024-03-05 19:34:57 -08:00
Simon Willison
4d24bf6b34 Don't explain an explain even in the demo, refs #2293 2024-03-05 18:14:55 -08:00
Simon Willison
5de6797d4a Better demo plugin for query_actions, refs #2293 2024-03-05 18:06:38 -08:00
Simon Willison
86335dc722 Release 1.0a12
Refs #2281, #2283, #2287, #2289
2024-02-29 14:35:28 -08:00
Simon Willison
57c1ce0e8b Reset column menu on every click, closes #2289 2024-02-29 14:25:50 -08:00
Simon Willison
6ec0081f5d
query_actions plugin hook
* New query_actions plugin hook, closes #2283
2024-02-27 21:55:16 -08:00
Simon Willison
f99c2f5f8c ?column_notcontains= table filter, closes #2287 2024-02-27 16:07:41 -08:00
Simon Willison
c863443ea1 Documentation for derive_named_parameters()
Closes #2284

Refs https://github.com/simonw/datasette-write/issues/7#issuecomment-1967593883
2024-02-27 13:24:47 -08:00
Simon Willison
dfd4ad558b
New design for table and database action menus
Closes #2281
2024-02-25 12:54:16 -08:00
Simon Willison
434123425f Release 1.0a11
Refs #2263, #2278, #2279

Closes #2280
2024-02-19 14:48:37 -08:00
Jeroen Van Goey
103b4decbd
fix (typo): Corrected spelling of 'environments' (#2268)
* fix (typo): Corrected spelling of 'environments'

* ci: add test folder to codespell workflow
2024-02-19 14:41:32 -08:00
dependabot[bot]
158d5d96e9
Bump the python-packages group with 1 update (#2269)
Bumps the python-packages group with 1 update: [black](https://github.com/psf/black).


Updates `black` from 24.1.1 to 24.2.0
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/24.1.1...24.2.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-19 14:23:12 -08:00
Simon Willison
28bf3a933f Applied Black, refs #2278 2024-02-19 14:22:59 -08:00
Simon Willison
26300738e3 Fixes for permissions debug page, closes #2278 2024-02-19 14:17:37 -08:00
Simon Willison
27409a7892 Fix for hook position in wide column names, refs #2263 2024-02-19 14:01:55 -08:00
Simon Willison
392ca2e24c Improvements to table column cog menu display, closes #2263
- Repositions if menu would cause a horizontal scrollbar
- Arrow tip on menu now attempts to align with cog icon on column
2024-02-19 13:40:48 -08:00
Simon Willison
b36a2d8f4b Require update-row to use insert replace, closes #2279 2024-02-19 12:55:51 -08:00
Simon Willison
3856a8cb24 Consistent Permission denied:, refs #2279 2024-02-19 12:51:14 -08:00
Simon Willison
81629dbeff Upgrade GitHub Actions, including PyPI publishing 2024-02-17 21:03:41 -08:00
Simon Willison
a4fa1ef3bd Release 1.0a10
Refs #2277
2024-02-17 20:56:15 -08:00
Simon Willison
10f9ba1a00 Take advantage of execute_write_fn(transaction=True)
A bunch of places no longer need to do manual transaction handling
thanks to this change. Refs #2277
2024-02-17 20:51:19 -08:00
Simon Willison
5e0e440f2c database.execute_write_fn(transaction=True) parameter, closes #2277 2024-02-17 20:28:15 -08:00
Simon Willison
e1c80efff8 Note about activating alpha documentation versions on ReadTheDocs 2024-02-16 14:43:36 -08:00
Simon Willison
9906f937d9 Release 1.0a9
Refs #2101, #2260, #2262, #2265, #2270, #2273, #2274, #2275

Closes #2276
2024-02-16 14:36:12 -08:00
Simon Willison
3a999a85fb Fire insert-rows on /db/-/create if rows were inserted, refs #2260 2024-02-16 13:59:56 -08:00
Simon Willison
244f3ff83a Test demonstrating fix for permisisons bug in #2262 2024-02-16 13:39:57 -08:00
Simon Willison
8bfa3a51c2 Consider every plugins opinion in datasette.permission_allowed()
Closes #2275, refs #2262
2024-02-16 13:29:39 -08:00
Simon Willison
232a30459b DATASETTE_TRACE_PLUGINS setting, closes #2274 2024-02-16 13:00:24 -08:00
Simon Willison
47e29e948b Better comments in permission_allowed_default() 2024-02-16 10:05:18 -08:00
Simon Willison
97de4d6362 Use transaction in delete_everything(), closes #2273 2024-02-15 21:35:49 -08:00
Simon Willison
b89cac3b6a
Use MD5 usedforsecurity=False on Python 3.9 and higher to pass FIPS
Closes #2270
2024-02-13 18:23:54 -08:00
Simon Willison
5d79974186
Call them "notable events" 2024-02-10 07:19:47 -08:00
Simon Willison
398a92cf1e Include database in name of _execute_writes thread, closes #2265 2024-02-08 20:12:31 -08:00
Simon Willison
bd9ed62e5d Make ds.pemrission_allawed(..., default=) a keyword-only argument, refs #2262 2024-02-08 20:12:31 -08:00
Simon Willison
dcd9ea3622
datasette-events-db as an example of track_events() 2024-02-08 14:14:58 -08:00
Simon Willison
c62cfa6de8 Fix upsert test to detect new alter-table event 2024-02-08 13:36:17 -08:00
Simon Willison
c954795f9a alter: true for row/-/update, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison
4e944c29e4 Corrected path used in test_update_row_check_permission 2024-02-08 13:36:17 -08:00
Simon Willison
528d89d1a3 alter: true support for /-/insert and /-/upsert, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison
b5ccc4d608 Test for Permission denied - need alter-table 2024-02-08 13:36:17 -08:00
Simon Willison
574687834f Docs for /db/-/create alter: true option, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison
900d15bcb8 alter table support for /db/-/create API, refs #2101 2024-02-08 13:36:17 -08:00
Simon Willison
569aacd39b
Link to /en/latest/ changelog 2024-02-07 22:53:14 -08:00
Simon Willison
9989f25709 Release 1.0a8
Refs Refs #2052, #2156, #2243, #2247, #2249, #2252, #2254, #2258
2024-02-07 08:34:05 -08:00
Simon Willison
e0794ddd52 Link to annotated release notes blog post 2024-02-07 08:32:47 -08:00
Simon Willison
1e31821d9f Link to events docs from changelog 2024-02-07 08:31:26 -08:00
Simon Willison
df8d1c055a
Mention JS plugins in release intro 2024-02-06 22:59:58 -08:00
Simon Willison
d0089ba776 Note in changelog about datasette publish, refs #2195 2024-02-06 22:30:30 -08:00
Simon Willison
c64453a4a1 Fix the date on the 1.0a8 release (due to go tomorrow)
Refs #2258
2024-02-06 22:28:22 -08:00
Simon Willison
ad01f9d321
1.0a8 release notes
Closes #2243

* Changelog for jinja2_environment_from_request and plugin_hook_slots
* track_event() in changelog
* Remove Using YAML for metadata section - no longer necessary now we show YAML and JSON examples everywhere.
* Configuration via the command-line section - #2252
* JavaScript plugins in release notes, refs #2052
* /-/config in changelog, refs #2254

Refs #2052, #2156, #2243, #2247, #2249, #2252, #2254
2024-02-06 22:24:24 -08:00
Simon Willison
9ac9f0152f Migrate allow from metadata to config if necessary, closes #2249 2024-02-06 22:18:38 -08:00
Simon Willison
60c6692f68
table_config instead of table_metadata (#2257)
Table configuration that was incorrectly placed in metadata is now treated as if it was in config.

New await datasette.table_config() method.

Closes #2247
2024-02-06 21:57:09 -08:00
Simon Willison
52a1dac5d2 Test proving $env works for datasette.yml, closes #2255 2024-02-06 21:00:55 -08:00
Simon Willison
f049103852 datasette.table_metadata() is now await datasette.table_config(), refs #2247 2024-02-06 17:33:18 -08:00
Simon Willison
69c6e95323 Fixed a bunch of unused imports spotted with ruff 2024-02-06 17:27:20 -08:00
Simon Willison
5d21057cf1 /-/config example, refs #2254 2024-02-06 15:22:03 -08:00
Simon Willison
5a63ecc557 Rename metadata= to table_config= in facet code, refs #2247 2024-02-06 15:03:19 -08:00
Simon Willison
1e901aa690 /-/config page, closes #2254 2024-02-06 12:33:46 -08:00
Simon Willison
85a1dfe6e0 Configuration via the command-line section
Closes #2252

Closes #2156
2024-02-05 13:43:50 -08:00
Simon Willison
efc7357554 Remove Using YAML for metadata section
No longer necessary now we show YAML and JSON examples everywhere.
2024-02-05 13:01:03 -08:00
Simon Willison
503545b203 JavaScript plugins documentation, closes #2250 2024-02-05 11:47:17 -08:00
Simon Willison
7219a56d1e 3 space indent, not 2 2024-02-05 10:34:10 -08:00
Simon Willison
5ea7098e4d Fixed an unnecessary f-string 2024-02-04 10:15:21 -08:00
Simon Willison
4ea109ac4d Two spaces is aesthetically more pleasing here 2024-02-01 15:47:41 -08:00
Simon Willison
6ccef35cc9 More links between events documentation 2024-02-01 15:42:45 -08:00
Simon Willison
be4f02335f Treat plugins in metadata as if they were in config, closes #2248 2024-02-01 15:33:33 -08:00
Simon Willison
d4bc2b2dfc Remove fail_if_plugins_in_metadata, part of #2248 2024-02-01 14:44:16 -08:00
Simon Willison
4da581d09b Link to config reference 2024-02-01 14:40:49 -08:00
Simon Willison
b466749e88 Filled out docs/configuration.rst, closes #2246 2024-01-31 20:03:19 -08:00
Simon Willison
bcf7ef963f YAML/JSON examples for allow blocks 2024-01-31 19:45:05 -08:00
Simon Willison
2e4a03b2c4
Run coverage on Python 3.12
- #2245

I hoped this would run slightly faster than 3.9 but there doesn't appear to be a performance improvement.
2024-01-31 15:31:26 -08:00
Simon Willison
bcc4f6bf1f
track_event() mechanism for analytics and plugins
* Closes #2240
* Documentation for event plugin hooks, refs #2240
* Include example track_event plugin in docs, refs #2240
* Tests for track_event() and register_events() hooks, refs #2240
* Initial documentation for core events, refs #2240
* Internals documentation for datasette.track_event()
2024-01-31 15:21:40 -08:00
dependabot[bot]
890615b3f2
Bump the python-packages group with 1 update (#2241)
Bumps the python-packages group with 1 update: [furo](https://github.com/pradyunsg/furo).


Updates `furo` from 2023.9.10 to 2024.1.29
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.09.10...2024.01.29)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-31 10:53:57 -08:00
Simon Willison
959e020297 Ran blacken-docs 2024-01-30 20:40:18 -08:00
gerrymanoim
04e8835297
Remove deprecated/unused args from setup.py (#2222) 2024-01-30 19:56:32 -08:00
Forest Gregg
b8230694ff
Set link to download db to nofollow 2024-01-30 19:56:05 -08:00
Simon Willison
5c64af6936 Upgrade to latest Black, closes #2239 2024-01-30 19:55:26 -08:00
Simon Willison
c3caf36af7
Template slot family of plugin hooks - top_homepage() and others
New plugin hooks:

top_homepage
top_database
top_table
top_row
top_query
top_canned_query

New datasette.utils.make_slot_function()

Closes #1191
2024-01-30 19:54:03 -08:00
Simon Willison
7a5adb592a Docs on temporary plugins in fixtures, closes #2234 2024-01-12 14:12:14 -08:00
Simon Willison
a25bf6bea7 fmt: off to fix problem with Black, closes #2231 2024-01-10 14:12:20 -08:00
Simon Willison
0f63cb83ed
Typo fix 2024-01-10 13:08:52 -08:00
Simon Willison
7506a89be0 Docs on datasette.client for tests, closes #1830
Also covers ds.client.actor_cookie() helper
2024-01-10 13:04:34 -08:00
Simon Willison
48148e66a8 Link from actors_from_ids plugin hook docs to datasette.actors_from_ids() 2024-01-10 10:42:36 -08:00
Simon Willison
2ff4d4a60a Test for ?_extra=count, refs #262 2024-01-08 13:14:25 -08:00
Simon Willison
0b2c6a7ebd Fix for ?_extra=columns bug, closes #2230
Also refs #262 - started a test suite for extras.
2024-01-08 13:12:57 -08:00
Simon Willison
1fc76fee62 1.0a8.dev1 version number
Not going to release this to PyPI but I will build my own wheel of it
2024-01-05 16:59:25 -08:00
Simon Willison
c7a4706bcc
jinja2_environment_from_request() plugin hook
Closes #2225
2024-01-05 14:33:23 -08:00
Simon Willison
45b88f2056 Release notes from 0.64.6, refs #2214 2023-12-22 15:24:26 -08:00
Simon Willison
872dae1e1a Fix for CSV labels=on missing foreign key bug, closes #2214 2023-12-22 15:08:11 -08:00
Simon Willison
978249beda Removed rogue print("max_csv_mb")
Found this while working on #2214
2023-12-22 15:07:42 -08:00
Simon Willison
4284c74bc1
db.execute_isolated_fn() method (#2220)
Closes #2218
2023-12-19 10:51:03 -08:00
Simon Willison
89c8ca0f3f Fix for round_trip_load() YAML error, refs #2219 2023-12-19 10:32:55 -08:00
Simon Willison
067cc75dfa
Fixed broken example links in row page documentation 2023-12-12 09:49:04 -08:00
Cameron Yick
452a587e23
JavaScript Plugin API, providing custom panels and column menu items
Thanks, Cameron Yick.

https://github.com/simonw/datasette/pull/2052

Co-authored-by: Simon Willison <swillison@gmail.com>
2023-10-12 17:00:27 -07:00
Simon Willison
4b534b89a5 Ran cog
Refs #2052
2023-10-12 16:48:22 -07:00
Simon Willison
11f7fd38a4 Fixed some rST header warnings 2023-10-12 15:05:02 -07:00
Simon Willison
a4b401f470 Updated Discord link, refs #2196
This issue reminded me to use the datasette.io/discord redirect URL.
2023-10-12 14:57:04 -07:00
Alex Garcia
3d6d1e3050
Raise an exception if a "plugins" block exists in metadata.json 2023-10-12 09:20:50 -07:00
Alex Garcia
35deaabcb1
Move non-metadata configuration from metadata.yaml to datasette.yaml
* Allow and permission blocks moved to datasette.yaml
* Documentation updates, initial framework for configuration reference
2023-10-12 09:16:37 -07:00
Simon Willison
4e1188f60f Upgrade spellcheck.yml workflow 2023-10-08 09:09:45 -07:00
Simon Willison
85a41987c7 Fixed typo acepts -> accepts 2023-10-08 09:07:11 -07:00
Simon Willison
d51e63d3bb Release notes for 0.64.5, refs #2197 2023-10-08 09:06:43 -07:00
Simon Willison
836b1587f0 Release notes for 1.0a7
Refs #2189
2023-09-21 15:27:27 -07:00
Simon Willison
e4f868801a Use importlib_metadata for 3.9 as well, refs #2057 2023-09-21 14:58:39 -07:00
Simon Willison
f130c7c0a8 Deploy with fixtures-metadata.json, refs #2194, #2195 2023-09-21 14:09:57 -07:00
Simon Willison
2da1a6acec Use importlib_metadata for Python 3.8, refs #2057 2023-09-21 13:26:13 -07:00
Simon Willison
b7cf0200e2 Swap order of config and metadata options, refs #2194 2023-09-21 13:22:40 -07:00
Simon Willison
80a9cd9620 test-datasette-load-plugins now fails correctly, refs #2193 2023-09-21 12:55:50 -07:00
Simon Willison
b0d0a0e5de importlib_resources for Python < 3.9, refs #2057 2023-09-21 12:42:15 -07:00
Simon Willison
947520c1fe Release notes for 0.64.4 on main 2023-09-21 12:31:32 -07:00
Simon Willison
10bc805473 Finish removing pkg_resources, closes #2057 2023-09-21 12:13:16 -07:00
dependabot[bot]
6763572948
Bump sphinx, furo, black
Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [black](https://github.com/psf/black).


Updates `sphinx` from 7.2.5 to 7.2.6
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES.rst)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.5...v7.2.6)

Updates `furo` from 2023.8.19 to 2023.9.10
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.08.19...2023.09.10)

Updates `black` from 23.7.0 to 23.9.1
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.7.0...23.9.1)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-20 15:11:24 -07:00
Simon Willison
b0e5d8afa3
Stop using parallel SQL queries for tables
Refs:
- #2189
2023-09-20 15:10:55 -07:00
Simon Willison
6ed7908580 Simplified test for #2189
This now executes two facets, in the hope that parallel facet execution
would illustrate the bug - but it did not illustrate the bug.
2023-09-18 10:44:13 -07:00
Simon Willison
f56e043747 test_facet_against_in_memory_database, refs #2189
This is meant to illustrate a crashing bug but it does not trigger it.
2023-09-18 10:39:11 -07:00
Simon Willison
852f501485 Switch from pkg_resources to importlib.metadata in app.py, refs #2057 2023-09-16 09:35:18 -07:00
Simon Willison
16f0b6d822 JSON/YAML tabs on configuration docs page 2023-09-13 14:16:36 -07:00
Alex Garcia
b2ec8717c3
Plugin configuration now lives in datasette.yaml/json
* Checkpoint, moving top-level plugin config to datasette.json
* Support database-level and table-level plugin configuration in datasette.yaml

Refs #2093
2023-09-13 14:06:25 -07:00
Simon Willison
a4c96d01b2 Release 1.0a6
Refs #1765, #2164, #2169, #2175, #2178, #2181
2023-09-07 21:44:08 -07:00
Simon Willison
b645174271
actors_from_ids plugin hook and datasette.actors_from_ids() method (#2181)
* Prototype of actors_from_ids plugin hook, refs #2180
* datasette-remote-actors example plugin, refs #2180
2023-09-07 21:23:59 -07:00
Simon Willison
c26370485a Label expand permission check respects cascade, closes #2178 2023-09-07 16:28:30 -07:00
Simon Willison
ab040470e2 Applied blacken-docs 2023-09-07 15:57:27 -07:00
Simon Willison
dbfad6d220 Foreign key label expanding respects table permissions, closes #2178 2023-09-07 15:51:09 -07:00
Simon Willison
2200abfa17 Fix for flaky test_hidden_sqlite_stat1_table, closes #2179 2023-09-07 15:49:50 -07:00
Simon Willison
fbcb103c0c Added example code to database_actions hook documentation 2023-09-07 07:47:24 -07:00
dependabot[bot]
e4abae3fd7
Bump Sphinx (#2166)
Bumps the python-packages group with 1 update: [sphinx](https://github.com/sphinx-doc/sphinx).

- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.2.4...v7.2.5)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-09-06 09:34:31 -07:00
Simon Willison
e86eaaa4f3
Test against Python 3.12 preview (#2175)
https://dev.to/hugovk/help-test-python-312-beta-1508/
2023-09-06 09:16:27 -07:00
Simon Willison
05707aa16b
click-default-group>=1.2.3 (#2173)
* click-default-group>=1.2.3

Now available as a wheel:
- https://github.com/click-contrib/click-default-group/issues/21

* Fix for blacken-docs
2023-09-05 19:50:09 -07:00
Simon Willison
31d5c4ec05 Contraction - Google and Microsoft styleguides like it
I was trying out https://github.com/errata-ai/vale
2023-09-05 19:43:01 -07:00
Simon Willison
fd083e37ec Docs for plugins that define more plugin hooks, closes #1765 2023-08-31 16:06:30 -07:00
Simon Willison
98ffad9aed execute-sql now implies can view instance/database, closes #2169 2023-08-31 15:46:26 -07:00
Simon Willison
9cead33fb9
OperationalError: database table is locked fix
See also:
- https://til.simonwillison.net/datasette/remember-to-commit
2023-08-31 10:46:07 -07:00
Simon Willison
4c3ef03311
Another ReST fix 2023-08-30 16:19:59 -07:00
Simon Willison
2caa53a52a
ReST fix 2023-08-30 16:19:24 -07:00
Simon Willison
6bfe104d47
DATASETTE_LOAD_PLUGINS environment variable for loading specific plugins
Closes #2164

* Load only specified plugins for DATASETTE_LOAD_PLUGINS=datasette-one,datasette-two
* Load no plugins if DATASETTE_LOAD_PLUGINS=''
* Automated tests in a Bash script for DATASETTE_LOAD_PLUGINS
2023-08-30 15:12:24 -07:00
Simon Willison
30b28c8367 Release 1.0a5
Refs #2093, #2102, #2153, #2156, #2157
2023-08-29 10:17:54 -07:00
Simon Willison
bb12229794 Rename core_ to catalog_, closes #2163 2023-08-29 10:01:28 -07:00
Simon Willison
50da908213
Cascade for restricted token view-table/view-database/view-instance operations (#2154)
Closes #2102

* Permission is now a dataclass, not a namedtuple - refs https://github.com/simonw/datasette/pull/2154/#discussion_r1308087800
* datasette.get_permission() method
2023-08-29 09:32:34 -07:00
Simon Willison
a1f3d75a52
Need to stick to Python 3.9 for gcloud 2023-08-28 20:46:12 -07:00
Alex Garcia
92b8bf38c0
Add new --internal internal.db option, deprecate legacy _internal database
Refs:
- #2157 
---------

Co-authored-by: Simon Willison <swillison@gmail.com>
2023-08-28 20:24:23 -07:00
dependabot[bot]
d28f12092d
Bump sphinx, furo, blacken-docs dependencies (#2160)
* Bump the python-packages group with 3 updates

Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs).


Updates `sphinx` from 7.1.2 to 7.2.4
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v7.1.2...v7.2.4)

Updates `furo` from 2023.7.26 to 2023.8.19
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.07.26...2023.08.19)

Updates `blacken-docs` from 1.15.0 to 1.16.0
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/1.15.0...1.16.0)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Simon Willison <swillison@gmail.com>
2023-08-28 17:38:32 -07:00
Simon Willison
2e2825869f Test for --get --actor, refs #2153 2023-08-28 13:18:24 -07:00
Simon Willison
d8351b08ed datasette --get --actor 'JSON' option, closes #2153
Refs #2154
2023-08-28 13:15:38 -07:00
Simon Willison
d9aad1fd04
-s/--setting x y gets merged into datasette.yml, refs #2143, #2156
This change updates the `-s/--setting` option to `datasette serve` to allow it to be used to set arbitrarily complex nested settings in a way that is compatible with the new `-c datasette.yml` work happening in:
- #2143

It will enable things like this:
```
datasette data.db --setting plugins.datasette-ripgrep.path "/home/simon/code"
```
For the moment though it just affects [settings](https://docs.datasette.io/en/1.0a4/settings.html) - so you can do this:
```
datasette data.db --setting settings.sql_time_limit_ms 3500
```
I've also implemented a backwards compatibility mechanism, so if you use it this way (the old way):
```
datasette data.db --setting sql_time_limit_ms 3500
```
It will notice that the setting you passed is one of Datasette's core settings, and will treat that as if you said `settings.sql_time_limit_ms` instead.
2023-08-28 13:06:14 -07:00
Simon Willison
527cec66b0 utils.pairs_to_nested_config(), refs #2156, #2143 2023-08-24 11:21:15 -07:00
Simon Willison
bdf59eb7db No more default to 15% on labels, closes #2150 2023-08-23 11:35:42 -07:00
Simon Willison
64fd1d788e Applied Cog, refs #2143, #2149 2023-08-22 19:57:46 -07:00
Simon Willison
2ce7872e3b -c shortcut for --config - refs #2143, #2149 2023-08-22 19:33:26 -07:00
Alex Garcia
17ec309e14
Start datasette.json, re-add --config, rm settings.json
The first step in defining the new `datasette.json/yaml` configuration mechanism.

Refs #2093, #2143, #493
2023-08-22 18:26:11 -07:00
Simon Willison
01e0558825
Merge pull request from GHSA-7ch3-7pp7-7cpq
* API explorer requires view-instance permission

* Check database/table permissions on /-/api page

* Release notes for 1.0a4

Refs #2119, #2133, #2138, #2140

Refs https://github.com/simonw/datasette/security/advisories/GHSA-7ch3-7pp7-7cpq
2023-08-22 10:10:01 -07:00
Simon Willison
943df09dcc Remove all remaining "$ " prefixes from docs, closes #2140
Also document sqlite-utils create-view
2023-08-11 10:44:34 -07:00
Simon Willison
4535568f2c Fixed display of database color
Closes #2139, closes #2119
2023-08-10 22:16:19 -07:00
Simon Willison
33251d04e7 Canned query write counters demo, refs #2134 2023-08-09 17:56:27 -07:00
Simon Willison
a3593c9015 on_success_message_sql, closes #2138 2023-08-09 17:32:07 -07:00
Simon Willison
4a42476bb7 datasette plugins --requirements, closes #2133 2023-08-09 15:04:16 -07:00
Simon Willison
19ab4552e2 Release 1.0a3
Closes #2135

Refs #262, #782, #1153, #1970, #2007, #2079, #2106, #2127, #2130
2023-08-09 12:13:11 -07:00
Simon Willison
90cb9ca58d JSON changes in release notes, refs #2135 2023-08-09 12:11:16 -07:00
Simon Willison
856ca68d94 Update default JSON representation docs, refs #2135 2023-08-09 12:04:40 -07:00
Simon Willison
e34d09c6ec Don't include columns in query JSON, refs #2136 2023-08-09 12:01:59 -07:00
Simon Willison
8920d425f4 1.0a3 release notes, smaller changes section - refs #2135 2023-08-09 10:20:58 -07:00
Simon Willison
26be9f0445 Refactored canned query code, replaced old QueryView, closes #2114 2023-08-09 08:26:52 -07:00
Simon Willison
cd57b0f712 Brought back parameter fields, closes #2132 2023-08-08 06:45:04 -07:00
Simon Willison
1377a290cd
New JSON design for query views (#2118)
* Refs #2111, closes #2110
* New Context dataclass/subclass mechanism, refs #2127
* Define QueryContext and extract get_tables() method, refs #2127
* Fix OPTIONS bug by porting DaatbaseView to be a View subclass
* Expose async_view_for_class.view_class for test_routes test
* Error/truncated aruments for renderers, closes #2130
2023-08-07 18:47:39 -07:00
dependabot[bot]
5139c0886a
Bump the python-packages group with 3 updates (#2128)
Bumps the python-packages group with 3 updates: [sphinx](https://github.com/sphinx-doc/sphinx), [furo](https://github.com/pradyunsg/furo) and [blacken-docs](https://github.com/asottile/blacken-docs).

Updates `sphinx` from 6.1.3 to 7.1.2
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.1.3...v7.1.2)

Updates `furo` from 2023.3.27 to 2023.7.26
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.03.27...2023.07.26)

Updates `blacken-docs` from 1.14.0 to 1.15.0
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/1.14.0...1.15.0)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: python-packages
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: python-packages
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-08-07 09:19:23 -07:00
Simon Willison
adf54f5c80
Use dependabot grouped updates 2023-08-07 08:45:10 -07:00
Simon Willison
0818182399 Update cli-reference for editable change, refs #2106 2023-07-26 11:52:57 -07:00
Simon Willison
18dd88ee4d Refactored DatabaseDownload to database_download, closes #2116 2023-07-26 11:43:55 -07:00
Simon Willison
dc5171eb1b Make editable work with -e '.[test]', refs #2106 2023-07-26 11:28:03 -07:00
Simon Willison
278ac91a4d datasette install -e option, closes #2106 2023-07-22 11:42:46 -07:00
dependabot[bot]
3a51ca9014
Bump black from 23.3.0 to 23.7.0 (#2099)
Bumps [black](https://github.com/psf/black) from 23.3.0 to 23.7.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/23.3.0...23.7.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-07-21 14:19:24 -07:00
Simon Willison
0f7192b615 One last YAML/JSON change, closes #1153 2023-07-08 13:08:09 -07:00
Simon Willison
42ca574720 Removed accidental test code I added, refs #1153 2023-07-08 12:50:22 -07:00
Simon Willison
2fd871a906 Drop support for Python 3.7, refs #2097 2023-07-08 11:40:19 -07:00
Simon Willison
45e6d370ce Install docs dependencies for tests, refs #1153 2023-07-08 11:35:15 -07:00
Simon Willison
50a6355c08 Workaround to get sphinx-build working again, refs 1153 2023-07-08 11:22:21 -07:00
Simon Willison
c076fb65e0 Applied sphinx-inline-tabs to remaining examples, refs #1153 2023-07-08 11:00:08 -07:00
Simon Willison
0183e1a72d Preserve JSON key order in YAML, refs #1153 2023-07-08 10:27:36 -07:00
Simon Willison
38fcc96e67 Removed duplicate imports, refs #1153 2023-07-08 10:09:26 -07:00
Simon Willison
3b336d8071 Utility function for cog for generating YAML/JSON tabs, refs #1153 2023-07-08 09:37:47 -07:00
Simon Willison
d7b21a8623 metadata.yaml now treated as default in docs
Added sphinx-inline-tabs to provide JSON and YAML tabs to show examples.

Refs #1153
2023-07-08 09:37:01 -07:00
Simon Willison
8cd60fd1d8 Homepage test now just asserts isinstance(x, int) - closes #2092 2023-06-29 08:24:09 -07:00
Simon Willison
c39d600aef Fix all E741 Ambiguous variable name warnings, refs #2090 2023-06-29 08:05:24 -07:00
Simon Willison
99ba051188 Fixed spelling error, refs #2089
Also ensure codespell runs as part of just lint
2023-06-29 07:46:22 -07:00
Simon Willison
84b32b447a Justfile I use for local development
Now with codespell, refs #2089
2023-06-29 07:44:13 -07:00
Simon Willison
d45a7213ed codespell>=2.5.5, also spellcheck README - refs #2089 2023-06-29 07:43:01 -07:00
dependabot[bot]
ede6203618
Bump blacken-docs from 1.13.0 to 1.14.0 (#2083)
Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.13.0 to 1.14.0.
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/1.13.0...1.14.0)

---
updated-dependencies:
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-06-29 07:31:54 -07:00
Simon Willison
d1d78ec0eb
Better docs for startup() hook 2023-06-23 13:06:35 -07:00
Simon Willison
dda99fc09f
New View base class (#2080)
* New View base class, closes #2078
* Use new View subclass for PatternPortfolioView
2023-05-25 17:18:43 -07:00
Simon Willison
b49fa446d6 --cors Access-Control-Max-Age: 3600, closes #2079 2023-05-25 15:05:58 -07:00
Simon Willison
9584879534 Rename callable.py to check_callable.py, refs #2078 2023-05-25 11:49:40 -07:00
Simon Willison
2e43a14da1 datasette.utils.check_callable(obj) - refs #2078 2023-05-25 11:35:34 -07:00
Simon Willison
49184c569c
Action: Deploy a Datasette branch preview to Vercel
Closes #2070
2023-05-09 09:24:28 -07:00
Simon Willison
d3d16b5ccf
Build docs with 3.11 on ReadTheDocs
Inspired by https://github.com/simonw/sqlite-utils/issues/540
2023-05-07 11:44:27 -07:00
Simon Willison
55c526a537 Add pip as a dependency too, for Rye - refs #2065 2023-04-26 22:07:35 -07:00
Simon Willison
0b0c5cd7a9 Hopeful fix for Python 3.7 httpx failure, refs #2066 2023-04-26 21:20:38 -07:00
Simon Willison
249fcf8e3e
Add setuptools to dependencies
Refs #2065
2023-04-26 20:36:10 -07:00
Simon Willison
5890a20c37 Mention API tokens in DATASETTE_SECRET docs 2023-03-31 09:45:16 -07:00
Simon Willison
4c1e277edb Updated JSON API shape documentation, refs #262 2023-03-28 23:21:42 -07:00
dependabot[bot]
30c88e3570
Bump black from 22.12.0 to 23.3.0 (#2047)
Bumps [black](https://github.com/psf/black) from 22.12.0 to 23.3.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.12.0...23.3.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Simon Willison <swillison@gmail.com>
2023-03-28 23:12:05 -07:00
dependabot[bot]
bbd5489dbc
Bump blacken-docs from 1.12.1 to 1.13.0 (#1992)
Bumps [blacken-docs](https://github.com/asottile/blacken-docs) from 1.12.1 to 1.13.0.
- [Release notes](https://github.com/asottile/blacken-docs/releases)
- [Changelog](https://github.com/adamchainz/blacken-docs/blob/main/HISTORY.rst)
- [Commits](https://github.com/asottile/blacken-docs/compare/v1.12.1...1.13.0)

---
updated-dependencies:
- dependency-name: blacken-docs
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-28 23:11:33 -07:00
dependabot[bot]
d52402447e
Bump sphinx from 6.1.2 to 6.1.3 (#1986)
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.2 to 6.1.3.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.1.2...v6.1.3)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-28 23:09:48 -07:00
dependabot[bot]
848a9a420d
Bump furo from 2022.12.7 to 2023.3.27 (#2046)
Bumps [furo](https://github.com/pradyunsg/furo) from 2022.12.7 to 2023.3.27.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2022.12.07...2023.03.27)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-03-28 23:08:00 -07:00
Simon Willison
651b78d8e6 Redesign ?_extra=extras a bit, refs #262 2023-03-28 23:07:30 -07:00
Simon Willison
c025b0180f
Drop jQuery dependency 2023-03-26 16:38:58 -07:00
Simon Willison
db8cf899e2
Use block scripts instead, refs #1608 2023-03-26 16:27:58 -07:00
Simon Willison
5c1cfa451d
Link docs /latest/ to /stable/ again
Re-implementing the pattern from https://til.simonwillison.net/readthedocs/link-from-latest-to-stable

Refs #1608
2023-03-26 16:23:28 -07:00
Simon Willison
3feed1f66e Re-applied Black 2023-03-22 15:54:35 -07:00
Simon Willison
d97e82df3c
?_extra= support and TableView refactor to table_view
* Implemented ?_extra= option for JSON views, refs #262
* New dependency: asyncinject
* Remove now-obsolete TableView class
2023-03-22 15:49:39 -07:00
Simon Willison
56b0758a5f 0.64 release notes, refs #2036 2023-03-08 12:52:37 -08:00
Simon Willison
25fdbe6b27 use tmpdir instead of isolated_filesystem, refs #2037
Should hopefully get tests passing for #2036 too.
2023-03-08 12:33:23 -08:00
Simon Willison
bd39cb4805 Use service-specific image ID for Cloud Run deploys, refs #2036 2023-03-08 12:25:55 -08:00
Simon Willison
1ad92a1d87 datasette install -r requirements.txt, closes #2033 2023-03-06 14:27:30 -08:00
Dustin Rodrigues
a53b893c46
Add Python 3.11 classifier (#2028)
Thanks, @dtrodrigues
2023-03-06 13:01:19 -08:00
Simon Willison
0b4a286914 render_cell(..., request) argument, closes #2007 2023-01-27 19:34:14 -08:00
Simon Willison
e4ebef082d
Fixed link text 2023-01-21 07:37:29 -08:00
Simon Willison
6a352e99ab
Added missing import to example 2023-01-11 11:04:11 -08:00
Simon Willison
25a612fe09 Release 0.64.1
Refs #1985, #1987
2023-01-11 10:23:49 -08:00
Simon Willison
50fd94e04f Raise ValueError if Datasette(files=) is a string, refs #1985 2023-01-11 10:13:20 -08:00
Simon Willison
2c86774179
Link to non-spam Python 3 setup instructions
Refs #1987
2023-01-11 09:59:40 -08:00
Simon Willison
8e70734043
Upgrade Sphinx, closes #1971 2023-01-09 18:02:32 -08:00
Simon Willison
4880638f13
setup-gcloud 318.0.0
Refs https://til.simonwillison.net/googlecloud/gcloud-error-workaround
2023-01-09 16:02:02 -08:00
Simon Willison
7dd671310a Release notes for 0.64, with a warning against arbitrary SQL with SpatiaLite
Refs #1409, #1771, #1979

Refs https://github.com/simonw/datasette.io/issues/132
2023-01-09 08:40:24 -08:00
Simon Willison
5e672df168 Explicitly explain allow_sql: false 2023-01-09 08:25:07 -08:00
Simon Willison
7b48664d75 Better error for --load-extensions, refs #1979 2023-01-07 15:56:03 -08:00
Simon Willison
0f7c71a86f What to do if extensions will not load, refs #1979 2023-01-07 15:49:28 -08:00
Simon Willison
fee658ad05 Improved wording in allow_sql docs 2023-01-05 09:22:49 -08:00
Simon Willison
c41278b46f default_allow_sql setting, closes #1409
Refs #1410
2023-01-04 16:51:26 -08:00
Simon Willison
adfcec51d6 Fixed broken example links in _where= docs 2023-01-04 16:51:26 -08:00
Simon Willison
deb5fcbed4
Fixed table_action example in docs 2023-01-04 10:25:04 -08:00
Simon Willison
572bdb5b80 Applied Black, refs #782 2022-12-31 19:32:07 -08:00
Simon Willison
d94a3c4326
No need to link to _shape=objects any more
It's the default now. Refs #782
2022-12-31 17:42:48 -08:00
Simon Willison
3c352b7132 Applied Black, refs #782 2022-12-31 13:17:54 -08:00
Simon Willison
5bbe2bcc50 Rename filtered_table_rows_count to count, refs #782 2022-12-31 12:52:57 -08:00
Simon Willison
a2dca62360 Fix for extension tests I broke, refs #782 2022-12-31 11:21:15 -08:00
Simon Willison
ca07fff3e2 Pin Sphinx 5.3.0, refs #1971
Furo is not yet compatible with Sphinx 6.0
2022-12-31 11:13:56 -08:00
Simon Willison
3af313e165 Fix for Sphinx extlinks warning, closes #1972 2022-12-31 11:13:14 -08:00
Chris Holdgraf
994ce46ed4
Add favicon to documentation (#1967)
Co-authored-by: Simon Willison <swillison@gmail.com>
2022-12-31 11:00:31 -08:00
Simon Willison
8059c8a27c Fixed typo 2022-12-31 10:54:25 -08:00
Simon Willison
8aa9cf629c Store null instead of 'None' in _internal database table, closes #1970 2022-12-31 10:52:37 -08:00
Simon Willison
234230e595 Default JSON shape is now objects - refs #1914, #1709 2022-12-31 10:52:37 -08:00
Simon Willison
1fda4806d4 Small documentation tweaks 2022-12-31 10:52:37 -08:00
Simon Willison
c635f6ebac Moved CORS bit to its own documentation section 2022-12-31 10:52:37 -08:00
Simon Willison
3bd05b854a -e/--expires-after in create-token docs 2022-12-31 10:52:37 -08:00
Simon Willison
677ba9dddd Fix rST warning in changelog 2022-12-31 10:52:37 -08:00
Jan Lehnardt
e03aed0002 Detect server start/stop more reliably.
This is useful, especially in testing, since your test
hosts might not reliabliy start the server within two
seconds, so we do a definite check before progressing.

By the same token, after `kill $server_pid` wait for
the pid to be gone from the process list.

Since now the script can end prematurely, I also added
a cleanup function to make sure the temporary certs are
removed in any case.

n.b. this could also be done with the use of `trap 'fn'
ERR` but that felt like a bit too much magic for this
short a script.
2022-12-18 08:01:51 -08:00
Simon Willison
a21c00b54d .select-wrapper:focus-within for accessibility, closes #1771 2022-12-17 22:28:07 -08:00
Simon Willison
23335e123b Release notes for 0.63.3
Refs #1963
2022-12-17 19:26:25 -08:00
Simon Willison
a27c0a0124 Deploy docs on publish using Python 3.9
A workaround for gcloud setup, see:

https://til.simonwillison.net/googlecloud/gcloud-error-workaround

Refs #1963
2022-12-17 19:24:48 -08:00
Simon Willison
0ea139dfe5 Run new HTTPS test in CI, refs #1955 2022-12-17 18:38:26 -08:00
Simon Willison
d1d369456a Move HTTPS test to a bash script
See https://github.com/simonw/datasette/issues/1955#issuecomment-1356627931
2022-12-17 18:33:07 -08:00
Simon Willison
8b73fc6b47 Put AsgiLifestyle back so server starts up again, refs #1955 2022-12-17 17:22:00 -08:00
Simon Willison
63fb750f39 Replace AsgiLifespan with AsgiRunOnFirstRequest, refs #1955 2022-12-17 14:14:34 -08:00
Simon Willison
89cffcf14c Reset _metadata_local in a couple of tests
Refs https://github.com/simonw/datasette/pull/1960#issuecomment-1356476886
2022-12-17 13:47:55 -08:00
Simon Willison
9c43b4164d Removed @pytest.mark.ds_client mark - refs #1959
I don't need it - can run 'pytest -k ds_client' instead.

See https://github.com/simonw/datasette/pull/1960#issuecomment-1355685828
2022-12-17 13:47:55 -08:00
Simon Willison
0e42444866 invoke_startup() inside ds_client fixture, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
e70974a4f1 Ran Black, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
42a66c2f04 A bunch of remaining ds_client conversions, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
be95359a80 ds_client for test_permissions.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
ef74d0ff70 ds_client for test_internal_db.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
4a151b15cc ds_client for test_filters.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
30f1a0705b ds_client for test_plugins.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
b998c2793f test_facets.py using ds_client, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
bc88491cb7 ds_client for test_table_api.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
1335bcb893 Use my own global variable instead of scope=session
Refs https://github.com/simonw/datasette/pull/1960#issuecomment-1354148139
2022-12-17 13:47:55 -08:00
Simon Willison
ebd3358e49 ds_client for test_table_html.py 2022-12-17 13:47:55 -08:00
Simon Willison
d94d363ec0 Don't use pytest_asyncio.fixture(scope="session") any more, refs #1959
Also got rid of the weird memory=False hack:

https://github.com/simonw/datasette/pull/1960#issuecomment-1354053151
2022-12-17 13:47:55 -08:00
Simon Willison
95900b9d02 Port app_client to ds_client for most of test_html.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
3001eec66a ds_client for test_csv.py and test_canned_queries.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
425ac4357f Ported app_client to ds_client where possible in test_auth.py, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
b077e63dc6 Ported test_api.py app_client test to ds_client, refs #1959 2022-12-17 13:47:55 -08:00
Simon Willison
5ee954e34b
Link to annotated release notes for 1.0a2 2022-12-15 17:03:37 -08:00
Simon Willison
013496862f
Try click.echo() instead
This ensures the URL is output correctly when running under Docker.

Closes #1958
2022-12-15 16:55:17 -08:00
Simon Willison
0b68996cc5 Revert "Replace AsgiLifespan with AsgiRunOnFirstRequest, refs #1955"
This reverts commit dc18f62089.
2022-12-15 13:06:45 -08:00
Simon Willison
38d28dd958 Revert "Try running every test at once, refs #1955"
This reverts commit 51ee8caa4a.
2022-12-15 13:05:33 -08:00
Simon Willison
51ee8caa4a Try running every test at once, refs #1955 2022-12-15 12:51:18 -08:00
Simon Willison
dc18f62089 Replace AsgiLifespan with AsgiRunOnFirstRequest, refs #1955 2022-12-15 09:34:07 -08:00
Simon Willison
e054704fb6 Added missing rST label 2022-12-14 21:38:28 -08:00
Simon Willison
6e1e815c78
It's an update-or-insert 2022-12-14 18:41:30 -08:00
Simon Willison
8b9d7fdbd8 Fixed typo in release notes, refs #1953 2022-12-14 18:02:42 -08:00
Simon Willison
8cac6ff301 Release 1.0a2
Refs #1636, #1855, #1878, #1927, #1937, #1940, #1947, #1951

Closes #1953
2022-12-14 18:01:02 -08:00
Simon Willison
9ad76d279e Applied blacken-docs, refs #1937 2022-12-14 14:49:13 -08:00
Simon Willison
c094dde3ff Extra permission rules for /-/create, closes #1937 2022-12-14 12:21:18 -08:00
Simon Willison
e238df3959 Handle non-initials in permission_allowed_actor_restrictions, closes #1956 2022-12-14 12:04:23 -08:00
Simon Willison
1a3dcf4943 Don't include _memory on /-/create-token, refs #1947 2022-12-13 21:19:31 -08:00
Simon Willison
420d0a0ee2 Tests for /-/create-token with restrictions, closes #1947 2022-12-13 21:13:20 -08:00
Simon Willison
6e5ab9e7b3 Note in docs about new /-/create-token features, refs #1947 2022-12-13 21:07:03 -08:00
Simon Willison
d98a8effb1 UI for restricting permissions on /-/create-token, refs #1947
Also fixes test failures I introduced in #1951
2022-12-13 21:03:17 -08:00
Simon Willison
fdf7c27b54 datasette.create_token() method, closes #1951 2022-12-13 18:42:01 -08:00
Simon Willison
d4cc1374f4 Improved --help for create-token, refs #1947 2022-12-13 14:28:59 -08:00
Simon Willison
f84acae98e Return 400 errors for ?_sort errors, closes #1950 2022-12-13 14:23:17 -08:00
dependabot[bot]
d4b98d3924
Bump black from 22.10.0 to 22.12.0 (#1944)
Bumps [black](https://github.com/psf/black) from 22.10.0 to 22.12.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.10.0...22.12.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-12 21:23:30 -08:00
Simon Willison
45979eb723 Rename permission created by demo plugin
It was showing up as 'new-permission' on https://latest.datasette.io/-/permissions
which I thought was confusing
2022-12-12 21:21:01 -08:00
Simon Willison
34ad574bac Don't hard-code permissions in permission_allowed_actor_restrictions, refs #1855 2022-12-12 21:14:40 -08:00
Simon Willison
a1a372f179 /-/actor no longer requires view-instance, refs #1945 2022-12-12 21:06:30 -08:00
Simon Willison
260fbb598e Fix some failing tests, refs #1855 2022-12-12 21:00:40 -08:00
Simon Willison
2aa2adaa8b Docs for new create-token options, refs #1855 2022-12-12 20:56:40 -08:00
Simon Willison
809fad2392 Tests for datasette create-token restrictions, refs #1855 2022-12-12 20:44:19 -08:00
Simon Willison
c13dada2f8 datasette --get --token option, closes #1946, refs #1855 2022-12-12 20:36:42 -08:00
Simon Willison
14f1cc4984 Update CLI reference help, refs #1855 2022-12-12 20:21:48 -08:00
Simon Willison
98eff2cde9 Ignore spelling of alls, refs #1855 2022-12-12 20:19:17 -08:00
Simon Willison
e95b490d88 Move create-token command into cli.py, refs #1855 2022-12-12 20:18:42 -08:00
Simon Willison
9cc1a7c4c8 create-token command can now create restricted tokens, refs #1855 2022-12-12 20:15:56 -08:00
Simon Willison
c6a811237c /-/actor.json no longer requires view-instance, closes #1945 2022-12-12 20:11:51 -08:00
Simon Willison
3e6a208ba3 Rename 't' to 'r' in '_r' actor format, refs #1855 2022-12-12 19:27:34 -08:00
Simon Willison
c5d30b58a1 Implemented metadata permissions: property, closes #1636 2022-12-12 18:40:45 -08:00
Simon Willison
8bf06a76b5
register_permissions() plugin hook (#1940)
* Docs for permissions: in metadata, refs #1636
* Refactor default_permissions.py to help with implementation of #1636
* register_permissions() plugin hook, closes #1939 - also refs #1938
* Tests for register_permissions() hook, refs #1939
* Documentation for datasette.permissions, refs #1939
* permission_allowed() falls back on Permission.default, refs #1939
* Raise StartupError on duplicate permissions
* Allow dupe permisisons if exact matches
2022-12-12 18:05:54 -08:00
David Larlet
e539c1c024
Typo in JSON API Updating a row documentation (#1930) 2022-12-08 13:12:34 -08:00
dependabot[bot]
bffefc7db0
Bump furo from 2022.9.29 to 2022.12.7 (#1935)
Bumps [furo](https://github.com/pradyunsg/furo) from 2022.9.29 to 2022.12.7.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2022.09.29...2022.12.07)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-12-08 13:12:07 -08:00
Simon Willison
05daa15aac Documentation for /-/create ignore/replace, closes #1927 2022-12-07 17:42:54 -08:00
Simon Willison
34cffff02a Refactor _headers() for write API tests 2022-12-07 17:39:07 -08:00
Simon Willison
dee18ed8ce test_create_table_error_rows_twice_with_duplicates, refs #1927 2022-12-07 17:29:24 -08:00
Simon Willison
9342b60f14 test_create_table_error_if_pk_changed, refs #1927 2022-12-07 17:27:01 -08:00
Simon Willison
6b27537988 ignore/replace to create requires pk, refs #1927 2022-12-07 17:18:40 -08:00
Simon Willison
272982e8a6
/db/table/-/upsert API
Close #1878

Also made a few tweaks to how _r works in tokens and actors,
refs #1855 - I needed that mechanism for the tests.
2022-12-07 17:12:15 -08:00
Simon Willison
93ababe6f7 Initial attempt at insert/replace for /-/create, refs #1927 2022-12-02 23:00:18 -08:00
Simon Willison
cab5b60e09
datasette-auth-passwords is another actor_from_request example 2022-12-02 08:39:52 -08:00
Simon Willison
d7e5e3c9f9 Fix for todomvc permission check
Refs https://github.com/simonw/todomvc-datasette/issues/2
2022-12-01 17:38:23 -08:00
Simon Willison
27efa8c381 todomvc permissions and fixed DATASETTE_SECRET for new demo
Refs https://github.com/simonw/todomvc-datasette/issues/2
2022-12-01 17:29:44 -08:00
Simon Willison
03f247845e
datasette-ephemeral-tables>=0.2.2
Refs https://github.com/simonw/datasette-ephemeral-tables/issues/5
2022-12-01 16:37:53 -08:00
Simon Willison
e2f71c6f81
Bump ephemeral limit up to 15 minutes per table
Refs #1915
2022-12-01 15:44:43 -08:00
Simon Willison
692fbfc40a Release 1.0a1
Refs #1922, #1917, #1915, #1916, #1918, #1924
2022-12-01 13:30:39 -08:00
Simon Willison
f3c8da7acd MAke the sign in as root button bigger on latest.datasette.io 2022-12-01 13:29:31 -08:00
Simon Willison
99da46f725 Docs for insert API ignore/replace - closes #1924 2022-11-30 18:07:48 -08:00
Simon Willison
7fde34cfcb Documentation and test for UNIQUE constraint failed, refs #1924 2022-11-30 18:05:29 -08:00
Simon Willison
9a1536b52a Move CORS headers into base class, refs #1922 2022-11-30 15:48:32 -08:00
Simon Willison
31d6a0bc5e Applied Black, refs #1922 2022-11-30 15:17:39 -08:00
Simon Willison
f0fadc28dd Access-Control-Allow-Headers: Authorization, Content-Type - refs #1922 2022-11-30 15:11:18 -08:00
Simon Willison
418eb7c5c6
Try Python 3.9 for Cloud Run deploy, refs #1923 2022-11-30 14:59:17 -08:00
Simon Willison
ec1dde5dd2
Try version 318.0.0 of google-github-actions/setup-gcloud
Refs #1923
2022-11-30 14:50:53 -08:00
Simon Willison
2cd7ecaa0a Apply Black, refs #1922 2022-11-30 13:54:47 -08:00
Simon Willison
6bfd71f5c6 Access-Control-Allow-Methods: GET, POST, HEAD, OPTIONS - refs #1922 2022-11-30 12:25:12 -08:00
Simon Willison
4c18730e71 Update tests to export 200 for OPTIONS calls, refs #1922 2022-11-30 10:29:48 -08:00
Simon Willison
48725bb4ea CORS headers for write APIs, refs #1922 2022-11-30 09:27:10 -08:00
Simon Willison
4ddd77e512
No need for pkginfo pin any more
The upstream issue was fixed. Refs #1913
2022-11-29 21:25:40 -08:00
Simon Willison
8404b21556 405 method not allowed for GET to POST endpoints, closes #1916 2022-11-29 21:15:13 -08:00
Simon Willison
5518397338 Show mutable DBs first in API explorer, closes #1918 2022-11-29 21:07:51 -08:00
Simon Willison
6b47734c04 _memory database should not be mutable, closes #1917 2022-11-29 21:06:52 -08:00
Simon Willison
9f5321ff1e
latest now uses datasette-ephemeral-tables>=0.2.1
Fix for https://github.com/simonw/datasette-ephemeral-tables/issues/4
2022-11-29 20:43:27 -08:00
Simon Willison
7588d27f4a
latest.datasette.io uses datasette-ephemeral-tables>=0.2
To show the countdown timer from:
https://github.com/simonw/datasette-ephemeral-tables/issues/3

Refs #1915
2022-11-29 17:51:15 -08:00
Simon Willison
53a8e5bae5 Deploy datasette-ephemeral-tables plugin
Refs #1915
2022-11-29 15:58:25 -08:00
Simon Willison
4a0bd960e9 Pin pkginfo==1.8.3 as workaround for #1913 2022-11-29 11:57:54 -08:00
Simon Willison
07aad51176
Merge pull request #1912 from simonw/1.0-dev
Merge 1.0-dev (with initial write API) back into main
2022-11-29 11:39:36 -08:00
Simon Willison
b8fc8e2cd7
Merge branch 'main' into 1.0-dev 2022-11-29 11:34:39 -08:00
Simon Willison
4d49a5a397 Release 1.0a0
Refs #1850, #1851, #1852, #1856, #1858, #1863, #1864, #1871, #1874, #1882

Closes #1891
2022-11-29 11:22:54 -08:00
Simon Willison
6bda225786 Tests for rowid and compound pk row deletion, closes #1864 2022-11-29 10:53:55 -08:00
Simon Willison
1154048f79 Compound primary key support for /db/-/create - closes #1911
Needed for tests in #1864
2022-11-29 10:47:48 -08:00
Simon Willison
484bef0d3b /db/table/pk/-/update endpoint, closes #1863 2022-11-29 10:06:19 -08:00
Simon Willison
21f8aab531 Release 0.63.2
Refs #1904, #1905
2022-11-18 16:59:05 -08:00
Simon Willison
733447d7c7 Upgrade to Python 3.11 on Heroku, refs #1905 2022-11-18 16:44:46 -08:00
Simon Willison
72ac9bf82f --generate-dir option to publish heroku, refs #1905 2022-11-18 16:34:33 -08:00
Simon Willison
5be728c2dd Pin httpx in Pyodide test, refs #1904
Should help get tests to pass for #1896 too
2022-11-18 14:52:05 -08:00
Simon Willison
0fe1619910 Pin httpx in Pyodide test, refs #1904
Should help get tests to pass for #1896 too
2022-11-18 14:50:19 -08:00
Simon Willison
ee64130fa8 Refactor to use new resolve_database/table/row methods, refs #1896 2022-11-18 14:46:25 -08:00
Simon Willison
c588a89f26 db.view_exists() method, needed by #1896 2022-11-18 14:16:38 -08:00
Simon Willison
b29ccb59c7 Add test for db.view_names() 2022-11-18 14:13:48 -08:00
Brian Grinstead
3ecd131e57
Use DOMContentLoaded instead of load event for CodeMirror initialization. Closes #1894 (#1898) 2022-11-17 23:29:00 -08:00
Simon Willison
63f923d013 Remove min-height on CodeMirror, closes #1899 2022-11-17 23:21:00 -08:00
Simon Willison
3db37e9a21 Remove min-height on CodeMirror, closes #1899 2022-11-17 23:20:49 -08:00
Simon Willison
83a6872d1b Include views in SQL autocomplete, refs #1897 2022-11-17 18:53:48 -08:00
Simon Willison
52bf222d48 /db/-/create API endpoint, closes #1882 2022-11-17 17:24:46 -08:00
Simon Willison
98611b3da0 Include SQL schema for CodeMirror on query pages, closes #1897
Refs #1893
2022-11-17 17:24:44 -08:00
Simon Willison
22bade4562 Use table_columns context for CodeMirror schema, if available - refs #1897 2022-11-17 17:23:35 -08:00
Simon Willison
8494be07ae Prettier should ignore bundle.js file - refs #1893 2022-11-17 17:23:35 -08:00
Brian Grinstead
710be684b8 Upgrade to CodeMirror 6, add SQL autocomplete (#1893)
* Upgrade to CodeMirror 6
* Update contributing docs
* Change how resizing works
* Define a custom SQLite autocomplete dialect
* Add meta-enter to submit
* Add fixture schema for testing
2022-11-17 17:23:35 -08:00
Simon Willison
b35522c6dd Updated test, refs #1890 2022-11-17 17:23:35 -08:00
Simon Willison
b470ab5c41 Fix for datalist against foreign key facets
Refs https://github.com/simonw/datasette/issues/1890#issuecomment-1314850524
2022-11-17 17:23:35 -08:00
Simon Willison
df2cc923c6 Applied prettier, refs #1890 2022-11-17 17:23:35 -08:00
Simon Willison
e15ff2d86e datalist autocomplete for facet filters, refs #1890 2022-11-17 17:23:35 -08:00
Simon Willison
3e61a41b9b Include SQL schema for CodeMirror on query pages, closes #1897
Refs #1893
2022-11-17 17:19:37 -08:00
Simon Willison
aff7a6985e Use table_columns context for CodeMirror schema, if available - refs #1897 2022-11-17 16:41:25 -08:00
Simon Willison
00e233d7a7 Prettier should ignore bundle.js file - refs #1893 2022-11-16 15:53:27 -08:00
Brian Grinstead
ae11fa5887
Upgrade to CodeMirror 6, add SQL autocomplete (#1893)
* Upgrade to CodeMirror 6
* Update contributing docs
* Change how resizing works
* Define a custom SQLite autocomplete dialect
* Add meta-enter to submit
* Add fixture schema for testing
2022-11-16 15:49:06 -08:00
Simon Willison
6f610e1d94 Updated test, refs #1890 2022-11-15 19:04:24 -08:00
Simon Willison
eac028d3f7 Fix for datalist against foreign key facets
Refs https://github.com/simonw/datasette/issues/1890#issuecomment-1314850524
2022-11-14 22:57:11 -08:00
Simon Willison
3652b7472a Applied prettier, refs #1890 2022-11-14 22:41:10 -08:00
Simon Willison
f156bf9e6b datalist autocomplete for facet filters, refs #1890 2022-11-14 22:31:29 -08:00
Simon Willison
187d91d686 /db/-/create API endpoint, closes #1882 2022-11-14 21:57:28 -08:00
Simon Willison
518fc63224 API explorer: no error if you format JSON on empty string
Refs #1871
2022-11-13 22:06:45 -08:00
Simon Willison
575a29c424 API explorer: respect immutability, closes #1888 2022-11-13 22:01:56 -08:00
Simon Willison
264d0ab471 Renamed return_rows to return in insert API
Refs https://github.com/simonw/datasette/issues/1866#issuecomment-1313128913
2022-11-13 21:49:23 -08:00
Simon Willison
65521f03db Error for drop against immutable database, closes #1874 2022-11-13 21:40:10 -08:00
Simon Willison
612da8eae6 confirm: true mechanism for drop table API, closes #1887 2022-11-13 21:17:18 -08:00
Simon Willison
db796771e2 Example links for API explorer, closes #1871 2022-11-13 20:58:45 -08:00
Simon Willison
c603faac5b API explorer: persist form state in # in URL, refs #1871 2022-11-13 20:12:36 -08:00
Simon Willison
ca66ea57d2 GET and POST areas toggle each other, refs #1871 2022-11-13 13:12:51 -08:00
Simon Willison
51d60d7ddf details-menu class to avoid accidential details closure
Refs https://github.com/simonw/datasette/issues/1871#issuecomment-1312821031
2022-11-13 13:06:58 -08:00
Simon Willison
f832435b88 Release 0.63.1
Refs ##1843, #1876, #1883
2022-11-12 12:29:59 -08:00
Simon Willison
fa9cc9efaf Fix for redirects ignoring base_url, refs #1883 2022-11-12 12:29:59 -08:00
Simon Willison
26262d08f3 Test form actions use prefix, refs #1883 2022-11-12 12:29:59 -08:00
Simon Willison
9f54f00a50 Release 0.63.1
Refs ##1843, #1876, #1883
2022-11-10 23:01:20 -08:00
Simon Willison
8d9a957c63 Fix for redirects ignoring base_url, refs #1883 2022-11-10 22:49:54 -08:00
Simon Willison
bbaab3b38e Test form actions use prefix, refs #1883 2022-11-10 22:20:40 -08:00
Simon Willison
aacf25cf19
Improvements to API token docs, refs #1852 2022-11-05 23:54:32 -07:00
Simon Willison
bcc781f4c5 Implementation and tests for _r field on actor, refs #1855
New mechanism for restricting permissions further for a given actor.

This still needs documentation. It will eventually be used by the mechanism to issue
signed API tokens that are only able to perform a subset of actions.

This also adds tests that exercise the POST /-/permissions tool, refs #1881
2022-11-03 17:12:23 -07:00
Simon Willison
fb8b6b2311 Refactor _error helper function 2022-11-03 16:36:43 -07:00
Simon Willison
867e0abd34 Tests now close SQLite database connections and files explicitly, refs #1843
Also added a db.close() method to the Database class.
2022-11-03 13:37:26 -07:00
Simon Willison
2355067ef5 Tests now close SQLite database connections and files explicitly, refs #1843
Also added a db.close() method to the Database class.
2022-11-03 13:36:11 -07:00
Simon Willison
bb030ba46f More margin on /-/allow-debug page 2022-11-02 22:10:59 -07:00
Simon Willison
c51d9246b9 Permission check testing tool, refs #1881 2022-11-02 22:10:07 -07:00
Simon Willison
9b5a73ba4c Applied Black 2022-11-02 21:46:05 -07:00
Simon Willison
719e757252 Return method not allowed error in JSON in some situations
Added this while playing with the new API explorer, refs #1871
2022-11-02 20:12:13 -07:00
Simon Willison
000eeb4464 Link to Datasette API docs from /-/api, refs #1871 2022-11-01 22:45:05 -07:00
Simon Willison
042881a522 Ran Prettier, refs #1871 2022-11-01 22:30:16 -07:00
Simon Willison
0b166befc0 API explorer can now do GET, has JSON syntax highlighting
Refs #1871
2022-11-01 17:31:22 -07:00
Simon Willison
497290beaf Handle database errors in /-/insert, refs #1866, #1873
Also improved API explorer to show HTTP status of response, refs #1871
2022-11-01 12:59:17 -07:00
Simon Willison
9bec7c38eb ignore and replace options for bulk inserts, refs #1873
Also removed the rule that you cannot include primary keys in the rows you insert.

And added validation that catches invalid parameters in the incoming JSON.

And renamed "inserted" to "rows" in the returned JSON for return_rows: true
2022-11-01 11:08:17 -07:00
Simon Willison
93a02281da Show interrupted query in resizing textarea, closes #1876 2022-11-01 10:38:24 -07:00
Simon Willison
2ec5583629 Show interrupted query in resizing textarea, closes #1876 2022-11-01 10:22:26 -07:00
Simon Willison
00632ded30 Initial attempt at /db/table/row/-/delete, refs #1864 2022-10-30 16:16:00 -07:00
Simon Willison
2865d3956f /db/table/-/drop API, closes #1874 2022-10-30 15:17:21 -07:00
Simon Willison
4f16e14d7a Update cog 2022-10-30 14:53:33 -07:00
Simon Willison
fedbfcc368 Neater display of output and errors in API explorer, refs #1871 2022-10-30 14:49:07 -07:00
Simon Willison
9eb9ffae3d Drop API token requirement from API explorer, refs #1871 2022-10-30 13:09:55 -07:00
Simon Willison
f6bf2d8045 Initial prototype of API explorer at /-/api, refs #1871 2022-10-29 23:20:11 -07:00
Simon Willison
c35859ae3d API for bulk inserts, closes #1866 2022-10-29 23:03:45 -07:00
Simon Willison
c9b5f5d598 Depend on sqlite-utils>=3.30
Decided to use the most recent version in case I decide later to
use the flatten() utility function.

Refs #1850
2022-10-27 17:58:41 -07:00
Simon Willison
61171f0154 Release 0.63
Refs #1646, #1786, #1787, #1789, #1794, #1800, #1804, #1805, #1808, #1809, #1816, #1819, #1825, #1829, #1831, #1834, #1844, #1853, #1860

Closes #1869
2022-10-27 17:58:41 -07:00
Simon Willison
26af9b9c4a Release notes for 0.63, refs #1869 2022-10-27 17:58:41 -07:00
dependabot[bot]
641bc4453b Bump black from 22.8.0 to 22.10.0 (#1839)
Bumps [black](https://github.com/psf/black) from 22.8.0 to 22.10.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.8.0...22.10.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-27 17:58:41 -07:00
Forest Gregg
2ea60e12d9 Make hash and size a lazy property (#1837)
* use inspect data for hash and file size
* make hash and cached_size lazy properties
* move hash property near size
2022-10-27 17:58:41 -07:00
Simon Willison
bf00b0b59b Release 0.63
Refs #1646, #1786, #1787, #1789, #1794, #1800, #1804, #1805, #1808, #1809, #1816, #1819, #1825, #1829, #1831, #1834, #1844, #1853, #1860

Closes #1869
2022-10-27 15:11:26 -07:00
Simon Willison
e5e0459a0b Release notes for 0.63, refs #1869 2022-10-27 13:58:00 -07:00
dependabot[bot]
2c36e45447
Bump black from 22.8.0 to 22.10.0 (#1839)
Bumps [black](https://github.com/psf/black) from 22.8.0 to 22.10.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.8.0...22.10.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-27 13:51:45 -07:00
Forest Gregg
b912d92b65
Make hash and size a lazy property (#1837)
* use inspect data for hash and file size
* make hash and cached_size lazy properties
* move hash property near size
2022-10-27 13:51:20 -07:00
Simon Willison
6e788b49ed New URL design /db/table/-/insert, refs #1851 2022-10-27 13:18:05 -07:00
Simon Willison
a2a5dff709 Missing tests for insert row API, refs #1851 2022-10-27 12:08:26 -07:00
Simon Willison
a51608090b Slight tweak to insert row API design, refs #1851
https://github.com/simonw/datasette/issues/1851#issuecomment-1292997608
2022-10-27 12:06:18 -07:00
Simon Willison
6958e21b5c Add test for /* multi line */ comment, refs #1860 2022-10-27 11:52:06 -07:00
Simon Willison
b597bb6b3e Better comment handling in SQL regex, refs #1860 2022-10-27 11:52:06 -07:00
Simon Willison
918f356120 Delete mirror-master-and-main.yml
Closes #1865
2022-10-27 11:52:06 -07:00
Simon Willison
d2ca13b699 Add test for /* multi line */ comment, refs #1860 2022-10-27 11:50:54 -07:00
Simon Willison
5f6be3c48b Better comment handling in SQL regex, refs #1860 2022-10-27 11:47:48 -07:00
Simon Willison
f6ca86987b
Delete mirror-master-and-main.yml
Closes #1865
2022-10-27 06:56:11 -07:00
Simon Willison
51c436fed2 First draft of insert row write API, refs #1851 2022-10-26 20:57:02 -07:00
Simon Willison
382a871583 max_signed_tokens_ttl setting, closes #1858
Also redesigned token format to include creation time and optional duration.
2022-10-26 20:14:59 -07:00
Simon Willison
af5d5d0243 Allow leading comments on SQL queries, refs #1860 2022-10-26 20:14:59 -07:00
Simon Willison
55f860c304 Fix bug with breadcrumbs and request=None, closes #1849 2022-10-26 20:14:59 -07:00
Simon Willison
55a709c480 Allow leading comments on SQL queries, refs #1860 2022-10-26 14:34:33 -07:00
Simon Willison
df7bf0b2fc Fix bug with breadcrumbs and request=None, closes #1849 2022-10-26 14:13:31 -07:00
Simon Willison
c7956eed77 datasette create-token command, refs #1859 2022-10-25 21:26:12 -07:00
Simon Willison
c556fad65d Try to address too many files error again, refs #1843 2022-10-25 21:25:47 -07:00
Simon Willison
c36a74ece1 Try shutting down executor in tests to free up thread local SQLite connections, refs #1843 2022-10-25 21:04:39 -07:00
Simon Willison
c23fa850e7 allow_signed_tokens setting, closes #1856 2022-10-25 19:55:47 -07:00
Simon Willison
0f013ff497 Mechanism to prevent tokens creating tokens, closes #1857 2022-10-25 19:43:55 -07:00
Simon Willison
b29e487bc3 actor_from_request for dstok_ tokens, refs #1852 2022-10-25 19:18:41 -07:00
Simon Willison
7ab091e8ef Tests and docs for /-/create-token, refs #1852 2022-10-25 19:04:05 -07:00
Simon Willison
68ccb7578b dstoke_ prefix for tokens
Refs https://github.com/simonw/datasette/issues/1852#issuecomment-1291290451
2022-10-25 18:40:07 -07:00
Simon Willison
42f8b402e6 Initial prototype of create API token page, refs #1852 2022-10-25 17:07:58 -07:00
Simon Willison
f9ae92b377 Poll until servers start, refs #1854 2022-10-25 16:03:36 -07:00
Simon Willison
05b479224f Don't need pysqlite3-binary any more, refs #1853 2022-10-25 16:03:36 -07:00
Simon Willison
6d085af28c Python 3.11 in CI 2022-10-25 16:03:36 -07:00
Simon Willison
c7dd76c262 Poll until servers start, refs #1854 2022-10-25 12:42:45 -07:00
Simon Willison
613ad05c09
Don't need pysqlite3-binary any more, refs #1853 2022-10-25 12:16:48 -07:00
Simon Willison
9676b2deb0 Upgrade Docker images to Python 3.11, closes #1853 2022-10-25 12:04:53 -07:00
Simon Willison
02ae1a0029 Upgrade Docker images to Python 3.11, closes #1853 2022-10-25 12:04:25 -07:00
Simon Willison
e135da8efe
Python 3.11 in CI 2022-10-25 07:13:43 -07:00
Simon Willison
83adf55b2d Deploy one-dot-zero branch preview 2022-10-23 20:28:15 -07:00
Simon Willison
a0dd5fa02f Fixed typo in release notes 2022-10-23 20:14:49 -07:00
Simon Willison
602c0888ce Release 0.63a1
Refs #1646, #1819, #1825, #1829, #1831, #1832, #1834, #1844, #1848
2022-10-23 20:07:09 -07:00
Simon Willison
5be86d48b2 Fix display of padlocks on database page, closes #1848 2022-10-23 19:42:30 -07:00
Simon Willison
78dad236df
check_visibility can now take multiple permissions into account
Closes #1829
2022-10-23 19:11:33 -07:00
Simon Willison
6887c12ea3 Workaround for 'too many open files' error, refs #1843 2022-10-23 15:17:33 -07:00
Simon Willison
fdf9891c3f Use shot-scraper images from datasette-screenshots repo, closes #1844 2022-10-14 12:57:00 -07:00
Simon Willison
79aa0de083 Test that breadcrumbs respect permissions, closes #1831 2022-10-13 14:51:59 -07:00
Simon Willison
1a5e5f2aa9 Refactor breadcrumbs to respect permissions, refs #1831 2022-10-13 14:42:52 -07:00
Simon Willison
b7fec7f902 .sqlite/.sqlite3 extensions for config directory mode
Closes #1646
2022-10-07 16:03:30 -07:00
Forest Gregg
eff112498e
Useuse inspect data for hash and file size on startup
Thanks, @fgregg

Closes #1834
2022-10-06 13:06:06 -07:00
Simon Willison
bbf33a7635 Test for bool(results), closes #1832 2022-10-04 21:32:29 -07:00
Simon Willison
b6ba117b79
Clarify request or None for two hooks 2022-10-04 18:25:52 -07:00
Simon Willison
4218c9cd74
reST markup fix 2022-10-04 11:45:36 -07:00
Simon Willison
883e326dd6
Drop word-wrap: anywhere, refs #1828, #1805 2022-10-02 14:26:16 -07:00
dependabot[bot]
c92c4318e9
Bump furo from 2022.9.15 to 2022.9.29 (#1827)
Bumps [furo](https://github.com/pradyunsg/furo) from 2022.9.15 to 2022.9.29.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2022.09.15...2022.09.29)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-09-30 10:55:40 -07:00
Simon Willison
34defdc10a
Browse the plugins directory 2022-09-28 17:39:36 -07:00
Adam Simpson
984b1df12c
Add documentation for serving via OpenRC (#1825)
* Add documentation for serving via OpenRC
2022-09-27 21:21:36 -07:00
Simon Willison
7fb4ea4e39
Update note about render_cell signature, refs #1826 2022-09-27 21:06:40 -07:00
Simon Willison
5f9f567acb Show SQL query when reporting time limit error, closes #1819 2022-09-26 16:06:01 -07:00
Simon Willison
212137a90b Release 0.63a0
Refs #1786, #1787, #1789, #1794, #1800, #1804, #1805, #1808, #1809, #1816
2022-09-26 14:14:25 -07:00
Simon Willison
cb1e093fd3 Fixed error message, closes #1816 2022-09-19 18:15:40 -07:00
Simon Willison
df851c117d Validate settings.json keys on startup, closes #1816
Refs #1814
2022-09-19 16:46:39 -07:00
Simon Willison
ddc999ad12 Async support for prepare_jinja2_environment, closes #1809 2022-09-16 20:38:24 -07:00
dependabot[bot]
2ebcffe222
Bump furo from 2022.6.21 to 2022.9.15 (#1812)
Bumps [furo](https://github.com/pradyunsg/furo) from 2022.6.21 to 2022.9.15.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2022.06.21...2022.09.15)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-09-16 12:50:52 -07:00
Simon Willison
b40872f5e5 prepare_jinja2_environment(datasette) argument, refs #1809 2022-09-14 14:31:54 -07:00
Simon Willison
610425460b
Add --nolock to the README Chrome demo
Refs #1744
2022-09-10 14:24:26 -07:00
Simon Willison
fb7e70d5e7 Database(is_mutable=) now defaults to True, closes #1808
Refs https://github.com/simonw/datasette-upload-dbs/issues/6
2022-09-09 09:19:20 -07:00
Simon Willison
bf8d84af54 word-wrap: anywhere on links in cells, refs #1805 2022-09-06 20:34:59 -07:00
Simon Willison
5aa359b869 Apply cell truncation on query page too, refs #1805 2022-09-06 16:58:30 -07:00
Simon Willison
d0737e4de5 truncate_cells_html now affects URLs too, refs #1805 2022-09-06 16:50:43 -07:00
Simon Willison
ff9c87197d Fixed Sphinx warnings on cli-reference page 2022-09-06 11:26:21 -07:00
Simon Willison
d0476897e1 Fixed Sphinx warning about language = None 2022-09-06 11:24:30 -07:00
Simon Willison
0a7815d203 Documentation for facet_size in metadata, closes #1804 2022-09-06 11:06:49 -07:00
Simon Willison
303c6c733d Fix for incorrectly handled _facet_size=max, refs #1804 2022-09-06 11:05:00 -07:00
Simon Willison
8430c3bc7d table facet_size in metadata, refs #1804 2022-09-06 08:59:19 -07:00
Simon Willison
d80775a48d Raise error if it's not about loops, refs #1802 2022-09-06 08:29:07 -07:00
Daniel Rech
c9d1943aed
Fix word break in facets by adding ul.tight-bullets li word-break: break-all (#1794)
Thanks, @dmr
2022-09-05 17:45:41 -07:00
Simon Willison
64288d827f
Workaround for test failure: RuntimeError: There is no current event loop (#1803)
* Remove ensure_eventloop hack
* Hack to recover from intermittent RuntimeError calling asyncio.Lock()
2022-09-05 17:40:19 -07:00
Simon Willison
1c29b925d3 Run tests in serial again
Because this didn't fix the issue I'm seeing in #1802

Revert "Run tests in serial, refs #1802"

This reverts commit b91e17280c.
2022-09-05 17:10:52 -07:00
Simon Willison
b2b901e8c4 Skip SpatiaLite test if no conn.enable_load_extension()
Ran into this problem while working on #1802
2022-09-05 17:09:57 -07:00
Simon Willison
b91e17280c
Run tests in serial, refs #1802 2022-09-05 16:50:53 -07:00
dependabot[bot]
294ecd45f7
Bump black from 22.6.0 to 22.8.0 (#1797)
Bumps [black](https://github.com/psf/black) from 22.6.0 to 22.8.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.6.0...22.8.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-09-05 11:51:51 -07:00
Simon Willison
51030df186
Don't use upper bound dependencies any more
See https://iscinumpy.dev/post/bound-version-constraints/ for the rationale behind this change.

Closes #1800
2022-09-05 11:35:40 -07:00
Simon Willison
ba35105eee
Test --load-extension in GitHub Actions (#1792)
* Run the --load-extension test, refs #1789
* Ran cog, refs #1789
2022-08-23 17:11:45 -07:00
Simon Willison
456dc155d4 Ran cog, refs #1789 2022-08-23 11:40:48 -07:00
Simon Willison
fd1086c686 Applied Black, refs #1789 2022-08-23 11:35:41 -07:00
Alex Garcia
1d64c9a8da
Add new entrypoint option to --load-extensions. (#1789)
Thanks, @asg017
2022-08-23 11:34:30 -07:00
Manuel Kaufmann
663ac431fe
Use Read the Docs action v1 (#1778)
Read the Docs repository was renamed from `readthedocs/readthedocs-preview` to `readthedocs/actions/`. Now, the `preview` action is under `readthedocs/actions/preview` and is tagged as `v1`
2022-08-19 17:04:16 -07:00
Simon Willison
0d9d33955b Clarify you can publish multiple files, closes #1788 2022-08-18 16:06:12 -07:00
Simon Willison
aff3df03d4 Ignore ro which stands for read only
Refs #1787 where it caused tests to break
2022-08-18 14:55:08 -07:00
Simon Willison
6c0ba7c00c Improved CLI reference documentation, refs #1787 2022-08-18 14:52:04 -07:00
Simon Willison
09a41662e7 Fix typo 2022-08-18 09:10:48 -07:00
Simon Willison
a3e6f1b167 Increase height of non-JS textarea to fit query
Closes #1786
2022-08-18 09:06:02 -07:00
Simon Willison
481eb96d85
https://datasette.io/tutorials/clean-data tutorial
Refs #1783
2022-08-15 13:17:28 -07:00
Simon Willison
a107e3a028
datasette-sentry is an example of handle_exception 2022-08-14 16:07:46 -07:00
Simon Willison
815162cf02 Release 0.62
Refs #903, #1300, #1683, #1701, #1712, #1717, #1718, #1728, #1733, #1738, #1739, #1744, #1746, #1748, #1759, #1766, #1768, #1770, #1773, #1779

Closes #1782
2022-08-14 10:32:42 -07:00
Simon Willison
5e6c5c9e31 Document datasette.config_dir, refs #1766 2022-08-14 10:18:47 -07:00
Simon Willison
82167105ee --min-instances and --max-instances Cloud Run publish options, closes #1779 2022-08-14 10:07:30 -07:00
Simon Willison
c1396bf860 Don't allow canned write queries on immutable DBs, closes #1728 2022-08-14 09:34:31 -07:00
Simon Willison
1563c22a8c Don't duplicate _sort_desc, refs #1738 2022-08-14 09:13:12 -07:00
Simon Willison
080d4b3e06 Switch to python:3.10.6-slim-bullseye for datasette publish - refs #1768 2022-08-14 08:49:14 -07:00
Simon Willison
668415df9f Upgrade Docker baes to 3.10.6-slim-bullseye - refs #1768 2022-08-14 08:47:17 -07:00
Simon Willison
df4fd2d7dd _sort= works even if sort column not selected, closes #1773 2022-08-14 08:44:02 -07:00
Simon Willison
8eb699de7b
Datasette Lite in Getting Started docs, closes #1781 2022-08-14 08:24:39 -07:00
Simon Willison
db00c00f63
Promote Datasette Lite in the README, refs #1781 2022-08-14 08:19:30 -07:00
Simon Willison
05d9c68268
Promote Discord more in the README 2022-08-14 08:16:53 -07:00
Simon Willison
8cfc723368 Ran blacken-docs 2022-08-09 11:21:53 -07:00
Simon Willison
bca2d95d02
Configure readthedocs/readthedocs-preview 2022-08-02 16:38:02 -07:00
Simon Willison
7af67b54b7 How to register temporary plugins in tests, closes #903 2022-07-18 14:31:17 -07:00
Chris Amico
01369176b0
Keep track of datasette.config_dir (#1766)
Thanks, @eyeseast - closes #1764
2022-07-17 18:12:45 -07:00
dependabot[bot]
22354c48ce
Update pytest-asyncio requirement from <0.19,>=0.17 to >=0.17,<0.20 (#1769)
Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Changelog](https://github.com/pytest-dev/pytest-asyncio/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest-asyncio/compare/v0.17.0...v0.19.0)

---
updated-dependencies:
- dependency-name: pytest-asyncio
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-07-17 18:06:37 -07:00
dependabot[bot]
ea6161f847
Bump furo from 2022.4.7 to 2022.6.21 (#1760)
Bumps [furo](https://github.com/pradyunsg/furo) from 2022.4.7 to 2022.6.21.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2022.04.07...2022.06.21)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-07-17 18:06:26 -07:00
Simon Willison
ed1ebc0f1d Run blacken-docs, refs #1770 2022-07-17 18:03:33 -07:00
Simon Willison
6d5e195547 Release 0.62a1
Refs #1300, #1739, #1744, #1746, #1748, #1759, #1770
2022-07-17 17:59:20 -07:00
Simon Willison
e543a095cc Updated default plugins in docs, refs #1770 2022-07-17 17:57:41 -07:00
Simon Willison
58fd1e33ec Hint that you can render templates for these hooks, refs #1770 2022-07-17 16:30:58 -07:00
Simon Willison
c09c53f345 New handle_exception plugin hook, refs #1770
Also refs:
- https://github.com/simonw/datasette-sentry/issues/1
- https://github.com/simonw/datasette-show-errors/issues/2
2022-07-17 16:24:39 -07:00
Simon Willison
8188f55efc Rename handle_500 to handle_exception, refs #1770 2022-07-17 15:24:16 -07:00
Simon Willison
950cc7677f
Fix missing Discord image
Refs https://github.com/simonw/datasette.io/issues/112
2022-07-14 15:18:28 -07:00
Simon Willison
c133545fe9
Make discord badge lowercase
Refs https://github.com/simonw/datasette.io/issues/112
2022-07-14 15:04:38 -07:00
Simon Willison
5d76c1f81b
Discord badge
Refs https://github.com/simonw/datasette.io/issues/112
2022-07-14 15:03:33 -07:00
Simon Willison
035dc5e7b9
More than 90 plugins now 2022-07-09 10:25:37 -07:00
Simon Willison
6373bb3414 Expose current SQLite row to render_cell hook, closes #1300 2022-07-07 09:30:49 -07:00
dependabot[bot]
9f1eb0d4ea
Bump black from 22.1.0 to 22.6.0 (#1763)
Bumps [black](https://github.com/psf/black) from 22.1.0 to 22.6.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/compare/22.1.0...22.6.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-06-28 10:40:24 -07:00
M. Nasimul Haque
00e59ec461
Extract facet pieces of table.html into included templates
Thanks, @nsmgr8
2022-06-20 11:05:44 -07:00
Simon Willison
e780b2f5d6
Trying out one-sentence-per-line
As suggested here: https://sive.rs/1s

Markdown and reStructuredText will display this as if it is a single paragraph, even though the sentences themselves are separated by newlines.

This could result in more useful diffs. Trying it out on this page first.
2022-06-20 10:54:23 -07:00
Naveen
2e9751672d
chore: Set permissions for GitHub actions (#1740)
Restrict the GitHub token permissions only to the required ones; this way, even if the attackers will succeed in compromising your workflow, they won’t be able to do much.

- Included permissions for the action. https://github.com/ossf/scorecard/blob/main/docs/checks.md#token-permissions

https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#permissions

https://docs.github.com/en/actions/using-jobs/assigning-permissions-to-jobs

[Keeping your GitHub Actions and workflows secure Part 1: Preventing pwn requests](https://securitylab.github.com/research/github-actions-preventing-pwn-requests/)

Signed-off-by: naveen <172697+naveensrinivasan@users.noreply.github.com>
2022-05-31 12:28:40 -07:00
Simon Willison
8dd816bc76 Applied Black 2022-05-30 15:42:38 -07:00
Simon Willison
adedd85b68
Clarify that request.headers names are converted to lowercase 2022-05-28 18:42:31 -07:00
Simon Willison
b010af7bb8
Updated copyright years in documentation footer 2022-05-20 15:23:09 -07:00
Simon Willison
4446075334 Append warning to the write element, refs #1746 2022-05-20 13:44:23 -07:00
Simon Willison
1d33fd03b3 Switch docs theme to Furo, refs #1746 2022-05-20 13:34:51 -07:00
Simon Willison
1465fea479 sphinx-copybutton for docs, closes #1748 2022-05-20 12:11:28 -07:00
Simon Willison
18a6e05887
Added "follow a tutorial" to getting started docs
Closes #1747
2022-05-20 12:05:33 -07:00
Simon Willison
0e2f6f1f82
datasette-copyable is an example of register_output_renderer 2022-05-18 17:37:46 -07:00
Simon Willison
7d1e004ff6 Fix test I broke in #1744 2022-05-17 13:09:07 -07:00
Simon Willison
b393e164dc
ReST fix 2022-05-17 12:45:28 -07:00
Simon Willison
5555bc8aef
How to run cog, closes #1745 2022-05-17 12:43:44 -07:00
Simon Willison
3508bf7875 --nolock mode to ignore locked files, closes #1744 2022-05-17 12:40:25 -07:00
Simon Willison
a5acfff4bd
Empty Datasette([]) list is no longer required 2022-05-16 17:06:40 -07:00
Simon Willison
280ff372ab ETag support for .db downloads, closes #1739 2022-05-03 07:59:46 -07:00
Simon Willison
d60f163528
Run on push and PR, closes #1737 2022-05-02 16:40:49 -07:00
Simon Willison
c0cbcf2aba Tweaks to test scripts, refs #1737 2022-05-02 16:36:58 -07:00
Simon Willison
847d6b1aac Test wheel against Pyodide, refs #1737, #1733 2022-05-02 16:32:24 -07:00
Simon Willison
943aa2e1f7 Release 0.62a0
Refs #1683, #1701, #1712, #1717, #1718, #1733
2022-05-02 14:38:34 -07:00
Simon Willison
3f00a29141
Clean up compatibility with Pyodide (#1736)
* Optional uvicorn import for Pyodide, refs #1733
* --setting num_sql_threads 0 to disable threading, refs #1735
2022-05-02 13:15:27 -07:00
Simon Willison
a29c127789 Rename to_decimal/from_decimal to decode/encode, refs #1734 2022-05-02 12:44:09 -07:00
Simon Willison
687907aa2b Remove python-baseconv dependency, refs #1733, closes #1734 2022-05-02 12:39:06 -07:00
Simon Willison
7e03394734 Optional uvicorn import for Pyodide, refs #1733 2022-05-02 12:20:14 -07:00
Simon Willison
4afc1afc72
Depend on click-default-group-wheel>=1.2.2
Refs #1733
2022-05-02 12:13:11 -07:00
Simon Willison
94a3171b01
.plugin_config() can return None 2022-04-28 13:29:11 -07:00
Simon Willison
942411ef94 Execute some TableView queries in parallel
Use ?_noparallel=1 to opt out (undocumented, useful for benchmark comparisons)

Refs #1723, #1715
2022-04-26 15:50:02 -07:00
Simon Willison
8a0c38f0b8 Rename database->database_name and table-> table_name, refs #1715 2022-04-26 15:50:02 -07:00
Simon Willison
c101f0efee
datasette-total-page-time example of asgi_wrapper 2022-04-26 15:34:29 -07:00
Simon Willison
579f59dcec Refactor to remove RowTableShared class, closes #1719
Refs #1715
2022-04-25 11:33:35 -07:00
Simon Willison
7463b051cf Cosmetic tweaks after blacken-docs, refs #1718 2022-04-24 09:59:20 -07:00
Simon Willison
289e4cf80a Finished applying blacken-docs, closes #1718 2022-04-24 09:17:59 -07:00
Simon Willison
498e1536f5 One more blacken-docs test, refs #1718 2022-04-24 09:08:56 -07:00
Simon Willison
92b26673d8 Fix blacken-docs errors and warnings, refs #1718 2022-04-24 09:03:14 -07:00
Simon Willison
36573638b0 Apply Black to code examples in documentation, refs #1718
Uses blacken-docs. This has a deliberate error which I hope will fail CI.
2022-04-24 08:50:43 -07:00
Simon Willison
40ef8ebac2
Run tests on pull requests 2022-04-24 07:10:13 -07:00
Simon Willison
e64d14e413 Use type integer for --timeout, refs #1717 2022-04-24 07:09:08 -07:00
Simon Willison
4bd3a30e1e Update cog docs for publish cloudrun, refs #1717 2022-04-24 07:04:11 -07:00
Tim Sherratt
3001e1e394
Add timeout option to Cloudrun build (#1717)
* Add timeout option for build phase
* Make the --timeout setting optional
* Add test for --timeout setting

Thanks, @wragge
2022-04-24 07:03:08 -07:00
Simon Willison
d57c347f35
Ignore Black commits in git blame, refs #1716 2022-04-22 14:58:46 -07:00
Simon Willison
8338c66a57
datasette-geojson is an example of register_output_renderer 2022-04-21 11:05:43 -07:00
Simon Willison
0bc5186b7b Tooltip and commas for byte length display, closes #1712 2022-04-12 11:44:12 -07:00
Simon Willison
143c105f87 Removed rogue print 2022-04-12 11:43:32 -07:00
dependabot[bot]
138e4d9a53
Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0 (#1694)
Updates the requirements on [click](https://github.com/pallets/click) to permit the latest version.
- [Release notes](https://github.com/pallets/click/releases)
- [Changelog](https://github.com/pallets/click/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/click/compare/7.1.1...8.1.0)

---
updated-dependencies:
- dependency-name: click
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-04-08 16:05:09 -07:00
dependabot[bot]
247e460e08
Update beautifulsoup4 requirement (#1703)
Updates the requirements on [beautifulsoup4](https://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version.

---
updated-dependencies:
- dependency-name: beautifulsoup4
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-04-08 15:51:04 -07:00
Simon Willison
90d1be9952 Tilde encoding now encodes space as plus, closes #1701
Refs #1657
2022-04-06 08:55:01 -07:00
Simon Willison
df88d03298
Warn about Cloud Run and bots
Refs #1698
2022-04-02 23:05:10 -07:00
Simon Willison
5c5e9b3657 Request.fake(... url_vars), plus .fake() is now documented
Also made 'from datasette import Request' shortcut work.

Closes #1697
2022-03-31 19:01:58 -07:00
Simon Willison
e73fa72917
Fixed bug in httpx_mock example, closes #1691 2022-03-26 15:46:08 -07:00
Simon Willison
bd8a58ae61
Fix message_type in documentation, closes #1689 2022-03-26 13:51:20 -07:00
Simon Willison
6b99e4a66b
Added missing hookimpl import
Useful for copying and pasting to create a quick plugin
2022-03-25 16:44:35 -07:00
Simon Willison
c496f2b663 Don't show facet in cog menu if not allow_facet, closes #1683 2022-03-24 12:16:19 -07:00
Simon Willison
d431a9055e Release 0.61.1
Refs #1682

Refs https://github.com/simonw/datasette-hashed-urls/issues/13
2022-03-23 11:54:10 -07:00
Simon Willison
0159662ab8 Fix for bug running ?sql= against databases with a different route, closes #1682 2022-03-23 11:48:10 -07:00
Simon Willison
d7c793d799 Release 0.61
Refs #957, #1228, #1533, #1545, #1576, #1577, #1587, #1601, #1603, #1607, #1612, #1621, #1649, #1654, #1657, #1661, #1668, #1675, #1678
2022-03-23 11:12:26 -07:00
Simon Willison
c4c9dbd038
google-github-actions/setup-gcloud@v0 2022-03-22 09:49:26 -07:00
Simon Willison
12f3ca7995
google-github-actions/setup-gcloud@v0 2022-03-21 18:42:03 -07:00
Simon Willison
72bfd75fb7 Drop n=1 threshold down to <= 20ms, closes #1679 2022-03-21 14:55:50 -07:00
Simon Willison
1a7750eb29 Documented datasette.check_visibility() method, closes #1678 2022-03-21 12:01:37 -07:00
Simon Willison
194e4f6c3f Removed check_permission() from BaseView, closes #1677
Refs #1660
2022-03-21 11:41:56 -07:00
Simon Willison
dfafce6d96 Display no-opinion permission checks on /-/permissions 2022-03-21 11:37:27 -07:00
Simon Willison
e627510b76 BaseView.check_permissions is now datasette.ensure_permissions, closes #1675
Refs #1660
2022-03-21 10:13:16 -07:00
Simon Willison
4a4164b811 Added another note to the 0.61a0 release notes, refs #1228 2022-03-19 18:23:03 -07:00
Simon Willison
cb4854a435 Fixed typo 2022-03-19 18:17:58 -07:00
Simon Willison
5471e3c491 Release 0.61a0
Refs #957, #1533, #1545, #1576, #1577, #1587, #1601, #1603, #1607, #1612, #1621, #1649, #1654, #1657, #1661, #1668
2022-03-19 18:14:40 -07:00
Simon Willison
cdbae2b93f Fixed internal links to respect db.route, refs #1668 2022-03-19 17:31:23 -07:00
Simon Willison
e10da9af35 alternative-route demo, refs #1668 2022-03-19 17:21:56 -07:00
Simon Willison
7a6654a253 Databases can now have a .route separate from their .name, refs #1668 2022-03-19 17:11:17 -07:00
Simon Willison
798f075ef9 Read format from route captures, closes #1667
Refs #1660
2022-03-19 13:32:29 -07:00
Simon Willison
b9c2b1cfc8 Consistent treatment of format in route capturing, refs #1667
Also refs #1660
2022-03-19 13:29:10 -07:00
Simon Willison
61419388c1 Rename route match groups for consistency, refs #1667, #1660 2022-03-19 09:52:08 -07:00
Simon Willison
764738dfcb test_routes also now asserts matches, refs #1666 2022-03-19 09:30:22 -07:00
Simon Willison
711767bcd3 Refactored URL routing to add tests, closes #1666
Refs #1660
2022-03-18 21:03:08 -07:00
Simon Willison
4e47a2d894 Fixed bug where tables with a column called n caused 500 errors
Closes #1228
2022-03-18 18:37:54 -07:00
Simon Willison
32963018e7 Updated documentation to remove hash_urls, refs #1661 2022-03-18 17:33:06 -07:00
Simon Willison
9979dcd07f Also remove default_cache_ttl_hashed setting, refs #1661 2022-03-18 17:25:14 -07:00
Simon Willison
8658c66438 Show error if --setting hash_urls 1 used, refs #1661 2022-03-18 17:19:31 -07:00
Simon Willison
d4f60c2388
Remove hashed URL mode
Also simplified how view class routing works.

Refs #1661
2022-03-18 17:12:03 -07:00
Simon Willison
30e5f0e67c
Documented internals used by datasette-hashed-urls
Closes #1663
2022-03-17 14:30:02 -07:00
dependabot[bot]
77a904fea1
Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0 (#1656)
Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...7.1.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-15 11:03:01 -07:00
Simon Willison
a35393b29c
Tilde encoding (#1659)
Closes #1657

Refs #1439
2022-03-15 11:01:57 -07:00
Simon Willison
c10cd48baf Min pytest-asyncio of 0.17
So that the asyncio_mode in pytest.ini does not produce
a warning on older versions of that library.
2022-03-15 08:43:47 -07:00
Simon Willison
645381a5ed Add code of conduct again
Refs #1658
2022-03-15 08:38:42 -07:00
Simon Willison
77e718c3ff Revert "Fix bug with percentage redirects, close #1650"
This reverts commit c85d669de3.

Refs #1658
2022-03-15 08:37:31 -07:00
Simon Willison
5a353a32b9 Revert "Fixed tests for urlsafe_components, refs #1650"
This reverts commit bb499942c1.

Refs #1658
2022-03-15 08:37:14 -07:00
Simon Willison
239aed1820 Revert "Code of conduct, refs #1654"
This reverts commit c5791156d9.

Refs #1658
2022-03-15 08:36:42 -07:00
Simon Willison
c5791156d9
Code of conduct, refs #1654 2022-03-07 14:04:10 -08:00
Simon Willison
bb499942c1 Fixed tests for urlsafe_components, refs #1650 2022-03-07 11:33:31 -08:00
Simon Willison
c85d669de3 Fix bug with percentage redirects, close #1650 2022-03-07 11:26:08 -08:00
Simon Willison
020effe47b Preserve query string in % to - redirects, refs #1650 2022-03-07 08:18:07 -08:00
Simon Willison
d714c67d65 asyncio_mode = strict to avoid pytest warnings 2022-03-07 08:09:15 -08:00
Simon Willison
644d25d1de Redirect old % URLs to new - encoded URLs, closes #1650
Refs #1439
2022-03-07 08:01:42 -08:00
Simon Willison
1baa030eca
Switch to dash encoding for table/database/row-pk in paths
* Dash encoding functions, tests and docs, refs #1439
* dash encoding is now like percent encoding but with dashes
* Use dash-encoding for row PKs and ?_next=, refs #1439
* Use dash encoding for table names, refs #1439
* Use dash encoding for database names, too, refs #1439

See also https://simonwillison.net/2022/Mar/5/dash-encoding/
2022-03-07 07:38:29 -08:00
Dan Peterson
de810f49cc
Add /opt/homebrew to where spatialite extension can be found (#1649)
Helps homebrew on Apple Silicon setups find spatialite without needing
a full path.

Similar to #1114

Thanks, @danp
2022-03-06 11:39:15 -08:00
David Larlet
0499f174c0
Typo in docs about default redirect status code (#1589) 2022-03-05 17:58:31 -08:00
dependabot[bot]
7b78279b93
Update pytest-timeout requirement from <2.1,>=1.4.2 to >=1.4.2,<2.2 (#1602)
Updates the requirements on [pytest-timeout](https://github.com/pytest-dev/pytest-timeout) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-timeout/releases)
- [Commits](https://github.com/pytest-dev/pytest-timeout/compare/1.4.2...2.1.0)

---
updated-dependencies:
- dependency-name: pytest-timeout
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-05 17:41:49 -08:00
dependabot[bot]
73f2d25f70
Update asgiref requirement from <3.5.0,>=3.2.10 to >=3.2.10,<3.6.0 (#1610)
Updates the requirements on [asgiref](https://github.com/django/asgiref) to permit the latest version.
- [Release notes](https://github.com/django/asgiref/releases)
- [Changelog](https://github.com/django/asgiref/blob/main/CHANGELOG.txt)
- [Commits](https://github.com/django/asgiref/compare/3.2.10...3.5.0)

---
updated-dependencies:
- dependency-name: asgiref
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-05 17:30:27 -08:00
dependabot[bot]
b21839dd1a
Update pytest requirement from <6.3.0,>=5.2.2 to >=5.2.2,<7.1.0 (#1629)
Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...7.0.0)

---
updated-dependencies:
- dependency-name: pytest
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-05 17:30:05 -08:00
dependabot[bot]
a22ec96c3a
Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.19 (#1631)
Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](https://github.com/pytest-dev/pytest-asyncio/compare/v0.10.0...v0.18.0)

---
updated-dependencies:
- dependency-name: pytest-asyncio
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-05 17:29:53 -08:00
Simon Willison
5010d1359b Fix for test failure caused by SQLite 3.37.0+, closes #1647 2022-03-05 11:46:59 -08:00
Simon Willison
dd94157f89
Link to tutorials from documentation index page 2022-02-27 10:04:03 -08:00
Simon Willison
7d24fd405f
datasette-auth-passwords is now an example of register_commands
Refs https://github.com/simonw/datasette-auth-passwords/issues/19
2022-02-09 09:47:54 -08:00
Simon Willison
458f03ad3a More SpatiaLite details on /-/versions, closes #1607 2022-02-08 22:32:19 -08:00
Simon Willison
1b2f0ab6bb Revert "Use de-dupe idiom that works with Python 3.6, refs #1632"
This reverts commit 5bfd001b55.

No need for this on the main branch because it doesn't support Python 3.6 any more.
2022-02-07 15:43:45 -08:00
Simon Willison
5bfd001b55 Use de-dupe idiom that works with Python 3.6, refs #1632 2022-02-07 15:42:37 -08:00
Simon Willison
fa5fc327ad Release 0.60.2
Refs #1632
2022-02-07 15:34:01 -08:00
Simon Willison
0cd982fc6a De-duplicate 'datasette db.db db.db', closes #1632
Refs https://github.com/simonw/datasette-publish-fly/pull/12
2022-02-07 15:28:59 -08:00
Simon Willison
03305ea183 Remove python.version, refs #1176 2022-02-06 22:40:47 -08:00
Simon Willison
fdce6f29e1 Reconfigure ReadTheDocs, refs #1176 2022-02-06 22:38:27 -08:00
Simon Willison
d9b508ffaa @documented decorator plus unit test plus sphinx.ext.autodoc
New mechanism for marking datasette.utils functions that should be covered by the
documentation, then testing that they have indeed been documented.

Also enabled sphinx.ext.autodoc which can now be used to embed the documented
versions of those functions.

Refs #1176
2022-02-06 22:31:06 -08:00
Simon Willison
9b83ff2ee4
Fixed spelling of "raise" 2022-02-05 22:46:33 -08:00
Simon Willison
8a25ea9bca Implemented import shortcuts, closes #957 2022-02-05 22:34:33 -08:00
Simon Willison
d25b55ab5e Fixed rST warnings 2022-02-05 22:32:23 -08:00
Simon Willison
1c6b297e3e Link to datasette.tracer from trace_debug docs, refs #1576 2022-02-04 21:28:35 -08:00
Simon Willison
da53e0360d tracer.trace_child_tasks() for asyncio.gather tracing
Also added documentation for datasette.tracer module.

Closes #1576
2022-02-04 21:19:49 -08:00
Simon Willison
ac239d34ab Refactor test_trace into separate test module, refs #1576 2022-02-04 20:45:13 -08:00
Robert Christie
1af1041f91
Jinja template_name should use "/" even on Windows (#1617)
Closes #1545. Thanks, Robert Christie
2022-02-02 17:58:35 -08:00
dependabot[bot]
b5e6b1a9e1
Bump black from 21.12b0 to 22.1.0 (#1616)
Bumps [black](https://github.com/psf/black) from 21.12b0 to 22.1.0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits/22.1.0)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-02-02 14:23:51 -08:00
Simon Willison
a9d8824617
Test against Python 3.11-dev
Closes #1621
2022-02-02 13:58:52 -08:00
Simon Willison
23a09b0f6a Remove JSON rel=alternate from some pages, closes #1623 2022-02-02 13:48:52 -08:00
Simon Willison
8d5779acf0 Refactored alternate_url_json mechanism, refs #1620, #1533 2022-02-02 13:32:47 -08:00
Simon Willison
b72b2423c7 rel=alternate JSON for queries and database pages, closes #1620 2022-02-02 13:22:45 -08:00
Simon Willison
3ef47a0896 Link rel=alternate header for tables and rows
Also added Access-Control-Expose-Headers: Link to --cors mode.

Closes #1533

Refs https://github.com/simonw/datasette-notebook/issues/2

LL#	metadata.json.1
2022-02-01 23:49:09 -08:00
Simon Willison
2aa686c655
It's not a weekly newsletter 2022-01-26 10:21:05 -08:00
Simon Willison
84391763a8
Clarify that magic parameters don't work for custom SQL 2022-01-25 10:39:03 -08:00
Simon Willison
68cc1e2dbb Move queries to top of database page, refs #1612 2022-01-25 10:28:15 -08:00
Simon Willison
d194db4204 Output pip freeze to show installed packages, refs #1609 2022-01-20 18:02:04 -08:00
Simon Willison
ffca55dfd7 Show link to /stable/ on /latest/ pages, refs #1608 2022-01-20 14:40:44 -08:00
Simon Willison
150967d98e Hand-edited pixel favicon, refs #1603 2022-01-20 10:43:15 -08:00
Simon Willison
7c67483f5e Make test_favicon flexible to changing icon sizes, refs #1603 2022-01-19 21:57:14 -08:00
Simon Willison
b01c9b68d1 Oops I pushed the wrong favicon, refs #1603 2022-01-19 21:54:41 -08:00
Simon Willison
b2eebf5ebf No need to send this, it's got a default, refs #1603 2022-01-19 21:52:00 -08:00
Simon Willison
0467723ee5 New, improved favicon - refs #1603 2022-01-19 21:46:03 -08:00
Simon Willison
e1770766ce Return plugins and hooks in predictable order 2022-01-19 21:14:04 -08:00
Simon Willison
43c30ce023 Use cog to maintain default plugin list in plugins.rst, closes #1600
Also fixed a bug I spotted where datasette.filters showed the same hook three times.
2022-01-19 21:04:09 -08:00
Simon Willison
14e320329f Hidden tables data_licenses, KNN, KNN2 for SpatiaLite, closes #1601 2022-01-19 20:38:49 -08:00
Simon Willison
fae3983c51 Drop support for Python 3.6, closes #1577
Refs #1606
2022-01-19 20:31:22 -08:00
Simon Willison
58652dd925 Hidden tables sqlite1/2/3/4, closes #1587 2022-01-19 20:12:46 -08:00
Simon Willison
cb29119db9 Release 0.60
Refs #473, #625, #1527, #1544, #1547, #1551, #1552, #1555, #1556, #1557,
#1563, #1564, #1568, #1570, #1575, #1579, #1588, #1594
2022-01-13 17:36:51 -08:00
Simon Willison
3664ddd400 Replace update-docs-help.py with cog, closes #1598 2022-01-13 16:47:53 -08:00
Simon Willison
10659c3f1f datasette-debug-asgi plugin to help investigate #1590 2022-01-13 16:38:53 -08:00
Simon Willison
ab7d6a7179 Updated settings help URL to avoid redirect 2022-01-13 16:38:16 -08:00
Simon Willison
714b4df1b1 Fixed reStructuredText warning, refs #1594 2022-01-13 16:36:28 -08:00
Simon Willison
76d66d5b2b Tweak order of documentation contents 2022-01-13 16:30:00 -08:00
Simon Willison
3a0f7d6488 Fixed hidden form fields bug #1527 2022-01-13 16:27:21 -08:00
Simon Willison
515f8d38eb Help summaries for publish cloudrun/heroku 2022-01-13 16:12:54 -08:00
Simon Willison
8cf4b77a92 Better copy for 'datasette plugins --help' 2022-01-13 16:11:07 -08:00
Simon Willison
8f5c44a166 Better --help summaries for install and uninstall 2022-01-13 16:09:38 -08:00
Simon Willison
88bc2ceae1 --help summary for 'datasette inspect', closes #1597 2022-01-13 16:07:30 -08:00
Simon Willison
3658e57ac2 Fixed bug with table title element, closes #1560 2022-01-13 14:20:07 -08:00
Simon Willison
5698e2af01 Promote Datasette Desktop in installation docs, closes #1466 2022-01-13 13:55:13 -08:00
Simon Willison
4b23f01f3e CLI reference docs, maintained by cog - refs #1594 2022-01-13 13:35:54 -08:00
Simon Willison
63537dd3de Allow 'explain query plan' with more whitespace, closes #1588 2022-01-13 12:34:55 -08:00
Simon Willison
8c401ee0f0 Fixed remaining code and docs for new block=True default, closes #1579 2021-12-23 11:18:20 -08:00
Simon Willison
75153ea9b9 Updated db.execute_write_fn() docs for block=True default, refs #1579 2021-12-23 11:16:31 -08:00
Simon Willison
00a2895cd2 execute_write defaut is now block=True, closes #1579 2021-12-23 11:03:49 -08:00
Simon Willison
ace86566b2 Remove concept of special_args, re-arrange TableView a bit, refs #1518 2021-12-22 12:23:05 -08:00
Simon Willison
6b1384b2f5
Track plausible for docs.datasette.io not datasette.io 2021-12-20 15:55:17 -08:00
Simon Willison
554aae5c51
Plausible analytics for the documentation 2021-12-20 09:23:05 -08:00
Simon Willison
f36e010b3b Upgrade to Pluggy>=1.0, refs #1575 2021-12-19 17:25:40 -08:00
Simon Willison
dbaac79946 Release 0.60a1
Refs #1547, #1555, #1562, #1563, #1564, #1567, #1568, #1569, #1570, #1571, #1572
2021-12-19 14:08:10 -08:00
Simon Willison
4094741c28 Fixed bug with custom templates for writable canned queries, closes #1547 2021-12-19 13:11:57 -08:00
Simon Willison
5fac26aa22 Another populate_schema_tables optimization, refs #1555 2021-12-19 12:54:12 -08:00
Simon Willison
f65817000f Include count in execute_write_many traces, closes #1571 2021-12-19 12:30:34 -08:00
Simon Willison
c6ff1f23e6 Queries took rather than query took, closes #1572 2021-12-18 20:03:21 -08:00
Simon Willison
97b1723dd0 Optimize init_internal_db by running PRAGMA in a single function
Refs #1555
2021-12-18 19:49:11 -08:00
Simon Willison
d637ed4676 Use execute_write_many to optimize internal DB, refs #1555, #1570 2021-12-18 11:11:08 -08:00
Simon Willison
5cadc24489 db.execute_write_script() and db.execute_write_many(), closes #1570
Refs #1555
2021-12-18 10:57:22 -08:00
Simon Willison
2e4ba71b53 Optimize create table calls using executescript=True
Refs #1555, #1569
2021-12-18 10:34:15 -08:00
Simon Willison
9e094b7c9d db.execute_write(executescript=True) option, closes #1569 2021-12-18 10:34:15 -08:00
Simon Willison
85c22f4fbc
Corrected Datasette(files=) example from #1563 2021-12-18 10:10:37 -08:00
Simon Willison
f81d9d0cd9 Trace write SQL queries in addition to read ones, closes #1568 2021-12-17 18:42:29 -08:00
Simon Willison
7c8f8aa209 Documentation for Datasette() constructor, closes #1563 2021-12-17 18:19:36 -08:00
Simon Willison
3a0cae4d7f Fix bug introduced by refactor in c35b84a2aa 2021-12-17 18:19:09 -08:00
Simon Willison
359140ceda Datasette() constructor no longer requires files=, closes #1563 2021-12-17 18:09:00 -08:00
Simon Willison
83bacfa945 Call _prepare_connection() on write connections, closes #1564 2021-12-17 17:58:39 -08:00
Simon Willison
c35b84a2aa Remove undocumented sqlite_functions mechanism, closes #1567 2021-12-17 17:54:39 -08:00
Simon Willison
0c91e59d2b datasette-leaflet-freedraw is an example of filters_from_request 2021-12-17 15:55:06 -08:00
Simon Willison
d0f24f9bbc Clarifying comment
The new filters stuff is a little bit action-at-a-distance
2021-12-17 15:55:06 -08:00
dependabot[bot]
35cba9e85a
Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 (#1562)
Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
- [Release notes](https://github.com/aio-libs/janus/releases)
- [Changelog](https://github.com/aio-libs/janus/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/janus/compare/v0.6.2...v1.0.0)

---
updated-dependencies:
- dependency-name: janus
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-12-17 15:08:28 -08:00
Simon Willison
f000a7bd75
Use load_extension(?) instead of fstring 2021-12-17 12:15:29 -08:00
Simon Willison
92a5280d2e Release 0.60a0
Refs #473, #625, #1544, #1551, #1552, #1556, #1557
2021-12-17 11:13:51 -08:00
Simon Willison
aa7f0037a4
filters_from_request plugin hook, now used in TableView
- New `filters_from_request` plugin hook, closes #473
- Used it to extract the logic from TableView that handles `_search` and
`_through` and `_where` - refs #1518

Also needed for this plugin work: https://github.com/simonw/datasette-leaflet-freedraw/issues/7
2021-12-17 11:02:14 -08:00
Simon Willison
0663d5525c More comments in TableView.data(), refs #1518 2021-12-16 14:00:29 -08:00
Simon Willison
2c07327d23 Move columns_to_select to TableView class, add lots of comments, refs #1518 2021-12-16 13:43:44 -08:00
Simon Willison
0d4145d0f4 Additional test for #625 2021-12-16 12:30:31 -08:00
Simon Willison
95d0dd7a1c Fix for colliding facet types bug, closes #625
Refs #830
2021-12-16 12:12:04 -08:00
Simon Willison
992496f261 ?_nosuggest=1 parameter for table views, closes #1557 2021-12-16 11:24:54 -08:00
Simon Willison
20a2ed6bec Fixed bug with metadata config of array/date facets, closes #1552
Thanks @davidbgk for spotting the fix for the bug.
2021-12-16 10:47:40 -08:00
Simon Willison
40e5b0a5b5
How to create indexes with sqlite-utils 2021-12-16 10:03:10 -08:00
Simon Willison
eb53837d2a Always show count of distinct facet values, closes #1556
Refs #1423
2021-12-15 09:58:01 -08:00
Simon Willison
4f02c8d4d7 Test for JSON in query_string name, refs #621
Plus simplified implementation of test_request_blank_values
2021-12-14 12:29:05 -08:00
dependabot[bot]
f5538e7161
Bump black from 21.11b1 to 21.12b0 (#1543)
Bumps [black](https://github.com/psf/black) from 21.11b1 to 21.12b0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-12-13 15:22:29 -08:00
dependabot[bot]
8b411a6b70
Update pytest-xdist requirement from <2.5,>=2.2.1 to >=2.2.1,<2.6 (#1548)
Updates the requirements on [pytest-xdist](https://github.com/pytest-dev/pytest-xdist) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-xdist/releases)
- [Changelog](https://github.com/pytest-dev/pytest-xdist/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest-xdist/compare/v2.2.1...v2.5.0)

---
updated-dependencies:
- dependency-name: pytest-xdist
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-12-13 15:22:21 -08:00
Simon Willison
a6ff123de5 keep_blank_values=True when parsing query_string, closes #1551
Refs #1518
2021-12-12 12:01:51 -08:00
Simon Willison
492f9835aa Refactor table view API tests to test_table_api.py
Refs #1518
2021-12-11 19:07:19 -08:00
Simon Willison
1876975e3b Refactor table view HTML tests to test_table_html.py
Refs #1518
2021-12-11 19:06:45 -08:00
Simon Willison
737115ea14
Label column finder is now case-insensitive
Closes #1544
2021-12-07 12:03:42 -08:00
Simon Willison
36b596e383
Framework :: Datasette Trove classifier 2021-12-07 11:41:56 -08:00
Simon Willison
7c02be2ee9 Release 0.59.4
Refs #1525, #1527
2021-11-29 22:45:37 -08:00
Simon Willison
ca66246438 Updated JSON foreign key tables test for #1525 2021-11-29 22:45:04 -08:00
Simon Willison
35b12746ba Fixed CSV test I broke in #1525 2021-11-29 22:37:22 -08:00
Simon Willison
a37ee74891 Correct link to _ prefix on row page, closes #1525 2021-11-29 22:34:31 -08:00
Simon Willison
69244a617b Rename city_id to _city_id in fixtures, refs #1525 2021-11-29 22:20:42 -08:00
Simon Willison
06762776f7 Fix for incorrect hidden for fields for _columns, refs #1527 2021-11-29 19:04:35 -08:00
dependabot[bot]
83eb29dece
Update janus requirement from <0.7,>=0.6.2 to >=0.6.2,<0.8 (#1529)
Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
- [Release notes](https://github.com/aio-libs/janus/releases)
- [Changelog](https://github.com/aio-libs/janus/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/janus/compare/v0.6.2...v0.7.0)

---
updated-dependencies:
- dependency-name: janus
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-29 18:37:13 -08:00
dependabot[bot]
cc4c70b367
Bump black from 21.9b0 to 21.11b1 (#1516)
Bumps [black](https://github.com/psf/black) from 21.9b0 to 21.11b1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-29 18:35:28 -08:00
dependabot[bot]
3303514a52
Update docutils requirement from <0.18 to <0.19 (#1508)
Updates the requirements on [docutils](http://docutils.sourceforge.net/) to permit the latest version.

---
updated-dependencies:
- dependency-name: docutils
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-29 18:35:18 -08:00
dependabot[bot]
1beb7d9399
Update aiofiles requirement from <0.8,>=0.4 to >=0.4,<0.9 (#1537)
Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
- [Release notes](https://github.com/Tinche/aiofiles/releases)
- [Commits](https://github.com/Tinche/aiofiles/compare/v0.4.0...v0.8.0)

---
updated-dependencies:
- dependency-name: aiofiles
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-29 18:29:54 -08:00
Simon Willison
48f11998b7 Release 0.59.3
Refs #448, #838, #1519
2021-11-20 15:40:21 -08:00
Simon Willison
d8c79b1340 Link to Apache proxy demo from documentation, closes #1524 2021-11-20 15:33:58 -08:00
Simon Willison
ed77eda6d8 Add datasette-redirect-to-https plugin
Also configured suprvisord children to log to stdout, so that I
can see them with flyctly logs -a datasette-apache-proxy-demo

Refs #1524
2021-11-20 15:30:25 -08:00
Simon Willison
f11a13d73f Extract out Apache config to separate file, refs #1524 2021-11-20 12:23:40 -08:00
Simon Willison
250db8192c Hopefully last fix relating to #1519, #838 2021-11-20 11:09:05 -08:00
Simon Willison
08947fa764 Fix more broken base_url links
Refs #1519, #838
2021-11-20 11:03:08 -08:00
Simon Willison
48951e4304 Switch to hosting demo on Fly, closes #1522 2021-11-20 10:51:51 -08:00
Simon Willison
494f11d5cc Switch from Alpine to Debian, refs #1522 2021-11-20 10:51:14 -08:00
Simon Willison
24b5006ad7 ProxyPreserveHost On for apache-proxy demo, refs #1522 2021-11-19 17:11:13 -08:00
Simon Willison
640031edfd Fixed bug introduced in #1519 2021-11-19 17:01:17 -08:00
Simon Willison
fe687fd020 Fixed a whole bunch of broken base_url links
Refs #1519, #838
2021-11-19 16:53:11 -08:00
Simon Willison
a1ba6cd6bb Use build arguments, refs #1522 2021-11-19 16:34:35 -08:00
Simon Willison
c617e1769e Fixed test I broke with new repr() in ##1519 2021-11-19 15:13:17 -08:00
Simon Willison
c76bbd4066 New live demo with Apache proxying, refs #1522 2021-11-19 14:50:06 -08:00
Simon Willison
ff0dd4da38 repr() method for Request, refs #1519 2021-11-19 12:29:37 -08:00
Simon Willison
3025505515 functools.wraps to help investigate #1517 2021-11-18 19:19:43 -08:00
Simon Willison
6e971b4ac1 Test confirming plugins can over-ride default routes, closes #1517 2021-11-18 19:07:21 -08:00
Simon Willison
0156c6b5e5 Facet in predictable order for tests, refs #448 2021-11-15 17:31:33 -08:00
Simon Willison
55024b5301 _facet_array no longer confused by duplicate array items, closes #448 2021-11-15 17:19:33 -08:00
Simon Willison
07044bd130 SQL view-friendly arraycontains/arraynotcontains implementation, refs #448 2021-11-15 15:41:07 -08:00
Simon Willison
502c02fa6d Pin to docutils<0.18 in ReadTheDocs, refs #1507 2021-11-13 21:37:40 -08:00
Simon Willison
030390fd4a
.readthedocs.yaml configuration, refs #1507 2021-11-13 21:29:43 -08:00
Simon Willison
de1e031713 Release 0.59.2
Refs #1497, #1503, #1506
2021-11-13 21:14:43 -08:00
Simon Willison
1c13e1af06 Ensure query columns are included too, ref #1503 2021-11-13 21:08:33 -08:00
Simon Willison
c9e3cfecc8 Columns in filters now ignore ?_nocol, closes #1503 2021-11-13 20:53:00 -08:00
Simon Willison
c306b696de Correct facet links for columns with a leading underscore, closes #1506 2021-11-13 20:44:54 -08:00
Simon Willison
c92ab51b3c
Logo at top of README 2021-11-12 06:18:31 -08:00
Simon Willison
2c31d1cd9c Upgrade Docker base to Debian buster, refs #1497 2021-10-24 16:24:41 -07:00
Simon Willison
3a2ed6300d Run tests on 3.10 during publish, refs #1482 2021-10-24 15:37:43 -07:00
Simon Willison
e6e44372b3 Release 0.59.1
Refs #1482, #1496
2021-10-24 15:29:56 -07:00
dependabot[bot]
03cc697b6b
Update pytest-asyncio requirement from <0.16,>=0.10 to >=0.10,<0.17 (#1494)
Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](https://github.com/pytest-dev/pytest-asyncio/compare/v0.10.0...v0.16.0)

---
updated-dependencies:
- dependency-name: pytest-asyncio
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-10-24 15:22:39 -07:00
Simon Willison
96a823f283
Fix compatibility with Python 3.10 (#1481)
* Run tests against Python 3.10
* Upgrade to Janus 0.6.2 for Python 3.10
* Add 3.10 to classifiers

Closes #1482
2021-10-24 15:19:54 -07:00
Simon Willison
15a9d4abff Docs on named parameters with cast as real/integer, closes #1496 2021-10-22 12:34:23 -07:00
Simon Willison
a1b20852db Unwrapped some documentation text 2021-10-22 12:00:00 -07:00
Simon Willison
e5d01ca583 Fixed typo in release notes 2021-10-22 11:56:27 -07:00
Simon Willison
ff9ccfb031 Fixed typo in release notes 2021-10-14 12:23:43 -07:00
Simon Willison
8934507cdc Release 0.59
Refs #942, #1404, #1405, #1416, #1420, #1421, #1422, #1423, #1425, #1431, #1443, #1446, #1449, #1467, #1469, #1470, #1488
2021-10-14 12:22:19 -07:00
Simon Willison
81bf9a9f3c Updated --cors documentation, refs #1467 2021-10-14 12:19:03 -07:00
Simon Willison
8584993529 --cors Access-Control-Allow-Headers: Authorization
Refs #1467, refs https://github.com/simonw/datasette-auth-tokens/issues/4
2021-10-14 12:03:28 -07:00
C. Titus Brown
0fdbf00484
Rework the --static documentation
Rework the `--static` documentation to better differentiate between the filesystem and serving locations. Closes #1457

Co-authored-by: Simon Willison <swillison@gmail.com>
2021-10-14 11:39:55 -07:00
dependabot[bot]
827fa823d1
Update pyyaml requirement from ~=5.3 to >=5.3,<7.0 (#1489)
Updates the requirements on [pyyaml](https://github.com/yaml/pyyaml) to permit the latest version.
- [Release notes](https://github.com/yaml/pyyaml/releases)
- [Changelog](https://github.com/yaml/pyyaml/blob/master/CHANGES)
- [Commits](https://github.com/yaml/pyyaml/compare/5.3...6.0)

---
updated-dependencies:
- dependency-name: pyyaml
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-10-14 11:10:42 -07:00
Simon Willison
b267b57754
Upgrade to httpx 0.20
* Upgrade to httpx 0.20, closes #1488
* TestClient.post() should not default to following redirects
2021-10-14 11:03:44 -07:00
dependabot[bot]
2a8c669039
Update beautifulsoup4 requirement (#1463)
Updates the requirements on [beautifulsoup4](http://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version.

---
updated-dependencies:
- dependency-name: beautifulsoup4
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-10-13 15:35:36 -07:00
dependabot[bot]
e1012e7098
Bump black from 21.7b0 to 21.9b0 (#1471)
Bumps [black](https://github.com/psf/black) from 21.7b0 to 21.9b0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-10-13 14:47:42 -07:00
Simon Willison
763d0a0faa Fix for cog menu default facet bug, closes #1469 2021-10-13 14:20:03 -07:00
dependabot[bot]
a673a93b57
Update pluggy requirement from ~=0.13.0 to >=0.13,<1.1 (#1448)
Updates the requirements on [pluggy](https://github.com/pytest-dev/pluggy) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pluggy/releases)
- [Changelog](https://github.com/pytest-dev/pluggy/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pluggy/compare/0.13.0...1.0.0)

---
updated-dependencies:
- dependency-name: pluggy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-10-13 14:11:00 -07:00
Michael Tiemann
31352914c4
Update full_text_search.rst (#1474)
Change "above" to "below" to correct correspondence of reference to example.
2021-10-13 14:10:23 -07:00
dependabot[bot]
6aab0217f0
Update pytest-xdist requirement from <2.4,>=2.2.1 to >=2.2.1,<2.5 (#1476)
Updates the requirements on [pytest-xdist](https://github.com/pytest-dev/pytest-xdist) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-xdist/releases)
- [Changelog](https://github.com/pytest-dev/pytest-xdist/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest-xdist/compare/v2.2.1...v2.4.0)

---
updated-dependencies:
- dependency-name: pytest-xdist
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-10-13 14:10:03 -07:00
dependabot[bot]
759fd97a54
Update pytest-timeout requirement from <1.5,>=1.4.2 to >=1.4.2,<2.1 (#1485)
Updates the requirements on [pytest-timeout](https://github.com/pytest-dev/pytest-timeout) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-timeout/releases)
- [Commits](https://github.com/pytest-dev/pytest-timeout/commits)

---
updated-dependencies:
- dependency-name: pytest-timeout
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-10-13 14:09:23 -07:00
Rhet Turnbull
68087440b3
Added instructions for installing plugins via pipx
Closes #1486
2021-10-13 14:09:10 -07:00
Simon Willison
b50bf5d13f Don't persist _next in hidden field, closes #1483 2021-10-13 14:08:06 -07:00
Simon Willison
0d5cc20aef Revert "asyncio_run helper to deal with a 3.10 warning, refs #1482"
This reverts commit 98dcabccbb.
2021-10-09 18:25:33 -07:00
Simon Willison
875117c343 Fix bug with ?_next=x&_sort=rowid, closes #1470 2021-10-09 18:14:56 -07:00
Simon Willison
1163da8916 Update test to handle Python 3.10 error message differenc, refs #1482 2021-10-08 17:32:52 -07:00
Simon Willison
98dcabccbb asyncio_run helper to deal with a 3.10 warning, refs #1482 2021-10-08 17:32:11 -07:00
Simon Willison
63886178a6
Describe a common mistake using csrftoken() 2021-09-22 15:44:28 -07:00
Simon Willison
b28b6cd2fe
Warn that execute_write_fn(fn) should be a non-async function 2021-09-12 13:13:52 -07:00
Simon Willison
d57ab156b3
Added researchers too, refs #1455 2021-09-04 09:33:20 -07:00
Robert Gieseke
772f9a07ce
Add scientists to target groups (#1455) 2021-09-04 09:31:38 -07:00
Simon Willison
67cbf0ae72
Example for register_commands, refs #1449 2021-08-28 04:17:03 -07:00
Simon Willison
50c35b66a4
Added missing space 2021-08-28 04:14:38 -07:00
Simon Willison
d3ea367131 Release 0.59a2
Refs #942, #1421, #1423, #1431, #1443, #1446, #1449
2021-08-27 18:55:54 -07:00
Simon Willison
30c18576d6 register_commands() plugin hook, closes #1449 2021-08-27 18:39:42 -07:00
Simon Willison
3655bb49a4 Better default help text, closes #1450 2021-08-27 17:49:03 -07:00
dependabot[bot]
a1a33bb582
Bump black from 21.6b0 to 21.7b0 (#1400)
Bumps [black](https://github.com/psf/black) from 21.6b0 to 21.7b0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-08-24 18:29:55 -07:00
dependabot[bot]
5161422b7f
Update trustme requirement from <0.9,>=0.7 to >=0.7,<0.10 (#1433)
Updates the requirements on [trustme](https://github.com/python-trio/trustme) to permit the latest version.
- [Release notes](https://github.com/python-trio/trustme/releases)
- [Commits](https://github.com/python-trio/trustme/compare/v0.7.0...v0.9.0)

---
updated-dependencies:
- dependency-name: trustme
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-08-24 18:29:26 -07:00
Tim Sherratt
93c3a7ffbf
Remove underscore from search mode parameter name (#1447)
The text refers to the parameter as `searchmode` but the `metadata.json` example uses `search_mode`. The latter doesn't actually seem to work.
2021-08-24 18:28:58 -07:00
Simon Willison
92a99d969c Added not-footer wrapper div, refs #1446 2021-08-24 11:13:42 -07:00
Simon Willison
7e15422aac Documentation for datasette.databases property, closes #1443 2021-08-19 14:23:43 -07:00
Simon Willison
4eb3ae40fb
Don't bother building docs if not on main
Refs ##1442
2021-08-19 14:17:44 -07:00
Simon Willison
d84e574e59
Ability to deploy demos of branches
* Ability to deploy additional branch demos, closes #1442
* Only run tests before deploy on main branch
* Documentation for continuous deployment
2021-08-19 14:09:38 -07:00
Simon Willison
adb5b70de5 Show count of facet values if ?_facet_size=max, closes #1423 2021-08-16 11:56:32 -07:00
Simon Willison
2883098770 Fixed config_dir mode, refs #1432 2021-08-12 22:17:40 -07:00
Simon Willison
bbc4756f9e
Settings fix, refs #1433 2021-08-12 20:54:25 -07:00
Simon Willison
ca4f83dc7b Rename config= to settings=, refs #1432 2021-08-12 18:10:36 -07:00
Simon Willison
77f46297a8 Rename --help-config to --help-settings, closes #1431 2021-08-12 18:01:57 -07:00
Simon Willison
e837095ef3
Column metadata, closes #942 2021-08-12 16:53:23 -07:00
Simon Willison
b1fed48a95 derive_named_parameters falls back to regex on SQL error, refs #1421 2021-08-08 20:26:08 -07:00
Simon Willison
fc4846850f New way of deriving named parameters using explain, refs #1421 2021-08-08 20:21:13 -07:00
Simon Willison
ad90a72afa Release 0.59a1
Refs #1425
2021-08-08 18:13:03 -07:00
Simon Willison
a390bdf9ce Stop using firstresult=True on render_cell, refs #1425
See https://github.com/simonw/datasette/issues/1425#issuecomment-894883664
2021-08-08 17:38:42 -07:00
Simon Willison
f3c9edb376 Fixed some tests I broke in #1425 2021-08-08 16:11:40 -07:00
Simon Willison
818b0b76a2 Test table render_cell async as well as query results, refs #1425 2021-08-08 16:07:52 -07:00
Simon Willison
3bb6409a6c render_cell() can now return an awaitable, refs 2021-08-08 16:05:00 -07:00
Simon Willison
de5ce2e563
datasette-pyinstrument 2021-08-08 10:37:51 -07:00
Simon Willison
61505dd0c6 Release 0.59a0
Refs #1404, #1405, #1416, #1420, #1422
2021-08-06 22:40:07 -07:00
Simon Willison
6dd14a1221 Improved links to example plugins 2021-08-06 22:38:47 -07:00
Simon Willison
a21853c9da Fix for rich.console sometimes not being available, refs #1416 2021-08-06 22:17:36 -07:00
Simon Willison
66e143c76e New hide_sql canned query option, refs #1422 2021-08-06 22:17:36 -07:00
Simon Willison
b7037f5ece Bit of breathing space on https://latest.datasette.io/fixtures/pragma_cache_size 2021-08-06 22:17:36 -07:00
Simon Willison
acc2243662
Quotes around '.[test]' for zsh 2021-08-05 08:47:18 -07:00
Simon Willison
a1f3830356 --cpu option for datasette publish cloudrun, closes #1420 2021-08-03 22:20:50 -07:00
Simon Willison
cd8b7bee8f Run codespell against datasette source code too, refs #1417 2021-08-03 10:03:48 -07:00
Simon Willison
2208c3c68e
Spelling corrections plus CI job for codespell
* Use codespell to check spelling in documentation, refs #1417
* Fixed spelling errors spotted by codespell, closes #1417
* Make codespell a docs dependency

See also this TIL:  https://til.simonwillison.net/python/codespell
2021-08-03 09:36:38 -07:00
Simon Willison
54b6e96ee8 Use optional rich dependency to render tracebacks, closes #1416 2021-08-03 09:12:48 -07:00
Simon Willison
a679d0de87 Fixed spelling of 'receive' in a bunch of places 2021-08-03 09:11:18 -07:00
Simon Willison
4adca0d850 No hidden SQL on canned query pages, closes #1411 2021-07-31 17:58:11 -07:00
Simon Willison
ff253f5242 Replace all uses of runner.isolated_filesystem, refs #1406 2021-07-31 11:49:08 -07:00
Simon Willison
96b1d0b7b4 Attempted fix for too-long UDS bug in #1407 2021-07-31 11:48:33 -07:00
Simon Willison
b46856391d pytest.mark.serial for any test using isolated_filesystem(), refs #1406 2021-07-30 16:46:41 -07:00
Simon Willison
e55cd9dc3f Try passing a directory to isolated_filesystem(), refs #1406 2021-07-29 18:16:58 -07:00
Simon Willison
74b775e20f Use consistent pattern for test before deploy, refs #1406 2021-07-29 17:50:45 -07:00
Simon Willison
2b1c535c12 pytest.mark.serial for any test using isolated_filesystem(), refs #1406 2021-07-29 17:44:16 -07:00
Simon Willison
121e10c29c Doumentation and test for utils.parse_metadata(), closes #1405 2021-07-29 16:30:12 -07:00
Simon Willison
eccfeb0871 register_routes() plugin hook datasette argument, closes #1404 2021-07-26 16:16:46 -07:00
Simon Willison
6f1731f305
Updated cookiecutter installation link 2021-07-23 12:38:09 -07:00
Simon Willison
c73af5dd72 Release 0.58.1
Refs #1231, #1396
2021-07-16 12:50:06 -07:00
Simon Willison
c00f29affc
Fix for race condition in refresh_schemas(), closes #1231 2021-07-16 12:44:58 -07:00
Simon Willison
dd5ee8e668 Removed some unused imports
I found these with:

    flake8 datasette | grep unus
2021-07-15 23:26:12 -07:00
Simon Willison
721a8d3cd4 Hopeful fix for publish problem in #1396 2021-07-14 18:51:36 -07:00
Simon Willison
084cfe1e00
Removed out-of-date datasette serve help from README 2021-07-14 18:00:39 -07:00
Simon Willison
e27dd7c12c Release 0.58
Refs #1365, #1371, #1377, #1384, #1387, #1388, #1389, #1394
2021-07-14 17:33:04 -07:00
Simon Willison
7ea678db22
Warn about potential changes to get_metadata hook, refs #1384 2021-07-14 17:19:31 -07:00
Simon Willison
a6c8e7fa4c Big performance boost for faceting, closes #1394 2021-07-14 17:05:43 -07:00
Simon Willison
ba11ef27ed
Clarify when to use systemd restart 2021-07-13 22:43:13 -07:00
Simon Willison
2c4cd7141a
Consistently use /my-datasette in examples 2021-07-13 16:15:48 -07:00
Simon Willison
7f4c854db1
rST fix 2021-07-13 11:45:32 -07:00
Aslak Raanes
d71cac4981
How to configure Unix domain sockets with Apache
Example on how to use Unix domain socket option on Apache. Not testet.

(Usually I would have used [`ProxyPassReverse`](https://httpd.apache.org/docs/current/mod/mod_proxy.html#proxypassreverse) in combination with `ProxyPass` , i.e.

```apache
ProxyPass /my-datasette/ http://127.0.0.1:8009/my-datasette/
ProxyPassReverse /my-datasette/ http://127.0.0.1:8009/my-datasette/
```

and

```apache
ProxyPass /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/
ProxyPassReverse /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/
```
2021-07-13 11:32:49 -07:00
Aslak Raanes
4054e96a39
Update deploying.rst (#1392)
Use same base url for Apache as in the example
2021-07-13 10:42:27 -07:00
dependabot[bot]
f83c84fd51
Update asgiref requirement from <3.4.0,>=3.2.10 to >=3.2.10,<3.5.0 (#1386)
Updates the requirements on [asgiref](https://github.com/django/asgiref) to permit the latest version.
- [Release notes](https://github.com/django/asgiref/releases)
- [Changelog](https://github.com/django/asgiref/blob/main/CHANGELOG.txt)
- [Commits](https://github.com/django/asgiref/commits)

---
updated-dependencies:
- dependency-name: asgiref
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-07-10 18:36:18 -07:00
Simon Willison
d792fc7cf5
Simplified nginx config examples 2021-07-10 17:29:42 -07:00
Simon Willison
de2a106328 Ran Black, refs #1388 2021-07-10 16:46:49 -07:00
Simon Willison
180c7a5328 --uds option for binding to Unix domain socket, closes #1388 2021-07-10 16:37:30 -07:00
Simon Willison
e0064ba7b0 Fixes for test_generated_columns_are_visible_in_datasette, refs #1391 2021-07-10 12:14:14 -07:00
Simon Willison
2e8d924cdc Refactored generated_columns test, no longer in fixtures.db - refs #1391 2021-07-10 12:03:19 -07:00
Simon Willison
83f6799a96 searchmode: raw table metadata property, closes #1389 2021-07-10 11:33:08 -07:00
Simon Willison
c8feaf0b62
systemctl restart datasette.service, closes #1390 2021-07-09 09:32:32 -07:00
Simon Willison
dbc61a1fd3
Documented ProxyPreserveHost On for Apache, closes #1387 2021-07-02 10:33:03 -07:00
Simon Willison
ea627baccf Removed fallback parameter from get_metadata, refs #1384 2021-06-26 17:02:42 -07:00
Simon Willison
0d339a4897 Removed text about executing SQL, refs #1384 2021-06-26 16:04:39 -07:00
Simon Willison
089278b8db rST fix, refs #1384 2021-06-26 15:49:07 -07:00
Simon Willison
05a312caf3 Applied Black, refs #1368 2021-06-26 15:25:28 -07:00
Brandon Roberts
baf986c871
New get_metadata() plugin hook for dynamic metadata
The following hook is added:

    get_metadata(
      datasette=self, key=key, database=database, table=table,
      fallback=fallback
    )

This gets called when we're building our metdata for the rest
of the system to use. We merge whatever the plugins return
with any local metadata (from metadata.yml/yaml/json) allowing
for a live-editable dynamic Datasette.

As a security precation, local meta is *not* overwritable by
plugin hooks. The workflow for transitioning to live-meta would
be to load the plugin with the full metadata.yaml and save.
Then remove the parts of the metadata that you want to be able
to change from the file.

* Avoid race condition: don't mutate databases list

This avoids the nasty "RuntimeError: OrderedDict mutated during
iteration" error that randomly happens when a plugin adds a
new database to Datasette, using `add_database`. This change
makes the add and remove database functions more expensive, but
it prevents the random explosion race conditions that make for
confusing user experience when importing live databases.

Thanks, @brandonrobertz
2021-06-26 15:24:54 -07:00
Simon Willison
953a64467d
Only publish stable docs on non-preview release
Refs https://github.com/simonw/datasette.io/issues/67
2021-06-24 09:42:02 -07:00
Simon Willison
ff17970ed4 Release 0.58a1
Refs #1365, #1377
2021-06-24 09:24:59 -07:00
Simon Willison
02b19c7a9a Removed rogue pdb=True, refs #1377 2021-06-23 15:50:48 -07:00
Simon Willison
b1fd24ac9f skip_csrf(datasette, scope) plugin hook, refs #1377 2021-06-23 15:40:09 -07:00
Simon Willison
4a3e8561ab Default 405 for POST, plus tests 2021-06-23 15:40:09 -07:00
Simon Willison
3a50015566
datasette-publish-now is now called datasette-publish-vercel 2021-06-23 12:51:19 -07:00
Simon Willison
403e370e5a
Fixed reference to default publish implementation 2021-06-23 12:50:19 -07:00
Simon Willison
7bc85b26d6 Deploy stable-docs.datasette.io on publish
Refs https://github.com/simonw/datasette.io/issues/67
2021-06-23 12:30:03 -07:00
Chris Amico
a6c55afe8c
Ensure db.path is a string before trying to insert into internal database (#1370)
Thanks, @eyeseast
2021-06-21 08:57:38 -07:00
dependabot[bot]
5335f360f4
Update pytest-xdist requirement from <2.3,>=2.2.1 to >=2.2.1,<2.4 (#1378)
Updates the requirements on [pytest-xdist](https://github.com/pytest-dev/pytest-xdist) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-xdist/releases)
- [Changelog](https://github.com/pytest-dev/pytest-xdist/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest-xdist/compare/v2.2.1...v2.3.0)

---
updated-dependencies:
- dependency-name: pytest-xdist
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-06-19 17:17:06 -07:00
dependabot[bot]
83e9c8bc75
Update trustme requirement from <0.8,>=0.7 to >=0.7,<0.9 (#1373)
Updates the requirements on [trustme](https://github.com/python-trio/trustme) to permit the latest version.
- [Release notes](https://github.com/python-trio/trustme/releases)
- [Commits](https://github.com/python-trio/trustme/compare/v0.7.0...v0.8.0)

---
updated-dependencies:
- dependency-name: trustme
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-06-13 08:38:47 -07:00
dependabot[bot]
e797565765
Bump black from 21.5b2 to 21.6b0 (#1374)
Bumps [black](https://github.com/psf/black) from 21.5b2 to 21.6b0.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

---
updated-dependencies:
- dependency-name: black
  dependency-type: direct:development
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-06-13 08:33:22 -07:00
Simon Willison
cd7678fde6 Release 0.58a0
Refs #1371
2021-06-09 21:51:14 -07:00
Simon Willison
d23a267138 Make request available to menu plugin hooks, closes #1371 2021-06-09 21:45:24 -07:00
Simon Willison
a3faf37883 Release 0.57.1
Refs #1364, #1367
2021-06-08 09:26:45 -07:00
Simon Willison
f4c5777c7e Fix visual glitch in nav menu, closes #1367 2021-06-07 11:24:14 -07:00
Simon Willison
03ec71193b Don't truncate list of columns on /db page, closes #1364 2021-06-06 15:07:45 -07:00
Simon Willison
030deb4b25 Try to handle intermittent FileNotFoundError in tests
Refs #1361
2021-06-05 16:02:03 -07:00
Simon Willison
0dfb924171
Temporarily reverting buildx support
I need to push a container for 0.57 using this action, and I'm not ready to ship other architecture builds until we have tested them in #1344.
2021-06-05 15:55:07 -07:00
Simon Willison
58746d3c51 Release 0.57
Refs #263, #615, #619, #1238, #1257, #1305, #1308, #1320, #1332, #1337, #1349, #1353, #1359, #1360
2021-06-05 15:06:55 -07:00
Simon Willison
8f311d6c1d Correctly escape output of ?_trace, refs #1360 2021-06-05 15:03:38 -07:00
Simon Willison
ff29dd55fa ?_trace=1 now depends on trace_debug setting, closes #1359 2021-06-05 13:18:37 -07:00
louispotok
368aa5f1b1
Update docs: explain allow_download setting (#1291)
* Update docs: explain allow_download setting

This fixes one possible source of confusion seen in #502 and clarifies
when database downloads will be shown and allowed.
2021-06-05 12:48:51 -07:00
Simon Willison
a634121525
Make custom pages compatible with base_url setting
Closes #1238

- base_url no longer causes custom page routing to fail
- new route_path key in request.scope storing the path that was used for routing with the base_url prefix stripped
- TestClient used by tests now avoids accidentally double processing of the base_url prefix
2021-06-05 11:59:54 -07:00
Simon Willison
6e9b07be92 More inclusive language 2021-06-02 21:45:03 -07:00
Simon Willison
f78ebdc045
Better "uploading and publishing your own CSV data" link 2021-06-02 10:00:30 -07:00
Simon Willison
d5d387abfe Applied Black, refs #1305 2021-06-01 21:30:44 -07:00
Simon Willison
80d8b0eb41 Test demonstrating fixed #1305, refs #1306 2021-06-01 21:26:25 -07:00
Guy Freeman
0f41db1ba8
Avoid error sorting by relationships if related tables are not allowed
Refs #1306
2021-06-01 21:25:27 -07:00
Simon Willison
f40d1b99d6 Don't show '0 results' on error page, refs #619 2021-06-01 21:09:10 -07:00
Simon Willison
ea5b237800 Show error message on bad query, closes #619 2021-06-01 20:59:29 -07:00
Simon Willison
9552414e1f
Re-display user's query with an error message if an error occurs (#1346)
* Ignore _shape when returning errors
2021-06-01 20:46:20 -07:00
Simon Willison
0f1e47287c Fixed bug with detect_fts for table with single quote in name, closes #1257 2021-06-01 20:27:04 -07:00
Simon Willison
807de378d0 /-/databases and homepage maintain connection order, closes #1216 2021-06-01 20:10:15 -07:00
dependabot[bot]
03b35d70e2
Bump black from 21.5b1 to 21.5b2 (#1352)
Bumps [black](https://github.com/psf/black) from 21.5b1 to 21.5b2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-06-01 19:56:44 -07:00
Simon Willison
0539bf0816 Don't execute facets/counts for _shape=array or object, closes #263 2021-06-01 19:53:00 -07:00
Simon Willison
a18e8641bc Don't reflect nofacet=1 and nocount=1 in BLOB URLs, refs #1353 2021-06-01 15:35:33 -07:00
Simon Willison
ff45ed0ce5 Updated --help output for latest Click, closes #1354 2021-06-01 09:16:58 -07:00
Simon Willison
fd368d3b2c New _nocount=1 option, used to speed up CSVs - closes #1353 2021-06-01 09:12:32 -07:00
Simon Willison
8bde6c5461 Rename ?_nofacets=1 to ?_nofacet=1, refs #1353 2021-06-01 08:56:00 -07:00
Simon Willison
d1d06ace49 ?_trac=1 for CSV, plus ?_nofacets=1 when rendering CSV
Closes #1351, closes #1350
2021-06-01 08:49:50 -07:00
Simon Willison
c5ae1197a2 ?_nofacets=1 option, closes #1350 2021-05-30 22:39:14 -04:00
Simon Willison
f7d3e76fb3 Facets now execute ignoring ?_col and ?_nocol, fixes #1345 2021-05-30 22:31:14 -04:00
Simon Willison
7b106e1060 Release 0.57a1
Refs #1319, #1320, #1331, #1337, #1338, #1341
2021-05-27 09:54:21 -07:00
Blair Drummond
89822d10be
Docker multi-arch support with Buildx (#1319)
Thanks, @blairdrummond
2021-05-27 09:49:23 -07:00
Simon Willison
1a8972f9c0
Upgrade Heroku runtime to python-3.8.10 2021-05-27 09:11:03 -07:00
Simon Willison
4545120c92 Test and docs for ?_facet_size=max, refs #1337 2021-05-27 09:04:26 -07:00
Simon Willison
7e983fede6 ?_facet_size=max, ... now links to that, closes #1337
Refs #1332
2021-05-27 09:00:58 -07:00
Simon Willison
51d7881140 'Show all columns' menu item if any _col= set, closes #1341
Refs #615
2021-05-26 21:31:12 -07:00
Simon Willison
f1c29fd6a1
?_col=/?_nocol= to show/hide columns on the table page
Closes #615

* Cog icon for hiding columns
* Show all columns cog menu item
* Do not allow hide column on primary keys
* Allow both ?_col= and ?_nocol=
* De-duplicate if ?_col= passed multiple times
* 400 error if user tries to ?_nocol= a primary key
* Documentation for ?_col= and ?_nocol=
2021-05-26 21:17:43 -07:00
Simon Willison
c0a748e5c3
Markup fix, refs #1320 2021-05-24 11:15:15 -07:00
Simon Willison
56af118fc1
How to apt-get install in Docker container, refs #1320 2021-05-24 11:14:45 -07:00
Simon Willison
fc972350a8 Docker image should now allow apt-get install, closes #1320 2021-05-24 11:07:03 -07:00
Simon Willison
eae3084b46 Fixed another Jinja warning, refs #1338 2021-05-24 10:52:09 -07:00
Simon Willison
2bd9d54b27 Fix Jinja warnings, closes #1338, refs #1331 2021-05-23 18:41:50 -07:00
Simon Willison
a443dba82f Release 0.57a0
Refs #1281, #1282, #1289, #1290, #1308, #1313, #1314, #1321, #1323, #1325, #1330, #1332, #1335
2021-05-22 17:45:54 -07:00
Simon Willison
9789b94da4 ?_facet_size=100 parameter, closes #1332 2021-05-22 17:34:33 -07:00
dependabot[bot]
5e9672c9bb
Bump black from 21.4b2 to 21.5b1 (#1321)
Bumps [black](https://github.com/psf/black) from 21.4b2 to 21.5b1.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/main/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-05-22 16:55:39 -07:00
dependabot[bot]
5c3b3ef97e
Update click requirement from ~=7.1.1 to >=7.1.1,<8.1.0 (#1323)
Updates the requirements on [click](https://github.com/pallets/click) to permit the latest version.
- [Release notes](https://github.com/pallets/click/releases)
- [Changelog](https://github.com/pallets/click/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/click/compare/7.1.1...8.0.0)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-05-22 16:54:48 -07:00
dependabot[bot]
b64d872046
Update itsdangerous requirement from ~=1.1 to >=1.1,<3.0 (#1325)
Updates the requirements on [itsdangerous](https://github.com/pallets/itsdangerous) to permit the latest version.
- [Release notes](https://github.com/pallets/itsdangerous/releases)
- [Changelog](https://github.com/pallets/itsdangerous/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/itsdangerous/compare/1.1.0...2.0.0)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-05-22 16:54:24 -07:00
dependabot[bot]
593d3e8173
Update aiofiles requirement from <0.7,>=0.4 to >=0.4,<0.8 (#1330)
Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
- [Release notes](https://github.com/Tinche/aiofiles/releases)
- [Commits](https://github.com/Tinche/aiofiles/compare/v0.4.0...v0.7.0)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-05-22 16:53:56 -07:00
Abdussamet Koçak
459259175e
Fix small typo (#1335) 2021-05-22 16:53:34 -07:00
dependabot[bot]
9b3b7e280c
Update jinja2 requirement from <2.12.0,>=2.10.3 to >=2.10.3,<3.1.0 (#1324)
Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/2.10.3...3.0.0)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-05-17 10:19:40 -07:00
dependabot-preview[bot]
1b697539f5
Bump black from 20.8b1 to 21.4b2 (#1313)
Bumps [black](https://github.com/psf/black) from 20.8b1 to 21.4b2.
- [Release notes](https://github.com/psf/black/releases)
- [Changelog](https://github.com/psf/black/blob/master/CHANGES.md)
- [Commits](https://github.com/psf/black/commits)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2021-04-29 08:47:49 -07:00
dependabot-preview[bot]
5e60bad404
Upgrade to GitHub-native Dependabot (#1314)
Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2021-04-29 08:47:21 -07:00
Simon Willison
a4bb2abce0 Show primary key cells in bold without affecting columns called 'link', closes #1308 2021-04-23 23:07:37 -07:00
dependabot-preview[bot]
6ed9238178
Update pytest-asyncio requirement from <0.15,>=0.10 to >=0.10,<0.16 (#1303)
Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](https://github.com/pytest-dev/pytest-asyncio/compare/v0.10.0...v0.15.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2021-04-19 11:18:17 -07:00
Simon Willison
0a7621f96f
Use pytest-xdist to speed up tests (#1290)
* Run tests in CI using pytest-xdist
* Documentation for pytest-xdist

Closes #1289
2021-04-02 20:42:28 -07:00
Simon Willison
59ef4a20cb
© 2017-2021 2021-04-02 13:27:03 -07:00
Simon Willison
87b583a128 Clearer help text for --reload
Immutable databases are not commonly used, but it's useful to clarify
that --reload will pick up on changes to metadata.
2021-04-02 13:20:51 -07:00
Marjorie Roswell
7b1a9a1999
Fix little typo (#1282) 2021-03-29 12:57:34 -07:00
Simon Willison
0486303b60 Explicitly push version tag, refs #1281 2021-03-28 18:42:42 -07:00
Simon Willison
8291065b13 Hopeful fix for Docker tag error, refs #1281 2021-03-28 18:39:02 -07:00
Simon Willison
849c4f06ea Workflow for manually pushing a Docker tag, refs #1281 2021-03-28 18:36:07 -07:00
Simon Willison
13fd9bdf01
docker push --all-tags, refs #1281 2021-03-28 18:07:49 -07:00
Simon Willison
af5a7f1c09 Release 0.56
Refs #1005, #1031, #1141, #1229, #1236, #1239, #1246, #1247, #1252, #1266, #1276, #1278
2021-03-28 17:41:12 -07:00
Simon Willison
d579fcf4f7 Applied some fixes suggested by @withshubh in #1260 2021-03-28 17:20:55 -07:00
Campbell Allen
f92d823766
ensure immutable databses when starting in configuration directory mode with (#1229)
* check if immutables is empty list of None
* update docs on how to create the inspect-data.json
2021-03-28 17:17:31 -07:00
Bob Whitelock
e72397d65b
Add styling to lists within table cells (fixes #1141) (#1252)
This overrides the Datasette reset (see
d0fd833b8c/datasette/static/app.css (L35-L38)),
to add back the default styling of list items displayed within Datasette
table cells.
2021-03-28 17:14:04 -07:00
vincent d warmerdam
c96a3826cf
Added --app to fly install command. (#1279) 2021-03-28 17:11:55 -07:00
Simon Willison
48d5e0e6ac Fix for no such table: pragma_database_list, refs #1276 2021-03-28 16:44:29 -07:00
Simon Willison
3fcfc85134 Fix links in SpatiaLite tutorial, closes #1278 2021-03-27 09:16:45 -07:00
Simon Willison
8ebdcc916d Remove obsolete note about building SpatiaLite from source, refs #1249 2021-03-26 21:33:15 -07:00
Simon Willison
5fd0289065 Build Dockerfile with SpatiaLite 5, refs #1249 2021-03-26 21:27:40 -07:00
Simon Willison
6ad544df5e Fixed master -> main in a bunch of places, mainly docs 2021-03-23 09:19:41 -07:00
Simon Willison
c4f1ec7f33 Documentation for Response.asgi_send(), closes #1266 2021-03-20 14:32:23 -07:00
Konstantin Baikov
8e18c79431
Use context manager instead of plain open (#1211)
Context manager with open closes the files after usage.

When the object is already a pathlib.Path i used read_text
write_text functions

In some cases pathlib.Path.open were used in context manager,
it is basically the same as builtin open.

Thanks, Konstantin Baikov!
2021-03-11 08:15:49 -08:00
Jean-Baptiste Pressac
a1bcd2fbe5
Minor typo in IP adress (#1256)
127.0.01 replaced by 127.0.0.1
2021-03-10 10:26:39 -08:00
Bob Whitelock
d0fd833b8c
Add compile option to Dockerfile to fix failing test (fixes #696) (#1223)
This test was failing when run inside the Docker container:
`test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]`,

with this error:

```
    def test_searchable(app_client, path, expected_rows):
        response = app_client.get(path)
>       assert expected_rows == response.json["rows"]
E       AssertionError: assert [[1, 'barry c...sel', 'puma']] == []
E         Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther']
E         Full diff:
E         + []
E         - [[1, 'barry cat', 'terry dog', 'panther'],
E         -  [2, 'terry dog', 'sara weasel', 'puma']]
```

The issue was that the version of sqlite3 built inside the Docker
container was built with FTS3 and FTS4 enabled, but without the
`SQLITE_ENABLE_FTS3_PARENTHESIS` compile option passed, which adds
support for using `AND` and `NOT` within `match` expressions (see
https://sqlite.org/fts3.html#compiling_and_enabling_fts3_and_fts4 and
https://www.sqlite.org/compile.html).

Without this, the `AND` used in the search in this test was being
interpreted as a literal string, and so no matches were found. Adding
this compile option fixes this.

Thanks, @bobwhitelock
2021-03-06 23:41:17 -08:00
David Boucha
4f9a2f1f47
Fix small typo (#1243)
Thanks, @UtahDave
2021-03-03 21:46:10 -08:00
Simon Willison
7c87532acc New .add_memory_database() method, closes #1247 2021-02-28 20:02:18 -08:00
Simon Willison
47eb885cc2 JSON faceting now suggested even if column has blank strings, closes #1246 2021-02-28 19:44:04 -08:00
Simon Willison
cc6774cbaa Upgrade httpx and remove xfail from tests, refs #1005 2021-02-28 14:34:44 -08:00
Simon Willison
afed51b1e3 Note about where to find plugin examples, closes #1244 2021-02-26 09:27:09 -08:00
Simon Willison
726f781c50 Fix for arraycontains bug, closes #1239 2021-02-22 16:22:47 -08:00
Simon Willison
42caabf7e9
Fixed typo 2021-02-22 09:35:41 -08:00
Simon Willison
1f9cca33b4 Resizable SQL editor using cm-resize, refs #1236 2021-02-19 15:47:52 -08:00
Simon Willison
cb8a293bd7 Release 0.55
Refs #1205, #1207, #1214, #1221, #1226, #1227, #1232, #1235
2021-02-18 18:01:06 -08:00
Simon Willison
a4239309b1 Bump Dockerfile to using Python 3.7.10, closes #1235 2021-02-18 17:48:20 -08:00
Simon Willison
73bed17563
Corrected documentation for datasette.urls.static_plugins 2021-02-18 15:25:01 -08:00
Simon Willison
6f41c8a2be
--crossdb option for joining across databases (#1232)
* Test for cross-database join, refs #283
* Warn if --crossdb used with more than 10 DBs, refs #283
* latest.datasette.io demo of --crossdb joins, refs #283
* Show attached databases on /_memory page, refs #283
* Documentation for cross-database queries, refs #283
2021-02-18 14:09:12 -08:00
Simon Willison
4df548e766 Update documentation, refs #1226 2021-02-18 10:32:04 -08:00
Simon Willison
5af2b99111
Create FUNDING.yml 2021-02-18 10:22:01 -08:00
Simon Willison
36a44bffbf Validation for --port, closes #1226 2021-02-18 10:05:27 -08:00
Simon Willison
d2d53a5559 New :issue: Sphinx macro, closes #1227 2021-02-17 17:20:15 -08:00
Simon Willison
9603d893b9 Tests for --ssl-keyfile and --ssl-certfile, refs #1221 2021-02-11 16:53:20 -08:00
Simon Willison
eda652cf6e
--ssl-keyfile and --ssl-certfile options to "datasette serve"
Closes #1221
2021-02-11 16:52:16 -08:00
Simon Willison
aa1fe0692c
Updated demo and video links 2021-02-07 19:27:02 -08:00
Simon Willison
3a3de76009 Release 0.54.1
Refs #1214
2021-02-02 13:24:05 -08:00
Simon Willison
7a2ed9f8a1 Fixed bug with ?_sort= and ?_search=, closes #1214 2021-02-02 13:21:03 -08:00
Simon Willison
beb98bf454
Fixed typo in code example 2021-01-31 00:49:09 -08:00
Simon Willison
dde3c500c7
Using pdb for errors thrown inside Datasette
Closes #1207
2021-01-28 18:12:32 -08:00
Simon Willison
1600d2a3ec Renamed /:memory: to /_memory, with redirects - closes #1205 2021-01-28 14:48:56 -08:00
Simon Willison
382e9ecd1d Removed a rogue full-stop 2021-01-25 09:35:06 -08:00
Simon Willison
0b9ac1b2e9
Release 0.54
Refs #509, #1091, #1150, #1151, #1166, #1167, #1178, #1181, #1182, #1184, #1185, #1186, #1187, #1194, #1198
2021-01-25 09:33:29 -08:00
Simon Willison
a5ede3cdd4 Fixed bug loading database called 'test-database (1).sqlite'
Closes #1181.

Also now ensures that database URLs have special characters URL-quoted.
2021-01-24 21:13:05 -08:00
Simon Willison
07e1635615 All ?_ parameters now copied to hidden form fields, closes #1194 2021-01-24 19:10:10 -08:00
Simon Willison
f3a1555318 Contributing docs for Black and Prettier, closes #1167
Refs #1203
2021-01-24 17:58:15 -08:00
Simon Willison
ffff3a4c53
Easier way to run Prettier locally (#1203)
Thanks, Ben Pickles - refs #1167
2021-01-24 17:41:46 -08:00
Simon Willison
b6a7b58fa0 Initial docs for _internal database, closes #1154 2021-01-24 16:08:29 -08:00
Simon Willison
f78e956eca Plugin testing documentation on using pytest-httpx
Closes #1198
2021-01-24 12:38:29 -08:00
Simon Willison
25c2933667 publish heroku now uses python-3.8.7 2021-01-22 16:46:25 -08:00
Simon Willison
5378f02352
Better tool for extracting issue numbers 2021-01-19 12:50:12 -08:00
Simon Willison
57f4d7b82f Release 0.54a0
Refs #1091, #1145, #1151, #1156, #1157, #1158, #1166, #1170, #1178, #1182, #1184, #1185, #1186, #1187
2021-01-19 12:47:30 -08:00
Simon Willison
7e3cfd9cf7
Clarify the name of plugin used in /-/static-plugins/ 2021-01-19 12:27:45 -08:00
Simon Willison
c38c42948c extra_body_script module support, closes #1187 2021-01-13 18:14:33 -08:00
Simon Willison
fa0c3777b8 script type=module support, closes #1186 2021-01-13 17:50:52 -08:00
Simon Willison
640ac7071b Better PRAGMA error message, closes #1185 2021-01-12 14:26:19 -08:00
Simon Willison
8e8fc5cee5 Applied Black 2021-01-11 13:34:38 -08:00
Simon Willison
ef2ecc1b89 Standardize on 'query string', not 'querystring', in docs
The request property is request.query_string so this is more consistent.
2021-01-11 13:33:54 -08:00
Simon Willison
649f48cd70 request.full_path property, closes #1184 2021-01-11 13:32:58 -08:00
Simon Willison
ed15c9908e Shrunk ecosystem docs in favour of datasette.io, closes #1182 2021-01-09 14:17:18 -08:00
Simon Willison
faa76390a0 Fixed bug introduced in e1efa9b7, refs #1178 2021-01-07 16:01:01 -08:00
Simon Willison
4c0995ed60
Fixed bug in example nginx config, refs #1091 2021-01-07 15:42:14 -08:00
Simon Willison
97fb10c17d Applied Black, refs #1178 2021-01-06 10:22:20 -08:00
Simon Willison
e1efa9b7a3 force_https_urls on for publish cloudrun, refs #1178 2021-01-06 10:13:34 -08:00
Simon Willison
ab7767acbe
tmate session mac
So I can test https://github.com/simonw/datasette/issues/93
2021-01-04 13:31:55 -08:00
Ben Pickles
3054e0f730
Install Prettier via package.json (#1170)
* Error if Prettier isn't already installed
* Temporarily run Prettier check on every commit
* Install and run Prettier via package.json
* Trigger another prettier check on CI
2021-01-04 11:52:33 -08:00
Simon Willison
1e8fa3ac7c
Only run prettier on changes to datasette/static
Refs #1166
2021-01-01 13:45:55 -08:00
Simon Willison
a93a65b027
Fixed Prettier formatting, closes #1166 2020-12-31 13:46:32 -08:00
Simon Willison
80870911de
Trying out bad formatting, refs #1166 2020-12-31 13:44:47 -08:00
Simon Willison
9cbc099492
GitHub Actions workflow for Prettier, refs #1166 2020-12-31 13:42:14 -08:00
Simon Willison
5193d0b3e4 Apply prettier to table.js, refs #1166 2020-12-31 13:27:39 -08:00
Simon Willison
1a513ed092 Ignore node_modules 2020-12-31 13:26:37 -08:00
Simon Willison
03933b3084 .prettierrc, refs #1166 2020-12-31 13:25:44 -08:00
Simon Willison
6705560148 Refactor out sqlite_extensions option 2020-12-29 14:16:05 -08:00
Simon Willison
8df116b24c sqlite-utils now lives at sqlite-utils.datasette.io 2020-12-29 13:38:53 -08:00
Miroslav Šedivý
a882d67962
Modernize code to Python 3.6+ (#1158)
* Compact dict and set building
* Remove redundant parentheses
* Simplify chained conditions
* Change method name to lowercase
* Use triple double quotes for docstrings

Thanks, @eumiro!
2020-12-23 09:04:32 -08:00
Simon Willison
90eba4c3ca Prettier CREATE TABLE SQL for _internal 2020-12-22 15:55:43 -08:00
Simon Willison
8919f99c2f Improved .add_database() method design
Closes #1155 - _internal now has a sensible name

Closes #509 - Support opening multiple databases with the same stem
2020-12-22 12:04:18 -08:00
Simon Willison
270de6527b Foreign keys for _internal database
Refs #1099 - Datasette now uses compound foreign keys internally,
so it would be great to link them correctly.
2020-12-22 11:48:54 -08:00
Simon Willison
bc1f1e1ce8 Compound primary key for foreign_keys table in _internal 2020-12-22 11:04:29 -08:00
Simon Willison
810853c5f2 Use time.perf_counter() instead of time.time(), closes #1157 2020-12-21 13:49:14 -08:00
Simon Willison
dcdfb2c301 Rename _schemas to _internal, closes #1156 2020-12-21 11:48:06 -08:00
Simon Willison
ebc7aa287c In-memory _schemas database tracking schemas of attached tables, closes #1150 2020-12-18 14:34:05 -08:00
Simon Willison
5e9895c67f Database(memory_name=) for shared in-memory databases, closes #1151 2020-12-17 17:01:18 -08:00
dependabot-preview[bot]
6119bd7973
Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0 (#1145)
Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...6.2.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-12-16 13:44:39 -08:00
Simon Willison
0c616f732c Release 0.53
Refs #1132, #1135, #1133, #1138, #1137
2020-12-10 17:44:36 -08:00
Simon Willison
02bb373194 Updated release process 2020-12-10 17:38:16 -08:00
Simon Willison
967cc05545 Powered by links to datasette.io, closes #1138 2020-12-10 15:37:08 -08:00
Simon Willison
2c0aca4887 _header=off option for CSV export, closes #1133 2020-12-10 15:28:44 -08:00
Simon Willison
7ef80d0145 News is now on datasette.io/news
Closes #1137, closes #659
2020-12-10 15:24:16 -08:00
Simon Willison
e0b54d0911 No longer using Wiki for examples 2020-12-10 15:20:43 -08:00
Simon Willison
4c6407cd74 Releasing bug fixes from a branch, closes #1136 2020-12-09 12:14:33 -08:00
Simon Willison
387b471b88 Release 0.52.5
Refs #1134
2020-12-09 12:13:14 -08:00
Simon Willison
6000d1a724 Fix for combining ?_search_x and ?_searchmode=raw, closes #1134 2020-12-09 11:56:44 -08:00
Simon Willison
fe86d85308 datasette serve --create option, closes #1135 2020-12-09 11:45:45 -08:00
Simon Willison
4c25b035b2 arraynotcontains filter, closes #1132 2020-12-07 14:41:03 -08:00
Simon Willison
8ae0f9f7f0
Fixed spelling of Janary 2020-12-07 12:16:13 -08:00
Simon Willison
62a6f70c64
Fixed Markdown indentation of news
To make it easier to programmatically extract.
2020-12-07 12:10:05 -08:00
Simon Willison
e3143700a2 Custom template for docs, linking to datasette.io 2020-12-07 11:00:10 -08:00
Simon Willison
e5930e6f88 Typo fix in release notes 2020-12-05 11:42:42 -08:00
Simon Willison
2dc281645a Release 0.52.4
Refs #1125, #1131, #1094
2020-12-05 11:41:40 -08:00
Abdussamet Koçak
705d1a1555
Fix startup error on windows (#1128)
Fixes https://github.com/simonw/datasette/issues/1094

This import isn't used at all, and causes error on startup on Windows.
2020-12-05 11:35:03 -08:00
Simon Willison
eae103a82b Write errors to stderr, closes #1131 2020-12-04 21:21:11 -08:00
Simon Willison
42efb799ea Fixed invalid test for generated columns, refs #1119 2020-12-04 21:20:12 -08:00
Simon Willison
49d8fc0568 Try pysqlite3-binary version as well, refs #1125 2020-12-03 20:07:16 -08:00
Simon Willison
e2fea36540
Switch to google-github-actions/setup-gcloud - refs #1126 2020-12-03 19:12:33 -08:00
Simon Willison
00185af74a Show pysqlite3 version on /-/versions, if installed - #1125 2020-12-03 14:08:50 -08:00
Simon Willison
4cce551666 Release 0.52.3
Refs #1124
2020-12-03 11:07:05 -08:00
Simon Willison
ca6e8e53dc More helpful 404 messages, refs #1124 2020-12-03 11:05:12 -08:00
Simon Willison
63efcb35ce More tweaks to root_path handling, refs #1124 2020-12-03 11:02:53 -08:00
Simon Willison
6b4c55efea Fix for Amazon Linux static assets 404ing, refs #1124 2020-12-03 10:53:26 -08:00
Simon Willison
e048791a9a Release 0.52.2
Refs #1116, #1115, #1100, #749, #1121
2020-12-02 16:57:40 -08:00
Simon Willison
13c960c03b Test is no longer order dependent, closes #1123 2020-12-02 16:49:55 -08:00
Simon Willison
a45a3dff3e Fix for OPTIONS request against /db, closes #1100 2020-12-02 16:49:55 -08:00
Abdussamet Koçak
daae35be46
Fix misaligned table actions cog
Closes #1121. Thanks, @abdusco
2020-12-02 16:33:36 -08:00
Simon Willison
88ac538b41 transfer-encoding: chunked for DB downloads, refs #749
This should get >32MB downloads working on Cloud Run.
2020-12-02 15:47:37 -08:00
Simon Willison
a970276b99
Try pysqlite3 on latest.datasette.io
--install=pysqlite3-binary to get a working demo of generated columns, refs #1119
2020-11-30 17:19:09 -08:00
Simon Willison
17cbbb1f7f
generated_columns table in fixtures.py, closes #1119 2020-11-30 16:28:02 -08:00
Simon Willison
461670a0b8
Support for generated columns
* Support for generated columns, closes #1116
* Show SQLite version in pytest report header
* Use table_info() if SQLite < 3.26.0
* Cache sqlite_version() rather than re-calculate every time
* Adjust test_database_page for SQLite 3.26.0 or higher
2020-11-30 13:29:57 -08:00
Simon Willison
49b6297fb7 Typo fix: messagge_is_html, closes #1118 2020-11-30 13:24:23 -08:00
Simon Willison
dea3c508b3 Revert "Support for generated columns, closes #1116" - it failed CI
This reverts commit 37f87b5e52.
2020-11-30 12:09:32 -08:00
Simon Willison
37f87b5e52 Support for generated columns, closes #1116 2020-11-30 12:01:15 -08:00
Simon Willison
c745c2715a Moved comment for clarity 2020-11-29 12:27:34 -08:00
Simon Willison
4777362bf2 Work around CI bug with ensure_eventloop, refs #1115 2020-11-29 12:19:24 -08:00
Simon Willison
09033c08be Suggest --load-extension=spatialite, closes #1115 2020-11-29 12:13:16 -08:00
Simon Willison
242bc89fdf Release 0.52.1
Refs #1098, #1102, #1114
2020-11-29 11:38:29 -08:00
Simon Willison
deb0be4ae5 Fix bug where compound foreign keys produced broken links, closes #1098 2020-11-29 11:30:17 -08:00
Simon Willison
e800ffcf7c
/usr/local/lib/mod_spatialite.so
Closes #1114
2020-11-29 09:37:43 -08:00
Simon Willison
12877d7a48
Plugin testing docs now recommend datasette.client, closes #1102 2020-11-28 23:44:57 -08:00
Simon Willison
a8e66f9065 Release 0.52
Refs #992, #1103, #1104, #1107, #1077, #1110, #1089, #1086, #1088, #1084
2020-11-28 15:54:35 -08:00
Simon Willison
50cc6af016 Fixed some broken internal links, refs #1106 2020-11-28 15:34:56 -08:00
Jeff Triplett
bbde835a1f
Fix --metadata doc usage (#1112)
Thanks, @jefftriplett.
2020-11-28 11:53:48 -08:00
Simon Willison
37d18a5bce datasette publish cloudrun --apt-get-install, closes #1110 2020-11-24 19:05:35 -08:00
Simon Willison
f2e2bfcdd9 Renamed datasette.config() to .setting(), closes #1107 2020-11-24 14:06:32 -08:00
Simon Willison
5a77f7a649 Updated docs renaming config to settings
- config.html is now settings.html
- ConfigOption in app.py is now Setting
- updated documentation unit tests

Refs #1106
2020-11-24 13:22:33 -08:00
Simon Willison
33eadb8782 config.json is now settings.json, closes #1104 2020-11-24 12:37:29 -08:00
Simon Willison
2a3d5b720b Redirect /-/config to /-/settings, closes #1103 2020-11-24 12:19:14 -08:00
Simon Willison
3159263f05 New --setting to replace --config, closes #992 2020-11-24 12:01:47 -08:00
Simon Willison
4bac9f18f9 Fix off-screen action menu bug, refs #1084 2020-11-21 15:33:04 -08:00
Simon Willison
30e64c8d3b
Use f-strings in place of .format()
Code transformed like so:

    pip install flynt
    flynt .
    black .
2020-11-15 15:24:22 -08:00
Simon Willison
6fd35be64d
Fixed invalid JSON in exampl 2020-11-15 08:45:26 -08:00
Simon Willison
200284e1a7
Clarified how --plugin-secret works 2020-11-15 08:43:13 -08:00
Simon Willison
5eb8e9bf25 Removed words that minimize involved difficulty, closes #1089 2020-11-12 12:07:19 -08:00
Simon Willison
253f2d9a3c Use correct QueryInterrupted exception on row page, closes #1088 2020-11-11 20:36:44 -08:00
Simon Willison
e8e0a6f284
Use FTS4 in fixtures
Closes #1081
2020-11-11 16:02:58 -08:00
Simon Willison
2a981e2ac1 Blank foreign key labels now show as hyphens, closes #1086 2020-11-11 15:44:04 -08:00
Simon Willison
13d1228d80
/dbname/tablename/-/modify-table-schema is OK after all
Refs #1053, #296
2020-11-02 12:02:50 -08:00
Simon Willison
d6257e3a7b Add database/table actions to pattern portfolio
Refs #1066, #1077
2020-11-02 10:53:52 -08:00
Simon Willison
7b19492070 database_actions() plugin hook, closes #1077 2020-11-02 10:27:25 -08:00
Simon Willison
b61f6cceb5 Add nav menu to pattern portfolio 2020-11-01 09:22:13 -08:00
Simon Willison
59b252a0c0
Link to annotated release notes for 0.51 2020-10-31 21:45:42 -07:00
Simon Willison
4785172bbc Release 0.51.1 2020-10-31 20:33:47 -07:00
Simon Willison
7788d62fa6
Expanded the Binary plugins section 2020-10-31 20:28:16 -07:00
Simon Willison
f0bd2d05f5 Link to global-power-plants demo instead of sf-trees 2020-10-31 15:24:54 -07:00
Simon Willison
d53d747e6a Release 0.51
Refs #1014, #1016, #1019, #1023, #1027, #1028, #1033, #1034, #1036, #1039

Closes #1076
2020-10-31 15:21:49 -07:00
Simon Willison
fa4de7551c Binary data documentation, closes #1047 2020-10-31 14:37:58 -07:00
Simon Willison
1fe15f4dc1 Docs: Running Datasette behind a proxy, closes #1027 2020-10-31 14:13:57 -07:00
Simon Willison
6bb41c4b33 Fix for test_paginate_using_link_header 2020-10-31 13:48:39 -07:00
Simon Willison
a4ca26a265 Address PrefixedUrlString bug in #1075 2020-10-31 13:35:47 -07:00
Simon Willison
bf18b9ba17 Stop using plugin-example.com, closes #1074 2020-10-31 12:47:42 -07:00
Simon Willison
84bc7244c1 datasette.client now applies base_url, closes #1026 2020-10-31 12:29:42 -07:00
Simon Willison
7a67bc7a56 datasette.urls methods will not apply base_url prefix twice, refs #1026 2020-10-31 12:11:40 -07:00
Simon Willison
c1d386ef67 Refactor Urls into url_builder.py
Refs #1026
2020-10-31 11:43:36 -07:00
Simon Willison
11eb1e026f datasette.urls.table(..., format="json"), closes #1035
Also improved tests for datasette.urls and added format= to some other methods
2020-10-31 11:16:28 -07:00
Simon Willison
b84cfe1b08 Confirm table actions work on views, closes #1067 2020-10-31 10:40:09 -07:00
Simon Willison
d6db47f5c1 Deploy demo plugins to latest.datasette.io, refs #1074 2020-10-31 10:36:46 -07:00
Simon Willison
f0a740ac21 Remove load_plugin hook - closes #1073
Refs #1042

This reverts commit 81dea4b07a.
2020-10-31 09:21:22 -07:00
Simon Willison
a2a7090720 Display messages in right place, closes #1071 2020-10-30 13:12:57 -07:00
Simon Willison
393f1b49d7 Updated nav in pattern portfolio 2020-10-30 13:12:01 -07:00
Simon Willison
59ab24af6b Release 0.51a2
Refs #1068, #1042, #1054
2020-10-30 10:56:02 -07:00
Simon Willison
0cb29498c7 Fixed bug with python tests/fixtures.py
https://github.com/simonw/datasette/runs/1333357885?check_suite_focus=true
2020-10-30 10:54:47 -07:00
Simon Willison
a7d9e24ece Update release process with explicit version, refs #1054 2020-10-30 10:52:45 -07:00
Simon Willison
81dea4b07a
load_template() plugin hook
Closes #1042
2020-10-30 10:47:18 -07:00
Simon Willison
fcf43589eb Link to homepage in nav on show-json page 2020-10-30 08:54:01 -07:00
Simon Willison
222f79bb4c debug-menu permission, closes #1068
Also added tests for navigation menu logic.
2020-10-30 08:41:57 -07:00
Simon Willison
9f0987cb57 cursor: pointer; on the new menu icons
Refs #1064, #1066
2020-10-29 22:55:10 -07:00
Simon Willison
0e1e89c6ba Release 0.51a1
Refs #1056, #1039, #998, #1045, #1033, #1036, #1034, #976, #1057, #1058, #1053, #1064, #1066
2020-10-29 22:35:23 -07:00
Simon Willison
2f7731e9e5 table_actions() plugin hook plus menu, closes #1066
Refs #690
2020-10-29 22:16:41 -07:00
Simon Willison
8a4639bc43 Applied Black 2020-10-29 22:14:33 -07:00
Simon Willison
561c1d2d36 Show logout link if they are logged in AND have ds_actor cookie
Otherwise an expired cookie will still cause the logout link to show.
2020-10-29 20:51:37 -07:00
Simon Willison
18a64fbb29
Navigation menu plus menu_links() hook
Closes #1064, refs #690.
2020-10-29 20:45:15 -07:00
Simon Willison
1a861be19e Fixed test_max_csv_mb test that I just broke, refs #1063 2020-10-29 15:58:40 -07:00
Simon Willison
178b7e8749 .csv now links to .blob downloads
Closes #1063, closes #1034
2020-10-29 15:47:32 -07:00
Simon Willison
78b3eeaad9
.blob output renderer
* _blob_hash= checking plus refactored to use new BadRequest class, refs #1050
* Replace BlobView with new .blob renderer, closes #1050
* .blob downloads on arbitrary queries, closes #1051
2020-10-29 15:01:38 -07:00
Simon Willison
d6f9ff7137 Docs on Designing URLs for your plugin - closes #1053 2020-10-29 12:35:25 -07:00
Simon Willison
89519f9a37 Fixed bug with download of BLOB null, refs #1050 2020-10-28 21:05:40 -07:00
Simon Willison
cefd058c1c
New explicit versioning mechanism
Closes #1054
2020-10-28 20:38:15 -07:00
Simon Willison
abcf022249
Margin bottom on metadata description 2020-10-28 10:11:07 -07:00
dependabot-preview[bot]
8796172652
Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7 (#1059)
Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
- [Release notes](https://github.com/Tinche/aiofiles/releases)
- [Commits](https://github.com/Tinche/aiofiles/compare/v0.4.0...v0.6.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-10-28 10:08:27 -07:00
Simon Willison
7d9fedc176 Cascading permissions for .db download, closes #1058 2020-10-27 20:15:41 -07:00
Simon Willison
c3aba4aa98 --cors for /name.db downloads, refs #1057 2020-10-27 13:39:57 -07:00
Simon Willison
e5f5034bcd Fixed broken footer test 2020-10-27 12:39:55 -07:00
Simon Willison
e7dd3434e1 No underline on nav links in header 2020-10-27 12:39:55 -07:00
Simon Willison
18977ce802 Off-white yellow is now off-white blue 2020-10-27 12:39:55 -07:00
Simon Willison
c069d481af Mobile view cards now have rounded corners 2020-10-27 12:39:55 -07:00
Simon Willison
f49d15a758 word-break: break-word; 2020-10-27 12:39:55 -07:00
Simon Willison
dab4b73f7d White cards on mobile 2020-10-27 12:39:55 -07:00
Simon Willison
62286b46a9 Tighten up table column CSS 2020-10-27 12:39:55 -07:00
Simon Willison
fe5e813f06 Styled facets with different bullets 2020-10-27 12:39:55 -07:00
Natalie Downe
df19a48a3b Implemented new Natalie design 2020-10-27 12:39:55 -07:00
Natalie Downe
6dff22eff8 Visited link colours 2020-10-27 12:39:55 -07:00
Natalie Downe
7d69f1ac02 New header and footer 2020-10-27 12:39:55 -07:00
Simon Willison
26bb4a2681 table-wrapper on query page too, refs ##998 2020-10-27 00:56:35 -07:00
Simon Willison
f5dbe61a45 -o now opens to most relevant page, closes #976 2020-10-25 22:06:20 -07:00
Simon Willison
105a2c10fd Fix z-index issue with dropdown menu, closes #1052 2020-10-25 19:19:21 -07:00
Simon Willison
42f4851e3e Documentation for .absolute_url(request, path), refs #1034 2020-10-24 18:17:30 -07:00
Simon Willison
6c9fd4ef1b Better download link display on mobile, refs #1046 2020-10-24 18:00:38 -07:00
Simon Willison
5db7ae3ce1 Link to BLOB downloads, closes #1046 2020-10-24 17:13:14 -07:00
Simon Willison
a96ad967e4 Cleaned up some rogue full-stops 2020-10-24 16:11:14 -07:00
Simon Willison
5a15197960
/db/table/-/blob/pk/column.blob download URL, refs #1036 2020-10-24 16:09:18 -07:00
Simon Willison
10c35bd371 urls.static_plugins() method, closes #1033
Also documented how to package static assets and templates in plugins, closes #575
2020-10-24 13:03:40 -07:00
Simon Willison
7f728d4a37 Extra tests for datasette.urls, refs #1025 2020-10-24 12:21:23 -07:00
Simon Willison
29a977a74e New app_client_base_url_prefix fixture 2020-10-24 12:03:24 -07:00
Simon Willison
d3e9b0aecb Document render_template() can take a Template, refs #1045 2020-10-23 17:26:15 -07:00
Simon Willison
8148c9e265 Document render_template(templates) list, closes #1045 2020-10-23 17:22:00 -07:00
Nicholas Bollweg
976e5f74aa
Include LICENSE in sdist (#1043) 2020-10-23 13:54:34 -07:00
Nicholas Bollweg
cab8e65261
Add minimum supported python (#1044) 2020-10-23 13:53:07 -07:00
Simon Willison
d0cc6f4c32
Use sphinx-to-sqlite==0.1a1
To address this bug: https://github.com/simonw/sphinx-to-sqlite/issues/2
2020-10-21 21:57:00 -07:00
Simon Willison
20f8659e2a Wide tables now scroll horizontally, refs #998 2020-10-21 18:09:01 -07:00
gerrymanoim
6e26b05799
Fix syntax error in register_routes docs (#1038)
Thanks, @gerrymanoim
2020-10-21 15:44:16 -07:00
Simon Willison
bf82b3d6a6 scale-in animation for column action menu, closes #1039 2020-10-21 10:02:26 -07:00
Simon Willison
66120a7a1c Release 0.51a0
Refs #1023, #904, #814, #1014, #1016, #1019, #1028
2020-10-19 22:31:14 -07:00
Simon Willison
091441a444 Fixed remaining places that needed datasette.urls, closes #1025 2020-10-19 22:21:19 -07:00
Simon Willison
0d1763fb2f More datasette.urls usage, refs #1025 2020-10-19 21:24:47 -07:00
Simon Willison
837d0bc995 Tiny typo, refs #904 2020-10-19 18:04:43 -07:00
Simon Willison
5aacc021b5 Docs for datasette.urls, closes #904 2020-10-19 17:51:39 -07:00
Simon Willison
310c3a3e05 New datasette.urls URL builders, refs #904 2020-10-19 17:33:59 -07:00
Simon Willison
c440ffc65a Updated serve help, refs #1028 2020-10-19 17:33:04 -07:00
Simon Willison
6aa5886379 --load-extension=spatialite shortcut, closes #1028 2020-10-19 15:37:43 -07:00
Simon Willison
a4def0b8db Clearer _sort_by_desc comment 2020-10-19 15:37:43 -07:00
Simon Willison
c37a0a93ec
Build and deploy docs.db to datasette-docs-latest 2020-10-18 14:35:26 -07:00
Simon Willison
f7147260a4
Added datasette-atom and datasette-ics 2020-10-18 13:56:35 -07:00
Simon Willison
b0b04bb7c1
Delete .readthedocs.yml
It worked fine without configuration, and my attempt to build the xml version failed with an error message:

    Problem in your project's configuration. Invalid "formats": expected one of (htmlzip, pdf, epub), got xml
2020-10-18 11:37:35 -07:00
Simon Willison
a0e9ae3c25
Build extra formats with Read the Docs 2020-10-18 11:20:33 -07:00
Taylor Hodge
568bd7bbf5
Fix broken link in publish docs (#1029) 2020-10-17 13:05:03 -07:00
Jacob Fenton
4f7c0ebd85
Fix table name in spatialite example command (#1022)
The example query for creating a new point geometry seems to be using a table called 'museums' but at one point it instead uses 'events'. I *believe* it is intended to be museums.
2020-10-14 16:46:46 -07:00
dependabot-preview[bot]
7f2edb5dd2
Update janus requirement from <0.6,>=0.4 to >=0.4,<0.7 (#1017)
Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
- [Release notes](https://github.com/aio-libs/janus/releases)
- [Changelog](https://github.com/aio-libs/janus/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/janus/compare/v0.4.0...v0.6.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-10-14 14:52:07 -07:00
dependabot-preview[bot]
b4a8e70957
Update asgiref requirement from ~=3.2.10 to >=3.2.10,<3.4.0 (#1018)
Updates the requirements on [asgiref](https://github.com/django/asgiref) to permit the latest version.
- [Release notes](https://github.com/django/asgiref/releases)
- [Changelog](https://github.com/django/asgiref/blob/master/CHANGELOG.txt)
- [Commits](https://github.com/django/asgiref/compare/3.2.10...3.3.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-10-14 14:51:34 -07:00
Simon Willison
f3a087a578 Edit SQL button on canned queries, closes #1019 2020-10-13 20:44:18 -07:00
Simon Willison
acf07a6772 x button for clearing filters, refs #1016 2020-10-11 19:53:26 -07:00
Simon Willison
e34e84901d Link: HTTP header pagination, closes #1014 2020-10-10 17:18:45 -07:00
Simon Willison
7e70643852 Removed --debug option, which didn't do anything - closes #814 2020-10-10 16:39:38 -07:00
Simon Willison
822260fb30
Improved homebrew instructions 2020-10-10 16:19:39 -07:00
Simon Willison
a67cb536f1
Promote the Datasette Weekly newsletter 2020-10-10 13:54:27 -07:00
Simon Willison
0e58ae7600 Release 0.50.2
Refs #1011
2020-10-09 20:53:47 -07:00
Simon Willison
7239175f63 Fixed broken column header links, closes #1011 2020-10-09 20:51:56 -07:00
Simon Willison
6fe30c348c Release 0.50.1
Refs #1010
2020-10-09 17:41:35 -07:00
Simon Willison
9f6dd985bc Fix broken CSV/JSON export on query page, refs #1010 2020-10-09 17:39:45 -07:00
Simon Willison
c13d184704 Emergency fix for broken links in 0.50, closes #1010 2020-10-09 17:33:13 -07:00
Simon Willison
549a007683
Clarify that datasette.client HTTP calls are simulated 2020-10-09 16:13:41 -07:00
Simon Willison
99488de329
Link to 0.50 annotated release notes 2020-10-09 14:50:19 -07:00
Simon Willison
ef76c9ea57
Link to annotated release notes 2020-10-09 14:49:13 -07:00
Simon Willison
1bdbc8aa7f Datasette now supports Python 3.9 2020-10-09 10:57:55 -07:00
Simon Willison
a61f0e4e15 Release 0.50
Refs #1001, #514, #891, #943, #969, #970, #978, #980, #996, #997

Closes #1002
2020-10-09 10:52:44 -07:00
Simon Willison
c12b7a5def Documentation for datasette.client, closes #1006
Refs #1000
2020-10-09 10:20:25 -07:00
Simon Willison
6421ca2b22
Use actions/setup-python@v2 to deploy latest
This should fix an error with Python 3.9.
2020-10-09 09:28:17 -07:00
Simon Willison
896cc2c6ac Replace MockRequest with Request.fake()
Close #1004
2020-10-09 09:26:17 -07:00
Simon Willison
6e091b14b6
Run tests against Python 3.9 2020-10-09 09:22:49 -07:00
Simon Willison
8f97b9b58e
datasette.client internal requests mechanism
Closes #943

* Datasette now requires httpx>=0.15
* Support OPTIONS without 500, closes #1001
* Added internals tests for datasette.client methods
* Datasette's own test mechanism now uses httpx to simulate requests
* Tests simulate HTTP 1.1 now
* Added base_url in a bunch more places
* Mark some tests as xfail - will remove that when new httpx release ships: #1005
2020-10-09 09:11:24 -07:00
Simon Willison
7249ac5ca0 Support OPTIONS without 500, closes #1001 2020-10-08 18:43:53 -07:00
Simon Willison
703439bdc3 Don't suggest datasette-graphql in bulidpacks demo
Refs #997 - it's not a great suggestion because the fivethirtyeight.db
database has so many tables.
2020-10-08 16:50:43 -07:00
Simon Willison
7a029d1eda Link to hosting providers, refs #997 2020-10-08 16:36:22 -07:00
Simon Willison
e4f18fbd37 Deploying using buildpacks docs, closes #997 2020-10-08 16:32:04 -07:00
Simon Willison
e4554c37b7 datasette publish heroku --tar option, closes #969 2020-10-08 16:30:46 -07:00
Simon Willison
107d0887a6 datasette publish heroku now uses Python 3.8.6 2020-10-08 16:22:11 -07:00
Simon Willison
86823ae6f7 Default to Uvicorn workers=1, refs #999 2020-10-08 16:16:55 -07:00
Simon Willison
2458d7b766 Docs on deploying with systemd, refs #514 2020-10-08 15:47:37 -07:00
Simon Willison
b47ac37114 Applied Black 2020-10-07 15:51:25 -07:00
Simon Willison
5070425817 Fix handling of nested custom page wildcard paths, closes #996 2020-10-07 15:51:11 -07:00
Simon Willison
b37431976c custom pages tests templates now in repo 2020-10-07 15:16:41 -07:00
Simon Willison
e02f6c1300 Tests for db.table_columns() and db.table_column_details() 2020-10-06 14:02:30 -07:00
Simon Willison
14982bd900 Release 0.50a1
Refs #995, #993, #989
2020-10-06 13:50:54 -07:00
Geoffrey Hing
ca5ba6b77b
Document setting Google Cloud SDK properties (#995)
Document setting Google Cloud SDK properties to avoid having to respond to interactive prompts when running `datasette publish cloudrun`.

Thanks, @ghing!
2020-10-06 09:25:37 -07:00
Simon Willison
5a184a5d21 Display column type in column action menu, closes #993
Also added new documented db.table_column_details() introspection method.
2020-10-05 17:32:10 -07:00
Simon Willison
e807c4eac0 Sort links remove _next=, closes #989 2020-10-04 11:05:20 -07:00
Simon Willison
b68cc1c6d4 Release 0.50a0
Refs #891, #970, #978, #980, #981
2020-10-01 16:35:04 -07:00
Simon Willison
5d6bc4c268 Allow faceting on compound primary keys, closes #985 2020-10-01 09:50:35 -07:00
Simon Willison
141544613f Extract out menu icon CSS, refs #981 2020-09-30 16:55:00 -07:00
Simon Willison
64127a4593 Show not-blank rows column action, refs #981 2020-09-30 16:43:34 -07:00
Simon Willison
765e8f0209 Close menu when clicked outside, refs #981 2020-09-30 16:21:44 -07:00
Simon Willison
0f2626868b Much improved column menu display logic, refs #981
* Menu links now take into account existing querystring
* No longer shows facet option for primary key columns
* Conditionally displays sort/sort-desc if already sorted
* Does not show facet option if already faceted by this
2020-09-30 16:01:37 -07:00
Simon Willison
fd0b00330f Don't show cog on Link column, refs #981
Also show ascending option before descending option
2020-09-30 15:31:17 -07:00
Simon Willison
97c71c3a3b Fixed test for column sorting, refs #981 2020-09-30 14:51:10 -07:00
Simon Willison
ae1f7c3870 Column action menu for sort/faceting, refs #981 2020-09-30 14:43:39 -07:00
Simon Willison
5b8b8ae597 Handle \r\n correctly in CSS escapes, refs #980 2020-09-29 12:16:30 -07:00
Simon Willison
c11383e628 Fix rendering glitch with columns on mobile, closes #978 2020-09-28 15:42:50 -07:00
dependabot-preview[bot]
1f021c3711
Update pytest requirement from <6.1.0,>=5.2.2 to >=5.2.2,<6.2.0 (#977)
Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...6.1.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-09-28 15:16:34 -07:00
Simon Willison
9a6d0dce28
datasette-json-html as render_cell example 2020-09-23 22:25:06 -07:00
Simon Willison
cac051bb8a Fix for 'open' bug, closes #973 2020-09-22 08:39:48 -07:00
Simon Willison
a980199e61 New -o option for opening Datasette in your browser, closes #970 2020-09-22 07:26:47 -07:00
Simon Willison
a258339a93
Fixed typo 2020-09-18 23:33:09 -07:00
Simon Willison
368be14c8b Link to annotated release notes 2020-09-15 17:01:11 -07:00
Simon Willison
432a3d675f sqlite3.enable_callback_tracebacks(True), closes #891 2020-09-15 14:59:17 -07:00
Simon Willison
d456b25032 Release 0.49.1
Refs #967, #966, #956
2020-09-15 13:20:15 -07:00
Simon Willison
448d13ea6b Fix for MagicParameters error with no POST body, closes #967 2020-09-15 13:12:57 -07:00
Simon Willison
65ca17d729 Fix for DeprecationWarning: invalid escape sequence 2020-09-15 13:10:38 -07:00
Simon Willison
853c5fc370
Fixed incorrect canned query example, closes #966 2020-09-14 20:52:44 -07:00
Simon Willison
cb515a9d75
Don't push preleases to Docker Hub, refs #940 2020-09-14 15:09:03 -07:00
Simon Willison
26de3a18bc
tmate debugging tool 2020-09-14 14:53:54 -07:00
Simon Willison
c0249525d7 Release 0.49
Refs #880, #944, #945, #947, #948, #953, #958, #962, #963, #964, #965
2020-09-14 14:38:24 -07:00
Simon Willison
72ac2fd32c JSON API for writable canned queries, closes #880 2020-09-14 14:23:18 -07:00
Simon Willison
894999a14e Improved test for JSON POST, refs #880 2020-09-14 13:25:09 -07:00
Simon Willison
896fce228f Canned query writes support JSON POST body, refs #880 2020-09-14 13:18:15 -07:00
Simon Willison
1552ac931e Documented custom error pages, closes #965 2020-09-14 11:47:16 -07:00
Simon Willison
3817152e31 Rename default error template to error.html, refs #965 2020-09-14 11:30:31 -07:00
Simon Willison
699be7dea9 raise_404() function for use in custom templates, closes #964 2020-09-14 10:39:25 -07:00
Simon Willison
30b98e4d29
Single, not double quotes - refs #940 2020-09-13 19:47:21 -07:00
Simon Willison
c18117cf08 Release notes for 0.49a1
Refs #948 #958 #962 #947 #963 #944
2020-09-13 19:40:10 -07:00
Simon Willison
cc77fcd133 Optional path parameters for custom pages, closes #944 2020-09-13 19:34:43 -07:00
Simon Willison
ea340cf320 Correctly persist selected facets in hidden fields
Closes #963
2020-09-12 14:54:01 -07:00
Simon Willison
20b1de86a1 Fix for test I broke in #947 2020-09-11 15:04:23 -07:00
Simon Willison
d02f6151da datasette --get status code for error pages, closes #947 2020-09-11 14:32:54 -07:00
Simon Willison
77521c6cd7
Documentation for --pdb, refs #962 2020-09-11 11:40:39 -07:00
Simon Willison
ca5c405d0f New 'datasette --pdb' option, closes #962 2020-09-11 11:37:55 -07:00
Simon Willison
d0c752d50c Fixed a couple of tiny HTML bugs, thanks curlylint
curlylint datasette/templates

https://github.com/thibaudcolas/curlylint
2020-09-07 08:43:37 -07:00
Simon Willison
a648bb82ba Upgrade to Black 20.8b1, closes #958 2020-09-02 15:24:55 -07:00
Simon Willison
26b2922f17 await_me_maybe utility function 2020-09-02 15:21:12 -07:00
Simon Willison
f65c45674d Notes on upgrading CodeMirror, refs #948 2020-08-30 11:11:04 -07:00
Simon Willison
9dbbfa1f0b Upgrade CodeMirror to 5.57.0, refs #948 2020-08-30 10:39:16 -07:00
Simon Willison
44cf424a94
Remove double colon, refs #956 2020-08-28 18:33:05 -07:00
Simon Willison
c36e287d71
Don't deploy alpha/betas to Docker Hub
Refs #956
2020-08-28 18:18:52 -07:00
Simon Willison
7178126d90 Release notes for 0.49a0
Refs #953, #945
2020-08-28 16:12:47 -07:00
Simon Willison
799ecae948 register_output_renderer can now return Response, closes #953 2020-08-27 21:02:50 -07:00
Simon Willison
86aefc39c5 Fixed undefined reference in index.rst 2020-08-19 10:22:33 -07:00
Simon Willison
69033c6ec4 datasette install --upgrade option, closes #945 2020-08-19 10:20:41 -07:00
Simon Willison
b21ed237ab
publish heroku now deploys with Python 3.8.5 2020-08-18 13:49:13 -07:00
Simon Willison
5e0b72247e
Run CI on GitHub Actions, not Travis
* Run CI on GitHub Actions, not Travis - refs #940
* Update documentation refs to Travis
* Release action now runs parallel tests, then pushes to PyPI, then Docker Hub
2020-08-17 22:09:34 -07:00
Simon Willison
52eabb019d Release 0.48
Refs #939, #938, #935, #914
2020-08-16 11:56:31 -07:00
Simon Willison
8e7e6458a6 Fix bug with ?_nl=on and binary data, closes #914 2020-08-16 11:26:49 -07:00
Simon Willison
3a4c8ed36a Added columns argument to various extra_ plugin hooks, closes #938 2020-08-16 11:09:53 -07:00
Simon Willison
94ae840fe3 Plugin tests now start with test_hook_ 2020-08-16 10:49:33 -07:00
Simon Willison
2da4144c57 Applied Black 2020-08-16 10:35:14 -07:00
Simon Willison
ac69d151c3 Test that plugin hooks are documented with correct arguments 2020-08-16 10:33:44 -07:00
Simon Willison
e3639247cd Standard arguments for extra_ plugin hooks, closes #939 2020-08-16 09:50:23 -07:00
Simon Willison
41ddc19756
Docs now live at docs.datasette.io (#937) 2020-08-15 16:57:05 -07:00
Simon Willison
af12f45c2b Documentation and tests for db.is_mutable 2020-08-15 16:27:32 -07:00
Simon Willison
b86f94883b
Don't hang in db.execute_write_fn() if connection fails
Closes #935

Refs https://github.com/simonw/latest-datasette-with-all-plugins/issues/3
2020-08-15 15:35:31 -07:00
Simon Willison
13b3b51087 Release 0.47.3
Refs #934, https://github.com/simonw/latest-datasette-with-all-plugins/issues/3
2020-08-15 13:56:08 -07:00
Simon Willison
45414f8412 --get now calls startup() plugin hooks, closes #934 2020-08-15 13:52:41 -07:00
Simon Willison
7702ea6021 Release 0.47.2
Refs #931
2020-08-12 13:54:33 -07:00
Simon Willison
e3e387fae7 Fixed URLs to SpatiaLite files, refs #931 2020-08-12 13:49:50 -07:00
Simon Willison
b8c09a9334
Suggest "allow": false instead of "allow": {} 2020-08-11 22:56:52 -07:00
Simon Willison
309d7191a1
Fixed broken rST link 2020-08-11 22:11:08 -07:00
Simon Willison
cd8c79d30a Release 0.47.1
Refs #930
2020-08-11 19:37:24 -07:00
Simon Willison
10ce9ce3bc Include templates/ in MANIFEST, refs #930 2020-08-11 19:34:39 -07:00
Simon Willison
03418ee037 Release 0.47
Refs #335, #923, #925, #926, #928
2020-08-11 17:42:47 -07:00
Simon Willison
1a805288ab Updating homebrew plugin installation instructions
This will start working as soon as Datasette 0.47 ships. Refs #923
2020-08-11 17:31:56 -07:00
Simon Willison
e139a7619f
'datasette --get' option, closes #926
Also made a start on the datasette.utils.testing module, refs #898
2020-08-11 17:24:40 -07:00
Simon Willison
83eda049af Fixed rST bug 2020-08-11 17:10:12 -07:00
Simon Willison
afdeda8216 Use runpy in install/uninstall, refs #928 2020-08-11 16:54:52 -07:00
Simon Willison
adfe304281 Upgrade pip in GitHub Actions runs 2020-08-11 16:12:22 -07:00
Simon Willison
6a126fa25f
Removed aiohttp from test dependencies
It wasn't being used.
2020-08-11 16:05:00 -07:00
Simon Willison
5126ecb126 Re-arranged installation docs, added Homebrew - closes #923 2020-08-11 15:52:41 -07:00
Simon Willison
f7fddc9019 Fixed typo in help text, refs #925 2020-08-11 15:33:16 -07:00
Simon Willison
01fe5b7401 datasette install / datasette uninstall commands, closes #925 2020-08-11 15:32:06 -07:00
Simon Willison
3fa261d1d2
Removed Python 3.5 installation instructions
Suggested here: https://github.com/simonw/datasette/discussions/921#discussioncomment-49362
2020-08-10 20:26:42 -07:00
Simon Willison
e21face1d7
Link to discussions forum 2020-08-10 19:49:22 -07:00
Simon Willison
be1fcd34b3
Link to GitHub Discussions
Also fixed a couple of master => main
2020-08-10 19:47:09 -07:00
Simon Willison
fca723ab2a
Fixed order of master and main in release notes 2020-08-09 12:30:55 -07:00
Simon Willison
2955e7ea51
One last update of the new tagline 2020-08-09 09:40:17 -07:00
Simon Willison
860b97b5a4
Build main but not master, refs #919 2020-08-09 09:19:01 -07:00
Simon Willison
b597aa07e6 Fixed link in release notes, refs #918 2020-08-09 09:09:07 -07:00
Simon Willison
f25391de1f Release 0.46
Refs #849, #908, #896, #897, #905, #909, #456, #887, #890
2020-08-09 09:06:34 -07:00
Simon Willison
6d29210cf4 Updated docs on what happens when a release goes out 2020-08-09 09:05:09 -07:00
Simon Willison
7f10f0f766 Fix for security issue #918 2020-08-09 09:03:35 -07:00
Simon Willison
de90b7568c Fixed incorrect link reference 2020-08-09 08:41:16 -07:00
Simon Willison
602390081c
Package as sdist as well as bdist_wheel 2020-08-08 21:58:24 -07:00
Simon Willison
dc86ce7d3a
Added datasette-graphql 2020-08-06 21:32:46 -07:00
Simon Willison
daf1b50d13
datasette-graphql in news 2020-08-06 21:30:59 -07:00
Simon Willison
7ca8c0521a
Calculate coverage on pushes to main
Refs #849
2020-07-31 16:23:02 -07:00
Simon Willison
84c162dec3
Deploy latest on pushes to main
Refs #849
2020-07-31 16:22:31 -07:00
Simon Willison
73bb59a9b5 Mirror master and main, refs #849 2020-07-31 16:19:43 -07:00
fcatus
2d7fa8b905
Use None as a default arg (#901)
Thanks, @fcatus!

* Use None as a default arg
* Black formatting fix

Co-authored-by: Simon Willison <swillison@gmail.com>
2020-07-31 11:42:38 -07:00
Simon Willison
d71b0c0cb9
Publishing to Vercel section
Closes #912
2020-07-31 10:06:32 -07:00
Simon Willison
8d02f1dfcf
An open source multi-tool for exploring and publishing data 2020-07-29 18:20:24 -07:00
Simon Willison
0748a65a22 Fixed content-disposition header on DB download, closes #909 2020-07-29 14:34:22 -07:00
dependabot-preview[bot]
c5c12a797f
Update pytest requirement from <5.5.0,>=5.2.2 to >=5.2.2,<6.1.0 (#910)
Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...6.0.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-07-29 14:26:03 -07:00
Simon Willison
3c33b42132
Documenting both false and {} for "deny all" is confusing
Refs #906
2020-07-25 14:44:42 -07:00
Simon Willison
980600564c
Updated news section 2020-07-24 18:09:36 -07:00
Simon Willison
092874202c Improvements to allow block logic and debug tool
true and false allow block values are now supported, closes #906

Added a bunch of demo links to the documentation, refs #908
2020-07-24 17:04:06 -07:00
Simon Willison
88065fb74f Increase size of allow/actor fields, refs #908 2020-07-24 16:52:16 -07:00
Simon Willison
12c0bc09cc /-/allow-debug tool, closes #908 2020-07-24 15:55:10 -07:00
abeyerpath
6be5654ffa
Exclude tests from package, properly this time
The `exclude` argument to `find_packages` needs an iterable of package
names.

Closes #456 - thanks, @abeyerpath!
2020-07-24 13:39:53 -07:00
Simon Willison
028f193dd6
How to use a custom domain with Cloud Run 2020-07-22 11:17:05 -07:00
Simon Willison
213e6a8926 content-length for DB downloads, closes #905 2020-07-21 21:52:35 -07:00
Simon Willison
02dc6298bd permission_allowed resource can be a tuple 2020-07-21 08:22:36 -07:00
Simon Willison
d9a5ef1c32
Don't need this, we're not using GitHub pages 2020-07-19 17:49:32 -07:00
Simon Willison
1f6a134369 await request.post_body() method, closes #897 2020-07-17 13:12:35 -07:00
Simon Willison
c5f06bc356
"white-space: pre-wrap" for all table cells, refs #896 2020-07-16 12:06:45 -07:00
Simon Willison
4691228a81
Fix for version color in nav, refs #892 2020-07-12 13:00:16 -07:00
Simon Willison
ee0ef01652 Added new logo to the documentation 2020-07-12 12:53:29 -07:00
Simon Willison
cd231e97cd
Updated example for asgi_wrapper 2020-07-07 19:01:13 -07:00
Simon Willison
ba739b2457
An open source multi-tool for exploring and publishing data 2020-07-07 12:54:54 -07:00
Simon Willison
bcb59ca466
codecov should not be blocking
From https://docs.codecov.io/docs/common-recipe-list
2020-07-02 21:29:32 -07:00
Amjith Ramanujam
ea99a4431c
Only load Python files from plugins-dir
Pull request #890. Thanks, @amjith!

* Load only python files from plugins-dir
* Add a test to verify non-python files are not loaded as plugins
2020-07-02 20:08:32 -07:00
Simon Willison
57879dc8b3 Better titles for canned query pages, closes #887 2020-07-01 17:23:37 -07:00
Simon Willison
f1f581b7ff Release notes for 0.45
Refs #687, #807, #812, #832, #834, #835, #840, #842, #846, #852, #854, #863, #864, #870
2020-07-01 14:43:07 -07:00
Simon Willison
c7e8a4aaac Handle missing request object, refs #884 2020-07-01 14:36:36 -07:00
Simon Willison
1bae24691f Only show 'log out' if ds_cookie present, closes #884 2020-07-01 14:25:59 -07:00
Simon Willison
f7c3fc978c
datasette-auth-tokens improved description
Refs https://github.com/simonw/datasette-auth-tokens/issues/1
2020-07-01 12:26:30 -07:00
Simon Willison
676bb64c87 Release 0.45a5
Refs #840, #832, #835, #812
2020-06-30 21:25:35 -07:00
Simon Willison
549b1c2063 New forbidden() plugin hook, closes #812 2020-06-30 21:17:38 -07:00
Simon Willison
3ec5b1abf6 CSRF tests for canned query POST, closes #835 2020-06-30 20:08:00 -07:00
Simon Willison
08b4928a75 asgi-csrf>=0.6, refs #835 2020-06-30 18:18:19 -07:00
Simon Willison
2b85bbdd45 Added logout button to pattern portfolio, closes #876
Refs #875
2020-06-30 16:47:23 -07:00
Simon Willison
cfd69593f7 Removed hashes from examples on docs/pages - closes #879 2020-06-30 16:45:34 -07:00
Simon Willison
d6e03b0430 Cascading view permissions, closes #832
- If you have table permission but not database permission you can now view the table page
- New BaseView.check_permissions() method
2020-06-30 16:40:50 -07:00
Simon Willison
ab76eddf31 Express no opinion if allow block is missing
Default permission policy was returning True by default for permission
checks - which means that if allow was not defined for a level it would
be treated as a passing check.

This is better: we now return None of the allow block is not defined,
which means 'I have no opinion on this' and allows other code to make
its own decisions.

Added while working on #832
2020-06-30 15:49:06 -07:00
Simon Willison
9ac6292614 _header_x now defaults to empty string
Prior to this a request to e.g. https://latest.datasette.io/fixtures/magic_parameters
which did not include a User-Agent header would trigger a 500 error.
2020-06-30 15:00:17 -07:00
Simon Willison
2115d7e345 Logout link in nav, refs #875 2020-06-29 11:40:40 -07:00
Simon Willison
51427323e6 Add message when user logs out, refs #840 2020-06-29 11:31:35 -07:00
Simon Willison
16f592247a Use explicit lifestyle=on for Uvicorn, refs #873 2020-06-29 08:42:50 -07:00
Simon Willison
35aee82c60 Fixed 500 error with /favicon.ico, closes #874 2020-06-28 21:27:11 -07:00
Simon Willison
22d932fafc /-/logout page for logging out of ds_actor cookie
Refs #840
2020-06-28 21:17:58 -07:00
Simon Willison
968ce53689
Added datasette-write to plugins list on Ecosystem 2020-06-28 20:49:45 -07:00
Simon Willison
265483173b Release 0.45a4
Refs #864, #871
2020-06-28 19:31:16 -07:00
Simon Willison
a8a5f81372 Made show_messages available to plugins, closes #864 2020-06-28 17:50:47 -07:00
Simon Willison
7ac4936cec .add_message() now works inside plugins, closes #864
Refs #870
2020-06-28 17:25:35 -07:00
Simon Willison
af350ba457 Use single Request created in DatasetteRouter, refs #870 2020-06-28 17:01:33 -07:00
Simon Willison
4dad028432 BaseView.as_asgi is now .as_view, refs #870 2020-06-28 16:47:40 -07:00
Simon Willison
3bc2461c77 Refactored AsgiView into BaseView, refs #870 2020-06-28 16:06:30 -07:00
Simon Willison
a8bcafc177 Refactored out AsgiRouter, refs #870 2020-06-28 13:45:17 -07:00
Simon Willison
0991ea75cc Renamed _timestamp to _now, refs #842, closes #871 2020-06-28 12:47:28 -07:00
Simon Willison
99fba0fad3 Link to datasette-init plugin hook, refs #834 2020-06-28 12:37:50 -07:00
Simon Willison
8b25b14de1 Added note about unit testing the startup() hook 2020-06-28 09:09:48 -07:00
Simon Willison
b28657672f Added register_magic_plugins hook to changelog, refs #842 2020-06-27 20:29:24 -07:00
Simon Willison
1f55a4a2b6 Release notes for 0.45a3 2020-06-27 20:22:49 -07:00
Simon Willison
335f26a0f7 /fixtures/magic_parameters demo, refs #842 2020-06-27 20:11:01 -07:00
Simon Willison
563f5a2d3a
Magic parameters for canned queries
Closes #842

Includes a new plugin hook, register_magic_parameters()
2020-06-27 19:58:16 -07:00
Simon Willison
4b142862f2 Support non-async view functions, closes #867 2020-06-27 11:30:34 -07:00
dependabot-preview[bot]
1bb33dab49
Update pytest-asyncio requirement from <0.13,>=0.10 to >=0.10,<0.15 (#866)
Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](https://github.com/pytest-dev/pytest-asyncio/compare/v0.10.0...v0.14.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-06-24 11:50:55 -07:00
Simon Willison
1a5b7d318f Fixed test I broke in #863 2020-06-23 21:17:30 -07:00
Simon Willison
c5916cbffb Release notes for 0.45a2 2020-06-23 20:28:50 -07:00
Simon Willison
28bb1c5189 csrftoken() now works with .render_template(), closes #863 2020-06-23 20:23:50 -07:00
Simon Willison
eed116ac05
render_template needs await 2020-06-23 20:06:30 -07:00
Simon Willison
000528192e New 'Testing plugins' page, closes #687 2020-06-21 20:53:42 -07:00
Simon Willison
74889aa92e How to use the datasette-plugin template, refs #687, closes #855 2020-06-21 19:51:26 -07:00
Simon Willison
751e7b4af7 Update tests for new plugin_hooks.rst, refs #687 2020-06-21 19:41:07 -07:00
Simon Willison
c32af6f693 Split out new 'Writing plugins' page, refs #687 2020-06-21 19:37:48 -07:00
Simon Willison
1f42379089 Improved intro on plugin_hooks.rst page, refs #687
https://datasette.readthedocs.io/en/latest/plugin_hooks.html
2020-06-21 17:52:58 -07:00
Simon Willison
36e77e1006 Move plugin hooks docs to plugin_hooks.rst, refs #687 2020-06-21 17:34:10 -07:00
Simon Willison
e4216ff503 Fixed rST warning 2020-06-21 17:34:10 -07:00
Simon Willison
84cbf17660
News: A cookiecutter template for writing Datasette plugins 2020-06-20 10:40:05 -07:00
Simon Willison
d1640ba76b Don't show prereleases on changelog badge 2020-06-20 08:48:39 -07:00
Simon Willison
55a6ffb93c Link to datasette-saved-queries plugin, closes #852 2020-06-19 20:08:30 -07:00
Simon Willison
64cc536b89
Don't include prereleases in changelog badge 2020-06-18 17:03:23 -07:00
Simon Willison
b59b92b1b0 Fix for tests - order was inconsistent, refs #852 2020-06-18 16:52:16 -07:00
Simon Willison
0807c4200f Release notes for 0.45a1, refs #852 2020-06-18 16:40:45 -07:00
Simon Willison
9216127ace Documentation tweak, refs #852 2020-06-18 16:39:43 -07:00
Simon Willison
6c26345836 New plugin hook: canned_queries(), refs #852 2020-06-18 16:35:15 -07:00
Simon Willison
d2f387591b Better rST label for alpha release, refs #807 2020-06-18 14:01:36 -07:00
Simon Willison
dda932d818 Release notes for 0.45a0
Refs #834 #846 #854 #807
2020-06-18 13:58:09 -07:00
Simon Willison
c81f637d86 Documentation for alpha/beta release process, refs #807 2020-06-18 13:49:52 -07:00
Simon Willison
13216cb6bd
Don't push alpha/beta tagged releases to Docker Hub
Refs #807
2020-06-18 13:40:33 -07:00
Simon Willison
6151c25a5a Respect existing scope["actor"] if set, closes #854 2020-06-18 11:37:28 -07:00
Simon Willison
d2aef9f7ef Test illustrating POST against register_routes(), closes #853 2020-06-18 09:21:15 -07:00
Simon Willison
a4ad5a504c Workaround for 'Too many open files' in test runs, refs #846 2020-06-13 17:26:18 -07:00
Simon Willison
0c27f10f9d
Updated plugin examples to include datasette-psutil 2020-06-13 16:41:26 -07:00
Simon Willison
cf7a2bdb40
Action to run tests and upload coverage to codecov.io
Closes #843.
2020-06-13 14:36:49 -07:00
Simon Willison
80c18a18fc Configure code coverage, refs #841, #843 2020-06-13 13:48:23 -07:00
Simon Willison
0e49842e22 datasette/actor_auth_cookie.py coverae to 100%, refs #841 2020-06-13 11:29:14 -07:00
Simon Willison
d60bd6ad13 Update plugin tests, refs #834 2020-06-13 11:15:33 -07:00
Simon Willison
ae99af2536 Fixed rST code formatting, refs #834 2020-06-13 10:59:35 -07:00
Simon Willison
72ae975156 Added test for async startup hook, refs #834 2020-06-13 10:58:32 -07:00
Simon Willison
09a3479a54 New "startup" plugin hook, closes #834 2020-06-13 10:55:41 -07:00
Simon Willison
b906030235 Release Datasette 0.44
Refs #395, #519, #576, #699, #706, #774, #777, #781, #784, #788, #790, #797,
#798, #800, #802, #804, #819, #822, #825, #826, #827, #828, #829, #830,
#833, #836, #837, #839

Closes #806.
2020-06-11 18:19:30 -07:00
Simon Willison
9ae0d483ea Get "$file": "../path" mechanism working again, closes #839 2020-06-11 17:48:20 -07:00
Simon Willison
793a52b317 Link to datasett-auth-tokens and datasette-permissions-sql in docs, refs #806 2020-06-11 17:43:51 -07:00
Simon Willison
1d2e8e09a0 Some last touches to the 0.44 release notes, refs #806 2020-06-11 17:33:16 -07:00
Simon Willison
308bcc8805 Fixed test_permissions_debug 2020-06-11 17:25:12 -07:00
Simon Willison
fba8ff6e76 "$env": "X" mechanism now works with nested lists, closes #837 2020-06-11 17:21:48 -07:00
Simon Willison
f39f111331 Fixed actor_matches_allow bug, closes #836 2020-06-11 15:47:19 -07:00
Simon Willison
29c5ff493a view-instance permission for debug URLs, closes #833 2020-06-11 15:14:51 -07:00
Simon Willison
09bf3c6322 Documentation for publish --secret, refs #787 2020-06-11 09:14:30 -07:00
Simon Willison
fcc7cd6379 rST formatting 2020-06-11 09:05:15 -07:00
Simon Willison
98632f0a87
--secret command for datasette publish
Closes #787
2020-06-11 09:02:03 -07:00
Simon Willison
371170eee8 publish heroku now deploys with Python 3.8.3 2020-06-11 08:44:44 -07:00
Simon Willison
ce4958018e Clarify that view-query also lets you execute writable queries 2020-06-10 17:10:28 -07:00
Simon Willison
198545733b Document that "allow": {} denies all
https://github.com/simonw/datasette/issues/831#issuecomment-642324847
2020-06-10 16:56:53 -07:00
Simon Willison
9f236c4c00 Warn that register_facet_classes may change, refs #830
Also documented policy that plugin hooks should not be shipped without a real example. Refs #818
2020-06-10 13:06:46 -07:00
Simon Willison
57e812d5de ds_author cookie can now expire, closes #829
Refs https://github.com/simonw/datasette-auth-github/issues/62#issuecomment-642152076
2020-06-10 12:39:54 -07:00
Simon Willison
d828abadde Fix horizontal scrollbar on changelog, refs #828 2020-06-09 21:20:07 -07:00
Simon Willison
f3951539f1 Hopefully fix horizontal scroll with changelog on mobile 2020-06-09 18:19:11 -07:00
Simon Willison
d94fc39e33 Crafty JavaScript trick for generating commit references 2020-06-09 16:43:58 -07:00
Simon Willison
b3919d8059 Mostly complete release notes for 0.44, refs #806 2020-06-09 16:03:42 -07:00
Simon Willison
b5f04f42ab ds_actor cookie documentation, closes #826 2020-06-09 15:32:24 -07:00
Simon Willison
008e2f63c2 response.set_cookie(), closes #795 2020-06-09 15:19:37 -07:00
Simon Willison
f240970b83 Fixed tests/fixtures.py, closes #804 2020-06-09 12:58:12 -07:00
Simon Willison
56eb80a459 Documented CSRF protection, closes #827 2020-06-09 12:32:52 -07:00
Simon Willison
5ef3b7b0c9 Applied Black
Refs #825
2020-06-09 12:25:48 -07:00
Simon Willison
7633b9ab24 unauthenticated: true method plus allow block docs, closes #825 2020-06-09 10:01:03 -07:00
Simon Willison
70dd14876e Improved documentation for permissions, refs #699 2020-06-09 09:04:46 -07:00
Simon Willison
3aa87eeaf2 Documentation no loger suggests that actor["id"] is required, closes #823 2020-06-09 07:58:12 -07:00
Simon Willison
fa87d16612 Clearer docs for actor_matches_allow 2020-06-09 07:10:46 -07:00
Simon Willison
eefeafaa27 Removed unused import 2020-06-09 07:09:39 -07:00
Simon Willison
fec750435d Support anonymous: true in actor_matches_allow, refs #825 2020-06-09 07:01:23 -07:00
Simon Willison
eb3ec279be
Test for anonymous: true, refs #825 2020-06-08 23:33:06 -07:00
Simon Willison
5a6a73e319 Replace os.urandom(32).hex() with secrets.token_hex(32) 2020-06-08 21:37:35 -07:00
Simon Willison
fac8e93815 request.url_vars property, closes #822 2020-06-08 20:40:00 -07:00
Simon Willison
db660db463 Docs + unit tests for Response, closes #821 2020-06-08 20:32:10 -07:00
Simon Willison
f5e79adf26
register_routes() plugin hook (#819)
Fixes #215
2020-06-08 20:12:06 -07:00
Simon Willison
d392dc1cfa Fixed test_table_not_exists_json test 2020-06-08 19:28:25 -07:00
Simon Willison
647c5ff0f3 Fixed broken CSS on 404 page, closes #777 2020-06-08 17:35:23 -07:00
Simon Willison
49d6d2f7b0 allow_sql block to control execute-sql upermission in metadata.json, closes #813
Also removed the --config allow_sql:0 mechanism in favour of the new allow_sql block.
2020-06-08 17:05:44 -07:00
Simon Willison
e0a4664fba Better example plugin for permission_allowed
Also fixed it so default permission checks run after plugin permission checks, refs #818
2020-06-08 15:09:57 -07:00
Simon Willison
8205d58316 Corrected documentation for resource in view-query 2020-06-08 13:10:40 -07:00
Simon Willison
5437085382 Documentation for allow blocks on more stuff, closes #811 2020-06-08 12:32:27 -07:00
Simon Willison
c7d145e016 Updated example for extra_template_vars hook, closes #816 2020-06-08 12:06:05 -07:00
Simon Willison
040fc0546f Updated tests, refs #817 2020-06-08 12:02:56 -07:00
Simon Willison
799c5d5357 Renamed resource_identifier to resource, refs #817 2020-06-08 11:59:53 -07:00
Simon Willison
c9f1ec616e Removed resource_type from permissions system, closes #817
Refs #811, #699
2020-06-08 11:51:03 -07:00
Simon Willison
5598c5de01 Database list on index page respects table/view permissions, refs #811 2020-06-08 11:34:14 -07:00
Simon Willison
dcec89270a View list respects view-table permission, refs #811
Also makes a small change to the /fixtures.json JSON:

    "views": ["view_name"]

Is now:

    "views": [{"name": "view_name", "private": true}]
2020-06-08 11:20:59 -07:00
Simon Willison
9ac27f67fe Show padlock on private query page, refs #811 2020-06-08 11:13:32 -07:00
Simon Willison
aa420009c0 Show padlock on private table page, refs #811 2020-06-08 11:07:11 -07:00
Simon Willison
dfff34e198 Applied black, refs #811 2020-06-08 11:03:33 -07:00
Simon Willison
ab14b20b24 Get tests working again 2020-06-08 10:16:24 -07:00
Simon Willison
177059284d New request.actor property, refs #811 2020-06-08 10:05:32 -07:00
Simon Willison
2a8b39800f Updated tests, refs #811 2020-06-08 07:50:06 -07:00
Simon Willison
3ce7f2e7da Show padlock on private database page, refs #811 2020-06-08 07:23:10 -07:00
Simon Willison
1cf86e5ecc Show padlock on private index page, refs #811 2020-06-08 07:18:47 -07:00
Simon Willison
cc218fa9be Move assert_permissions_checked() calls from test_html.py to test_permissions.py, refs #811 2020-06-08 07:02:31 -07:00
Simon Willison
e18f8c3f87 New check_visibility() utility function, refs #811 2020-06-08 06:49:55 -07:00
Simon Willison
9397d71834 Implemented view-table, refs #811 2020-06-07 21:47:22 -07:00
Simon Willison
b26292a458 Test that view-query is respected by query list, refs #811 2020-06-07 20:56:49 -07:00
Simon Willison
9b42e1a4f5 view-database permission
Also now using 🔒 to indicate private resources - resources that
would not be available to the anonymous user. Refs #811
2020-06-07 20:50:37 -07:00
Simon Willison
613fa551a1 Removed view-row permission, for the moment - refs #811
https://github.com/simonw/datasette/issues/811#issuecomment-640338347
2020-06-07 20:14:27 -07:00
Simon Willison
cd92e4fe2a Fixed test name, this executes view-query, not execute-sql - refs #811 2020-06-07 14:33:56 -07:00
Simon Willison
8571ce388a Implemented view-instance permission, refs #811 2020-06-07 14:30:39 -07:00
Simon Willison
ece0ba6f4b Test + default impl for view-query permission, refs #811 2020-06-07 14:23:16 -07:00
Simon Willison
abc7339124 Nicer pattern for make_app_client() in tests, closes #395 2020-06-07 14:14:10 -07:00
Simon Willison
5ed2853cf3 Fix permissions documenation test 2020-06-07 14:01:22 -07:00
Simon Willison
a1e801453a Renamed execute-query permission to execute-sql, refs #811 2020-06-07 13:20:59 -07:00
Simon Willison
4340845754 Nested permission checks for all views, refs #811 2020-06-07 13:03:08 -07:00
Simon Willison
86dec9e8ff Added permission check to every view, closes #808 2020-06-06 22:30:36 -07:00
Simon Willison
bd4de0647d Improved permissions documentation 2020-06-06 19:09:59 -07:00
Simon Willison
7dc23cd71a Whitespace 2020-06-06 13:05:09 -07:00
Simon Willison
f1daf64e72 Link to canned query permissions documentation 2020-06-06 12:46:40 -07:00
Simon Willison
415ccd7cbd
Merge pull request #803 from simonw/canned-query-permissions 2020-06-06 12:40:19 -07:00
Simon Willison
3359d54a4e Use cookies when accessing csrftoken_from 2020-06-06 12:33:08 -07:00
Simon Willison
966eec7f75 Check permissions on canned query page, refs #800 2020-06-06 12:27:00 -07:00
Simon Willison
070838bfa1 Better test for Vary header 2020-06-06 12:26:19 -07:00
Simon Willison
3f83d4632a Respect query permissions on database page, refs #800 2020-06-06 12:05:22 -07:00
Simon Willison
14f6b4d200 actor_matches_allow utility function, refs #800 2020-06-06 11:39:11 -07:00
Simon Willison
d4c7b85f55 Documentation for "id": "*", refs #800 2020-06-06 11:23:54 -07:00
Simon Willison
30a8132d58 Docs for authentication + canned query permissions, refs #800
Closes #786
2020-06-06 11:18:46 -07:00
Simon Willison
9c563d6aed Bump asgi-csrf to 0.5.1 for a bug fix
Refs https://github.com/simonw/asgi-csrf/issues/10
2020-06-05 17:15:52 -07:00
Simon Willison
75c143a84c Fixed /-/plugins?all=1, refs #802 2020-06-05 16:55:08 -07:00
Simon Willison
f786033a5f Fixed 'datasette plugins' command, with tests - closes #802 2020-06-05 16:46:37 -07:00
Simon Willison
033a1bb22c Removed rogue print() from test 2020-06-05 12:06:43 -07:00
Simon Willison
84a9c4ff75
CSRF protection (#798)
Closes #793.

* Rename RequestParameters to MultiParams, refs #799
* Allow tuples as well as lists in MultiParams, refs #799
* Use csrftokens when running tests, refs #799
* Use new csrftoken() function, refs https://github.com/simonw/asgi-csrf/issues/7
* Check for Vary: Cookie hedaer, refs https://github.com/simonw/asgi-csrf/issues/8
2020-06-05 12:05:57 -07:00
Simon Willison
d96ac1d52c Allow tuples as well as lists in MultiParams, refs #799 2020-06-05 11:01:06 -07:00
Simon Willison
0da7f49b24 Rename RequestParameters to MultiParams, refs #799 2020-06-05 10:52:50 -07:00
Simon Willison
0c064c5fe2 More things you can do with plugins 2020-06-04 20:10:40 -07:00
Simon Willison
2074efa5a4 Another actor_from_request example 2020-06-04 18:38:32 -07:00
Simon Willison
8524866fdf Link to authentication docs 2020-06-04 16:58:19 -07:00
Simon Willison
9cb44be42f Docs and tests for "params", closes #797 2020-06-03 14:04:40 -07:00
Simon Willison
aa82d03704
Basic writable canned queries
Refs #698. First working version of this feature.

* request.post_vars() no longer discards empty values
2020-06-03 08:16:50 -07:00
Simon Willison
0934844c0b request.post_vars() no longer discards empty values 2020-06-03 06:48:39 -07:00
Simon Willison
9690ce6068 More efficient modifiation of scope 2020-06-02 17:05:33 -07:00
Simon Willison
3c5e4f266d Added messages to pattern portfolio, refs #790 2020-06-02 15:34:50 -07:00
Simon Willison
a7137dfe06 /-/plugins now shows details of hooks, closes #794
Also added /-/plugins?all=1 parameter to see default plugins.
2020-06-02 14:49:28 -07:00
Simon Willison
5278c04682 More consistent use of response.text/response.json in tests, closes #792 2020-06-02 14:29:12 -07:00
Simon Willison
4fa7cf6853 Flash messages mechanism, closes #790 2020-06-02 14:12:18 -07:00
Simon Willison
1d0bea157a New request.cookies property 2020-06-02 14:11:41 -07:00
Simon Willison
b4cd8797b8 permission_checks is now _permission_checks 2020-06-02 14:11:32 -07:00
Simon Willison
dfdbdf378a Added /-/permissions debug tool, closes #788
Also started the authentication.rst docs page, refs #786.

Part of authentication work, refs #699.
2020-05-31 22:00:36 -07:00
Simon Willison
57cf5139c5 Default actor_from_request hook supporting ds_actor signed cookie
Refs #784, refs #699
2020-05-31 18:16:42 -07:00
Simon Willison
9f3d4aba31 --root option and /-/auth-token view, refs #784 2020-05-31 18:16:42 -07:00
Simon Willison
7690d5ba40 Docs for --secret/DATASETTE_SECRET - closes #785 2020-05-31 18:16:42 -07:00
Simon Willison
fa27e44fe0 datasette.sign() and datasette.unsign() methods, refs #785 2020-05-31 18:16:42 -07:00
Simon Willison
1fc6ceefb9 Added /-/actor.json - refs #699
Also added JSON highlighting to introspection documentation.
2020-05-31 18:16:42 -07:00
Simon Willison
9315bacf6f Implemented datasette.permission_allowed(), refs #699 2020-05-31 18:16:42 -07:00
Simon Willison
461c82838d Implemented actor_from_request with tests, refs #699
Also added datasette argument to permission_allowed hook
2020-05-31 18:16:42 -07:00
Simon Willison
060a56735c actor_from_request and permission_allowed hookspecs, refs #699 2020-05-31 18:16:42 -07:00
Simon Willison
c4fbe50676 Documentation for Database introspection methods, closes #684
Refs #576
2020-05-30 11:40:30 -07:00
Simon Willison
124acf34a6 Removed db.get_outbound_foreign_keys method
It duplicated the functionality of db.foreign_keys_for_table.
2020-05-30 11:39:46 -07:00
Simon Willison
4d798ca0e3 Added test for db.mtime_ns 2020-05-30 11:17:20 -07:00
Simon Willison
3c5afaeb23 Re-arranged internals documentation
Request is more useful to most people than Database.
2020-05-30 11:06:13 -07:00
Simon Willison
5ae14c9f20 Improved documentation for RequestParameters class 2020-05-30 10:54:22 -07:00
Simon Willison
de1cde65a6 Moved request tests to test_internals_request.py 2020-05-30 10:45:11 -07:00
Simon Willison
012c76901a _ prefix for many private methods of Datasette, refs #576 2020-05-30 07:38:46 -07:00
Simon Willison
ca56c226a9 Renamed test_database.py to test_internals_database.py
Also added a db fixture to remove some boilerplate.
2020-05-30 07:33:02 -07:00
Simon Willison
31fb006a9b Added datasette.get_database() method
Refs #576
2020-05-30 07:29:59 -07:00
Simon Willison
81be31322a New implementation for RequestParams
- no longer subclasses dict
- request.args[key] now returns first item, not all items
- removed request.raw_args entirely

Closes #774
2020-05-29 16:22:22 -07:00
Simon Willison
f272cbc65f Use request.args.getlist instead of request.args[...], refs #774 2020-05-29 15:57:46 -07:00
Simon Willison
84616a2364 request.args.getlist() returns [] if missing, refs #774
Also added some unit tests for request.args
2020-05-29 15:51:30 -07:00
Simon Willison
7ccd55a163 Views do support sorting now, refs #508 2020-05-29 15:44:22 -07:00
Simon Willison
3e8932bf64
Upgrade to actions/cache@v2 2020-05-29 15:12:10 -07:00
Simon Willison
3c1a60589e Consistent capitalization of SpatiaLite in the docs 2020-05-28 11:27:44 -07:00
Simon Willison
21a8ffc82d
Tip about referencing issues in release notes commit 2020-05-28 10:49:58 -07:00
Simon Willison
7bb30c1f11 request.url now respects force_https_urls, closes #781 2020-05-28 10:10:06 -07:00
Simon Willison
40885ef24e
Noted tool for converting release notes to Markdown 2020-05-28 07:41:22 -07:00
Simon Willison
d56f402822 Release notes for 0.43
Refs #581, #770, #729, #706, #751, #706, #744, #771, #773
2020-05-28 07:11:06 -07:00
Simon Willison
5ab411c733 can_render mechanism for register_output_renderer, closes #770 2020-05-27 22:57:05 -07:00
Simon Willison
75cd432e5a Ability to set custom table/view page size in metadata, closes #751 2020-05-27 22:00:04 -07:00
Simon Willison
510c1989d4 Removed xfail, refs #773 2020-05-27 21:11:53 -07:00
Simon Willison
6d95cb4f91 Unit test for register_facet_classes plugin, closes #773
I was a bit lazy with this one. I didn't hook up a test for the facet_results mechanism.
The custom facet hook isn't a great design so I will probably rethink it at some point
in the future anyway.
2020-05-27 21:09:16 -07:00
Simon Willison
defead17a4 Test for publish_subcommand hook, refs #773 2020-05-27 20:30:32 -07:00
Simon Willison
cbeea23d00 Test for prepare_jinja2_environment, refs #773 2020-05-27 20:13:32 -07:00
Simon Willison
57f48b8416 Made register_output_renderer callback optionally awaitable, closes #776 2020-05-27 19:43:30 -07:00
Simon Willison
52c4387c7d Redesigned register_output_renderer plugin hook, closes #581 2020-05-27 19:21:41 -07:00
Simon Willison
446e5de65d Refactored test plugins into tests/plugins, closes #775 2020-05-27 17:57:25 -07:00
Simon Willison
4b96857f17 Link to request object documentation, refs #706 2020-05-27 15:35:25 -07:00
Simon Willison
50652f474b Stop using .raw_args, deprecate and undocument it - refs #706 2020-05-27 15:29:42 -07:00
Simon Willison
6d7cb02f00 Documentation for request object, refs #706 2020-05-27 15:17:53 -07:00
Simon Willison
ad88c9b3f3 Mechanism for adding a default URL fragment to a canned query
Closes #767
2020-05-27 14:52:03 -07:00
Simon Willison
af5702220c Added datasette-media plugin to the docs 2020-05-27 13:34:12 -07:00
Simon Willison
da87e963bf
Test that plugin hooks are unit tested (xfail)
This currently fails using xfail. Closes 771.
2020-05-27 13:16:02 -07:00
Simon Willison
41a0cd7b6a call_with_supported_arguments() util, refs #581 2020-05-27 12:25:52 -07:00
Simon Willison
9e6075d21f rST fixes for register_output_renderer docs 2020-05-27 11:35:31 -07:00
Simon Willison
2d099ad9c6
Backport of Python 3.8 shutil.copytree, refs #744 (#769) 2020-05-27 11:17:43 -07:00
Simon Willison
cee671a58f
Use dirs_exist_ok=True, refs #744 (#768) 2020-05-21 10:53:51 -07:00
Simon Willison
faea5093b8 Column headings now black in mobile view, closes #729 2020-05-15 11:16:47 -07:00
Simon Willison
5ea8c6d1cd type-pk instead of type-link CSS class, closes #729 2020-05-14 22:55:20 -07:00
Simon Willison
504196341c Visually distinguish float/int columns, closes #729 2020-05-14 22:51:39 -07:00
Simon Willison
fc24edc153
Added project_urls, closes #764 2020-05-11 11:28:53 -07:00
Simon Willison
af6c6c5d6f Release 0.42, refs #685 2020-05-08 10:38:27 -07:00
Simon Willison
2694ddcf14 Test for .execute_fn(), refs #685 2020-05-08 10:29:17 -07:00
Simon Willison
5ab848f0b8
RST fix 2020-05-08 10:04:47 -07:00
Simon Willison
545c71b604
Small cleanup 2020-05-08 09:57:01 -07:00
Simon Willison
ec9cdc3ffa
Documentation for .execute_fn(), refs #685 2020-05-08 09:52:53 -07:00
Simon Willison
4433306c18
Improvements + docs for db.execute() and Results class
* Including new results.first() and results.single_value() methods. Closes #685
2020-05-08 09:05:46 -07:00
Simon Willison
69e3a855dd Rename execute_against_connection_in_thread() to execute_fn(), refs #685 2020-05-08 07:16:39 -07:00
Simon Willison
182e5c8745 Release Datasette 0.41
Refs #648 #731 #750 #151 #761 #752 #719 #756 #748
2020-05-06 11:20:58 -07:00
Simon Willison
0784f2ef9d Allow specific pragma functions, closes #761 2020-05-06 10:18:31 -07:00
Simon Willison
9212f0c9c3
Removed note about virtual environments
Simplifies things now that we also talk about pipx.
2020-05-04 12:35:28 -07:00
Simon Willison
0cdf111ae6
Move pip/pipx to top of installation instructions
Less intimidating than Docker, hopefully.
2020-05-04 12:31:13 -07:00
Simon Willison
7e2bb31464
Documented installation using pipx, closes #756 2020-05-04 12:10:31 -07:00
Simon Willison
cc872b1f50 Fixed rogue output in tests, closes #755 2020-05-04 11:42:01 -07:00
Simon Willison
9424687e9e Consistently return charset utf-8, closes #752 2020-05-04 10:42:10 -07:00
Simon Willison
450d2e2896 Fixed pytest warning about TestClient class 2020-05-04 10:42:10 -07:00
dependabot-preview[bot]
b314e088c5
Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.13 (#753)
Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](https://github.com/pytest-dev/pytest-asyncio/compare/v0.10.0...v0.12.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-04 10:40:48 -07:00
dependabot-preview[bot]
707fe03994
Update beautifulsoup4 requirement from ~=4.8.1 to >=4.8.1,<4.10.0 (#720)
Updates the requirements on [beautifulsoup4](http://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version.

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-04 10:14:46 -07:00
Colin Dellow
dbd2d70b38
asgi: check raw_path is not None (#719)
The ASGI spec
(https://asgi.readthedocs.io/en/latest/specs/www.html#http) seems to
imply that `None` is a valid value, so we need to check the value
itself, not just whether the key is present.

In particular, the [mangum](https://github.com/erm/mangum) adapter
passes `None` for this key.
2020-05-04 10:14:25 -07:00
dependabot-preview[bot]
c91fb9e3d4
Update pytest requirement from ~=5.2.2 to >=5.2.2,<5.5.0 (#721)
Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.2.2...5.4.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-04 10:13:41 -07:00
dependabot-preview[bot]
aa064de3f4
Update jinja2 requirement from ~=2.10.3 to >=2.10.3,<2.12.0 (#722)
Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/master/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/2.10.3...2.11.1)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-04 10:13:15 -07:00
dependabot-preview[bot]
109c5a430d
Update janus requirement from ~=0.4.0 to >=0.4,<0.6 (#734)
Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
- [Release notes](https://github.com/aio-libs/janus/releases)
- [Changelog](https://github.com/aio-libs/janus/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/janus/compare/v0.4.0...v0.5.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-04 09:48:03 -07:00
dependabot-preview[bot]
e232f77055
Update mergedeep requirement from ~=1.1.1 to >=1.1.1,<1.4.0 (#728)
Updates the requirements on [mergedeep](https://github.com/clarketm/mergedeep) to permit the latest version.
- [Release notes](https://github.com/clarketm/mergedeep/releases)
- [Commits](https://github.com/clarketm/mergedeep/compare/v1.1.1...v1.3.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-04 09:45:49 -07:00
dependabot-preview[bot]
985e59493e
Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6 (#725)
Refs #754

Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
- [Release notes](https://github.com/Tinche/aiofiles/releases)
- [Commits](https://github.com/Tinche/aiofiles/compare/v0.4.0...v0.5.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>

Co-authored-by: dependabot-preview[bot] <27856297+dependabot-preview[bot]@users.noreply.github.com>
2020-05-04 09:17:48 -07:00
Simon Willison
d996d4122b
Add badges to documentation index 2020-05-03 08:46:49 -07:00
Simon Willison
cef23e8861 Started pattern portfolio at /-/patterns, refs #151 2020-05-02 20:05:25 -07:00
Simon Willison
b3aa5f4313 Added 'not like' table filter, refs #750 2020-05-02 12:04:54 -07:00
Simon Willison
4df1b4d8b0 Re-arranged full-text search docs
Also documented ?_searchmode=raw - closes #748
2020-04-30 14:06:00 -07:00
Simon Willison
cf2d547ffc Documentation for #747 2020-04-30 12:02:28 -07:00
Simon Willison
1d91ab71d4 Directory configuration mode supports metadata.yaml, closes #747 2020-04-30 11:47:41 -07:00
Simon Willison
d18086ae01
Changelog badge 2020-04-30 11:31:35 -07:00
Simon Willison
e37f4077c0 Remove 'Serve!' line from serve CLI output
It wasn't adding anything, and it was confusing when run in
conjunction with the new config directory mode from #731
2020-04-27 15:02:28 -07:00
Simon Willison
89c4ddd482 403 for static directory listing, closes #740 2020-04-27 11:29:04 -07:00
Simon Willison
25014ca25e
Configuration directory mode, closes #731 2020-04-27 09:30:24 -07:00
Simon Willison
1b7b66c465 Make request available when rendering custom pages, closes #738 2020-04-26 12:01:46 -07:00
Simon Willison
304e7b1d9f
Mechanism for creating custom pages using templates
Closes #648
2020-04-26 11:46:43 -07:00
Simon Willison
227bb3e91f
Added more example plugins 2020-04-22 06:47:20 -07:00
Simon Willison
8da108193b Fixed a couple of spelling errors 2020-04-21 21:06:39 -07:00
Simon Willison
edb39c91f7 Release Datasette 0.40 2020-04-21 21:00:34 -07:00
Simon Willison
15e2321804 Extra body CSS class for canned queries, closes #727 2020-04-15 14:07:28 -07:00
Simon Willison
d349d57cdf Smarter merging of metadata and extra_metadata, closes #724 2020-04-10 11:34:09 -07:00
Simon Willison
d55fe8cdfc Fixed bug with Templates considered comment, closes #689 2020-04-05 12:38:33 -07:00
Simon Willison
e89b0ef2f9 Expose extra_template_vars in _contex=1, refs #693 2020-04-05 11:49:15 -07:00
Simon Willison
09253817de Fix for missing view_name bug, closes #716 2020-04-05 11:28:20 -07:00
Simon Willison
e0e7a0facf Removed Zeit Now v1 support, closes #710 2020-04-04 16:04:33 -07:00
Simon Willison
07e208cc6d Refactored .custom_sql() method to new QueryView class
Refs #698
2020-04-02 18:12:13 -07:00
Simon Willison
b07312c2b3 dedent SQL for neighborhood_search fixture
Makes this page a little prettier:
https://latest.datasette.io/fixtures/neighborhood_search
2020-04-02 17:54:27 -07:00
Simon Willison
6717c719dd
--metadata accepts YAML as well as JSON - closes #713 2020-04-02 12:30:53 -07:00
Simon Willison
2aaad72789 Refactor template setup into Datasette constructor
Closes #707
2020-03-26 18:12:43 -07:00
Simon Willison
6aa516d82d Run base_url tests against /fixtures/facetable too, refs #712 2020-03-25 19:31:22 -07:00
Simon Willison
99b1e91965 Fixed RST bug 2020-03-24 21:46:52 -07:00
Simon Willison
dedd775512 Release 0.39 2020-03-24 21:02:37 -07:00
Simon Willison
a351d353bc Fixed typo in GitHub Action configuration, refs #705 2020-03-24 19:30:50 -07:00
Simon Willison
90015b2689 Deploy latest.datasett.io to Cloud Run, refs #705 2020-03-24 19:26:02 -07:00
Simon Willison
cc4445801e Removed deploy to Zeit Now, refs #705 2020-03-24 19:17:27 -07:00
Simon Willison
7656fd64d8
base_url configuration setting, closes #394
* base_url configuration setting
* base_url works for static assets as well
2020-03-24 17:18:43 -07:00
Simon Willison
2a36dfa2a8 Fix for input type=search Webkit styling, closes #701 2020-03-24 15:57:09 -07:00
Simon Willison
5f4aeb1f19 Removed documentation for Zeit Now v1, refs #710 2020-03-24 15:45:24 -07:00
Simon Willison
c0aa929cdd Added datasette-publish-fly plugin to docs, closes #704 2020-03-24 15:38:58 -07:00
Simon Willison
3b3cb3e8df Added example plugins to plugin hooks docs, closes #709 2020-03-24 15:29:34 -07:00
Simon Willison
a498d0fe65 Fix bug with over-riding default sort, closes #702 2020-03-21 19:40:29 -07:00
Simon Willison
236aa065b2 "sort" and "sort_desc" metadata properties, closes #702 2020-03-21 19:28:35 -07:00
Simon Willison
e1a817411a Bump to click 7.1.1 to fix flaky tests 2020-03-21 18:47:51 -07:00
Simon Willison
3d656f4b31 Updated documentation formatting 2020-03-21 18:31:54 -07:00
Simon Willison
2c0e1e09bc Show sort arrow on primary key by default
Closes #677. Refs #702.
2020-03-21 16:57:37 -07:00
Simon Willison
fd2a74dc09
Updated publish_subcommand example 2020-03-18 17:47:53 -07:00
Simon Willison
a000c80d50 await Request(scope, receive).post_vars() method, closes #700
Needed for #698
2020-03-16 19:47:37 -07:00
Simon Willison
7e357abbc3 Release 0.38 2020-03-08 16:26:50 -07:00
Simon Willison
e1b5339fdf Do not look for templates_path in default plugins
Closes #697
2020-03-08 16:11:18 -07:00
Simon Willison
7508477a96
Link to Datasette Writes blog entry 2020-03-08 10:23:51 -07:00
Simon Willison
f7f31a0223 Upgrade Dockerfile to SQLite 3.31.1, closes #695 2020-03-06 00:15:19 -06:00
Simon Willison
af9cd4ca64 Fixes for new --memory option, refs #694 2020-03-05 17:44:15 -06:00
Simon Willison
ddd11b3ddd --memory option for publish cloudrun, refs #694 2020-03-05 17:34:36 -06:00
Simon Willison
be20e6991e Changelog for 0.37.1 2020-03-02 19:43:08 -08:00
Simon Willison
b796519da2 Print exceptions if they occur in the write thread 2020-03-02 17:59:29 -08:00
Simon Willison
dc80e779a2 Handle scope path if it is a string
I ran into this while running a unit test with httpx.AsyncClient
2020-03-02 15:34:04 -08:00
Simon Willison
4933035b75
RST fix 2020-03-02 08:10:16 -08:00
Simon Willison
613f6fad72
Improved extra_template_vars documentation 2020-03-02 07:12:34 -08:00
Simon Willison
7f5a330377
Don't count rows on homepage for DBs > 100MB (#688)
Closes #649.
2020-02-28 17:08:29 -08:00
Simon Willison
0f8e91c68f
Documentation fix 2020-02-25 23:13:39 -08:00
Simon Willison
1a77f30d3c
Fixed typo 2020-02-25 23:11:19 -08:00
Simon Willison
c9e6841482 News and release notes for 0.37 2020-02-25 17:22:02 -08:00
Simon Willison
78198df668 Fixed incorrect target name 2020-02-25 17:10:30 -08:00
Kevin Keogh
3041c6b641
Use inspect-file, if possible, for total row count (#666)
For large tables, counting the number of rows in the table can take a
significant amount of time. Instead, where an inspect-file is provided
for an immutable database, look up the row-count for a plain count(*).

Thanks, @kevindkeogh
2020-02-25 12:19:29 -08:00
Simon Willison
6cb65555f4
?_searchmode=raw option (#686) 2020-02-24 21:56:03 -08:00
Simon Willison
a093c5f79f
.execute_write() and .execute_write_fn() methods on Database (#683)
Closes #682.
2020-02-24 20:45:07 -08:00
Simon Willison
411056c4c4 Only --reload on changes to immutable databases, closes #494 2020-02-24 11:44:59 -08:00
Simon Willison
b031fe9763 Updated README news for 0.36 2020-02-21 19:04:46 -08:00
Simon Willison
962a7e16e5 Release notes for 0.36, refs #679 2020-02-21 19:01:57 -08:00
Simon Willison
d6335f1f31 Added shapefile-to-sqlite, datasette-mask-columns, datasette-auth-existing-cookies, datasette-auth-existing-cookies
Refs #679
2020-02-21 18:53:35 -08:00
Adrien Di Pasquale
be2265b0e8
Fix db-to-sqlite command in ecosystem doc page (#669)
Thanks, @adipasquale
2020-02-21 18:32:17 -08:00
Simon Willison
7c6a9c3529 Better tests for prepare_connection() plugin hook, refs #678 2020-02-21 18:27:07 -08:00
Simon Willison
6303ea5048 prepare_connection() now takes datasette and database args, refs #678 2020-02-21 17:32:40 -08:00
Simon Willison
d3f2fade88 Refactored run_sanity_checks to check_connection(conn), refs #674 2020-02-15 09:56:48 -08:00
Simon Willison
f1442a8151 Replaced self.ds.execute with db.execute in more places 2020-02-13 18:20:05 -08:00
Simon Willison
efa54b439f Docs for .render_template(), refs #577
Also improved parameter documentation for other methods, refs #576
2020-02-13 17:58:32 -08:00
Simon Willison
3ffb8f3b98 .add_database() and .remove_database() methods, refs #671
Also made a start on the Datasette class documentation, refs #576
2020-02-13 17:27:57 -08:00
Simon Willison
cf5f4386ef Run black against everything, not just tests and datasette dirs 2020-02-13 15:02:10 -08:00
Simon Willison
b38a792ef0 Apply Black, update copyright to be 2017-2020 2020-02-13 15:01:14 -08:00
Simon Willison
0091dfe3e5 More reliable tie-break ordering for facet results
I was seeing a weird bug where the order of results running tests
on my laptop was inconsistent, causing pytest failures even though
the order of tests in Travis CI was fine.

I think the fix is to explicitly state how facet ordering ties on
the count should be resolved.
2020-02-12 22:36:42 -08:00
Simon Willison
298a899e79 Reformatted with black 2020-02-12 22:05:46 -08:00
Simon Willison
30b6f71b30 Updated release notes with #653 2020-02-04 18:17:47 -08:00
Jay Graves
33a12c8ae5
Allow leading comments in SQL input field (#653)
Thanks, @jaywgraves!
2020-02-04 18:13:24 -08:00
Simon Willison
ce12244037 Release notes for 0.35 2020-02-04 18:02:32 -08:00
Simon Willison
4d7dae9eb7
Added a bunch more plugins to the Ecosystem page 2020-02-04 12:49:41 -08:00
Simon Willison
70b915fb4b
Datasette.render_template() method, closes #577
Pull request #664.
2020-02-04 12:26:17 -08:00
Simon Willison
286ed286b6
geojson-to-sqlite 2020-01-30 23:09:56 -08:00
Simon Willison
e7f60d2a9b Release notes for Datasette 0.34, plus news updates 2020-01-29 16:09:01 -08:00
Simon Willison
67fc9c5720
--port argument for datasette package, plus tests - closes #661
From pull request #663
2020-01-29 14:46:43 -08:00
Katie McLaughlin
34d77d780f gcloud run is now GA, s/beta// (#660)
Thanks, @glasnt
2020-01-21 15:28:11 -08:00
Simon Willison
3c861f363d _search= queries now correctly escaped, fixes #651
Queries with reserved words or characters according to the SQLite
FTS5 query language could cause errors.

Queries are now escaped like so:

    dog cat => "dog" "cat"
2019-12-29 18:48:30 +00:00
Simon Willison
59e7014c8a Release 0.33 2019-12-22 16:27:04 +00:00
Simon Willison
dc98b0f41d Link to JSK Medium post from news 2019-12-22 16:16:58 +00:00
Simon Willison
d54318fc7f Added template_debug setting, closes #654 2019-12-22 16:04:45 +00:00
Simon Willison
ceef5ce684 Documentation for --port=0 2019-12-22 15:42:30 +00:00
Simon Willison
85c19c4037 Apply black 2019-12-22 15:34:20 +00:00
Simon Willison
9c3f0b73de Bump to uvicorn 0.11 2019-12-22 15:33:04 +00:00
Simon Willison
16665c9ee6 Better handling of corrupted database files 2019-12-22 15:31:40 +00:00
Simon Willison
d6b6c9171f Include asyncio task information in /-/threads debug page 2019-12-04 22:47:17 -08:00
Simon Willison
2039e78e58
Added Niche Museums to News 2019-12-02 22:53:59 -08:00
Simon Willison
a562f29655 Examples of things you can do with plugins 2019-11-27 11:19:11 -08:00
Simon Willison
f9d0ce4233
Added datasette-haversine to plugins list 2019-11-27 06:04:32 -08:00
Simon Willison
df2879ee2a Better documentation for --static, closes #641
https://datasette.readthedocs.io/en/stable/custom_templates.html#serving-static-files
2019-11-25 18:31:42 -08:00
Simon Willison
aca41618f8 index view is also important for plugin hooks 2019-11-25 09:04:39 -08:00
Simon Willison
d3e1c3017e Display 0 results, closes #637 2019-11-22 22:07:01 -08:00
Simon Willison
fd137da7f8 Suggest column facet only if at least one count > 1
Fixes #638
2019-11-21 16:56:55 -08:00
Simon Willison
c16be14517
How to upgrade using Docker 2019-11-20 10:02:07 -08:00
Simon Willison
440a70428c Include rowid in filter select, closes #636 2019-11-19 15:01:10 -08:00
Simon Willison
a9909c29cc Move .execute() from Datasette to Database
Refs #569 - I split this change out from #579
2019-11-15 14:52:03 -08:00
Simon Willison
8fc9a5d877
Datasette 0.32 and datasette-template-sql in news 2019-11-14 15:46:37 -08:00
Simon Willison
a95bedb9c4 Release notes for 0.32 2019-11-14 15:20:21 -08:00
Simon Willison
8c642f04e0
Render templates using Jinja async mode
Closes #628
2019-11-14 15:14:22 -08:00
Simon Willison
b51f258d00 Release notes for 0.31.2 2019-11-13 08:48:36 -08:00
Simon Willison
f524510230 Fix "publish heroku" + upgrade to use Python 3.8.0
Closes #633. Closes #632.
2019-11-13 08:42:47 -08:00
Stanley Zheng
848dec4deb Fix for datasette publish with just --source_url (#631)
Closes #572
2019-11-12 20:28:42 -08:00
Simon Willison
bbd00e903c
Badge linking to datasette on hub.docker.com 2019-11-12 18:38:13 -08:00
Simon Willison
a22c7761b6 Fixed typo in release notes 2019-11-12 18:18:39 -08:00
Simon Willison
16265f6a1a Release notes for 0.31.1 2019-11-12 18:18:04 -08:00
Simon Willison
d977fbadf7
datasette publish uses python:3.8 base Docker image, closes #629 2019-11-11 22:03:09 -08:00
Simon Willison
f554be39fc
ReST fix 2019-11-11 22:00:13 -08:00
Simon Willison
1c518680e9
Final steps: build stable branch of Read The Docs 2019-11-11 21:57:48 -08:00
Simon Willison
7f89928062 Removed code that conditionally installs black
Since we no longer support Python 3.5 we don't need this any more.
2019-11-11 21:33:51 -08:00
Simon Willison
c633c035dc Datasette 0.31 in news section 2019-11-11 21:26:56 -08:00
Simon Willison
76fc6a9c73 Release notes for 0.31 2019-11-11 21:18:17 -08:00
Simon Willison
cf7776d36f
Support Python 3.8, stop supporting Python 3.5 (#627)
* Upgrade to uvicorn 0.10.4
* Drop support for Python 3.5
* Bump all dependencies to latest releases
* Update docs to reflect we no longer support 3.5
* Removed code that skipped black unit test on 3.5

Closes #622
2019-11-11 21:09:11 -08:00
Simon Willison
5bc2570121 Include uvicorn version in /-/versions, refs #622 2019-11-11 20:45:12 -08:00
Simon Willison
42ee3e16a9
Bump pint to 0.9 (#624)
This fixes 2 deprecation warnings in Python 3.8 - refs #623 #622
2019-11-10 20:19:01 -08:00
Simon Willison
1c063fae9d
Test against Python 3.8 in Travis (#623)
* Test against Python 3.8 in Travis
* Avoid current_task warnings in Python 3.8
2019-11-10 19:45:34 -08:00
Simon Willison
28c4a6db5b CREATE INDEX statements on table page, closes #618 2019-11-09 17:29:36 -08:00
Simon Willison
10b9d85eda
datasette-csvs on Glitch now uses sqlite-utils
It previously used csvs-to-sqlite but that had heavy dependencies.

See https://support.glitch.com/t/can-you-upgrade-python-to-latest-version/7980/33
2019-11-08 18:15:13 -08:00
Simon Willison
9f5d19c254
Improved documentation for "publish cloudrun" 2019-11-08 18:12:20 -08:00
Simon Willison
83fc5165ac Improved UI for publish cloudrun, closes #608 2019-11-07 18:48:39 -08:00
Simon Willison
f9c146b893 Removed unused special_args_lists variable 2019-11-06 16:55:44 -08:00
Simon Willison
c30f07c58e Removed _group_count=col feature, closes #504 2019-11-05 21:12:55 -08:00
Tobias Kunze
931bfc6661 Handle spaces in DB names (#590)
Closes #503 - thanks, @rixx
2019-11-04 15:16:30 -08:00
Simon Willison
52fa79c607 Use select colnames, not select * for table view - refs #615 2019-11-04 15:03:48 -08:00
Simon Willison
9db22cdf18 pk__notin= filter, closes #614 2019-11-03 20:11:55 -08:00
Tobias Kunze
ee330222f4 Offer to format readonly SQL (#602)
Following discussion in #601, this PR adds a "Format SQL" button to
read-only SQL (if the SQL actually differs from the formatting result).

It also removes a console error on readonly SQL queries.

Thanks, @rixx!
2019-11-03 18:39:55 -08:00
Simon Willison
2bf7ce5f51 Fix CSV export for nullable foreign keys, closes #612 2019-11-02 16:12:46 -07:00
Simon Willison
c3181d9a84 Release notes for 0.30.2 2019-11-02 15:47:20 -07:00
Simon Willison
14da70525b Don't show 'None' as label for nullable foreign key, closes #406 2019-11-02 15:29:40 -07:00
Simon Willison
ed57e4f990 Plugin static assets support both hyphens and underscores in names
Closes #611
2019-11-01 15:15:10 -07:00
Simon Willison
ffae2f0ecd Better documentation of --host, closes #574 2019-11-01 14:57:49 -07:00
Simon Willison
7152e76eda Don't suggest array facet if column is only [], closes #610 2019-11-01 14:45:59 -07:00
Simon Willison
ba5414f16b Only inspect first 100 records for #562 2019-11-01 12:38:15 -07:00
Simon Willison
50287e7c6b Only suggest array facet for arrays of strings - closes #562 2019-11-01 12:37:46 -07:00
Simon Willison
937828f946 Use distinfo.project_name for plugin name if available, closes #606 2019-10-31 22:39:59 -07:00
Simon Willison
3ca290e0db Fixed dumb error 2019-10-30 12:00:21 -07:00
Simon Willison
f5f6cbe03c Release 0.30.1 2019-10-30 11:56:04 -07:00
Simon Willison
e2c390500e Persist _where= in hidden fields, closes #604 2019-10-30 11:49:26 -07:00
Simon Willison
5dd4d2b2d3
Update to latest black (#609) 2019-10-30 11:49:01 -07:00
chris48s
f4c0830529 Always pop as_format off args dict (#603)
Closes #563. Thanks, @chris48s
2019-10-20 19:03:08 -07:00
Simon Willison
8050f9e1ec Update news in README 2019-10-18 18:08:04 -07:00
Simon Willison
debea4f971 Release 0.30 2019-10-18 18:06:37 -07:00
Simon Willison
e877b1cb12
Don't auto-format SQL on page load (#601)
Closes #600
2019-10-18 16:56:44 -07:00
Simon Willison
b647b5efc2
Fix for /foo v.s. /foo-bar issue, closes #597
Pull request #599
2019-10-18 15:51:07 -07:00
Simon Willison
b6ad1fdc70 Fixed bug returning non-ascii characters in CSV, closes #584 2019-10-17 22:23:01 -07:00
Simon Willison
3e864b1625 Use --platform=managed for publish cloudrun, closes #587 2019-10-17 14:51:45 -07:00
Simon Willison
9366d0bf19
Add Python versions badge 2019-10-14 15:29:16 -07:00
Tobias Kunze
12cec411ca Display metadata footer on custom SQL queries (#589)
Closes #408 - thanks, @rixx!
2019-10-13 20:53:21 -07:00
Tobias Kunze
908fc3999e Sort databases on homepage by argument order - #591
Closes #585 - thanks, @rixx!
2019-10-13 20:52:33 -07:00
Tobias Kunze
af2e6a5cf1 Button to format SQL, closes #136
SQL code will be formatted on page load, and can additionally
be formatted by clicking the "Format SQL" button.

Thanks, @rixx!
2019-10-13 20:46:12 -07:00
Simon Willison
fffd69ec03 Allow EXPLAIN WITH... - closes #583 2019-10-06 10:23:58 -07:00
Simon Willison
a314b76186 Added /-/threads debugging page 2019-10-02 08:35:25 -07:00
Simon Willison
0fc8afde0e Changelog for 0.29.3 release 2019-09-02 17:40:53 -07:00
Simon Willison
2dc5c8dc25
detect_fts now works with alternative table escaping (#571)
Fixes #570. See also https://github.com/simonw/sqlite-utils/pull/57
2019-09-02 17:32:27 -07:00
Simon Willison
f04deebec4 Refactored connection logic to database.connect() 2019-07-26 13:22:57 +03:00
Min ho Kim
27cb29365c Fix numerous typos (#561)
Thanks, @minho42!
2019-07-26 13:25:44 +03:00
Simon Willison
a9453c4dda Fixed CodeMirror on database page, closes #560 2019-07-13 20:38:40 -07:00
Simon Willison
6abe6faff6 Release 0.9.2 2019-07-13 20:04:05 -07:00
Simon Willison
90d4f497f9 Fix plus test for unicode characters in custom query name, closes #558 2019-07-13 19:49:24 -07:00
Simon Willison
5ed450a332 Fixed breadcrumbs on custom query page 2019-07-13 19:05:58 -07:00
Simon Willison
afc2e4260a
News: Single sign-on against GitHub using ASGI middleware 2019-07-13 18:42:35 -07:00
Simon Willison
d224ee2c98
Bump to uvicorn 0.8.4 (#559)
https://github.com/encode/uvicorn/commits/0.8.4

Query strings will now be included in log files: https://github.com/encode/uvicorn/pull/384
2019-07-13 15:34:57 -07:00
Simon Willison
f2006cca80 Updated release notes 2019-07-11 09:27:28 -07:00
Simon Willison
2a94f3719f Release 0.29.1 2019-07-11 09:17:55 -07:00
Simon Willison
cc27857c72 Removed unused variable 2019-07-11 09:14:24 -07:00
Abdus
74ecf8a7cc Fix static mounts using relative paths and prevent traversal exploits (#554)
Thanks, @abdusco! Closes #555
2019-07-11 09:13:19 -07:00
Abdus
9ca860e54f Add support for running datasette as a module (#556)
python -m datasette

Thanks, @abdusco
2019-07-11 09:07:44 -07:00
Simon Willison
81fa8b6cdc
News: Datasette 0.29, datasette-auth-github, datasette-cors 2019-07-07 21:36:27 -07:00
Simon Willison
fb7ee8e0ad Changelog for 0.29 release 2019-07-07 20:14:27 -07:00
Simon Willison
973f8f139d
--plugin-secret option for datasette publish
Closes #543

Also added new --show-files option to publish now and publish cloudrun - handy for debugging.
2019-07-07 19:06:31 -07:00
Simon Willison
2d04986c44 Added datasette-auth-github and datasette-cors plugins to Ecosystem
Closes #548
2019-07-07 19:02:27 -07:00
Simon Willison
aa4cc99c02 Removed facet-by-m2m from docs, refs #550
Will bring this back in #551
2019-07-07 18:22:05 -07:00
Simon Willison
c5542abba5 Removed ManyToManyFacet for the moment, closes #550 2019-07-07 16:21:11 -07:00
Simon Willison
9998f92cc0 Updated custom facet docs, closes #482 2019-07-07 16:19:02 -07:00
Simon Willison
912ce848b9 Fix nav display on 500 page, closes #545 2019-07-07 13:26:45 -07:00
Simon Willison
787dd427de white-space: pre-wrap for table SQL, closes #505 2019-07-07 13:26:38 -07:00
Simon Willison
f80ff9b07b min-height on .hd
Now it should be the same size on the homepage as it is on pages with breadcrumbs
2019-07-07 13:16:48 -07:00
Katie McLaughlin
d95048031e Split pypi and docker travis tasks (#480)
Thanks @glasnt!
2019-07-07 13:03:19 -07:00
Simon Willison
fcfcae21e6
extra_template_vars plugin hook (#542)
* extra_template_vars plugin hook

Closes #541

* Workaround for cwd bug

Based on https://github.com/pytest-dev/pytest/issues/1235#issuecomment-175295691
2019-07-05 17:05:56 -07:00
278 changed files with 56380 additions and 19915 deletions

2
.coveragerc Normal file
View file

@ -0,0 +1,2 @@
[run]
omit = datasette/_version.py, datasette/utils/shutil_backport.py

View file

@ -3,10 +3,11 @@
.eggs .eggs
.gitignore .gitignore
.ipynb_checkpoints .ipynb_checkpoints
.travis.yml
build build
*.spec *.spec
*.egg-info *.egg-info
dist dist
scratchpad scratchpad
venv venv
*.db
*.sqlite

4
.git-blame-ignore-revs Normal file
View file

@ -0,0 +1,4 @@
# Applying Black
35d6ee2790e41e96f243c1ff58be0c9c0519a8ce
368638555160fb9ac78f462d0f79b1394163fa30
2b344f6a34d2adaa305996a1a580ece06397f6e4

1
.gitattributes vendored
View file

@ -1,2 +1 @@
datasette/_version.py export-subst
datasette/static/codemirror-* linguist-vendored datasette/static/codemirror-* linguist-vendored

1
.github/FUNDING.yml vendored Normal file
View file

@ -0,0 +1 @@
github: [simonw]

11
.github/dependabot.yml vendored Normal file
View file

@ -0,0 +1,11 @@
version: 2
updates:
- package-ecosystem: pip
directory: "/"
schedule:
interval: daily
time: "13:00"
groups:
python-packages:
patterns:
- "*"

View file

@ -0,0 +1,35 @@
name: Deploy a Datasette branch preview to Vercel
on:
workflow_dispatch:
inputs:
branch:
description: "Branch to deploy"
required: true
type: string
jobs:
deploy-branch-preview:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.11
uses: actions/setup-python@v6
with:
python-version: "3.11"
- name: Install dependencies
run: |
pip install datasette-publish-vercel
- name: Deploy the preview
env:
VERCEL_TOKEN: ${{ secrets.BRANCH_PREVIEW_VERCEL_TOKEN }}
run: |
export BRANCH="${{ github.event.inputs.branch }}"
wget https://latest.datasette.io/fixtures.db
datasette publish vercel fixtures.db \
--branch $BRANCH \
--project "datasette-preview-$BRANCH" \
--token $VERCEL_TOKEN \
--scope datasette \
--about "Preview of $BRANCH" \
--about_url "https://github.com/simonw/datasette/tree/$BRANCH"

132
.github/workflows/deploy-latest.yml vendored Normal file
View file

@ -0,0 +1,132 @@
name: Deploy latest.datasette.io
on:
workflow_dispatch:
push:
branches:
- main
# - 1.0-dev
permissions:
contents: read
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Check out datasette
uses: actions/checkout@v5
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.13"
cache: pip
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
python -m pip install -e .[test]
python -m pip install -e .[docs]
python -m pip install sphinx-to-sqlite==0.1a1
- name: Run tests
if: ${{ github.ref == 'refs/heads/main' }}
run: |
pytest -n auto -m "not serial"
pytest -m "serial"
- name: Build fixtures.db and other files needed to deploy the demo
run: |-
python tests/fixtures.py \
fixtures.db \
fixtures-config.json \
fixtures-metadata.json \
plugins \
--extra-db-filename extra_database.db
- name: Build docs.db
if: ${{ github.ref == 'refs/heads/main' }}
run: |-
cd docs
DISABLE_SPHINX_INLINE_TABS=1 sphinx-build -b xml . _build
sphinx-to-sqlite ../docs.db _build
cd ..
- name: Set up the alternate-route demo
run: |
echo '
from datasette import hookimpl
@hookimpl
def startup(datasette):
db = datasette.get_database("fixtures2")
db.route = "alternative-route"
' > plugins/alternative_route.py
cp fixtures.db fixtures2.db
- name: And the counters writable canned query demo
run: |
cat > plugins/counters.py <<EOF
from datasette import hookimpl
@hookimpl
def startup(datasette):
db = datasette.add_memory_database("counters")
async def inner():
await db.execute_write("create table if not exists counters (name text primary key, value integer)")
await db.execute_write("insert or ignore into counters (name, value) values ('counter_a', 0)")
await db.execute_write("insert or ignore into counters (name, value) values ('counter_b', 0)")
await db.execute_write("insert or ignore into counters (name, value) values ('counter_c', 0)")
return inner
@hookimpl
def canned_queries(database):
if database == "counters":
queries = {}
for name in ("counter_a", "counter_b", "counter_c"):
queries["increment_{}".format(name)] = {
"sql": "update counters set value = value + 1 where name = '{}'".format(name),
"on_success_message_sql": "select 'Counter {name} incremented to ' || value from counters where name = '{name}'".format(name=name),
"write": True,
}
queries["decrement_{}".format(name)] = {
"sql": "update counters set value = value - 1 where name = '{}'".format(name),
"on_success_message_sql": "select 'Counter {name} decremented to ' || value from counters where name = '{name}'".format(name=name),
"write": True,
}
return queries
EOF
# - name: Make some modifications to metadata.json
# run: |
# cat fixtures.json | \
# jq '.databases |= . + {"ephemeral": {"allow": {"id": "*"}}}' | \
# jq '.plugins |= . + {"datasette-ephemeral-tables": {"table_ttl": 900}}' \
# > metadata.json
# cat metadata.json
- id: auth
name: Authenticate to Google Cloud
uses: google-github-actions/auth@v3
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}
- name: Set up Cloud SDK
uses: google-github-actions/setup-gcloud@v3
- name: Deploy to Cloud Run
env:
LATEST_DATASETTE_SECRET: ${{ secrets.LATEST_DATASETTE_SECRET }}
run: |-
gcloud config set run/region us-central1
gcloud config set project datasette-222320
export SUFFIX="-${GITHUB_REF#refs/heads/}"
export SUFFIX=${SUFFIX#-main}
# Replace 1.0 with one-dot-zero in SUFFIX
export SUFFIX=${SUFFIX//1.0/one-dot-zero}
datasette publish cloudrun fixtures.db fixtures2.db extra_database.db \
-m fixtures-metadata.json \
--plugins-dir=plugins \
--branch=$GITHUB_SHA \
--version-note=$GITHUB_SHA \
--extra-options="--setting template_debug 1 --setting trace_debug 1 --crossdb" \
--install 'datasette-ephemeral-tables>=0.2.2' \
--service "datasette-latest$SUFFIX" \
--secret $LATEST_DATASETTE_SECRET
- name: Deploy to docs as well (only for main)
if: ${{ github.ref == 'refs/heads/main' }}
run: |-
# Deploy docs.db to a different service
datasette publish cloudrun docs.db \
--branch=$GITHUB_SHA \
--version-note=$GITHUB_SHA \
--extra-options="--setting template_debug 1" \
--service=datasette-docs-latest

View file

@ -0,0 +1,16 @@
name: Read the Docs Pull Request Preview
on:
pull_request_target:
types:
- opened
permissions:
pull-requests: write
jobs:
documentation-links:
runs-on: ubuntu-latest
steps:
- uses: readthedocs/actions/preview@v1
with:
project-slug: "datasette"

25
.github/workflows/prettier.yml vendored Normal file
View file

@ -0,0 +1,25 @@
name: Check JavaScript for conformance with Prettier
on: [push]
permissions:
contents: read
jobs:
prettier:
runs-on: ubuntu-latest
steps:
- name: Check out repo
uses: actions/checkout@v4
- uses: actions/cache@v4
name: Configure npm caching
with:
path: ~/.npm
key: ${{ runner.OS }}-npm-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.OS }}-npm-
- name: Install dependencies
run: npm ci
- name: Run prettier
run: |-
npm run prettier -- --check

109
.github/workflows/publish.yml vendored Normal file
View file

@ -0,0 +1,109 @@
name: Publish Python Package
on:
release:
types: [created]
permissions:
contents: read
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v6
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: pyproject.toml
- name: Install dependencies
run: |
pip install -e '.[test]'
- name: Run tests
run: |
pytest
deploy:
runs-on: ubuntu-latest
needs: [test]
environment: release
permissions:
id-token: write
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: '3.13'
cache: pip
cache-dependency-path: pyproject.toml
- name: Install dependencies
run: |
pip install setuptools wheel build
- name: Build
run: |
python -m build
- name: Publish
uses: pypa/gh-action-pypi-publish@release/v1
deploy_static_docs:
runs-on: ubuntu-latest
needs: [deploy]
if: "!github.event.release.prerelease"
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: '3.10'
cache: pip
cache-dependency-path: pyproject.toml
- name: Install dependencies
run: |
python -m pip install -e .[docs]
python -m pip install sphinx-to-sqlite==0.1a1
- name: Build docs.db
run: |-
cd docs
DISABLE_SPHINX_INLINE_TABS=1 sphinx-build -b xml . _build
sphinx-to-sqlite ../docs.db _build
cd ..
- id: auth
name: Authenticate to Google Cloud
uses: google-github-actions/auth@v2
with:
credentials_json: ${{ secrets.GCP_SA_KEY }}
- name: Set up Cloud SDK
uses: google-github-actions/setup-gcloud@v3
- name: Deploy stable-docs.datasette.io to Cloud Run
run: |-
gcloud config set run/region us-central1
gcloud config set project datasette-222320
datasette publish cloudrun docs.db \
--service=datasette-docs-stable
deploy_docker:
runs-on: ubuntu-latest
needs: [deploy]
if: "!github.event.release.prerelease"
steps:
- uses: actions/checkout@v4
- name: Build and push to Docker Hub
env:
DOCKER_USER: ${{ secrets.DOCKER_USER }}
DOCKER_PASS: ${{ secrets.DOCKER_PASS }}
run: |-
sleep 60 # Give PyPI time to make the new release available
docker login -u $DOCKER_USER -p $DOCKER_PASS
export REPO=datasetteproject/datasette
docker build -f Dockerfile \
-t $REPO:${GITHUB_REF#refs/tags/} \
--build-arg VERSION=${GITHUB_REF#refs/tags/} .
docker tag $REPO:${GITHUB_REF#refs/tags/} $REPO:latest
docker push $REPO:${GITHUB_REF#refs/tags/}
docker push $REPO:latest

28
.github/workflows/push_docker_tag.yml vendored Normal file
View file

@ -0,0 +1,28 @@
name: Push specific Docker tag
on:
workflow_dispatch:
inputs:
version_tag:
description: Tag to build and push
permissions:
contents: read
jobs:
deploy_docker:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Build and push to Docker Hub
env:
DOCKER_USER: ${{ secrets.DOCKER_USER }}
DOCKER_PASS: ${{ secrets.DOCKER_PASS }}
VERSION_TAG: ${{ github.event.inputs.version_tag }}
run: |-
docker login -u $DOCKER_USER -p $DOCKER_PASS
export REPO=datasetteproject/datasette
docker build -f Dockerfile \
-t $REPO:${VERSION_TAG} \
--build-arg VERSION=${VERSION_TAG} .
docker push $REPO:${VERSION_TAG}

27
.github/workflows/spellcheck.yml vendored Normal file
View file

@ -0,0 +1,27 @@
name: Check spelling in documentation
on: [push, pull_request]
permissions:
contents: read
jobs:
spellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: '3.11'
cache: 'pip'
cache-dependency-path: '**/pyproject.toml'
- name: Install dependencies
run: |
pip install -e '.[docs]'
- name: Check spelling
run: |
codespell README.md --ignore-words docs/codespell-ignore-words.txt
codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt
codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt
codespell tests --ignore-words docs/codespell-ignore-words.txt

76
.github/workflows/stable-docs.yml vendored Normal file
View file

@ -0,0 +1,76 @@
name: Update Stable Docs
on:
release:
types: [published]
push:
branches:
- main
permissions:
contents: write
jobs:
update_stable_docs:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v5
with:
fetch-depth: 0 # We need all commits to find docs/ changes
- name: Set up Git user
run: |
git config user.name "Automated"
git config user.email "actions@users.noreply.github.com"
- name: Create stable branch if it does not yet exist
run: |
if ! git ls-remote --heads origin stable | grep -qE '\bstable\b'; then
# Make sure we have all tags locally
git fetch --tags --quiet
# Latest tag that is just numbers and dots (optionally prefixed with 'v')
# e.g., 0.65.2 or v0.65.2 — excludes 1.0a20, 1.0-rc1, etc.
LATEST_RELEASE=$(
git tag -l --sort=-v:refname \
| grep -E '^v?[0-9]+(\.[0-9]+){1,3}$' \
| head -n1
)
git checkout -b stable
# If there are any stable releases, copy docs/ from the most recent
if [ -n "$LATEST_RELEASE" ]; then
rm -rf docs/
git checkout "$LATEST_RELEASE" -- docs/ || true
fi
git commit -m "Populate docs/ from $LATEST_RELEASE" || echo "No changes"
git push -u origin stable
fi
- name: Handle Release
if: github.event_name == 'release' && !github.event.release.prerelease
run: |
git fetch --all
git checkout stable
git reset --hard ${GITHUB_REF#refs/tags/}
git push origin stable --force
- name: Handle Commit to Main
if: contains(github.event.head_commit.message, '!stable-docs')
run: |
git fetch origin
git checkout -b stable origin/stable
# Get the list of modified files in docs/ from the current commit
FILES=$(git diff-tree --no-commit-id --name-only -r ${{ github.sha }} -- docs/)
# Check if the list of files is non-empty
if [[ -n "$FILES" ]]; then
# Checkout those files to the stable branch to over-write with their contents
for FILE in $FILES; do
git checkout ${{ github.sha }} -- $FILE
done
git add docs/
git commit -m "Doc changes from ${{ github.sha }}"
git push origin stable
else
echo "No changes to docs/ in this commit."
exit 0
fi

40
.github/workflows/test-coverage.yml vendored Normal file
View file

@ -0,0 +1,40 @@
name: Calculate test coverage
on:
push:
branches:
- main
pull_request:
branches:
- main
permissions:
contents: read
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Check out datasette
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: '3.12'
cache: 'pip'
cache-dependency-path: '**/pyproject.toml'
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip
python -m pip install -e .[test]
python -m pip install pytest-cov
- name: Run tests
run: |-
ls -lah
cat .coveragerc
pytest -m "not serial" --cov=datasette --cov-config=.coveragerc --cov-report xml:coverage.xml --cov-report term -x
ls -lah
- name: Upload coverage report
uses: codecov/codecov-action@v1
with:
token: ${{ secrets.CODECOV_TOKEN }}
file: coverage.xml

33
.github/workflows/test-pyodide.yml vendored Normal file
View file

@ -0,0 +1,33 @@
name: Test in Pyodide with shot-scraper
on:
push:
pull_request:
workflow_dispatch:
permissions:
contents: read
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.10
uses: actions/setup-python@v6
with:
python-version: "3.10"
cache: 'pip'
cache-dependency-path: '**/pyproject.toml'
- name: Cache Playwright browsers
uses: actions/cache@v4
with:
path: ~/.cache/ms-playwright/
key: ${{ runner.os }}-browsers
- name: Install Playwright dependencies
run: |
pip install shot-scraper build
shot-scraper install
- name: Run test
run: |
./test-in-pyodide-with-shot-scraper.sh

View file

@ -0,0 +1,53 @@
name: Test SQLite versions
on: [push, pull_request]
permissions:
contents: read
jobs:
test:
runs-on: ${{ matrix.platform }}
continue-on-error: true
strategy:
matrix:
platform: [ubuntu-latest]
python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"]
sqlite-version: [
#"3", # latest version
"3.46",
#"3.45",
#"3.27",
#"3.26",
"3.25",
#"3.25.3", # 2018-09-25, window functions breaks test_upsert for some reason on 3.10, skip for now
#"3.24", # 2018-06-04, added UPSERT support
#"3.23.1" # 2018-04-10, before UPSERT
]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v6
with:
python-version: ${{ matrix.python-version }}
allow-prereleases: true
cache: pip
cache-dependency-path: pyproject.toml
- name: Set up SQLite ${{ matrix.sqlite-version }}
uses: asg017/sqlite-versions@71ea0de37ae739c33e447af91ba71dda8fcf22e6
with:
version: ${{ matrix.sqlite-version }}
cflags: "-DSQLITE_ENABLE_DESERIALIZE -DSQLITE_ENABLE_FTS5 -DSQLITE_ENABLE_FTS4 -DSQLITE_ENABLE_FTS3_PARENTHESIS -DSQLITE_ENABLE_RTREE -DSQLITE_ENABLE_JSON1"
- run: python3 -c "import sqlite3; print(sqlite3.sqlite_version)"
- run: echo $LD_LIBRARY_PATH
- name: Build extension for --load-extension test
run: |-
(cd tests && gcc ext.c -fPIC -shared -o ext.so)
- name: Install dependencies
run: |
pip install -e '.[test]'
pip freeze
- name: Run tests
run: |
pytest -n auto -m "not serial"
pytest -m "serial"

51
.github/workflows/test.yml vendored Normal file
View file

@ -0,0 +1,51 @@
name: Test
on: [push, pull_request]
permissions:
contents: read
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.11", "3.12", "3.13", "3.14"]
steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v6
with:
python-version: ${{ matrix.python-version }}
allow-prereleases: true
cache: pip
cache-dependency-path: pyproject.toml
- name: Build extension for --load-extension test
run: |-
(cd tests && gcc ext.c -fPIC -shared -o ext.so)
- name: Install dependencies
run: |
pip install -e '.[test]'
pip freeze
- name: Run tests
run: |
pytest -n auto -m "not serial"
pytest -m "serial"
# And the test that exceeds a localhost HTTPS server
tests/test_datasette_https_server.sh
- name: Install docs dependencies
run: |
pip install -e '.[docs]'
- name: Black
run: black --check .
- name: Check if cog needs to be run
run: |
cog --check docs/*.rst
- name: Check if blacken-docs needs to be run
run: |
# This fails on syntax errors, or a diff was applied
blacken-docs -l 60 docs/*.rst
- name: Test DATASETTE_LOAD_PLUGINS
run: |
pip install datasette-init datasette-json-html
tests/test-datasette-load-plugins.sh

15
.github/workflows/tmate-mac.yml vendored Normal file
View file

@ -0,0 +1,15 @@
name: tmate session mac
on:
workflow_dispatch:
permissions:
contents: read
jobs:
build:
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- name: Setup tmate session
uses: mxschmitt/action-tmate@v3

18
.github/workflows/tmate.yml vendored Normal file
View file

@ -0,0 +1,18 @@
name: tmate session
on:
workflow_dispatch:
permissions:
contents: read
models: read
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup tmate session
uses: mxschmitt/action-tmate@v3
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

11
.gitignore vendored
View file

@ -5,6 +5,9 @@ scratchpad
.vscode .vscode
uv.lock
data.db
# We don't use Pipfile, so ignore them # We don't use Pipfile, so ignore them
Pipfile Pipfile
Pipfile.lock Pipfile.lock
@ -116,3 +119,11 @@ ENV/
# macOS files # macOS files
.DS_Store .DS_Store
node_modules
.*.swp
# In case someone compiled tests/ext.c for test_load_extensions, don't
# include it in source control.
tests/*.dylib
tests/*.so
tests/*.dll

4
.prettierrc Normal file
View file

@ -0,0 +1,4 @@
{
"tabWidth": 2,
"useTabs": false
}

16
.readthedocs.yaml Normal file
View file

@ -0,0 +1,16 @@
version: 2
build:
os: ubuntu-20.04
tools:
python: "3.11"
sphinx:
configuration: docs/conf.py
python:
install:
- method: pip
path: .
extra_requirements:
- docs

View file

@ -1,52 +0,0 @@
language: python
dist: xenial
# 3.6 is listed first so it gets used for the later build stages
python:
- "3.6"
- "3.7"
- "3.5"
# Executed for 3.5 AND 3.5 as the first "test" stage:
script:
- pip install -U pip wheel
- pip install .[test]
- pytest
cache:
directories:
- $HOME/.cache/pip
# This defines further stages that execute after the tests
jobs:
include:
- stage: deploy latest.datasette.io
if: branch = master AND type = push
script:
- pip install .[test]
- npm install -g now
- python tests/fixtures.py fixtures.db fixtures.json
- export ALIAS=`echo $TRAVIS_COMMIT | cut -c 1-7`
- datasette publish nowv1 fixtures.db -m fixtures.json --token=$NOW_TOKEN --branch=$TRAVIS_COMMIT --version-note=$TRAVIS_COMMIT --name=datasette-latest-$ALIAS --alias=latest.datasette.io --alias=$ALIAS.datasette.io
- stage: release tagged version
if: tag IS present
python: 3.6
script:
- npm install -g now
- export ALIAS=`echo $TRAVIS_COMMIT | cut -c 1-7`
- export TAG=`echo $TRAVIS_TAG | sed 's/\./-/g' | sed 's/.*/v&/'`
- now alias $ALIAS.datasette.io $TAG.datasette.io --token=$NOW_TOKEN
# Build and release to Docker Hub
- docker login -u $DOCKER_USER -p $DOCKER_PASS
- export REPO=datasetteproject/datasette
- docker build -f Dockerfile -t $REPO:$TRAVIS_TAG .
- docker tag $REPO:$TRAVIS_TAG $REPO:latest
- docker push $REPO
deploy:
- provider: pypi
user: simonw
distributions: bdist_wheel
password: ${PYPI_PASSWORD}
on:
branch: master
tags: true

128
CODE_OF_CONDUCT.md Normal file
View file

@ -0,0 +1,128 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
`swillison+datasette-code-of-conduct@gmail.com`.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.

View file

@ -1,42 +1,18 @@
FROM python:3.7.2-slim-stretch as build FROM python:3.11.0-slim-bullseye as build
# Setup build dependencies # Version of Datasette to install, e.g. 0.55
RUN apt update \ # docker build . -t datasette --build-arg VERSION=0.55
&& apt install -y python3-dev build-essential wget libxml2-dev libproj-dev libgeos-dev libsqlite3-dev zlib1g-dev pkg-config git \ ARG VERSION
&& apt clean
RUN apt-get update && \
apt-get install -y --no-install-recommends libsqlite3-mod-spatialite && \
apt clean && \
rm -rf /var/lib/apt && \
rm -rf /var/lib/dpkg/info/*
RUN wget "https://www.sqlite.org/2018/sqlite-autoconf-3260000.tar.gz" && tar xzf sqlite-autoconf-3260000.tar.gz \ RUN pip install https://github.com/simonw/datasette/archive/refs/tags/${VERSION}.zip && \
&& cd sqlite-autoconf-3260000 && ./configure --disable-static --enable-fts5 --enable-json1 CFLAGS="-g -O2 -DSQLITE_ENABLE_FTS3=1 -DSQLITE_ENABLE_FTS4=1 -DSQLITE_ENABLE_RTREE=1 -DSQLITE_ENABLE_JSON1" \ find /usr/local/lib -name '__pycache__' | xargs rm -r && \
&& make && make install rm -rf /root/.cache/pip
RUN wget "https://www.gaia-gis.it/gaia-sins/freexl-1.0.5.tar.gz" && tar zxf freexl-1.0.5.tar.gz \
&& cd freexl-1.0.5 && ./configure && make && make install
RUN wget "https://www.gaia-gis.it/gaia-sins/libspatialite-4.4.0-RC0.tar.gz" && tar zxf libspatialite-4.4.0-RC0.tar.gz \
&& cd libspatialite-4.4.0-RC0 && ./configure && make && make install
RUN wget "https://www.gaia-gis.it/gaia-sins/readosm-1.1.0.tar.gz" && tar zxf readosm-1.1.0.tar.gz && cd readosm-1.1.0 && ./configure && make && make install
RUN wget "https://www.gaia-gis.it/gaia-sins/spatialite-tools-4.4.0-RC0.tar.gz" && tar zxf spatialite-tools-4.4.0-RC0.tar.gz \
&& cd spatialite-tools-4.4.0-RC0 && ./configure && make && make install
# Add local code to the image instead of fetching from pypi.
COPY . /datasette
RUN pip install /datasette
FROM python:3.7.2-slim-stretch
# Copy python dependencies and spatialite libraries
COPY --from=build /usr/local/lib/ /usr/local/lib/
# Copy executables
COPY --from=build /usr/local/bin /usr/local/bin
# Copy spatial extensions
COPY --from=build /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu
ENV LD_LIBRARY_PATH=/usr/local/lib
EXPOSE 8001 EXPOSE 8001
CMD ["datasette"] CMD ["datasette"]

56
Justfile Normal file
View file

@ -0,0 +1,56 @@
export DATASETTE_SECRET := "not_a_secret"
# Run tests and linters
@default: test lint
# Setup project
@init:
uv sync --extra test --extra docs
# Run pytest with supplied options
@test *options: init
uv run pytest -n auto {{options}}
@codespell:
uv run codespell README.md --ignore-words docs/codespell-ignore-words.txt
uv run codespell docs/*.rst --ignore-words docs/codespell-ignore-words.txt
uv run codespell datasette -S datasette/static --ignore-words docs/codespell-ignore-words.txt
uv run codespell tests --ignore-words docs/codespell-ignore-words.txt
# Run linters: black, flake8, mypy, cog
@lint: codespell
uv run black . --check
uv run flake8
uv run --extra test cog --check README.md docs/*.rst
# Rebuild docs with cog
@cog:
uv run --extra test cog -r README.md docs/*.rst
# Serve live docs on localhost:8000
@docs: cog blacken-docs
uv run --extra docs make -C docs livehtml
# Build docs as static HTML
@docs-build: cog blacken-docs
rm -rf docs/_build && cd docs && uv run make html
# Apply Black
@black:
uv run black .
# Apply blacken-docs
@blacken-docs:
uv run blacken-docs -l 60 docs/*.rst
# Apply prettier
@prettier:
npm run fix
# Format code with both black and prettier
@format: black prettier blacken-docs
@serve *options:
uv run sqlite-utils create-database data.db
uv run sqlite-utils create-table data.db docs id integer title text --pk id --ignore
uv run python -m datasette data.db --root --reload {{options}}

View file

@ -1,3 +1,5 @@
recursive-include datasette/static * recursive-include datasette/static *
recursive-include datasette/templates *
include versioneer.py include versioneer.py
include datasette/_version.py include datasette/_version.py
include LICENSE

118
README.md
View file

@ -1,69 +1,42 @@
# Datasette <img src="https://datasette.io/static/datasette-logo.svg" alt="Datasette">
[![PyPI](https://img.shields.io/pypi/v/datasette.svg)](https://pypi.org/project/datasette/) [![PyPI](https://img.shields.io/pypi/v/datasette.svg)](https://pypi.org/project/datasette/)
[![Travis CI](https://travis-ci.org/simonw/datasette.svg?branch=master)](https://travis-ci.org/simonw/datasette) [![Changelog](https://img.shields.io/github/v/release/simonw/datasette?label=changelog)](https://docs.datasette.io/en/latest/changelog.html)
[![Documentation Status](https://readthedocs.org/projects/datasette/badge/?version=latest)](http://datasette.readthedocs.io/en/latest/?badge=latest) [![Python 3.x](https://img.shields.io/pypi/pyversions/datasette.svg?logo=python&logoColor=white)](https://pypi.org/project/datasette/)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette/blob/master/LICENSE) [![Tests](https://github.com/simonw/datasette/workflows/Test/badge.svg)](https://github.com/simonw/datasette/actions?query=workflow%3ATest)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://black.readthedocs.io/en/stable/) [![Documentation Status](https://readthedocs.org/projects/datasette/badge/?version=latest)](https://docs.datasette.io/en/latest/?badge=latest)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/datasette/blob/main/LICENSE)
[![docker: datasette](https://img.shields.io/badge/docker-datasette-blue)](https://hub.docker.com/r/datasetteproject/datasette)
[![discord](https://img.shields.io/discord/823971286308356157?label=discord)](https://datasette.io/discord)
*A tool for exploring and publishing data* *An open source multi-tool for exploring and publishing data*
Datasette is a tool for exploring and publishing data. It helps people take data of any shape or size and publish that as an interactive, explorable website and accompanying API. Datasette is a tool for exploring and publishing data. It helps people take data of any shape or size and publish that as an interactive, explorable website and accompanying API.
Datasette is aimed at data journalists, museum curators, archivists, local governments and anyone else who has data that they wish to share with the world. Datasette is aimed at data journalists, museum curators, archivists, local governments, scientists, researchers and anyone else who has data that they wish to share with the world.
[Explore a demo](https://fivethirtyeight.datasettes.com/fivethirtyeight), watch [a video about the project](https://www.youtube.com/watch?v=pTr1uLQTJNE) or try it out by [uploading and publishing your own CSV data](https://simonwillison.net/2019/Apr/23/datasette-glitch/). [Explore a demo](https://datasette.io/global-power-plants/global-power-plants), watch [a video about the project](https://simonwillison.net/2021/Feb/7/video/) or try it out [on GitHub Codespaces](https://github.com/datasette/datasette-studio).
* Comprehensive documentation: http://datasette.readthedocs.io/ * [datasette.io](https://datasette.io/) is the official project website
* Examples: https://github.com/simonw/datasette/wiki/Datasettes * Latest [Datasette News](https://datasette.io/news)
* Live demo of current master: https://latest.datasette.io/ * Comprehensive documentation: https://docs.datasette.io/
* Examples: https://datasette.io/examples
* Live demo of current `main` branch: https://latest.datasette.io/
* Questions, feedback or want to talk about the project? Join our [Discord](https://datasette.io/discord)
## News Want to stay up-to-date with the project? Subscribe to the [Datasette newsletter](https://datasette.substack.com/) for tips, tricks and news on what's new in the Datasette ecosystem.
* 23rd June 2019: [Porting Datasette to ASGI, and Turtles all the way down](https://simonwillison.net/2019/Jun/23/datasette-asgi/)
* 21st May 2019: The anonymized raw data from [the Stack Overflow Developer Survey 2019](https://stackoverflow.blog/2019/05/21/public-data-release-of-stack-overflows-2019-developer-survey/) has been [published in partnership with Glitch](https://glitch.com/culture/discover-insights-explore-developer-survey-results-2019/), powered by Datasette.
* 19th May 2019: [Datasette 0.28](https://datasette.readthedocs.io/en/stable/changelog.html#v0-28) - a salmagundi of new features!
* No longer immutable! Datasette now supports [databases that change](https://datasette.readthedocs.io/en/stable/changelog.html#supporting-databases-that-change).
* [Faceting improvements](https://datasette.readthedocs.io/en/stable/changelog.html#faceting-improvements-and-faceting-plugins) including facet-by-JSON-array and the ability to define custom faceting using plugins.
* [datasette publish cloudrun](https://datasette.readthedocs.io/en/stable/changelog.html#datasette-publish-cloudrun) lets you publish databases to Google's new Cloud Run hosting service.
* New [register_output_renderer](https://datasette.readthedocs.io/en/stable/changelog.html#register-output-renderer-plugins) plugin hook for adding custom output extensions to Datasette in addition to the default `.json` and `.csv`.
* Dozens of other smaller features and tweaks - see [the release notes](https://datasette.readthedocs.io/en/stable/changelog.html#v0-28) for full details.
* Read more about this release here: [Datasette 0.28—and why master should always be releasable](https://simonwillison.net/2019/May/19/datasette-0-28/)
* 24th February 2019: [
sqlite-utils: a Python library and CLI tool for building SQLite databases](https://simonwillison.net/2019/Feb/25/sqlite-utils/) - a partner tool for easily creating SQLite databases for use with Datasette.
* 31st Janary 2019: [Datasette 0.27](https://datasette.readthedocs.io/en/latest/changelog.html#v0-27) - `datasette plugins` command, newline-delimited JSON export option, new documentation on [The Datasette Ecosystem](https://datasette.readthedocs.io/en/latest/ecosystem.html).
* 10th January 2019: [Datasette 0.26.1](http://datasette.readthedocs.io/en/latest/changelog.html#v0-26-1) - SQLite upgrade in Docker image, `/-/versions` now shows SQLite compile options.
* 2nd January 2019: [Datasette 0.26](http://datasette.readthedocs.io/en/latest/changelog.html#v0-26) - minor bug fixes, `datasette publish now --alias` argument.
* 18th December 2018: [Fast Autocomplete Search for Your Website](https://24ways.org/2018/fast-autocomplete-search-for-your-website/) - a new tutorial on using Datasette to build a JavaScript autocomplete search engine.
* 3rd October 2018: [The interesting ideas in Datasette](https://simonwillison.net/2018/Oct/4/datasette-ideas/) - a write-up of some of the less obvious interesting ideas embedded in the Datasette project.
* 19th September 2018: [Datasette 0.25](http://datasette.readthedocs.io/en/latest/changelog.html#v0-25) - New plugin hooks, improved database view support and an easier way to use more recent versions of SQLite.
* 23rd July 2018: [Datasette 0.24](http://datasette.readthedocs.io/en/latest/changelog.html#v0-24) - a number of small new features
* 29th June 2018: [datasette-vega](https://github.com/simonw/datasette-vega), a new plugin for visualizing data as bar, line or scatter charts
* 21st June 2018: [Datasette 0.23.1](http://datasette.readthedocs.io/en/latest/changelog.html#v0-23-1) - minor bug fixes
* 18th June 2018: [Datasette 0.23: CSV, SpatiaLite and more](http://datasette.readthedocs.io/en/latest/changelog.html#v0-23) - CSV export, foreign key expansion in JSON and CSV, new config options, improved support for SpatiaLite and a bunch of other improvements
* 23rd May 2018: [Datasette 0.22.1 bugfix](https://github.com/simonw/datasette/releases/tag/0.22.1) plus we now use [versioneer](https://github.com/warner/python-versioneer)
* 20th May 2018: [Datasette 0.22: Datasette Facets](https://simonwillison.net/2018/May/20/datasette-facets)
* 5th May 2018: [Datasette 0.21: New _shape=, new _size=, search within columns](https://github.com/simonw/datasette/releases/tag/0.21)
* 25th April 2018: [Exploring the UK Register of Members Interests with SQL and Datasette](https://simonwillison.net/2018/Apr/25/register-members-interests/) - a tutorial describing how [register-of-members-interests.datasettes.com](https://register-of-members-interests.datasettes.com/) was built ([source code here](https://github.com/simonw/register-of-members-interests))
* 20th April 2018: [Datasette plugins, and building a clustered map visualization](https://simonwillison.net/2018/Apr/20/datasette-plugins/) - introducing Datasette's new plugin system and [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/), a plugin for visualizing data on a map
* 20th April 2018: [Datasette 0.20: static assets and templates for plugins](https://github.com/simonw/datasette/releases/tag/0.20)
* 16th April 2018: [Datasette 0.19: plugins preview](https://github.com/simonw/datasette/releases/tag/0.19)
* 14th April 2018: [Datasette 0.18: units](https://github.com/simonw/datasette/releases/tag/0.18)
* 9th April 2018: [Datasette 0.15: sort by column](https://github.com/simonw/datasette/releases/tag/0.15)
* 28th March 2018: [Baltimore Sun Public Salary Records](https://simonwillison.net/2018/Mar/28/datasette-in-the-wild/) - a data journalism project from the Baltimore Sun powered by Datasette - source code [is available here](https://github.com/baltimore-sun-data/salaries-datasette)
* 27th March 2018: [Cloud-first: Rapid webapp deployment using containers](https://wwwf.imperial.ac.uk/blog/research-software-engineering/2018/03/27/cloud-first-rapid-webapp-deployment-using-containers/) - a tutorial covering deploying Datasette using Microsoft Azure by the Research Software Engineering team at Imperial College London
* 28th January 2018: [Analyzing my Twitter followers with Datasette](https://simonwillison.net/2018/Jan/28/analyzing-my-twitter-followers/) - a tutorial on using Datasette to analyze follower data pulled from the Twitter API
* 17th January 2018: [Datasette Publish: a web app for publishing CSV files as an online database](https://simonwillison.net/2018/Jan/17/datasette-publish/)
* 12th December 2017: [Building a location to time zone API with SpatiaLite, OpenStreetMap and Datasette](https://simonwillison.net/2017/Dec/12/building-a-location-time-zone-api/)
* 9th December 2017: [Datasette 0.14: customization edition](https://github.com/simonw/datasette/releases/tag/0.14)
* 25th November 2017: [New in Datasette: filters, foreign keys and search](https://simonwillison.net/2017/Nov/25/new-in-datasette/)
* 13th November 2017: [Datasette: instantly create and publish an API for your SQLite databases](https://simonwillison.net/2017/Nov/13/datasette/)
## Installation ## Installation
pip3 install datasette If you are on a Mac, [Homebrew](https://brew.sh/) is the easiest way to install Datasette:
Datasette requires Python 3.5 or higher. We also have [detailed installation instructions](https://datasette.readthedocs.io/en/stable/installation.html) covering other options such as Docker. brew install datasette
You can also install it using `pip` or `pipx`:
pip install datasette
Datasette requires Python 3.8 or higher. We also have [detailed installation instructions](https://docs.datasette.io/en/stable/installation.html) covering other options such as Docker.
## Basic usage ## Basic usage
@ -75,41 +48,12 @@ This will start a web server on port 8001 - visit http://localhost:8001/ to acce
Use Chrome on OS X? You can run datasette against your browser history like so: Use Chrome on OS X? You can run datasette against your browser history like so:
datasette ~/Library/Application\ Support/Google/Chrome/Default/History datasette ~/Library/Application\ Support/Google/Chrome/Default/History --nolock
Now visiting http://localhost:8001/History/downloads will show you a web interface to browse your downloads data: Now visiting http://localhost:8001/History/downloads will show you a web interface to browse your downloads data:
![Downloads table rendered by datasette](https://static.simonwillison.net/static/2017/datasette-downloads.png) ![Downloads table rendered by datasette](https://static.simonwillison.net/static/2017/datasette-downloads.png)
## datasette serve options
$ datasette serve --help
Usage: datasette serve [OPTIONS] [FILES]...
Serve up specified SQLite database files with a web UI
Options:
-i, --immutable PATH Database files to open in immutable mode
-h, --host TEXT host for server, defaults to 127.0.0.1
-p, --port INTEGER port for server, defaults to 8001
--debug Enable debug mode - useful for development
--reload Automatically reload if database or code change detected -
useful for development
--cors Enable CORS by serving Access-Control-Allow-Origin: *
--load-extension PATH Path to a SQLite extension to load
--inspect-file TEXT Path to JSON file created using "datasette inspect"
-m, --metadata FILENAME Path to JSON file containing license/source metadata
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static STATIC MOUNT mountpoint:path-to-directory for serving static files
--memory Make :memory: database available
--config CONFIG Set config option using configname:value
datasette.readthedocs.io/en/latest/config.html
--version-note TEXT Additional note to show on /-/versions
--help-config Show available config options
--help Show this message and exit.
## metadata.json ## metadata.json
If you want to include licensing and source information in the generated datasette website you can do so using a JSON file that looks something like this: If you want to include licensing and source information in the generated datasette website you can do so using a JSON file that looks something like this:
@ -130,7 +74,7 @@ The license and source information will be displayed on the index page and in th
## datasette publish ## datasette publish
If you have [Heroku](https://heroku.com/), [Google Cloud Run](https://cloud.google.com/run/) or [Zeit Now v1](https://zeit.co/now) configured, Datasette can deploy one or more SQLite databases to the internet with a single command: If you have [Heroku](https://heroku.com/) or [Google Cloud Run](https://cloud.google.com/run/) configured, Datasette can deploy one or more SQLite databases to the internet with a single command:
datasette publish heroku database.db datasette publish heroku database.db
@ -140,4 +84,8 @@ Or:
This will create a docker image containing both the datasette application and the specified SQLite database files. It will then deploy that image to Heroku or Cloud Run and give you a URL to access the resulting website and API. This will create a docker image containing both the datasette application and the specified SQLite database files. It will then deploy that image to Heroku or Cloud Run and give you a URL to access the resulting website and API.
See [Publishing data](https://datasette.readthedocs.io/en/stable/publish.html) in the documentation for more details. See [Publishing data](https://docs.datasette.io/en/stable/publish.html) in the documentation for more details.
## Datasette Lite
[Datasette Lite](https://lite.datasette.io/) is Datasette packaged using WebAssembly so that it runs entirely in your browser, no Python web application server required. Read more about that in the [Datasette Lite documentation](https://github.com/simonw/datasette-lite/blob/main/README.md).

View file

@ -1 +0,0 @@
theme: jekyll-theme-architect

8
codecov.yml Normal file
View file

@ -0,0 +1,8 @@
coverage:
status:
project:
default:
informational: true
patch:
default:
informational: true

View file

@ -1,3 +1,8 @@
from datasette.permissions import Permission # noqa
from datasette.version import __version_info__, __version__ # noqa from datasette.version import __version_info__, __version__ # noqa
from datasette.events import Event # noqa
from datasette.utils.asgi import Forbidden, NotFound, Request, Response # noqa
from datasette.utils import actor_matches_allow # noqa
from datasette.views import Context # noqa
from .hookspecs import hookimpl # noqa from .hookspecs import hookimpl # noqa
from .hookspecs import hookspec # noqa from .hookspecs import hookspec # noqa

4
datasette/__main__.py Normal file
View file

@ -0,0 +1,4 @@
from datasette.cli import cli
if __name__ == "__main__":
cli()

View file

@ -1,556 +0,0 @@
# This file helps to compute a version number in source trees obtained from
# git-archive tarball (such as those provided by githubs download-from-tag
# feature). Distribution tarballs (built by setup.py sdist) and build
# directories (produced by setup.py build) will contain a much shorter file
# that just contains the computed version number.
# This file is released into the public domain. Generated by
# versioneer-0.18 (https://github.com/warner/python-versioneer)
"""Git implementation of _version.py."""
import errno
import os
import re
import subprocess
import sys
def get_keywords():
"""Get the keywords needed to look up the version information."""
# these strings will be replaced by git during git-archive.
# setup.py/versioneer.py will grep for the variable names, so they must
# each be defined on a line of their own. _version.py will just call
# get_keywords().
git_refnames = "$Format:%d$"
git_full = "$Format:%H$"
git_date = "$Format:%ci$"
keywords = {"refnames": git_refnames, "full": git_full, "date": git_date}
return keywords
class VersioneerConfig:
"""Container for Versioneer configuration parameters."""
def get_config():
"""Create, populate and return the VersioneerConfig() object."""
# these strings are filled in when 'setup.py versioneer' creates
# _version.py
cfg = VersioneerConfig()
cfg.VCS = "git"
cfg.style = "pep440"
cfg.tag_prefix = ""
cfg.parentdir_prefix = "datasette-"
cfg.versionfile_source = "datasette/_version.py"
cfg.verbose = False
return cfg
class NotThisMethod(Exception):
"""Exception raised if a method is not valid for the current scenario."""
LONG_VERSION_PY = {}
HANDLERS = {}
def register_vcs_handler(vcs, method): # decorator
"""Decorator to mark a method as the handler for a particular VCS."""
def decorate(f):
"""Store f in HANDLERS[vcs][method]."""
if vcs not in HANDLERS:
HANDLERS[vcs] = {}
HANDLERS[vcs][method] = f
return f
return decorate
def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, env=None):
"""Call the given command(s)."""
assert isinstance(commands, list)
p = None
for c in commands:
try:
dispcmd = str([c] + args)
# remember shell=False, so use git.cmd on windows, not just git
p = subprocess.Popen(
[c] + args,
cwd=cwd,
env=env,
stdout=subprocess.PIPE,
stderr=(subprocess.PIPE if hide_stderr else None),
)
break
except EnvironmentError:
e = sys.exc_info()[1]
if e.errno == errno.ENOENT:
continue
if verbose:
print("unable to run %s" % dispcmd)
print(e)
return None, None
else:
if verbose:
print("unable to find command, tried %s" % (commands,))
return None, None
stdout = p.communicate()[0].strip()
if sys.version_info[0] >= 3:
stdout = stdout.decode()
if p.returncode != 0:
if verbose:
print("unable to run %s (error)" % dispcmd)
print("stdout was %s" % stdout)
return None, p.returncode
return stdout, p.returncode
def versions_from_parentdir(parentdir_prefix, root, verbose):
"""Try to determine the version from the parent directory name.
Source tarballs conventionally unpack into a directory that includes both
the project name and a version string. We will also support searching up
two directory levels for an appropriately named parent directory
"""
rootdirs = []
for i in range(3):
dirname = os.path.basename(root)
if dirname.startswith(parentdir_prefix):
return {
"version": dirname[len(parentdir_prefix) :],
"full-revisionid": None,
"dirty": False,
"error": None,
"date": None,
}
else:
rootdirs.append(root)
root = os.path.dirname(root) # up a level
if verbose:
print(
"Tried directories %s but none started with prefix %s"
% (str(rootdirs), parentdir_prefix)
)
raise NotThisMethod("rootdir doesn't start with parentdir_prefix")
@register_vcs_handler("git", "get_keywords")
def git_get_keywords(versionfile_abs):
"""Extract version information from the given file."""
# the code embedded in _version.py can just fetch the value of these
# keywords. When used from setup.py, we don't want to import _version.py,
# so we do it with a regexp instead. This function is not used from
# _version.py.
keywords = {}
try:
f = open(versionfile_abs, "r")
for line in f.readlines():
if line.strip().startswith("git_refnames ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["refnames"] = mo.group(1)
if line.strip().startswith("git_full ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["full"] = mo.group(1)
if line.strip().startswith("git_date ="):
mo = re.search(r'=\s*"(.*)"', line)
if mo:
keywords["date"] = mo.group(1)
f.close()
except EnvironmentError:
pass
return keywords
@register_vcs_handler("git", "keywords")
def git_versions_from_keywords(keywords, tag_prefix, verbose):
"""Get version information from git keywords."""
if not keywords:
raise NotThisMethod("no keywords at all, weird")
date = keywords.get("date")
if date is not None:
# git-2.2.0 added "%cI", which expands to an ISO-8601 -compliant
# datestamp. However we prefer "%ci" (which expands to an "ISO-8601
# -like" string, which we must then edit to make compliant), because
# it's been around since git-1.5.3, and it's too difficult to
# discover which version we're using, or to work around using an
# older one.
date = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
refnames = keywords["refnames"].strip()
if refnames.startswith("$Format"):
if verbose:
print("keywords are unexpanded, not using")
raise NotThisMethod("unexpanded keywords, not a git-archive tarball")
refs = set([r.strip() for r in refnames.strip("()").split(",")])
# starting in git-1.8.3, tags are listed as "tag: foo-1.0" instead of
# just "foo-1.0". If we see a "tag: " prefix, prefer those.
TAG = "tag: "
tags = set([r[len(TAG) :] for r in refs if r.startswith(TAG)])
if not tags:
# Either we're using git < 1.8.3, or there really are no tags. We use
# a heuristic: assume all version tags have a digit. The old git %d
# expansion behaves like git log --decorate=short and strips out the
# refs/heads/ and refs/tags/ prefixes that would let us distinguish
# between branches and tags. By ignoring refnames without digits, we
# filter out many common branch names like "release" and
# "stabilization", as well as "HEAD" and "master".
tags = set([r for r in refs if re.search(r"\d", r)])
if verbose:
print("discarding '%s', no digits" % ",".join(refs - tags))
if verbose:
print("likely tags: %s" % ",".join(sorted(tags)))
for ref in sorted(tags):
# sorting will prefer e.g. "2.0" over "2.0rc1"
if ref.startswith(tag_prefix):
r = ref[len(tag_prefix) :]
if verbose:
print("picking %s" % r)
return {
"version": r,
"full-revisionid": keywords["full"].strip(),
"dirty": False,
"error": None,
"date": date,
}
# no suitable tags, so version is "0+unknown", but full hex is still there
if verbose:
print("no suitable tags, using unknown + full revision id")
return {
"version": "0+unknown",
"full-revisionid": keywords["full"].strip(),
"dirty": False,
"error": "no suitable tags",
"date": None,
}
@register_vcs_handler("git", "pieces_from_vcs")
def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
"""Get version from 'git describe' in the root of the source tree.
This only gets called if the git-archive 'subst' keywords were *not*
expanded, and _version.py hasn't already been rewritten with a short
version string, meaning we're inside a checked out source tree.
"""
GITS = ["git"]
if sys.platform == "win32":
GITS = ["git.cmd", "git.exe"]
out, rc = run_command(GITS, ["rev-parse", "--git-dir"], cwd=root, hide_stderr=True)
if rc != 0:
if verbose:
print("Directory %s not under git control" % root)
raise NotThisMethod("'git rev-parse --git-dir' returned error")
# if there is a tag matching tag_prefix, this yields TAG-NUM-gHEX[-dirty]
# if there isn't one, this yields HEX[-dirty] (no NUM)
describe_out, rc = run_command(
GITS,
[
"describe",
"--tags",
"--dirty",
"--always",
"--long",
"--match",
"%s*" % tag_prefix,
],
cwd=root,
)
# --long was added in git-1.5.5
if describe_out is None:
raise NotThisMethod("'git describe' failed")
describe_out = describe_out.strip()
full_out, rc = run_command(GITS, ["rev-parse", "HEAD"], cwd=root)
if full_out is None:
raise NotThisMethod("'git rev-parse' failed")
full_out = full_out.strip()
pieces = {}
pieces["long"] = full_out
pieces["short"] = full_out[:7] # maybe improved later
pieces["error"] = None
# parse describe_out. It will be like TAG-NUM-gHEX[-dirty] or HEX[-dirty]
# TAG might have hyphens.
git_describe = describe_out
# look for -dirty suffix
dirty = git_describe.endswith("-dirty")
pieces["dirty"] = dirty
if dirty:
git_describe = git_describe[: git_describe.rindex("-dirty")]
# now we have TAG-NUM-gHEX or HEX
if "-" in git_describe:
# TAG-NUM-gHEX
mo = re.search(r"^(.+)-(\d+)-g([0-9a-f]+)$", git_describe)
if not mo:
# unparseable. Maybe git-describe is misbehaving?
pieces["error"] = "unable to parse git-describe output: '%s'" % describe_out
return pieces
# tag
full_tag = mo.group(1)
if not full_tag.startswith(tag_prefix):
if verbose:
fmt = "tag '%s' doesn't start with prefix '%s'"
print(fmt % (full_tag, tag_prefix))
pieces["error"] = "tag '%s' doesn't start with prefix '%s'" % (
full_tag,
tag_prefix,
)
return pieces
pieces["closest-tag"] = full_tag[len(tag_prefix) :]
# distance: number of commits since tag
pieces["distance"] = int(mo.group(2))
# commit: short hex revision ID
pieces["short"] = mo.group(3)
else:
# HEX: no tags
pieces["closest-tag"] = None
count_out, rc = run_command(GITS, ["rev-list", "HEAD", "--count"], cwd=root)
pieces["distance"] = int(count_out) # total number of commits
# commit date: see ISO-8601 comment in git_versions_from_keywords()
date = run_command(GITS, ["show", "-s", "--format=%ci", "HEAD"], cwd=root)[
0
].strip()
pieces["date"] = date.strip().replace(" ", "T", 1).replace(" ", "", 1)
return pieces
def plus_or_dot(pieces):
"""Return a + if we don't already have one, else return a ."""
if "+" in pieces.get("closest-tag", ""):
return "."
return "+"
def render_pep440(pieces):
"""Build up version string, with post-release "local version identifier".
Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you
get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty
Exceptions:
1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += plus_or_dot(pieces)
rendered += "%d.g%s" % (pieces["distance"], pieces["short"])
if pieces["dirty"]:
rendered += ".dirty"
else:
# exception #1
rendered = "0+untagged.%d.g%s" % (pieces["distance"], pieces["short"])
if pieces["dirty"]:
rendered += ".dirty"
return rendered
def render_pep440_pre(pieces):
"""TAG[.post.devDISTANCE] -- No -dirty.
Exceptions:
1: no tags. 0.post.devDISTANCE
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"]:
rendered += ".post.dev%d" % pieces["distance"]
else:
# exception #1
rendered = "0.post.dev%d" % pieces["distance"]
return rendered
def render_pep440_post(pieces):
"""TAG[.postDISTANCE[.dev0]+gHEX] .
The ".dev0" means dirty. Note that .dev0 sorts backwards
(a dirty tree will appear "older" than the corresponding clean one),
but you shouldn't be releasing software with -dirty anyways.
Exceptions:
1: no tags. 0.postDISTANCE[.dev0]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += ".post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
rendered += plus_or_dot(pieces)
rendered += "g%s" % pieces["short"]
else:
# exception #1
rendered = "0.post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
rendered += "+g%s" % pieces["short"]
return rendered
def render_pep440_old(pieces):
"""TAG[.postDISTANCE[.dev0]] .
The ".dev0" means dirty.
Eexceptions:
1: no tags. 0.postDISTANCE[.dev0]
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"] or pieces["dirty"]:
rendered += ".post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
else:
# exception #1
rendered = "0.post%d" % pieces["distance"]
if pieces["dirty"]:
rendered += ".dev0"
return rendered
def render_git_describe(pieces):
"""TAG[-DISTANCE-gHEX][-dirty].
Like 'git describe --tags --dirty --always'.
Exceptions:
1: no tags. HEX[-dirty] (note: no 'g' prefix)
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
if pieces["distance"]:
rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
else:
# exception #1
rendered = pieces["short"]
if pieces["dirty"]:
rendered += "-dirty"
return rendered
def render_git_describe_long(pieces):
"""TAG-DISTANCE-gHEX[-dirty].
Like 'git describe --tags --dirty --always -long'.
The distance/hash is unconditional.
Exceptions:
1: no tags. HEX[-dirty] (note: no 'g' prefix)
"""
if pieces["closest-tag"]:
rendered = pieces["closest-tag"]
rendered += "-%d-g%s" % (pieces["distance"], pieces["short"])
else:
# exception #1
rendered = pieces["short"]
if pieces["dirty"]:
rendered += "-dirty"
return rendered
def render(pieces, style):
"""Render the given version pieces into the requested style."""
if pieces["error"]:
return {
"version": "unknown",
"full-revisionid": pieces.get("long"),
"dirty": None,
"error": pieces["error"],
"date": None,
}
if not style or style == "default":
style = "pep440" # the default
if style == "pep440":
rendered = render_pep440(pieces)
elif style == "pep440-pre":
rendered = render_pep440_pre(pieces)
elif style == "pep440-post":
rendered = render_pep440_post(pieces)
elif style == "pep440-old":
rendered = render_pep440_old(pieces)
elif style == "git-describe":
rendered = render_git_describe(pieces)
elif style == "git-describe-long":
rendered = render_git_describe_long(pieces)
else:
raise ValueError("unknown style '%s'" % style)
return {
"version": rendered,
"full-revisionid": pieces["long"],
"dirty": pieces["dirty"],
"error": None,
"date": pieces.get("date"),
}
def get_versions():
"""Get version information or return default if unable to do so."""
# I am in _version.py, which lives at ROOT/VERSIONFILE_SOURCE. If we have
# __file__, we can work backwards from there to the root. Some
# py2exe/bbfreeze/non-CPython implementations don't do __file__, in which
# case we can only use expanded keywords.
cfg = get_config()
verbose = cfg.verbose
try:
return git_versions_from_keywords(get_keywords(), cfg.tag_prefix, verbose)
except NotThisMethod:
pass
try:
root = os.path.realpath(__file__)
# versionfile_source is the relative path from the top of the source
# tree (where the .git directory might live) to this file. Invert
# this to find the root from __file__.
for i in cfg.versionfile_source.split("/"):
root = os.path.dirname(root)
except NameError:
return {
"version": "0+unknown",
"full-revisionid": None,
"dirty": None,
"error": "unable to find root of source tree",
"date": None,
}
try:
pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose)
return render(pieces, cfg.style)
except NotThisMethod:
pass
try:
if cfg.parentdir_prefix:
return versions_from_parentdir(cfg.parentdir_prefix, root, verbose)
except NotThisMethod:
pass
return {
"version": "0+unknown",
"full-revisionid": None,
"dirty": None,
"error": "unable to compute version",
"date": None,
}

View file

@ -0,0 +1,23 @@
from datasette import hookimpl
from itsdangerous import BadSignature
from datasette.utils import baseconv
import time
@hookimpl
def actor_from_request(datasette, request):
if "ds_actor" not in request.cookies:
return None
try:
decoded = datasette.unsign(request.cookies["ds_actor"], "actor")
# If it has "e" and "a" keys process the "e" expiry
if not isinstance(decoded, dict) or "a" not in decoded:
return None
expires_at = decoded.get("e")
if expires_at:
timestamp = int(baseconv.base62.decode(expires_at))
if time.time() > timestamp:
return None
return decoded["a"]
except BadSignature:
return None

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,61 @@
from datasette import hookimpl
from datasette.utils.asgi import Response, BadRequest
from datasette.utils import to_css_class
import hashlib
_BLOB_COLUMN = "_blob_column"
_BLOB_HASH = "_blob_hash"
async def render_blob(datasette, database, rows, columns, request, table, view_name):
if _BLOB_COLUMN not in request.args:
raise BadRequest(f"?{_BLOB_COLUMN}= is required")
blob_column = request.args[_BLOB_COLUMN]
if blob_column not in columns:
raise BadRequest(f"{blob_column} is not a valid column")
# If ?_blob_hash= provided, use that to select the row - otherwise use first row
blob_hash = None
if _BLOB_HASH in request.args:
blob_hash = request.args[_BLOB_HASH]
for row in rows:
value = row[blob_column]
if hashlib.sha256(value).hexdigest() == blob_hash:
break
else:
# Loop did not break
raise BadRequest(
"Link has expired - the requested binary content has changed or could not be found."
)
else:
row = rows[0]
value = row[blob_column]
filename_bits = []
if table:
filename_bits.append(to_css_class(table))
if "pks" in request.url_vars:
filename_bits.append(request.url_vars["pks"])
filename_bits.append(to_css_class(blob_column))
if blob_hash:
filename_bits.append(blob_hash[:6])
filename = "-".join(filename_bits) + ".blob"
headers = {
"X-Content-Type-Options": "nosniff",
"Content-Disposition": f'attachment; filename="{filename}"',
}
return Response(
body=value or b"",
status=200,
headers=headers,
content_type="application/binary",
)
@hookimpl
def register_output_renderer():
return {
"extension": "blob",
"render": render_blob,
"can_render": lambda: False,
}

View file

@ -2,84 +2,156 @@ import asyncio
import uvicorn import uvicorn
import click import click
from click import formatting from click import formatting
from click.types import CompositeParamType
from click_default_group import DefaultGroup from click_default_group import DefaultGroup
import functools
import json import json
import os import os
import pathlib
from runpy import run_module
import shutil import shutil
from subprocess import call from subprocess import call
import sys import sys
from .app import Datasette, DEFAULT_CONFIG, CONFIG_OPTIONS, pm import textwrap
import webbrowser
from .app import (
Datasette,
DEFAULT_SETTINGS,
SETTINGS,
SQLITE_LIMIT_ATTACHED,
pm,
)
from .utils import ( from .utils import (
LoadExtension,
StartupError,
check_connection,
deep_dict_update,
find_spatialite,
parse_metadata,
ConnectionProblem,
SpatialiteConnectionProblem,
initial_path_for_datasette,
pairs_to_nested_config,
temporary_docker_directory, temporary_docker_directory,
value_as_boolean, value_as_boolean,
SpatialiteNotFound,
StaticMount, StaticMount,
ValueAsBooleanError, ValueAsBooleanError,
) )
from .utils.sqlite import sqlite3
from .utils.testing import TestClient
from .version import __version__
class Config(click.ParamType): def run_sync(coro_func):
name = "config" """Run an async callable to completion on a fresh event loop."""
loop = asyncio.new_event_loop()
try:
asyncio.set_event_loop(loop)
return loop.run_until_complete(coro_func())
finally:
asyncio.set_event_loop(None)
loop.close()
# Use Rich for tracebacks if it is installed
try:
from rich.traceback import install
install(show_locals=True)
except ImportError:
pass
class Setting(CompositeParamType):
name = "setting"
arity = 2
def convert(self, config, param, ctx): def convert(self, config, param, ctx):
if ":" not in config: name, value = config
self.fail('"{}" should be name:value'.format(config), param, ctx) if name in DEFAULT_SETTINGS:
return # For backwards compatibility with how this worked prior to
name, value = config.split(":") # Datasette 1.0, we turn bare setting names into setting.name
if name not in DEFAULT_CONFIG: # Type checking for those older settings
self.fail( default = DEFAULT_SETTINGS[name]
"{} is not a valid option (--help-config to see all)".format(name), name = "settings.{}".format(name)
param, if isinstance(default, bool):
ctx, try:
) return name, "true" if value_as_boolean(value) else "false"
return except ValueAsBooleanError:
# Type checking self.fail(f'"{name}" should be on/off/true/false/1/0', param, ctx)
default = DEFAULT_CONFIG[name] elif isinstance(default, int):
if isinstance(default, bool): if not value.isdigit():
try: self.fail(f'"{name}" should be an integer', param, ctx)
return name, value_as_boolean(value) return name, value
except ValueAsBooleanError: elif isinstance(default, str):
self.fail( return name, value
'"{}" should be on/off/true/false/1/0'.format(name), param, ctx else:
# Should never happen:
self.fail("Invalid option")
return name, value
def sqlite_extensions(fn):
fn = click.option(
"sqlite_extensions",
"--load-extension",
type=LoadExtension(),
envvar="DATASETTE_LOAD_EXTENSION",
multiple=True,
help="Path to a SQLite extension to load, and optional entrypoint",
)(fn)
# Wrap it in a custom error handler
@functools.wraps(fn)
def wrapped(*args, **kwargs):
try:
return fn(*args, **kwargs)
except AttributeError as e:
if "enable_load_extension" in str(e):
raise click.ClickException(
textwrap.dedent(
"""
Your Python installation does not have the ability to load SQLite extensions.
More information: https://datasette.io/help/extensions
"""
).strip()
) )
return raise
elif isinstance(default, int):
if not value.isdigit(): return wrapped
self.fail('"{}" should be an integer'.format(name), param, ctx)
return
return name, int(value)
else:
# Should never happen:
self.fail("Invalid option")
@click.group(cls=DefaultGroup, default="serve", default_if_no_args=True) @click.group(cls=DefaultGroup, default="serve", default_if_no_args=True)
@click.version_option() @click.version_option(version=__version__)
def cli(): def cli():
""" """
Datasette! Datasette is an open source multi-tool for exploring and publishing data
\b
About Datasette: https://datasette.io/
Full documentation: https://docs.datasette.io/
""" """
@cli.command() @cli.command()
@click.argument("files", type=click.Path(exists=True), nargs=-1) @click.argument("files", type=click.Path(exists=True), nargs=-1)
@click.option("--inspect-file", default="-") @click.option("--inspect-file", default="-")
@click.option( @sqlite_extensions
"sqlite_extensions",
"--load-extension",
envvar="SQLITE_EXTENSIONS",
multiple=True,
type=click.Path(exists=True, resolve_path=True),
help="Path to a SQLite extension to load",
)
def inspect(files, inspect_file, sqlite_extensions): def inspect(files, inspect_file, sqlite_extensions):
app = Datasette([], immutables=files, sqlite_extensions=sqlite_extensions) """
Generate JSON summary of provided database files
This can then be passed to "datasette --inspect-file" to speed up count
operations against immutable database files.
"""
inspect_data = run_sync(lambda: inspect_(files, sqlite_extensions))
if inspect_file == "-": if inspect_file == "-":
out = sys.stdout sys.stdout.write(json.dumps(inspect_data, indent=2))
else: else:
out = open(inspect_file, "w") with open(inspect_file, "w") as fp:
loop = asyncio.get_event_loop() fp.write(json.dumps(inspect_data, indent=2))
inspect_data = loop.run_until_complete(inspect_(files, sqlite_extensions))
out.write(json.dumps(inspect_data, indent=2))
async def inspect_(files, sqlite_extensions): async def inspect_(files, sqlite_extensions):
@ -99,18 +171,9 @@ async def inspect_(files, sqlite_extensions):
return data return data
class PublishAliases(click.Group): @cli.group()
aliases = {"now": "nowv1"}
def get_command(self, ctx, cmd_name):
if cmd_name in self.aliases:
return click.Group.get_command(self, ctx, self.aliases[cmd_name])
return click.Group.get_command(self, ctx, cmd_name)
@cli.group(cls=PublishAliases)
def publish(): def publish():
"Publish specified SQLite database files to the internet along with a Datasette-powered interface and API" """Publish specified SQLite database files to the internet along with a Datasette-powered interface and API"""
pass pass
@ -120,15 +183,23 @@ pm.hook.publish_subcommand(publish=publish)
@cli.command() @cli.command()
@click.option("--all", help="Include built-in default plugins", is_flag=True) @click.option("--all", help="Include built-in default plugins", is_flag=True)
@click.option(
"--requirements", help="Output requirements.txt of installed plugins", is_flag=True
)
@click.option( @click.option(
"--plugins-dir", "--plugins-dir",
type=click.Path(exists=True, file_okay=False, dir_okay=True), type=click.Path(exists=True, file_okay=False, dir_okay=True),
help="Path to directory containing custom plugins", help="Path to directory containing custom plugins",
) )
def plugins(all, plugins_dir): def plugins(all, requirements, plugins_dir):
"List currently available plugins" """List currently installed plugins"""
app = Datasette([], plugins_dir=plugins_dir) app = Datasette([], plugins_dir=plugins_dir)
click.echo(json.dumps(app.plugins(all), indent=4)) if requirements:
for plugin in app._plugins():
if plugin["version"]:
click.echo("{}=={}".format(plugin["name"], plugin["version"]))
else:
click.echo(json.dumps(app._plugins(all=all), indent=4))
@cli.command() @cli.command()
@ -142,10 +213,10 @@ def plugins(all, plugins_dir):
"-m", "-m",
"--metadata", "--metadata",
type=click.File(mode="r"), type=click.File(mode="r"),
help="Path to JSON file containing metadata to publish", help="Path to JSON/YAML file containing metadata to publish",
) )
@click.option("--extra-options", help="Extra options to pass to datasette serve") @click.option("--extra-options", help="Extra options to pass to datasette serve")
@click.option("--branch", help="Install datasette from a GitHub branch e.g. master") @click.option("--branch", help="Install datasette from a GitHub branch e.g. main")
@click.option( @click.option(
"--template-dir", "--template-dir",
type=click.Path(exists=True, file_okay=False, dir_okay=True), type=click.Path(exists=True, file_okay=False, dir_okay=True),
@ -159,7 +230,7 @@ def plugins(all, plugins_dir):
@click.option( @click.option(
"--static", "--static",
type=StaticMount(), type=StaticMount(),
help="mountpoint:path-to-directory for serving static files", help="Serve static files from this directory at /MOUNT/...",
multiple=True, multiple=True,
) )
@click.option( @click.option(
@ -167,6 +238,19 @@ def plugins(all, plugins_dir):
) )
@click.option("--spatialite", is_flag=True, help="Enable SpatialLite extension") @click.option("--spatialite", is_flag=True, help="Enable SpatialLite extension")
@click.option("--version-note", help="Additional note to show on /-/versions") @click.option("--version-note", help="Additional note to show on /-/versions")
@click.option(
"--secret",
help="Secret used for signing secure values, such as signed cookies",
envvar="DATASETTE_PUBLISH_SECRET",
default=lambda: os.urandom(32).hex(),
)
@click.option(
"-p",
"--port",
default=8001,
type=click.IntRange(1, 65535),
help="Port to run the server on, defaults to 8001",
)
@click.option("--title", help="Title for metadata") @click.option("--title", help="Title for metadata")
@click.option("--license", help="License label for metadata") @click.option("--license", help="License label for metadata")
@click.option("--license_url", help="License URL for metadata") @click.option("--license_url", help="License URL for metadata")
@ -186,9 +270,11 @@ def package(
install, install,
spatialite, spatialite,
version_note, version_note,
**extra_metadata secret,
port,
**extra_metadata,
): ):
"Package specified SQLite files into a new datasette Docker container" """Package SQLite files into a Datasette Docker container"""
if not shutil.which("docker"): if not shutil.which("docker"):
click.secho( click.secho(
' The package command requires "docker" to be installed and configured ', ' The package command requires "docker" to be installed and configured ',
@ -201,16 +287,18 @@ def package(
with temporary_docker_directory( with temporary_docker_directory(
files, files,
"datasette", "datasette",
metadata, metadata=metadata,
extra_options, extra_options=extra_options,
branch, branch=branch,
template_dir, template_dir=template_dir,
plugins_dir, plugins_dir=plugins_dir,
static, static=static,
install, install=install,
spatialite, spatialite=spatialite,
version_note, version_note=version_note,
extra_metadata, secret=secret,
extra_metadata=extra_metadata,
port=port,
): ):
args = ["docker", "build"] args = ["docker", "build"]
if tag: if tag:
@ -221,7 +309,48 @@ def package(
@cli.command() @cli.command()
@click.argument("files", type=click.Path(exists=True), nargs=-1) @click.argument("packages", nargs=-1)
@click.option(
"-U", "--upgrade", is_flag=True, help="Upgrade packages to latest version"
)
@click.option(
"-r",
"--requirement",
type=click.Path(exists=True),
help="Install from requirements file",
)
@click.option(
"-e",
"--editable",
help="Install a project in editable mode from this path",
)
def install(packages, upgrade, requirement, editable):
"""Install plugins and packages from PyPI into the same environment as Datasette"""
if not packages and not requirement and not editable:
raise click.UsageError("Please specify at least one package to install")
args = ["pip", "install"]
if upgrade:
args += ["--upgrade"]
if editable:
args += ["--editable", editable]
if requirement:
args += ["-r", requirement]
args += list(packages)
sys.argv = args
run_module("pip", run_name="__main__")
@cli.command()
@click.argument("packages", nargs=-1, required=True)
@click.option("-y", "--yes", is_flag=True, help="Don't ask for confirmation")
def uninstall(packages, yes):
"""Uninstall plugins and Python packages from the Datasette environment"""
sys.argv = ["pip", "uninstall"] + list(packages) + (["-y"] if yes else [])
run_module("pip", run_name="__main__")
@cli.command()
@click.argument("files", type=click.Path(), nargs=-1)
@click.option( @click.option(
"-i", "-i",
"--immutable", "--immutable",
@ -230,28 +359,35 @@ def package(
multiple=True, multiple=True,
) )
@click.option( @click.option(
"-h", "--host", default="127.0.0.1", help="host for server, defaults to 127.0.0.1" "-h",
"--host",
default="127.0.0.1",
help=(
"Host for server. Defaults to 127.0.0.1 which means only connections "
"from the local machine will be allowed. Use 0.0.0.0 to listen to "
"all IPs and allow access from other machines."
),
) )
@click.option("-p", "--port", default=8001, help="port for server, defaults to 8001")
@click.option( @click.option(
"--debug", is_flag=True, help="Enable debug mode - useful for development" "-p",
"--port",
default=8001,
type=click.IntRange(0, 65535),
help="Port for server, defaults to 8001. Use -p 0 to automatically assign an available port.",
)
@click.option(
"--uds",
help="Bind to a Unix domain socket",
) )
@click.option( @click.option(
"--reload", "--reload",
is_flag=True, is_flag=True,
help="Automatically reload if database or code change detected - useful for development", help="Automatically reload if code or metadata change detected - useful for development",
) )
@click.option( @click.option(
"--cors", is_flag=True, help="Enable CORS by serving Access-Control-Allow-Origin: *" "--cors", is_flag=True, help="Enable CORS by serving Access-Control-Allow-Origin: *"
) )
@click.option( @sqlite_extensions
"sqlite_extensions",
"--load-extension",
envvar="SQLITE_EXTENSIONS",
multiple=True,
type=click.Path(exists=True, resolve_path=True),
help="Path to a SQLite extension to load",
)
@click.option( @click.option(
"--inspect-file", help='Path to JSON file created using "datasette inspect"' "--inspect-file", help='Path to JSON file created using "datasette inspect"'
) )
@ -259,7 +395,7 @@ def package(
"-m", "-m",
"--metadata", "--metadata",
type=click.File(mode="r"), type=click.File(mode="r"),
help="Path to JSON file containing license/source metadata", help="Path to JSON/YAML file containing license/source metadata",
) )
@click.option( @click.option(
"--template-dir", "--template-dir",
@ -274,24 +410,102 @@ def package(
@click.option( @click.option(
"--static", "--static",
type=StaticMount(), type=StaticMount(),
help="mountpoint:path-to-directory for serving static files", help="Serve static files from this directory at /MOUNT/...",
multiple=True, multiple=True,
) )
@click.option("--memory", is_flag=True, help="Make :memory: database available") @click.option("--memory", is_flag=True, help="Make /_memory database available")
@click.option( @click.option(
"-c",
"--config", "--config",
type=Config(), type=click.File(mode="r"),
help="Set config option using configname:value datasette.readthedocs.io/en/latest/config.html", help="Path to JSON/YAML Datasette configuration file",
)
@click.option(
"-s",
"--setting",
"settings",
type=Setting(),
help="nested.key, value setting to use in Datasette configuration",
multiple=True, multiple=True,
) )
@click.option(
"--secret",
help="Secret used for signing secure values, such as signed cookies",
envvar="DATASETTE_SECRET",
)
@click.option(
"--root",
help="Output URL that sets a cookie authenticating the root user",
is_flag=True,
)
@click.option(
"--default-deny",
help="Deny all permissions by default",
is_flag=True,
)
@click.option(
"--get",
help="Run an HTTP GET request against this path, print results and exit",
)
@click.option(
"--headers",
is_flag=True,
help="Include HTTP headers in --get output",
)
@click.option(
"--token",
help="API token to send with --get requests",
)
@click.option(
"--actor",
help="Actor to use for --get requests (JSON string)",
)
@click.option("--version-note", help="Additional note to show on /-/versions") @click.option("--version-note", help="Additional note to show on /-/versions")
@click.option("--help-config", is_flag=True, help="Show available config options") @click.option("--help-settings", is_flag=True, help="Show available settings")
@click.option("--pdb", is_flag=True, help="Launch debugger on any errors")
@click.option(
"-o",
"--open",
"open_browser",
is_flag=True,
help="Open Datasette in your web browser",
)
@click.option(
"--create",
is_flag=True,
help="Create database files if they do not exist",
)
@click.option(
"--crossdb",
is_flag=True,
help="Enable cross-database joins using the /_memory database",
)
@click.option(
"--nolock",
is_flag=True,
help="Ignore locking, open locked files in read-only mode",
)
@click.option(
"--ssl-keyfile",
help="SSL key file",
envvar="DATASETTE_SSL_KEYFILE",
)
@click.option(
"--ssl-certfile",
help="SSL certificate file",
envvar="DATASETTE_SSL_CERTFILE",
)
@click.option(
"--internal",
type=click.Path(),
help="Path to a persistent Datasette internal SQLite database",
)
def serve( def serve(
files, files,
immutable, immutable,
host, host,
port, port,
debug, uds,
reload, reload,
cors, cors,
sqlite_extensions, sqlite_extensions,
@ -302,17 +516,34 @@ def serve(
static, static,
memory, memory,
config, config,
settings,
secret,
root,
default_deny,
get,
headers,
token,
actor,
version_note, version_note,
help_config, help_settings,
pdb,
open_browser,
create,
crossdb,
nolock,
ssl_keyfile,
ssl_certfile,
internal,
return_instance=False,
): ):
"""Serve up specified SQLite database files with a web UI""" """Serve up specified SQLite database files with a web UI"""
if help_config: if help_settings:
formatter = formatting.HelpFormatter() formatter = formatting.HelpFormatter()
with formatter.section("Config options"): with formatter.section("Settings"):
formatter.write_dl( formatter.write_dl(
[ [
(option.name, "{} (default={})".format(option.help, option.default)) (option.name, f"{option.help} (default={option.default})")
for option in CONFIG_OPTIONS for option in SETTINGS
] ]
) )
click.echo(formatter.getvalue()) click.echo(formatter.getvalue())
@ -321,38 +552,342 @@ def serve(
import hupper import hupper
reloader = hupper.start_reloader("datasette.cli.serve") reloader = hupper.start_reloader("datasette.cli.serve")
reloader.watch_files(files) if immutable:
reloader.watch_files(immutable)
if config:
reloader.watch_files([config.name])
if metadata: if metadata:
reloader.watch_files([metadata.name]) reloader.watch_files([metadata.name])
inspect_data = None inspect_data = None
if inspect_file: if inspect_file:
inspect_data = json.load(open(inspect_file)) with open(inspect_file) as fp:
inspect_data = json.load(fp)
metadata_data = None metadata_data = None
if metadata: if metadata:
metadata_data = json.loads(metadata.read()) metadata_data = parse_metadata(metadata.read())
click.echo( config_data = None
"Serve! files={} (immutables={}) on port {}".format(files, immutable, port) if config:
) config_data = parse_metadata(config.read())
ds = Datasette(
files, config_data = config_data or {}
# Merge in settings from -s/--setting
if settings:
settings_updates = pairs_to_nested_config(settings)
# Merge recursively, to avoid over-writing nested values
# https://github.com/simonw/datasette/issues/2389
deep_dict_update(config_data, settings_updates)
kwargs = dict(
immutables=immutable, immutables=immutable,
cache_headers=not debug and not reload, cache_headers=not reload,
cors=cors, cors=cors,
inspect_data=inspect_data, inspect_data=inspect_data,
config=config_data,
metadata=metadata_data, metadata=metadata_data,
sqlite_extensions=sqlite_extensions, sqlite_extensions=sqlite_extensions,
template_dir=template_dir, template_dir=template_dir,
plugins_dir=plugins_dir, plugins_dir=plugins_dir,
static_mounts=static, static_mounts=static,
config=dict(config), settings=None, # These are passed in config= now
memory=memory, memory=memory,
secret=secret,
version_note=version_note, version_note=version_note,
pdb=pdb,
crossdb=crossdb,
nolock=nolock,
internal=internal,
default_deny=default_deny,
) )
# Run async sanity checks - but only if we're not under pytest
asyncio.get_event_loop().run_until_complete(ds.run_sanity_checks()) # Separate directories from files
directories = [f for f in files if os.path.isdir(f)]
file_paths = [f for f in files if not os.path.isdir(f)]
# Handle config_dir - only one directory allowed
if len(directories) > 1:
raise click.ClickException(
"Cannot pass multiple directories. Pass a single directory as config_dir."
)
elif len(directories) == 1:
kwargs["config_dir"] = pathlib.Path(directories[0])
# Verify list of files, create if needed (and --create)
for file in file_paths:
if not pathlib.Path(file).exists():
if create:
sqlite3.connect(file).execute("vacuum")
else:
raise click.ClickException(
"Invalid value for '[FILES]...': Path '{}' does not exist.".format(
file
)
)
# Check for duplicate files by resolving all paths to their absolute forms
# Collect all database files that will be loaded (explicit files + config_dir files)
all_db_files = []
# Add explicit files
for file in file_paths:
all_db_files.append((file, pathlib.Path(file).resolve()))
# Add config_dir databases if config_dir is set
if "config_dir" in kwargs:
config_dir = kwargs["config_dir"]
for ext in ("db", "sqlite", "sqlite3"):
for db_file in config_dir.glob(f"*.{ext}"):
all_db_files.append((str(db_file), db_file.resolve()))
# Check for duplicates
seen = {}
for original_path, resolved_path in all_db_files:
if resolved_path in seen:
raise click.ClickException(
f"Duplicate database file: '{original_path}' and '{seen[resolved_path]}' "
f"both refer to {resolved_path}"
)
seen[resolved_path] = original_path
files = file_paths
try:
ds = Datasette(files, **kwargs)
except SpatialiteNotFound:
raise click.ClickException("Could not find SpatiaLite extension")
except StartupError as e:
raise click.ClickException(e.args[0])
if return_instance:
# Private utility mechanism for writing unit tests
return ds
# Run the "startup" plugin hooks
run_sync(ds.invoke_startup)
# Run async soundness checks - but only if we're not under pytest
run_sync(lambda: check_databases(ds))
if headers and not get:
raise click.ClickException("--headers can only be used with --get")
if token and not get:
raise click.ClickException("--token can only be used with --get")
if get:
client = TestClient(ds)
request_headers = {}
if token:
request_headers["Authorization"] = "Bearer {}".format(token)
cookies = {}
if actor:
cookies["ds_actor"] = client.actor_cookie(json.loads(actor))
response = client.get(get, headers=request_headers, cookies=cookies)
if headers:
# Output HTTP status code, headers, two newlines, then the response body
click.echo(f"HTTP/1.1 {response.status}")
for key, value in response.headers.items():
click.echo(f"{key}: {value}")
if response.text:
click.echo()
click.echo(response.text)
else:
click.echo(response.text)
exit_code = 0 if response.status == 200 else 1
sys.exit(exit_code)
return
# Start the server # Start the server
uvicorn.run(ds.app(), host=host, port=port, log_level="info") url = None
if root:
ds.root_enabled = True
url = "http://{}:{}{}?token={}".format(
host, port, ds.urls.path("-/auth-token"), ds._root_token
)
click.echo(url)
if open_browser:
if url is None:
# Figure out most convenient URL - to table, database or homepage
path = run_sync(lambda: initial_path_for_datasette(ds))
url = f"http://{host}:{port}{path}"
webbrowser.open(url)
uvicorn_kwargs = dict(
host=host, port=port, log_level="info", lifespan="on", workers=1
)
if uds:
uvicorn_kwargs["uds"] = uds
if ssl_keyfile:
uvicorn_kwargs["ssl_keyfile"] = ssl_keyfile
if ssl_certfile:
uvicorn_kwargs["ssl_certfile"] = ssl_certfile
uvicorn.run(ds.app(), **uvicorn_kwargs)
@cli.command()
@click.argument("id")
@click.option(
"--secret",
help="Secret used for signing the API tokens",
envvar="DATASETTE_SECRET",
required=True,
)
@click.option(
"-e",
"--expires-after",
help="Token should expire after this many seconds",
type=int,
)
@click.option(
"alls",
"-a",
"--all",
type=str,
metavar="ACTION",
multiple=True,
help="Restrict token to this action",
)
@click.option(
"databases",
"-d",
"--database",
type=(str, str),
metavar="DB ACTION",
multiple=True,
help="Restrict token to this action on this database",
)
@click.option(
"resources",
"-r",
"--resource",
type=(str, str, str),
metavar="DB RESOURCE ACTION",
multiple=True,
help="Restrict token to this action on this database resource (a table, SQL view or named query)",
)
@click.option(
"--debug",
help="Show decoded token",
is_flag=True,
)
@click.option(
"--plugins-dir",
type=click.Path(exists=True, file_okay=False, dir_okay=True),
help="Path to directory containing custom plugins",
)
def create_token(
id, secret, expires_after, alls, databases, resources, debug, plugins_dir
):
"""
Create a signed API token for the specified actor ID
Example:
datasette create-token root --secret mysecret
To allow only "view-database-download" for all databases:
\b
datasette create-token root --secret mysecret \\
--all view-database-download
To allow "create-table" against a specific database:
\b
datasette create-token root --secret mysecret \\
--database mydb create-table
To allow "insert-row" against a specific table:
\b
datasette create-token root --secret myscret \\
--resource mydb mytable insert-row
Restricted actions can be specified multiple times using
multiple --all, --database, and --resource options.
Add --debug to see a decoded version of the token.
"""
ds = Datasette(secret=secret, plugins_dir=plugins_dir)
# Run ds.invoke_startup() in an event loop
run_sync(ds.invoke_startup)
# Warn about any unknown actions
actions = []
actions.extend(alls)
actions.extend([p[1] for p in databases])
actions.extend([p[2] for p in resources])
for action in actions:
if not ds.actions.get(action):
click.secho(
f" Unknown permission: {action} ",
fg="red",
err=True,
)
restrict_database = {}
for database, action in databases:
restrict_database.setdefault(database, []).append(action)
restrict_resource = {}
for database, resource, action in resources:
restrict_resource.setdefault(database, {}).setdefault(resource, []).append(
action
)
token = ds.create_token(
id,
expires_after=expires_after,
restrict_all=alls,
restrict_database=restrict_database,
restrict_resource=restrict_resource,
)
click.echo(token)
if debug:
encoded = token[len("dstok_") :]
click.echo("\nDecoded:\n")
click.echo(json.dumps(ds.unsign(encoded, namespace="token"), indent=2))
pm.hook.register_commands(cli=cli)
async def check_databases(ds):
# Run check_connection against every connected database
# to confirm they are all usable
for database in list(ds.databases.values()):
try:
await database.execute_fn(check_connection)
except SpatialiteConnectionProblem:
suggestion = ""
try:
find_spatialite()
suggestion = "\n\nTry adding the --load-extension=spatialite option."
except SpatialiteNotFound:
pass
raise click.UsageError(
"It looks like you're trying to load a SpatiaLite"
+ " database without first loading the SpatiaLite module."
+ suggestion
+ "\n\nRead more: https://docs.datasette.io/en/stable/spatialite.html"
)
except ConnectionProblem as e:
raise click.UsageError(
f"Connection to {database.path} failed check: {str(e.args[0])}"
)
# If --crossdb and more than SQLITE_LIMIT_ATTACHED show warning
if (
ds.crossdb
and len([db for db in ds.databases.values() if not db.is_memory])
> SQLITE_LIMIT_ATTACHED
):
msg = (
"Warning: --crossdb only works with the first {} attached databases".format(
SQLITE_LIMIT_ATTACHED
)
)
click.echo(click.style(msg, bold=True, fg="yellow"), err=True)

View file

@ -1,46 +1,380 @@
import asyncio
from collections import namedtuple
from pathlib import Path from pathlib import Path
import janus
import queue
import sqlite_utils
import sys
import threading
import uuid
from .tracer import trace
from .utils import ( from .utils import (
QueryInterrupted,
detect_fts, detect_fts,
detect_primary_keys, detect_primary_keys,
detect_spatialite, detect_spatialite,
get_all_foreign_keys, get_all_foreign_keys,
get_outbound_foreign_keys, get_outbound_foreign_keys,
md5_not_usedforsecurity,
sqlite_timelimit,
sqlite3, sqlite3,
table_columns, table_columns,
table_column_details,
) )
from .utils.sqlite import sqlite_version
from .inspect import inspect_hash from .inspect import inspect_hash
connections = threading.local()
AttachedDatabase = namedtuple("AttachedDatabase", ("seq", "name", "file"))
class Database: class Database:
def __init__(self, ds, path=None, is_mutable=False, is_memory=False): # For table counts stop at this many rows:
count_limit = 10000
_thread_local_id_counter = 1
def __init__(
self,
ds,
path=None,
is_mutable=True,
is_memory=False,
memory_name=None,
mode=None,
):
self.name = None
self._thread_local_id = f"x{self._thread_local_id_counter}"
Database._thread_local_id_counter += 1
self.route = None
self.ds = ds self.ds = ds
self.path = path self.path = path
self.is_mutable = is_mutable self.is_mutable = is_mutable
self.is_memory = is_memory self.is_memory = is_memory
self.hash = None self.memory_name = memory_name
if memory_name is not None:
self.is_memory = True
self.cached_hash = None
self.cached_size = None self.cached_size = None
self.cached_table_counts = None self._cached_table_counts = None
if not self.is_mutable: self._write_thread = None
p = Path(path) self._write_queue = None
self.hash = inspect_hash(p) # These are used when in non-threaded mode:
self.cached_size = p.stat().st_size self._read_connection = None
# Maybe use self.ds.inspect_data to populate cached_table_counts self._write_connection = None
if self.ds.inspect_data and self.ds.inspect_data.get(self.name): # This is used to track all file connections so they can be closed
self.cached_table_counts = { self._all_file_connections = []
key: value["count"] self.mode = mode
for key, value in self.ds.inspect_data[self.name]["tables"].items()
} @property
def cached_table_counts(self):
if self._cached_table_counts is not None:
return self._cached_table_counts
# Maybe use self.ds.inspect_data to populate cached_table_counts
if self.ds.inspect_data and self.ds.inspect_data.get(self.name):
self._cached_table_counts = {
key: value["count"]
for key, value in self.ds.inspect_data[self.name]["tables"].items()
}
return self._cached_table_counts
@property
def color(self):
if self.hash:
return self.hash[:6]
return md5_not_usedforsecurity(self.name)[:6]
def suggest_name(self):
if self.path:
return Path(self.path).stem
elif self.memory_name:
return self.memory_name
else:
return "db"
def connect(self, write=False):
extra_kwargs = {}
if write:
extra_kwargs["isolation_level"] = "IMMEDIATE"
if self.memory_name:
uri = "file:{}?mode=memory&cache=shared".format(self.memory_name)
conn = sqlite3.connect(
uri, uri=True, check_same_thread=False, **extra_kwargs
)
if not write:
conn.execute("PRAGMA query_only=1")
return conn
if self.is_memory:
return sqlite3.connect(":memory:", uri=True)
# mode=ro or immutable=1?
if self.is_mutable:
qs = "?mode=ro"
if self.ds.nolock:
qs += "&nolock=1"
else:
qs = "?immutable=1"
assert not (write and not self.is_mutable)
if write:
qs = ""
if self.mode is not None:
qs = f"?mode={self.mode}"
conn = sqlite3.connect(
f"file:{self.path}{qs}", uri=True, check_same_thread=False, **extra_kwargs
)
self._all_file_connections.append(conn)
return conn
def close(self):
# Close all connections - useful to avoid running out of file handles in tests
for connection in self._all_file_connections:
connection.close()
async def execute_write(self, sql, params=None, block=True):
def _inner(conn):
return conn.execute(sql, params or [])
with trace("sql", database=self.name, sql=sql.strip(), params=params):
results = await self.execute_write_fn(_inner, block=block)
return results
async def execute_write_script(self, sql, block=True):
def _inner(conn):
return conn.executescript(sql)
with trace("sql", database=self.name, sql=sql.strip(), executescript=True):
results = await self.execute_write_fn(
_inner, block=block, transaction=False
)
return results
async def execute_write_many(self, sql, params_seq, block=True):
def _inner(conn):
count = 0
def count_params(params):
nonlocal count
for param in params:
count += 1
yield param
return conn.executemany(sql, count_params(params_seq)), count
with trace(
"sql", database=self.name, sql=sql.strip(), executemany=True
) as kwargs:
results, count = await self.execute_write_fn(_inner, block=block)
kwargs["count"] = count
return results
async def execute_isolated_fn(self, fn):
# Open a new connection just for the duration of this function
# blocking the write queue to avoid any writes occurring during it
if self.ds.executor is None:
# non-threaded mode
isolated_connection = self.connect(write=True)
try:
result = fn(isolated_connection)
finally:
isolated_connection.close()
try:
self._all_file_connections.remove(isolated_connection)
except ValueError:
# Was probably a memory connection
pass
return result
else:
# Threaded mode - send to write thread
return await self._send_to_write_thread(fn, isolated_connection=True)
async def execute_write_fn(self, fn, block=True, transaction=True):
if self.ds.executor is None:
# non-threaded mode
if self._write_connection is None:
self._write_connection = self.connect(write=True)
self.ds._prepare_connection(self._write_connection, self.name)
if transaction:
with self._write_connection:
return fn(self._write_connection)
else:
return fn(self._write_connection)
else:
return await self._send_to_write_thread(
fn, block=block, transaction=transaction
)
async def _send_to_write_thread(
self, fn, block=True, isolated_connection=False, transaction=True
):
if self._write_queue is None:
self._write_queue = queue.Queue()
if self._write_thread is None:
self._write_thread = threading.Thread(
target=self._execute_writes, daemon=True
)
self._write_thread.name = "_execute_writes for database {}".format(
self.name
)
self._write_thread.start()
task_id = uuid.uuid5(uuid.NAMESPACE_DNS, "datasette.io")
reply_queue = janus.Queue()
self._write_queue.put(
WriteTask(fn, task_id, reply_queue, isolated_connection, transaction)
)
if block:
result = await reply_queue.async_q.get()
if isinstance(result, Exception):
raise result
else:
return result
else:
return task_id
def _execute_writes(self):
# Infinite looping thread that protects the single write connection
# to this database
conn_exception = None
conn = None
try:
conn = self.connect(write=True)
self.ds._prepare_connection(conn, self.name)
except Exception as e:
conn_exception = e
while True:
task = self._write_queue.get()
if conn_exception is not None:
result = conn_exception
else:
if task.isolated_connection:
isolated_connection = self.connect(write=True)
try:
result = task.fn(isolated_connection)
except Exception as e:
sys.stderr.write("{}\n".format(e))
sys.stderr.flush()
result = e
finally:
isolated_connection.close()
try:
self._all_file_connections.remove(isolated_connection)
except ValueError:
# Was probably a memory connection
pass
else:
try:
if task.transaction:
with conn:
result = task.fn(conn)
else:
result = task.fn(conn)
except Exception as e:
sys.stderr.write("{}\n".format(e))
sys.stderr.flush()
result = e
task.reply_queue.sync_q.put(result)
async def execute_fn(self, fn):
if self.ds.executor is None:
# non-threaded mode
if self._read_connection is None:
self._read_connection = self.connect()
self.ds._prepare_connection(self._read_connection, self.name)
return fn(self._read_connection)
# threaded mode
def in_thread():
conn = getattr(connections, self._thread_local_id, None)
if not conn:
conn = self.connect()
self.ds._prepare_connection(conn, self.name)
setattr(connections, self._thread_local_id, conn)
return fn(conn)
return await asyncio.get_event_loop().run_in_executor(
self.ds.executor, in_thread
)
async def execute(
self,
sql,
params=None,
truncate=False,
custom_time_limit=None,
page_size=None,
log_sql_errors=True,
):
"""Executes sql against db_name in a thread"""
page_size = page_size or self.ds.page_size
def sql_operation_in_thread(conn):
time_limit_ms = self.ds.sql_time_limit_ms
if custom_time_limit and custom_time_limit < time_limit_ms:
time_limit_ms = custom_time_limit
with sqlite_timelimit(conn, time_limit_ms):
try:
cursor = conn.cursor()
cursor.execute(sql, params if params is not None else {})
max_returned_rows = self.ds.max_returned_rows
if max_returned_rows == page_size:
max_returned_rows += 1
if max_returned_rows and truncate:
rows = cursor.fetchmany(max_returned_rows + 1)
truncated = len(rows) > max_returned_rows
rows = rows[:max_returned_rows]
else:
rows = cursor.fetchall()
truncated = False
except (sqlite3.OperationalError, sqlite3.DatabaseError) as e:
if e.args == ("interrupted",):
raise QueryInterrupted(e, sql, params)
if log_sql_errors:
sys.stderr.write(
"ERROR: conn={}, sql = {}, params = {}: {}\n".format(
conn, repr(sql), params, e
)
)
sys.stderr.flush()
raise
if truncate:
return Results(rows, truncated, cursor.description)
else:
return Results(rows, False, cursor.description)
with trace("sql", database=self.name, sql=sql.strip(), params=params):
results = await self.execute_fn(sql_operation_in_thread)
return results
@property
def hash(self):
if self.cached_hash is not None:
return self.cached_hash
elif self.is_mutable or self.is_memory:
return None
elif self.ds.inspect_data and self.ds.inspect_data.get(self.name):
self.cached_hash = self.ds.inspect_data[self.name]["hash"]
return self.cached_hash
else:
p = Path(self.path)
self.cached_hash = inspect_hash(p)
return self.cached_hash
@property @property
def size(self): def size(self):
if self.is_memory:
return 0
if self.cached_size is not None: if self.cached_size is not None:
return self.cached_size return self.cached_size
else: elif self.is_memory:
return 0
elif self.is_mutable:
return Path(self.path).stat().st_size return Path(self.path).stat().st_size
elif self.ds.inspect_data and self.ds.inspect_data.get(self.name):
self.cached_size = self.ds.inspect_data[self.name]["size"]
return self.cached_size
else:
self.cached_size = Path(self.path).stat().st_size
return self.cached_size
async def table_counts(self, limit=10): async def table_counts(self, limit=10):
if not self.is_mutable and self.cached_table_counts is not None: if not self.is_mutable and self.cached_table_counts is not None:
@ -50,107 +384,225 @@ class Database:
for table in await self.table_names(): for table in await self.table_names():
try: try:
table_count = ( table_count = (
await self.ds.execute( await self.execute(
self.name, f"select count(*) from (select * from [{table}] limit {self.count_limit + 1})",
"select count(*) from [{}]".format(table),
custom_time_limit=limit, custom_time_limit=limit,
) )
).rows[0][0] ).rows[0][0]
counts[table] = table_count counts[table] = table_count
# In some cases I saw "SQL Logic Error" here in addition to # In some cases I saw "SQL Logic Error" here in addition to
# QueryInterrupted - so we catch that too: # QueryInterrupted - so we catch that too:
except (QueryInterrupted, sqlite3.OperationalError): except (QueryInterrupted, sqlite3.OperationalError, sqlite3.DatabaseError):
counts[table] = None counts[table] = None
if not self.is_mutable: if not self.is_mutable:
self.cached_table_counts = counts self._cached_table_counts = counts
return counts return counts
@property @property
def mtime_ns(self): def mtime_ns(self):
if self.is_memory:
return None
return Path(self.path).stat().st_mtime_ns return Path(self.path).stat().st_mtime_ns
@property async def attached_databases(self):
def name(self): # This used to be:
if self.is_memory: # select seq, name, file from pragma_database_list() where seq > 0
return ":memory:" # But SQLite prior to 3.16.0 doesn't support pragma functions
else: results = await self.execute("PRAGMA database_list;")
return Path(self.path).stem # {'seq': 0, 'name': 'main', 'file': ''}
return [
AttachedDatabase(*row)
for row in results.rows
# Filter out the SQLite internal "temp" database, refs #2557
if row["seq"] > 0 and row["name"] != "temp"
]
async def table_exists(self, table): async def table_exists(self, table):
results = await self.ds.execute( results = await self.execute(
self.name, "select 1 from sqlite_master where type='table' and name=?", params=(table,)
"select 1 from sqlite_master where type='table' and name=?", )
params=(table,), return bool(results.rows)
async def view_exists(self, table):
results = await self.execute(
"select 1 from sqlite_master where type='view' and name=?", params=(table,)
) )
return bool(results.rows) return bool(results.rows)
async def table_names(self): async def table_names(self):
results = await self.ds.execute( results = await self.execute(
self.name, "select name from sqlite_master where type='table'" "select name from sqlite_master where type='table'"
) )
return [r[0] for r in results.rows] return [r[0] for r in results.rows]
async def table_columns(self, table): async def table_columns(self, table):
return await self.ds.execute_against_connection_in_thread( return await self.execute_fn(lambda conn: table_columns(conn, table))
self.name, lambda conn: table_columns(conn, table)
) async def table_column_details(self, table):
return await self.execute_fn(lambda conn: table_column_details(conn, table))
async def primary_keys(self, table): async def primary_keys(self, table):
return await self.ds.execute_against_connection_in_thread( return await self.execute_fn(lambda conn: detect_primary_keys(conn, table))
self.name, lambda conn: detect_primary_keys(conn, table)
)
async def fts_table(self, table): async def fts_table(self, table):
return await self.ds.execute_against_connection_in_thread( return await self.execute_fn(lambda conn: detect_fts(conn, table))
self.name, lambda conn: detect_fts(conn, table)
)
async def label_column_for_table(self, table): async def label_column_for_table(self, table):
explicit_label_column = self.ds.table_metadata(self.name, table).get( explicit_label_column = (await self.ds.table_config(self.name, table)).get(
"label_column" "label_column"
) )
if explicit_label_column: if explicit_label_column:
return explicit_label_column return explicit_label_column
# If a table has two columns, one of which is ID, then label_column is the other one
column_names = await self.ds.execute_against_connection_in_thread( def column_details(conn):
self.name, lambda conn: table_columns(conn, table) # Returns {column_name: (type, is_unique)}
) db = sqlite_utils.Database(conn)
columns = db[table].columns_dict
indexes = db[table].indexes
details = {}
for name in columns:
is_unique = any(
index
for index in indexes
if index.columns == [name] and index.unique
)
details[name] = (columns[name], is_unique)
return details
column_details = await self.execute_fn(column_details)
# Is there just one unique column that's text?
unique_text_columns = [
name
for name, (type_, is_unique) in column_details.items()
if is_unique and type_ is str
]
if len(unique_text_columns) == 1:
return unique_text_columns[0]
column_names = list(column_details.keys())
# Is there a name or title column? # Is there a name or title column?
name_or_title = [c for c in column_names if c in ("name", "title")] name_or_title = [c for c in column_names if c.lower() in ("name", "title")]
if name_or_title: if name_or_title:
return name_or_title[0] return name_or_title[0]
# If a table has two columns, one of which is ID, then label_column is the other one
if ( if (
column_names column_names
and len(column_names) == 2 and len(column_names) == 2
and ("id" in column_names or "pk" in column_names) and ("id" in column_names or "pk" in column_names)
and not set(column_names) == {"id", "pk"}
): ):
return [c for c in column_names if c not in ("id", "pk")][0] return [c for c in column_names if c not in ("id", "pk")][0]
# Couldn't find a label: # Couldn't find a label:
return None return None
async def foreign_keys_for_table(self, table): async def foreign_keys_for_table(self, table):
return await self.ds.execute_against_connection_in_thread( return await self.execute_fn(
self.name, lambda conn: get_outbound_foreign_keys(conn, table) lambda conn: get_outbound_foreign_keys(conn, table)
) )
async def hidden_table_names(self): async def hidden_table_names(self):
# Mark tables 'hidden' if they relate to FTS virtual tables hidden_tables = []
hidden_tables = [ # Add any tables marked as hidden in config
r[0] db_config = self.ds.config.get("databases", {}).get(self.name, {})
for r in ( if "tables" in db_config:
await self.ds.execute( hidden_tables += [
self.name, t for t in db_config["tables"] if db_config["tables"][t].get("hidden")
]
if sqlite_version()[1] >= 37:
hidden_tables += [
x[0]
for x in await self.execute(
"""
with shadow_tables as (
select name
from pragma_table_list
where [type] = 'shadow'
order by name
),
core_tables as (
select name
from sqlite_master
WHERE name in ('sqlite_stat1', 'sqlite_stat2', 'sqlite_stat3', 'sqlite_stat4')
OR substr(name, 1, 1) == '_'
),
combined as (
select name from shadow_tables
union all
select name from core_tables
)
select name from combined order by 1
""" """
select name from sqlite_master
where rootpage = 0
and sql like '%VIRTUAL TABLE%USING FTS%'
""",
) )
).rows ]
else:
hidden_tables += [
x[0]
for x in await self.execute(
"""
WITH base AS (
SELECT name
FROM sqlite_master
WHERE name IN ('sqlite_stat1', 'sqlite_stat2', 'sqlite_stat3', 'sqlite_stat4')
OR substr(name, 1, 1) == '_'
),
fts_suffixes AS (
SELECT column1 AS suffix
FROM (VALUES ('_data'), ('_idx'), ('_docsize'), ('_content'), ('_config'))
),
fts5_names AS (
SELECT name
FROM sqlite_master
WHERE sql LIKE '%VIRTUAL TABLE%USING FTS%'
),
fts5_shadow_tables AS (
SELECT
printf('%s%s', fts5_names.name, fts_suffixes.suffix) AS name
FROM fts5_names
JOIN fts_suffixes
),
fts3_suffixes AS (
SELECT column1 AS suffix
FROM (VALUES ('_content'), ('_segdir'), ('_segments'), ('_stat'), ('_docsize'))
),
fts3_names AS (
SELECT name
FROM sqlite_master
WHERE sql LIKE '%VIRTUAL TABLE%USING FTS3%'
OR sql LIKE '%VIRTUAL TABLE%USING FTS4%'
),
fts3_shadow_tables AS (
SELECT
printf('%s%s', fts3_names.name, fts3_suffixes.suffix) AS name
FROM fts3_names
JOIN fts3_suffixes
),
final AS (
SELECT name FROM base
UNION ALL
SELECT name FROM fts5_shadow_tables
UNION ALL
SELECT name FROM fts3_shadow_tables
)
SELECT name FROM final ORDER BY 1
"""
)
]
# Also hide any FTS tables that have a content= argument
hidden_tables += [
x[0]
for x in await self.execute(
"""
SELECT name
FROM sqlite_master
WHERE sql LIKE '%VIRTUAL TABLE%'
AND sql LIKE '%USING FTS%'
AND sql LIKE '%content=%'
"""
)
] ]
has_spatialite = await self.ds.execute_against_connection_in_thread(
self.name, detect_spatialite has_spatialite = await self.execute_fn(detect_spatialite)
)
if has_spatialite: if has_spatialite:
# Also hide Spatialite internal tables # Also hide Spatialite internal tables
hidden_tables += [ hidden_tables += [
@ -163,64 +615,51 @@ class Database:
"sqlite_sequence", "sqlite_sequence",
"views_geometry_columns", "views_geometry_columns",
"virts_geometry_columns", "virts_geometry_columns",
"data_licenses",
"KNN",
"KNN2",
] + [ ] + [
r[0] r[0]
for r in ( for r in (
await self.ds.execute( await self.execute(
self.name,
""" """
select name from sqlite_master select name from sqlite_master
where name like "idx_%" where name like "idx_%"
and type = "table" and type = "table"
""", """
) )
).rows ).rows
] ]
# Add any from metadata.json
db_metadata = self.ds.metadata(database=self.name)
if "tables" in db_metadata:
hidden_tables += [
t
for t in db_metadata["tables"]
if db_metadata["tables"][t].get("hidden")
]
# Also mark as hidden any tables which start with the name of a hidden table
# e.g. "searchable_fts" implies "searchable_fts_content" should be hidden
for table_name in await self.table_names():
for hidden_table in hidden_tables[:]:
if table_name.startswith(hidden_table):
hidden_tables.append(table_name)
continue
return hidden_tables return hidden_tables
async def view_names(self): async def view_names(self):
results = await self.ds.execute( results = await self.execute("select name from sqlite_master where type='view'")
self.name, "select name from sqlite_master where type='view'"
)
return [r[0] for r in results.rows] return [r[0] for r in results.rows]
async def get_all_foreign_keys(self): async def get_all_foreign_keys(self):
return await self.ds.execute_against_connection_in_thread( return await self.execute_fn(get_all_foreign_keys)
self.name, get_all_foreign_keys
)
async def get_outbound_foreign_keys(self, table):
return await self.ds.execute_against_connection_in_thread(
self.name, lambda conn: get_outbound_foreign_keys(conn, table)
)
async def get_table_definition(self, table, type_="table"): async def get_table_definition(self, table, type_="table"):
table_definition_rows = list( table_definition_rows = list(
await self.ds.execute( await self.execute(
self.name,
"select sql from sqlite_master where name = :n and type=:t", "select sql from sqlite_master where name = :n and type=:t",
{"n": table, "t": type_}, {"n": table, "t": type_},
) )
) )
if not table_definition_rows: if not table_definition_rows:
return None return None
return table_definition_rows[0][0] bits = [table_definition_rows[0][0] + ";"]
# Add on any indexes
index_rows = list(
await self.execute(
"select sql from sqlite_master where tbl_name = :n and type='index' and sql is not null",
{"n": table},
)
)
for index_row in index_rows:
bits.append(index_row[0] + ";")
return "\n".join(bits)
async def get_view_definition(self, view): async def get_view_definition(self, view):
return await self.get_table_definition(view, "view") return await self.get_table_definition(view, "view")
@ -232,10 +671,67 @@ class Database:
if self.is_memory: if self.is_memory:
tags.append("memory") tags.append("memory")
if self.hash: if self.hash:
tags.append("hash={}".format(self.hash)) tags.append(f"hash={self.hash}")
if self.size is not None: if self.size is not None:
tags.append("size={}".format(self.size)) tags.append(f"size={self.size}")
tags_str = "" tags_str = ""
if tags: if tags:
tags_str = " ({})".format(", ".join(tags)) tags_str = f" ({', '.join(tags)})"
return "<Database: {}{}>".format(self.name, tags_str) return f"<Database: {self.name}{tags_str}>"
class WriteTask:
__slots__ = ("fn", "task_id", "reply_queue", "isolated_connection", "transaction")
def __init__(self, fn, task_id, reply_queue, isolated_connection, transaction):
self.fn = fn
self.task_id = task_id
self.reply_queue = reply_queue
self.isolated_connection = isolated_connection
self.transaction = transaction
class QueryInterrupted(Exception):
def __init__(self, e, sql, params):
self.e = e
self.sql = sql
self.params = params
def __str__(self):
return "QueryInterrupted: {}".format(self.e)
class MultipleValues(Exception):
pass
class Results:
def __init__(self, rows, truncated, description):
self.rows = rows
self.truncated = truncated
self.description = description
@property
def columns(self):
return [d[0] for d in self.description]
def first(self):
if self.rows:
return self.rows[0]
else:
return None
def single_value(self):
if self.rows and 1 == len(self.rows) and 1 == len(self.rows[0]):
return self.rows[0][0]
else:
raise MultipleValues
def dicts(self):
return [dict(row) for row in self.rows]
def __iter__(self):
return iter(self.rows)
def __len__(self):
return len(self.rows)

View file

@ -0,0 +1,101 @@
from datasette import hookimpl
from datasette.permissions import Action
from datasette.resources import (
DatabaseResource,
TableResource,
QueryResource,
)
@hookimpl
def register_actions():
"""Register the core Datasette actions."""
return (
# Global actions (no resource_class)
Action(
name="view-instance",
abbr="vi",
description="View Datasette instance",
),
Action(
name="permissions-debug",
abbr="pd",
description="Access permission debug tool",
),
Action(
name="debug-menu",
abbr="dm",
description="View debug menu items",
),
# Database-level actions (parent-level)
Action(
name="view-database",
abbr="vd",
description="View database",
resource_class=DatabaseResource,
),
Action(
name="view-database-download",
abbr="vdd",
description="Download database file",
resource_class=DatabaseResource,
also_requires="view-database",
),
Action(
name="execute-sql",
abbr="es",
description="Execute read-only SQL queries",
resource_class=DatabaseResource,
also_requires="view-database",
),
Action(
name="create-table",
abbr="ct",
description="Create tables",
resource_class=DatabaseResource,
),
# Table-level actions (child-level)
Action(
name="view-table",
abbr="vt",
description="View table",
resource_class=TableResource,
),
Action(
name="insert-row",
abbr="ir",
description="Insert rows",
resource_class=TableResource,
),
Action(
name="delete-row",
abbr="dr",
description="Delete rows",
resource_class=TableResource,
),
Action(
name="update-row",
abbr="ur",
description="Update rows",
resource_class=TableResource,
),
Action(
name="alter-table",
abbr="at",
description="Alter tables",
resource_class=TableResource,
),
Action(
name="drop-table",
abbr="dt",
description="Drop tables",
resource_class=TableResource,
),
# Query-level actions (child-level)
Action(
name="view-query",
abbr="vq",
description="View named query results",
resource_class=QueryResource,
),
)

View file

@ -0,0 +1,57 @@
from datasette import hookimpl
import datetime
import os
import time
def header(key, request):
key = key.replace("_", "-").encode("utf-8")
headers_dict = dict(request.scope["headers"])
return headers_dict.get(key, b"").decode("utf-8")
def actor(key, request):
if request.actor is None:
raise KeyError
return request.actor[key]
def cookie(key, request):
return request.cookies[key]
def now(key, request):
if key == "epoch":
return int(time.time())
elif key == "date_utc":
return datetime.datetime.now(datetime.timezone.utc).date().isoformat()
elif key == "datetime_utc":
return (
datetime.datetime.now(datetime.timezone.utc).strftime(r"%Y-%m-%dT%H:%M:%S")
+ "Z"
)
else:
raise KeyError
def random(key, request):
if key.startswith("chars_") and key.split("chars_")[-1].isdigit():
num_chars = int(key.split("chars_")[-1])
if num_chars % 2 == 1:
urandom_len = (num_chars + 1) / 2
else:
urandom_len = num_chars / 2
return os.urandom(int(urandom_len)).hex()[:num_chars]
else:
raise KeyError
@hookimpl
def register_magic_parameters():
return [
("header", header),
("actor", actor),
("cookie", cookie),
("now", now),
("random", random),
]

View file

@ -0,0 +1,41 @@
from datasette import hookimpl
@hookimpl
def menu_links(datasette, actor):
async def inner():
if not await datasette.allowed(action="debug-menu", actor=actor):
return []
return [
{"href": datasette.urls.path("/-/databases"), "label": "Databases"},
{
"href": datasette.urls.path("/-/plugins"),
"label": "Installed plugins",
},
{
"href": datasette.urls.path("/-/versions"),
"label": "Version info",
},
{
"href": datasette.urls.path("/-/settings"),
"label": "Settings",
},
{
"href": datasette.urls.path("/-/permissions"),
"label": "Debug permissions",
},
{
"href": datasette.urls.path("/-/messages"),
"label": "Debug messages",
},
{
"href": datasette.urls.path("/-/allow-debug"),
"label": "Debug allow rules",
},
{"href": datasette.urls.path("/-/threads"), "label": "Debug threads"},
{"href": datasette.urls.path("/-/actor"), "label": "Debug actor"},
{"href": datasette.urls.path("/-/patterns"), "label": "Pattern portfolio"},
]
return inner

View file

@ -0,0 +1,59 @@
"""
Default permission implementations for Datasette.
This module provides the built-in permission checking logic through implementations
of the permission_resources_sql hook. The hooks are organized by their purpose:
1. Actor Restrictions - Enforces _r allowlists embedded in actor tokens
2. Root User - Grants full access when --root flag is used
3. Config Rules - Applies permissions from datasette.yaml
4. Default Settings - Enforces default_allow_sql and default view permissions
IMPORTANT: These hooks return PermissionSQL objects that are combined using SQL
UNION/INTERSECT operations. The order of evaluation is:
- restriction_sql fields are INTERSECTed (all must match)
- Regular sql fields are UNIONed and evaluated with cascading priority
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
# Re-export all hooks and public utilities
from .restrictions import (
actor_restrictions_sql,
restrictions_allow_action,
ActorRestrictions,
)
from .root import root_user_permissions_sql
from .config import config_permissions_sql
from .defaults import (
default_allow_sql_check,
default_action_permissions_sql,
DEFAULT_ALLOW_ACTIONS,
)
from .tokens import actor_from_signed_api_token
@hookimpl
def skip_csrf(scope) -> Optional[bool]:
"""Skip CSRF check for JSON content-type requests."""
if scope["type"] == "http":
headers = scope.get("headers") or {}
if dict(headers).get(b"content-type") == b"application/json":
return True
return None
@hookimpl
def canned_queries(datasette: "Datasette", database: str, actor) -> dict:
"""Return canned queries defined in datasette.yaml configuration."""
queries = (
((datasette.config or {}).get("databases") or {}).get(database) or {}
).get("queries") or {}
return queries

View file

@ -0,0 +1,442 @@
"""
Config-based permission handling for Datasette.
Applies permission rules from datasette.yaml configuration.
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Any, List, Optional, Set, Tuple
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
from datasette.utils import actor_matches_allow
from .helpers import PermissionRowCollector, get_action_name_variants
class ConfigPermissionProcessor:
"""
Processes permission rules from datasette.yaml configuration.
Configuration structure:
permissions: # Root-level permissions block
view-instance:
id: admin
databases:
mydb:
permissions: # Database-level permissions
view-database:
id: admin
allow: # Database-level allow block (for view-*)
id: viewer
allow_sql: # execute-sql allow block
id: analyst
tables:
users:
permissions: # Table-level permissions
view-table:
id: admin
allow: # Table-level allow block
id: viewer
queries:
my_query:
permissions: # Query-level permissions
view-query:
id: admin
allow: # Query-level allow block
id: viewer
"""
def __init__(
self,
datasette: "Datasette",
actor: Optional[dict],
action: str,
):
self.datasette = datasette
self.actor = actor
self.action = action
self.config = datasette.config or {}
self.collector = PermissionRowCollector(prefix="cfg")
# Pre-compute action variants
self.action_checks = get_action_name_variants(datasette, action)
self.action_obj = datasette.actions.get(action)
# Parse restrictions if present
self.has_restrictions = actor and "_r" in actor if actor else False
self.restrictions = actor.get("_r", {}) if actor else {}
# Pre-compute restriction info for efficiency
self.restricted_databases: Set[str] = set()
self.restricted_tables: Set[Tuple[str, str]] = set()
if self.has_restrictions:
self.restricted_databases = {
db_name
for db_name, db_actions in (self.restrictions.get("d") or {}).items()
if self.action_checks.intersection(db_actions)
}
self.restricted_tables = {
(db_name, table_name)
for db_name, tables in (self.restrictions.get("r") or {}).items()
for table_name, table_actions in tables.items()
if self.action_checks.intersection(table_actions)
}
# Tables implicitly reference their parent databases
self.restricted_databases.update(db for db, _ in self.restricted_tables)
def evaluate_allow_block(self, allow_block: Any) -> Optional[bool]:
"""Evaluate an allow block against the current actor."""
if allow_block is None:
return None
return actor_matches_allow(self.actor, allow_block)
def is_in_restriction_allowlist(
self,
parent: Optional[str],
child: Optional[str],
) -> bool:
"""Check if resource is allowed by actor restrictions."""
if not self.has_restrictions:
return True # No restrictions, all resources allowed
# Check global allowlist
if self.action_checks.intersection(self.restrictions.get("a", [])):
return True
# Check database-level allowlist
if parent and self.action_checks.intersection(
self.restrictions.get("d", {}).get(parent, [])
):
return True
# Check table-level allowlist
if parent:
table_restrictions = (self.restrictions.get("r", {}) or {}).get(parent, {})
if child:
table_actions = table_restrictions.get(child, [])
if self.action_checks.intersection(table_actions):
return True
else:
# Parent query should proceed if any child in this database is allowlisted
for table_actions in table_restrictions.values():
if self.action_checks.intersection(table_actions):
return True
# Parent/child both None: include if any restrictions exist for this action
if parent is None and child is None:
if self.action_checks.intersection(self.restrictions.get("a", [])):
return True
if self.restricted_databases:
return True
if self.restricted_tables:
return True
return False
def add_permissions_rule(
self,
parent: Optional[str],
child: Optional[str],
permissions_block: Optional[dict],
scope_desc: str,
) -> None:
"""Add a rule from a permissions:{action} block."""
if permissions_block is None:
return
action_allow_block = permissions_block.get(self.action)
result = self.evaluate_allow_block(action_allow_block)
self.collector.add(
parent=parent,
child=child,
allow=result,
reason=f"config {'allow' if result else 'deny'} {scope_desc}",
if_not_none=True,
)
def add_allow_block_rule(
self,
parent: Optional[str],
child: Optional[str],
allow_block: Any,
scope_desc: str,
) -> None:
"""
Add rules from an allow:{} block.
For allow blocks, if the block exists but doesn't match the actor,
this is treated as a deny. We also handle the restriction-gate logic.
"""
if allow_block is None:
return
# Skip if resource is not in restriction allowlist
if not self.is_in_restriction_allowlist(parent, child):
return
result = self.evaluate_allow_block(allow_block)
bool_result = bool(result)
self.collector.add(
parent,
child,
bool_result,
f"config {'allow' if result else 'deny'} {scope_desc}",
)
# Handle restriction-gate: add explicit denies for restricted resources
self._add_restriction_gate_denies(parent, child, bool_result, scope_desc)
def _add_restriction_gate_denies(
self,
parent: Optional[str],
child: Optional[str],
is_allowed: bool,
scope_desc: str,
) -> None:
"""
When a config rule denies at a higher level, add explicit denies
for restricted resources to prevent child-level allows from
incorrectly granting access.
"""
if is_allowed or child is not None or not self.has_restrictions:
return
if not self.action_obj:
return
reason = f"config deny {scope_desc} (restriction gate)"
if parent is None:
# Root-level deny: add denies for all restricted resources
if self.action_obj.takes_parent:
for db_name in self.restricted_databases:
self.collector.add(db_name, None, False, reason)
if self.action_obj.takes_child:
for db_name, table_name in self.restricted_tables:
self.collector.add(db_name, table_name, False, reason)
else:
# Database-level deny: add denies for tables in that database
if self.action_obj.takes_child:
for db_name, table_name in self.restricted_tables:
if db_name == parent:
self.collector.add(db_name, table_name, False, reason)
def process(self) -> Optional[PermissionSQL]:
"""Process all config rules and return combined PermissionSQL."""
self._process_root_permissions()
self._process_databases()
self._process_root_allow_blocks()
return self.collector.to_permission_sql()
def _process_root_permissions(self) -> None:
"""Process root-level permissions block."""
root_perms = self.config.get("permissions") or {}
self.add_permissions_rule(
None,
None,
root_perms,
f"permissions for {self.action}",
)
def _process_databases(self) -> None:
"""Process database-level and nested configurations."""
databases = self.config.get("databases") or {}
for db_name, db_config in databases.items():
self._process_database(db_name, db_config or {})
def _process_database(self, db_name: str, db_config: dict) -> None:
"""Process a single database's configuration."""
# Database-level permissions block
db_perms = db_config.get("permissions") or {}
self.add_permissions_rule(
db_name,
None,
db_perms,
f"permissions for {self.action} on {db_name}",
)
# Process tables
for table_name, table_config in (db_config.get("tables") or {}).items():
self._process_table(db_name, table_name, table_config or {})
# Process queries
for query_name, query_config in (db_config.get("queries") or {}).items():
self._process_query(db_name, query_name, query_config)
# Database-level allow blocks
self._process_database_allow_blocks(db_name, db_config)
def _process_table(
self,
db_name: str,
table_name: str,
table_config: dict,
) -> None:
"""Process a single table's configuration."""
# Table-level permissions block
table_perms = table_config.get("permissions") or {}
self.add_permissions_rule(
db_name,
table_name,
table_perms,
f"permissions for {self.action} on {db_name}/{table_name}",
)
# Table-level allow block (for view-table)
if self.action == "view-table":
self.add_allow_block_rule(
db_name,
table_name,
table_config.get("allow"),
f"allow for {self.action} on {db_name}/{table_name}",
)
def _process_query(
self,
db_name: str,
query_name: str,
query_config: Any,
) -> None:
"""Process a single query's configuration."""
# Query config can be a string (just SQL) or dict
if not isinstance(query_config, dict):
return
# Query-level permissions block
query_perms = query_config.get("permissions") or {}
self.add_permissions_rule(
db_name,
query_name,
query_perms,
f"permissions for {self.action} on {db_name}/{query_name}",
)
# Query-level allow block (for view-query)
if self.action == "view-query":
self.add_allow_block_rule(
db_name,
query_name,
query_config.get("allow"),
f"allow for {self.action} on {db_name}/{query_name}",
)
def _process_database_allow_blocks(
self,
db_name: str,
db_config: dict,
) -> None:
"""Process database-level allow/allow_sql blocks."""
# view-database allow block
if self.action == "view-database":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow"),
f"allow for {self.action} on {db_name}",
)
# execute-sql allow_sql block
if self.action == "execute-sql":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow_sql"),
f"allow_sql for {db_name}",
)
# view-table uses database-level allow for inheritance
if self.action == "view-table":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow"),
f"allow for {self.action} on {db_name}",
)
# view-query uses database-level allow for inheritance
if self.action == "view-query":
self.add_allow_block_rule(
db_name,
None,
db_config.get("allow"),
f"allow for {self.action} on {db_name}",
)
def _process_root_allow_blocks(self) -> None:
"""Process root-level allow/allow_sql blocks."""
root_allow = self.config.get("allow")
if self.action == "view-instance":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-instance",
)
if self.action == "view-database":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-database",
)
if self.action == "view-table":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-table",
)
if self.action == "view-query":
self.add_allow_block_rule(
None,
None,
root_allow,
"allow for view-query",
)
if self.action == "execute-sql":
self.add_allow_block_rule(
None,
None,
self.config.get("allow_sql"),
"allow_sql",
)
@hookimpl(specname="permission_resources_sql")
async def config_permissions_sql(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[List[PermissionSQL]]:
"""
Apply permission rules from datasette.yaml configuration.
This processes:
- permissions: blocks at root, database, table, and query levels
- allow: blocks for view-* actions
- allow_sql: blocks for execute-sql action
"""
processor = ConfigPermissionProcessor(datasette, actor, action)
result = processor.process()
if result is None:
return []
return [result]

View file

@ -0,0 +1,70 @@
"""
Default permission settings for Datasette.
Provides default allow rules for standard view/execute actions.
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
# Actions that are allowed by default (unless --default-deny is used)
DEFAULT_ALLOW_ACTIONS = frozenset(
{
"view-instance",
"view-database",
"view-database-download",
"view-table",
"view-query",
"execute-sql",
}
)
@hookimpl(specname="permission_resources_sql")
async def default_allow_sql_check(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[PermissionSQL]:
"""
Enforce the default_allow_sql setting.
When default_allow_sql is false (the default), execute-sql is denied
unless explicitly allowed by config or other rules.
"""
if action == "execute-sql":
if not datasette.setting("default_allow_sql"):
return PermissionSQL.deny(reason="default_allow_sql is false")
return None
@hookimpl(specname="permission_resources_sql")
async def default_action_permissions_sql(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[PermissionSQL]:
"""
Provide default allow rules for standard view/execute actions.
These defaults are skipped when datasette is started with --default-deny.
The restriction_sql mechanism (from actor_restrictions_sql) will still
filter these results if the actor has restrictions.
"""
if datasette.default_deny:
return None
if action in DEFAULT_ALLOW_ACTIONS:
reason = f"default allow for {action}".replace("'", "''")
return PermissionSQL.allow(reason=reason)
return None

View file

@ -0,0 +1,85 @@
"""
Shared helper utilities for default permission implementations.
"""
from __future__ import annotations
from dataclasses import dataclass
from typing import TYPE_CHECKING, List, Optional, Set
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette.permissions import PermissionSQL
def get_action_name_variants(datasette: "Datasette", action: str) -> Set[str]:
"""
Get all name variants for an action (full name and abbreviation).
Example:
get_action_name_variants(ds, "view-table") -> {"view-table", "vt"}
"""
variants = {action}
action_obj = datasette.actions.get(action)
if action_obj and action_obj.abbr:
variants.add(action_obj.abbr)
return variants
def action_in_list(datasette: "Datasette", action: str, action_list: list) -> bool:
"""Check if an action (or its abbreviation) is in a list."""
return bool(get_action_name_variants(datasette, action).intersection(action_list))
@dataclass
class PermissionRow:
"""A single permission rule row."""
parent: Optional[str]
child: Optional[str]
allow: bool
reason: str
class PermissionRowCollector:
"""Collects permission rows and converts them to PermissionSQL."""
def __init__(self, prefix: str = "row"):
self.rows: List[PermissionRow] = []
self.prefix = prefix
def add(
self,
parent: Optional[str],
child: Optional[str],
allow: Optional[bool],
reason: str,
if_not_none: bool = False,
) -> None:
"""Add a permission row. If if_not_none=True, only add if allow is not None."""
if if_not_none and allow is None:
return
self.rows.append(PermissionRow(parent, child, allow, reason))
def to_permission_sql(self) -> Optional[PermissionSQL]:
"""Convert collected rows to a PermissionSQL object."""
if not self.rows:
return None
parts = []
params = {}
for idx, row in enumerate(self.rows):
key = f"{self.prefix}_{idx}"
parts.append(
f"SELECT :{key}_parent AS parent, :{key}_child AS child, "
f":{key}_allow AS allow, :{key}_reason AS reason"
)
params[f"{key}_parent"] = row.parent
params[f"{key}_child"] = row.child
params[f"{key}_allow"] = 1 if row.allow else 0
params[f"{key}_reason"] = row.reason
sql = "\nUNION ALL\n".join(parts)
return PermissionSQL(sql=sql, params=params)

View file

@ -0,0 +1,195 @@
"""
Actor restriction handling for Datasette permissions.
This module handles the _r (restrictions) key in actor dictionaries, which
contains allowlists of resources the actor can access.
"""
from __future__ import annotations
from dataclasses import dataclass
from typing import TYPE_CHECKING, List, Optional, Set, Tuple
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
from .helpers import action_in_list, get_action_name_variants
@dataclass
class ActorRestrictions:
"""Parsed actor restrictions from the _r key."""
global_actions: List[str] # _r.a - globally allowed actions
database_actions: dict # _r.d - {db_name: [actions]}
table_actions: dict # _r.r - {db_name: {table: [actions]}}
@classmethod
def from_actor(cls, actor: Optional[dict]) -> Optional["ActorRestrictions"]:
"""Parse restrictions from actor dict. Returns None if no restrictions."""
if not actor:
return None
assert isinstance(actor, dict), "actor must be a dictionary"
restrictions = actor.get("_r")
if restrictions is None:
return None
return cls(
global_actions=restrictions.get("a", []),
database_actions=restrictions.get("d", {}),
table_actions=restrictions.get("r", {}),
)
def is_action_globally_allowed(self, datasette: "Datasette", action: str) -> bool:
"""Check if action is in the global allowlist."""
return action_in_list(datasette, action, self.global_actions)
def get_allowed_databases(self, datasette: "Datasette", action: str) -> Set[str]:
"""Get database names where this action is allowed."""
allowed = set()
for db_name, db_actions in self.database_actions.items():
if action_in_list(datasette, action, db_actions):
allowed.add(db_name)
return allowed
def get_allowed_tables(
self, datasette: "Datasette", action: str
) -> Set[Tuple[str, str]]:
"""Get (database, table) pairs where this action is allowed."""
allowed = set()
for db_name, tables in self.table_actions.items():
for table_name, table_actions in tables.items():
if action_in_list(datasette, action, table_actions):
allowed.add((db_name, table_name))
return allowed
@hookimpl(specname="permission_resources_sql")
async def actor_restrictions_sql(
datasette: "Datasette",
actor: Optional[dict],
action: str,
) -> Optional[List[PermissionSQL]]:
"""
Handle actor restriction-based permission rules.
When an actor has an "_r" key, it contains an allowlist of resources they
can access. This function returns restriction_sql that filters the final
results to only include resources in that allowlist.
The _r structure:
{
"a": ["vi", "pd"], # Global actions allowed
"d": {"mydb": ["vt", "es"]}, # Database-level actions
"r": {"mydb": {"users": ["vt"]}} # Table-level actions
}
"""
if not actor:
return None
restrictions = ActorRestrictions.from_actor(actor)
if restrictions is None:
# No restrictions - all resources allowed
return []
# If globally allowed, no filtering needed
if restrictions.is_action_globally_allowed(datasette, action):
return []
# Build restriction SQL
allowed_dbs = restrictions.get_allowed_databases(datasette, action)
allowed_tables = restrictions.get_allowed_tables(datasette, action)
# If nothing is allowed for this action, return empty-set restriction
if not allowed_dbs and not allowed_tables:
return [
PermissionSQL(
params={"deny": f"actor restrictions: {action} not in allowlist"},
restriction_sql="SELECT NULL AS parent, NULL AS child WHERE 0",
)
]
# Build UNION of allowed resources
selects = []
params = {}
counter = 0
# Database-level entries (parent, NULL) - allows all children
for db_name in allowed_dbs:
key = f"restr_{counter}"
counter += 1
selects.append(f"SELECT :{key}_parent AS parent, NULL AS child")
params[f"{key}_parent"] = db_name
# Table-level entries (parent, child)
for db_name, table_name in allowed_tables:
key = f"restr_{counter}"
counter += 1
selects.append(f"SELECT :{key}_parent AS parent, :{key}_child AS child")
params[f"{key}_parent"] = db_name
params[f"{key}_child"] = table_name
restriction_sql = "\nUNION ALL\n".join(selects)
return [PermissionSQL(params=params, restriction_sql=restriction_sql)]
def restrictions_allow_action(
datasette: "Datasette",
restrictions: dict,
action: str,
resource: Optional[str | Tuple[str, str]],
) -> bool:
"""
Check if restrictions allow the requested action on the requested resource.
This is a synchronous utility function for use by other code that needs
to quickly check restriction allowlists.
Args:
datasette: The Datasette instance
restrictions: The _r dict from an actor
action: The action name to check
resource: None for global, str for database, (db, table) tuple for table
Returns:
True if allowed, False if denied
"""
# Does this action have an abbreviation?
to_check = get_action_name_variants(datasette, action)
# Check global level (any resource)
all_allowed = restrictions.get("a")
if all_allowed is not None:
assert isinstance(all_allowed, list)
if to_check.intersection(all_allowed):
return True
# Check database level
if resource:
if isinstance(resource, str):
database_name = resource
else:
database_name = resource[0]
database_allowed = restrictions.get("d", {}).get(database_name)
if database_allowed is not None:
assert isinstance(database_allowed, list)
if to_check.intersection(database_allowed):
return True
# Check table/resource level
if resource is not None and not isinstance(resource, str) and len(resource) == 2:
database, table = resource
table_allowed = restrictions.get("r", {}).get(database, {}).get(table)
if table_allowed is not None:
assert isinstance(table_allowed, list)
if to_check.intersection(table_allowed):
return True
# This action is not explicitly allowed, so reject it
return False

View file

@ -0,0 +1,29 @@
"""
Root user permission handling for Datasette.
Grants full permissions to the root user when --root flag is used.
"""
from __future__ import annotations
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
from datasette import hookimpl
from datasette.permissions import PermissionSQL
@hookimpl(specname="permission_resources_sql")
async def root_user_permissions_sql(
datasette: "Datasette",
actor: Optional[dict],
) -> Optional[PermissionSQL]:
"""
Grant root user full permissions when --root flag is used.
"""
if not datasette.root_enabled:
return None
if actor is not None and actor.get("id") == "root":
return PermissionSQL.allow(reason="root user")

View file

@ -0,0 +1,95 @@
"""
Token authentication for Datasette.
Handles signed API tokens (dstok_ prefix).
"""
from __future__ import annotations
import time
from typing import TYPE_CHECKING, Optional
if TYPE_CHECKING:
from datasette.app import Datasette
import itsdangerous
from datasette import hookimpl
@hookimpl(specname="actor_from_request")
def actor_from_signed_api_token(datasette: "Datasette", request) -> Optional[dict]:
"""
Authenticate requests using signed API tokens (dstok_ prefix).
Token structure (signed JSON):
{
"a": "actor_id", # Actor ID
"t": 1234567890, # Timestamp (Unix epoch)
"d": 3600, # Optional: Duration in seconds
"_r": {...} # Optional: Restrictions
}
"""
prefix = "dstok_"
# Check if tokens are enabled
if not datasette.setting("allow_signed_tokens"):
return None
max_signed_tokens_ttl = datasette.setting("max_signed_tokens_ttl")
# Get authorization header
authorization = request.headers.get("authorization")
if not authorization:
return None
if not authorization.startswith("Bearer "):
return None
token = authorization[len("Bearer ") :]
if not token.startswith(prefix):
return None
# Remove prefix and verify signature
token = token[len(prefix) :]
try:
decoded = datasette.unsign(token, namespace="token")
except itsdangerous.BadSignature:
return None
# Validate timestamp
if "t" not in decoded:
return None
created = decoded["t"]
if not isinstance(created, int):
return None
# Handle duration/expiry
duration = decoded.get("d")
if duration is not None and not isinstance(duration, int):
return None
# Apply max TTL if configured
if (duration is None and max_signed_tokens_ttl) or (
duration is not None
and max_signed_tokens_ttl
and duration > max_signed_tokens_ttl
):
duration = max_signed_tokens_ttl
# Check expiry
if duration:
if time.time() - created > duration:
return None
# Build actor dict
actor = {"id": decoded["a"], "token": "dstok"}
# Copy restrictions if present
if "_r" in decoded:
actor["_r"] = decoded["_r"]
# Add expiry timestamp if applicable
if duration:
actor["token_expires"] = created + duration
return actor

235
datasette/events.py Normal file
View file

@ -0,0 +1,235 @@
from abc import ABC, abstractproperty
from dataclasses import asdict, dataclass, field
from datasette.hookspecs import hookimpl
from datetime import datetime, timezone
@dataclass
class Event(ABC):
@abstractproperty
def name(self):
pass
created: datetime = field(
init=False, default_factory=lambda: datetime.now(timezone.utc)
)
actor: dict | None
def properties(self):
properties = asdict(self)
properties.pop("actor", None)
properties.pop("created", None)
return properties
@dataclass
class LoginEvent(Event):
"""
Event name: ``login``
A user (represented by ``event.actor``) has logged in.
"""
name = "login"
@dataclass
class LogoutEvent(Event):
"""
Event name: ``logout``
A user (represented by ``event.actor``) has logged out.
"""
name = "logout"
@dataclass
class CreateTokenEvent(Event):
"""
Event name: ``create-token``
A user created an API token.
:ivar expires_after: Number of seconds after which this token will expire.
:type expires_after: int or None
:ivar restrict_all: Restricted permissions for this token.
:type restrict_all: list
:ivar restrict_database: Restricted database permissions for this token.
:type restrict_database: dict
:ivar restrict_resource: Restricted resource permissions for this token.
:type restrict_resource: dict
"""
name = "create-token"
expires_after: int | None
restrict_all: list
restrict_database: dict
restrict_resource: dict
@dataclass
class CreateTableEvent(Event):
"""
Event name: ``create-table``
A new table has been created in the database.
:ivar database: The name of the database where the table was created.
:type database: str
:ivar table: The name of the table that was created
:type table: str
:ivar schema: The SQL schema definition for the new table.
:type schema: str
"""
name = "create-table"
database: str
table: str
schema: str
@dataclass
class DropTableEvent(Event):
"""
Event name: ``drop-table``
A table has been dropped from the database.
:ivar database: The name of the database where the table was dropped.
:type database: str
:ivar table: The name of the table that was dropped
:type table: str
"""
name = "drop-table"
database: str
table: str
@dataclass
class AlterTableEvent(Event):
"""
Event name: ``alter-table``
A table has been altered.
:ivar database: The name of the database where the table was altered
:type database: str
:ivar table: The name of the table that was altered
:type table: str
:ivar before_schema: The table's SQL schema before the alteration
:type before_schema: str
:ivar after_schema: The table's SQL schema after the alteration
:type after_schema: str
"""
name = "alter-table"
database: str
table: str
before_schema: str
after_schema: str
@dataclass
class InsertRowsEvent(Event):
"""
Event name: ``insert-rows``
Rows were inserted into a table.
:ivar database: The name of the database where the rows were inserted.
:type database: str
:ivar table: The name of the table where the rows were inserted.
:type table: str
:ivar num_rows: The number of rows that were requested to be inserted.
:type num_rows: int
:ivar ignore: Was ignore set?
:type ignore: bool
:ivar replace: Was replace set?
:type replace: bool
"""
name = "insert-rows"
database: str
table: str
num_rows: int
ignore: bool
replace: bool
@dataclass
class UpsertRowsEvent(Event):
"""
Event name: ``upsert-rows``
Rows were upserted into a table.
:ivar database: The name of the database where the rows were inserted.
:type database: str
:ivar table: The name of the table where the rows were inserted.
:type table: str
:ivar num_rows: The number of rows that were requested to be inserted.
:type num_rows: int
"""
name = "upsert-rows"
database: str
table: str
num_rows: int
@dataclass
class UpdateRowEvent(Event):
"""
Event name: ``update-row``
A row was updated in a table.
:ivar database: The name of the database where the row was updated.
:type database: str
:ivar table: The name of the table where the row was updated.
:type table: str
:ivar pks: The primary key values of the updated row.
"""
name = "update-row"
database: str
table: str
pks: list
@dataclass
class DeleteRowEvent(Event):
"""
Event name: ``delete-row``
A row was deleted from a table.
:ivar database: The name of the database where the row was deleted.
:type database: str
:ivar table: The name of the table where the row was deleted.
:type table: str
:ivar pks: The primary key values of the deleted row.
"""
name = "delete-row"
database: str
table: str
pks: list
@hookimpl
def register_events():
return [
LoginEvent,
LogoutEvent,
CreateTableEvent,
CreateTokenEvent,
AlterTableEvent,
DropTableEvent,
InsertRowsEvent,
UpsertRowsEvent,
UpdateRowEvent,
DeleteRowEvent,
]

View file

@ -1,20 +1,18 @@
import json import json
import urllib import urllib
import re
from datasette import hookimpl from datasette import hookimpl
from datasette.database import QueryInterrupted
from datasette.utils import ( from datasette.utils import (
escape_sqlite, escape_sqlite,
path_with_added_args, path_with_added_args,
path_with_removed_args, path_with_removed_args,
detect_json1, detect_json1,
QueryInterrupted,
InvalidSql,
sqlite3, sqlite3,
) )
def load_facet_configs(request, table_metadata): def load_facet_configs(request, table_config):
# Given a request and the metadata configuration for a table, return # Given a request and the configuration for a table, return
# a dictionary of selected facets, their lists of configs and for each # a dictionary of selected facets, their lists of configs and for each
# config whether it came from the request or the metadata. # config whether it came from the request or the metadata.
# #
@ -22,21 +20,21 @@ def load_facet_configs(request, table_metadata):
# {"source": "metadata", "config": config1}, # {"source": "metadata", "config": config1},
# {"source": "request", "config": config2}]} # {"source": "request", "config": config2}]}
facet_configs = {} facet_configs = {}
table_metadata = table_metadata or {} table_config = table_config or {}
metadata_facets = table_metadata.get("facets", []) table_facet_configs = table_config.get("facets", [])
for metadata_config in metadata_facets: for facet_config in table_facet_configs:
if isinstance(metadata_config, str): if isinstance(facet_config, str):
type = "column" type = "column"
metadata_config = {"simple": metadata_config} facet_config = {"simple": facet_config}
else: else:
assert ( assert (
len(metadata_config.values()) == 1 len(facet_config.values()) == 1
), "Metadata config dicts should be {type: config}" ), "Metadata config dicts should be {type: config}"
type, metadata_config = metadata_config.items()[0] type, facet_config = list(facet_config.items())[0]
if isinstance(metadata_config, str): if isinstance(facet_config, str):
metadata_config = {"simple": metadata_config} facet_config = {"simple": facet_config}
facet_configs.setdefault(type, []).append( facet_configs.setdefault(type, []).append(
{"source": "metadata", "config": metadata_config} {"source": "metadata", "config": facet_config}
) )
qs_pairs = urllib.parse.parse_qs(request.query_string, keep_blank_values=True) qs_pairs = urllib.parse.parse_qs(request.query_string, keep_blank_values=True)
for key, values in qs_pairs.items(): for key, values in qs_pairs.items():
@ -47,20 +45,19 @@ def load_facet_configs(request, table_metadata):
elif key.startswith("_facet_"): elif key.startswith("_facet_"):
type = key[len("_facet_") :] type = key[len("_facet_") :]
for value in values: for value in values:
# The value is the config - either JSON or not # The value is the facet_config - either JSON or not
if value.startswith("{"): facet_config = (
config = json.loads(value) json.loads(value) if value.startswith("{") else {"simple": value}
else: )
config = {"simple": value}
facet_configs.setdefault(type, []).append( facet_configs.setdefault(type, []).append(
{"source": "request", "config": config} {"source": "request", "config": facet_config}
) )
return facet_configs return facet_configs
@hookimpl @hookimpl
def register_facet_classes(): def register_facet_classes():
classes = [ColumnFacet, DateFacet, ManyToManyFacet] classes = [ColumnFacet, DateFacet]
if detect_json1(): if detect_json1():
classes.append(ArrayFacet) classes.append(ArrayFacet)
return classes return classes
@ -68,6 +65,8 @@ def register_facet_classes():
class Facet: class Facet:
type = None type = None
# How many rows to consider when suggesting facets:
suggest_consider = 1000
def __init__( def __init__(
self, self,
@ -77,7 +76,7 @@ class Facet:
sql=None, sql=None,
table=None, table=None,
params=None, params=None,
metadata=None, table_config=None,
row_count=None, row_count=None,
): ):
assert table or sql, "Must provide either table= or sql=" assert table or sql, "Must provide either table= or sql="
@ -86,14 +85,14 @@ class Facet:
self.database = database self.database = database
# For foreign key expansion. Can be None for e.g. canned SQL queries: # For foreign key expansion. Can be None for e.g. canned SQL queries:
self.table = table self.table = table
self.sql = sql or "select * from [{}]".format(table) self.sql = sql or f"select * from [{table}]"
self.params = params or [] self.params = params or []
self.metadata = metadata self.table_config = table_config
# row_count can be None, in which case we calculate it ourselves: # row_count can be None, in which case we calculate it ourselves:
self.row_count = row_count self.row_count = row_count
def get_configs(self): def get_configs(self):
configs = load_facet_configs(self.request, self.metadata) configs = load_facet_configs(self.request, self.table_config)
return configs.get(self.type) or [] return configs.get(self.type) or []
def get_querystring_pairs(self): def get_querystring_pairs(self):
@ -101,6 +100,36 @@ class Facet:
# [('_foo', 'bar'), ('_foo', '2'), ('empty', '')] # [('_foo', 'bar'), ('_foo', '2'), ('empty', '')]
return urllib.parse.parse_qsl(self.request.query_string, keep_blank_values=True) return urllib.parse.parse_qsl(self.request.query_string, keep_blank_values=True)
def get_facet_size(self):
facet_size = self.ds.setting("default_facet_size")
max_returned_rows = self.ds.setting("max_returned_rows")
table_facet_size = None
if self.table:
config_facet_size = (
self.ds.config.get("databases", {})
.get(self.database, {})
.get("tables", {})
.get(self.table, {})
.get("facet_size")
)
if config_facet_size:
table_facet_size = config_facet_size
custom_facet_size = self.request.args.get("_facet_size")
if custom_facet_size:
if custom_facet_size == "max":
facet_size = max_returned_rows
elif custom_facet_size.isdigit():
facet_size = int(custom_facet_size)
else:
# Invalid value, ignore it
custom_facet_size = None
if table_facet_size and not custom_facet_size:
if table_facet_size == "max":
facet_size = max_returned_rows
else:
facet_size = table_facet_size
return min(facet_size, max_returned_rows)
async def suggest(self): async def suggest(self):
return [] return []
@ -114,21 +143,10 @@ class Facet:
# Detect column names using the "limit 0" trick # Detect column names using the "limit 0" trick
return ( return (
await self.ds.execute( await self.ds.execute(
self.database, "select * from ({}) limit 0".format(sql), params or [] self.database, f"select * from ({sql}) limit 0", params or []
) )
).columns ).columns
async def get_row_count(self):
if self.row_count is None:
self.row_count = (
await self.ds.execute(
self.database,
"select count(*) from ({})".format(self.sql),
self.params,
)
).rows[0][0]
return self.row_count
class ColumnFacet(Facet): class ColumnFacet(Facet):
type = "column" type = "column"
@ -136,19 +154,23 @@ class ColumnFacet(Facet):
async def suggest(self): async def suggest(self):
row_count = await self.get_row_count() row_count = await self.get_row_count()
columns = await self.get_columns(self.sql, self.params) columns = await self.get_columns(self.sql, self.params)
facet_size = self.ds.config("default_facet_size") facet_size = self.get_facet_size()
suggested_facets = [] suggested_facets = []
already_enabled = [c["config"]["simple"] for c in self.get_configs()] already_enabled = [c["config"]["simple"] for c in self.get_configs()]
for column in columns: for column in columns:
if column in already_enabled: if column in already_enabled:
continue continue
suggested_facet_sql = """ suggested_facet_sql = """
select distinct {column} from ( with limited as (select * from ({sql}) limit {suggest_consider})
{sql} select {column} as value, count(*) as n from limited
) where {column} is not null where value is not null
group by value
limit {limit} limit {limit}
""".format( """.format(
column=escape_sqlite(column), sql=self.sql, limit=facet_size + 1 column=escape_sqlite(column),
sql=self.sql,
limit=facet_size + 1,
suggest_consider=self.suggest_consider,
) )
distinct_values = None distinct_values = None
try: try:
@ -157,21 +179,25 @@ class ColumnFacet(Facet):
suggested_facet_sql, suggested_facet_sql,
self.params, self.params,
truncate=False, truncate=False,
custom_time_limit=self.ds.config("facet_suggest_time_limit_ms"), custom_time_limit=self.ds.setting("facet_suggest_time_limit_ms"),
) )
num_distinct_values = len(distinct_values) num_distinct_values = len(distinct_values)
if ( if (
num_distinct_values 1 < num_distinct_values < row_count
and num_distinct_values > 1
and num_distinct_values <= facet_size and num_distinct_values <= facet_size
and num_distinct_values < row_count # And at least one has n > 1
and any(r["n"] > 1 for r in distinct_values)
): ):
suggested_facets.append( suggested_facets.append(
{ {
"name": column, "name": column,
"toggle_url": self.ds.absolute_url( "toggle_url": self.ds.absolute_url(
self.request, self.request,
path_with_added_args(self.request, {"_facet": column}), self.ds.urls.path(
path_with_added_args(
self.request, {"_facet": column}
)
),
), ),
} }
) )
@ -179,13 +205,24 @@ class ColumnFacet(Facet):
continue continue
return suggested_facets return suggested_facets
async def get_row_count(self):
if self.row_count is None:
self.row_count = (
await self.ds.execute(
self.database,
f"select count(*) from (select * from ({self.sql}) limit {self.suggest_consider})",
self.params,
)
).rows[0][0]
return self.row_count
async def facet_results(self): async def facet_results(self):
facet_results = {} facet_results = []
facets_timed_out = [] facets_timed_out = []
qs_pairs = self.get_querystring_pairs() qs_pairs = self.get_querystring_pairs()
facet_size = self.ds.config("default_facet_size") facet_size = self.get_facet_size()
for source_and_config in self.get_configs(): for source_and_config in self.get_configs():
config = source_and_config["config"] config = source_and_config["config"]
source = source_and_config["source"] source = source_and_config["source"]
@ -195,7 +232,7 @@ class ColumnFacet(Facet):
{sql} {sql}
) )
where {col} is not null where {col} is not null
group by {col} order by count desc limit {limit} group by {col} order by count desc, value limit {limit}
""".format( """.format(
col=escape_sqlite(column), sql=self.sql, limit=facet_size + 1 col=escape_sqlite(column), sql=self.sql, limit=facet_size + 1
) )
@ -205,37 +242,42 @@ class ColumnFacet(Facet):
facet_sql, facet_sql,
self.params, self.params,
truncate=False, truncate=False,
custom_time_limit=self.ds.config("facet_time_limit_ms"), custom_time_limit=self.ds.setting("facet_time_limit_ms"),
) )
facet_results_values = [] facet_results_values = []
facet_results[column] = { facet_results.append(
"name": column, {
"type": self.type, "name": column,
"hideable": source != "metadata", "type": self.type,
"toggle_url": path_with_removed_args( "hideable": source != "metadata",
self.request, {"_facet": column} "toggle_url": self.ds.urls.path(
), path_with_removed_args(self.request, {"_facet": column})
"results": facet_results_values, ),
"truncated": len(facet_rows_results) > facet_size, "results": facet_results_values,
} "truncated": len(facet_rows_results) > facet_size,
}
)
facet_rows = facet_rows_results.rows[:facet_size] facet_rows = facet_rows_results.rows[:facet_size]
if self.table: if self.table:
# Attempt to expand foreign keys into labels # Attempt to expand foreign keys into labels
values = [row["value"] for row in facet_rows] values = [row["value"] for row in facet_rows]
expanded = await self.ds.expand_foreign_keys( expanded = await self.ds.expand_foreign_keys(
self.database, self.table, column, values self.request.actor, self.database, self.table, column, values
) )
else: else:
expanded = {} expanded = {}
for row in facet_rows: for row in facet_rows:
selected = (column, str(row["value"])) in qs_pairs column_qs = column
if column.startswith("_"):
column_qs = "{}__exact".format(column)
selected = (column_qs, str(row["value"])) in qs_pairs
if selected: if selected:
toggle_path = path_with_removed_args( toggle_path = path_with_removed_args(
self.request, {column: str(row["value"])} self.request, {column_qs: str(row["value"])}
) )
else: else:
toggle_path = path_with_added_args( toggle_path = path_with_added_args(
self.request, {column: row["value"]} self.request, {column_qs: row["value"]}
) )
facet_results_values.append( facet_results_values.append(
{ {
@ -243,7 +285,7 @@ class ColumnFacet(Facet):
"label": expanded.get((column, row["value"]), row["value"]), "label": expanded.get((column, row["value"]), row["value"]),
"count": row["count"], "count": row["count"],
"toggle_url": self.ds.absolute_url( "toggle_url": self.ds.absolute_url(
self.request, toggle_path self.request, self.ds.urls.path(toggle_path)
), ),
"selected": selected, "selected": selected,
} }
@ -257,6 +299,16 @@ class ColumnFacet(Facet):
class ArrayFacet(Facet): class ArrayFacet(Facet):
type = "array" type = "array"
def _is_json_array_of_strings(self, json_string):
try:
array = json.loads(json_string)
except ValueError:
return False
for item in array:
if not isinstance(item, str):
return False
return True
async def suggest(self): async def suggest(self):
columns = await self.get_columns(self.sql, self.params) columns = await self.get_columns(self.sql, self.params)
suggested_facets = [] suggested_facets = []
@ -266,10 +318,14 @@ class ArrayFacet(Facet):
continue continue
# Is every value in this column either null or a JSON array? # Is every value in this column either null or a JSON array?
suggested_facet_sql = """ suggested_facet_sql = """
with limited as (select * from ({sql}) limit {suggest_consider})
select distinct json_type({column}) select distinct json_type({column})
from ({sql}) from limited
where {column} is not null and {column} != ''
""".format( """.format(
column=escape_sqlite(column), sql=self.sql column=escape_sqlite(column),
sql=self.sql,
suggest_consider=self.suggest_consider,
) )
try: try:
results = await self.ds.execute( results = await self.ds.execute(
@ -277,44 +333,86 @@ class ArrayFacet(Facet):
suggested_facet_sql, suggested_facet_sql,
self.params, self.params,
truncate=False, truncate=False,
custom_time_limit=self.ds.config("facet_suggest_time_limit_ms"), custom_time_limit=self.ds.setting("facet_suggest_time_limit_ms"),
log_sql_errors=False, log_sql_errors=False,
) )
types = tuple(r[0] for r in results.rows) types = tuple(r[0] for r in results.rows)
if types in (("array",), ("array", None)): if types in (("array",), ("array", None)):
suggested_facets.append( # Now check that first 100 arrays contain only strings
{ first_100 = [
"name": column, v[0]
"type": "array", for v in await self.ds.execute(
"toggle_url": self.ds.absolute_url( self.database,
self.request, (
path_with_added_args( "select {column} from ({sql}) "
self.request, {"_facet_array": column} "where {column} is not null "
), "and {column} != '' "
"and json_array_length({column}) > 0 "
"limit 100"
).format(column=escape_sqlite(column), sql=self.sql),
self.params,
truncate=False,
custom_time_limit=self.ds.setting(
"facet_suggest_time_limit_ms"
), ),
} log_sql_errors=False,
) )
]
if first_100 and all(
self._is_json_array_of_strings(r) for r in first_100
):
suggested_facets.append(
{
"name": column,
"type": "array",
"toggle_url": self.ds.absolute_url(
self.request,
self.ds.urls.path(
path_with_added_args(
self.request, {"_facet_array": column}
)
),
),
}
)
except (QueryInterrupted, sqlite3.OperationalError): except (QueryInterrupted, sqlite3.OperationalError):
continue continue
return suggested_facets return suggested_facets
async def facet_results(self): async def facet_results(self):
# self.configs should be a plain list of columns # self.configs should be a plain list of columns
facet_results = {} facet_results = []
facets_timed_out = [] facets_timed_out = []
facet_size = self.ds.config("default_facet_size") facet_size = self.get_facet_size()
for source_and_config in self.get_configs(): for source_and_config in self.get_configs():
config = source_and_config["config"] config = source_and_config["config"]
source = source_and_config["source"] source = source_and_config["source"]
column = config.get("column") or config["simple"] column = config.get("column") or config["simple"]
# https://github.com/simonw/datasette/issues/448
facet_sql = """ facet_sql = """
select j.value as value, count(*) as count from ( with inner as ({sql}),
{sql} deduped_array_items as (
) join json_each({col}) j select
group by j.value order by count desc limit {limit} distinct j.value,
inner.*
from
json_each([inner].{col}) j
join inner
)
select
value as value,
count(*) as count
from
deduped_array_items
group by
value
order by
count(*) desc, value limit {limit}
""".format( """.format(
col=escape_sqlite(column), sql=self.sql, limit=facet_size + 1 col=escape_sqlite(column),
sql=self.sql,
limit=facet_size + 1,
) )
try: try:
facet_rows_results = await self.ds.execute( facet_rows_results = await self.ds.execute(
@ -322,31 +420,35 @@ class ArrayFacet(Facet):
facet_sql, facet_sql,
self.params, self.params,
truncate=False, truncate=False,
custom_time_limit=self.ds.config("facet_time_limit_ms"), custom_time_limit=self.ds.setting("facet_time_limit_ms"),
) )
facet_results_values = [] facet_results_values = []
facet_results[column] = { facet_results.append(
"name": column, {
"type": self.type, "name": column,
"results": facet_results_values, "type": self.type,
"hideable": source != "metadata", "results": facet_results_values,
"toggle_url": path_with_removed_args( "hideable": source != "metadata",
self.request, {"_facet_array": column} "toggle_url": self.ds.urls.path(
), path_with_removed_args(
"truncated": len(facet_rows_results) > facet_size, self.request, {"_facet_array": column}
} )
),
"truncated": len(facet_rows_results) > facet_size,
}
)
facet_rows = facet_rows_results.rows[:facet_size] facet_rows = facet_rows_results.rows[:facet_size]
pairs = self.get_querystring_pairs() pairs = self.get_querystring_pairs()
for row in facet_rows: for row in facet_rows:
value = str(row["value"]) value = str(row["value"])
selected = ("{}__arraycontains".format(column), value) in pairs selected = (f"{column}__arraycontains", value) in pairs
if selected: if selected:
toggle_path = path_with_removed_args( toggle_path = path_with_removed_args(
self.request, {"{}__arraycontains".format(column): value} self.request, {f"{column}__arraycontains": value}
) )
else: else:
toggle_path = path_with_added_args( toggle_path = path_with_added_args(
self.request, {"{}__arraycontains".format(column): value} self.request, {f"{column}__arraycontains": value}
) )
facet_results_values.append( facet_results_values.append(
{ {
@ -378,8 +480,8 @@ class DateFacet(Facet):
# Does this column contain any dates in the first 100 rows? # Does this column contain any dates in the first 100 rows?
suggested_facet_sql = """ suggested_facet_sql = """
select date({column}) from ( select date({column}) from (
{sql} select * from ({sql}) limit 100
) where {column} glob "????-??-*" limit 100; ) where {column} glob "????-??-*"
""".format( """.format(
column=escape_sqlite(column), sql=self.sql column=escape_sqlite(column), sql=self.sql
) )
@ -389,7 +491,7 @@ class DateFacet(Facet):
suggested_facet_sql, suggested_facet_sql,
self.params, self.params,
truncate=False, truncate=False,
custom_time_limit=self.ds.config("facet_suggest_time_limit_ms"), custom_time_limit=self.ds.setting("facet_suggest_time_limit_ms"),
log_sql_errors=False, log_sql_errors=False,
) )
values = tuple(r[0] for r in results.rows) values = tuple(r[0] for r in results.rows)
@ -400,8 +502,10 @@ class DateFacet(Facet):
"type": "date", "type": "date",
"toggle_url": self.ds.absolute_url( "toggle_url": self.ds.absolute_url(
self.request, self.request,
path_with_added_args( self.ds.urls.path(
self.request, {"_facet_date": column} path_with_added_args(
self.request, {"_facet_date": column}
)
), ),
), ),
} }
@ -411,10 +515,10 @@ class DateFacet(Facet):
return suggested_facets return suggested_facets
async def facet_results(self): async def facet_results(self):
facet_results = {} facet_results = []
facets_timed_out = [] facets_timed_out = []
args = dict(self.get_querystring_pairs()) args = dict(self.get_querystring_pairs())
facet_size = self.ds.config("default_facet_size") facet_size = self.get_facet_size()
for source_and_config in self.get_configs(): for source_and_config in self.get_configs():
config = source_and_config["config"] config = source_and_config["config"]
source = source_and_config["source"] source = source_and_config["source"]
@ -425,7 +529,7 @@ class DateFacet(Facet):
{sql} {sql}
) )
where date({col}) is not null where date({col}) is not null
group by date({col}) order by count desc limit {limit} group by date({col}) order by count desc, value limit {limit}
""".format( """.format(
col=escape_sqlite(column), sql=self.sql, limit=facet_size + 1 col=escape_sqlite(column), sql=self.sql, limit=facet_size + 1
) )
@ -435,31 +539,31 @@ class DateFacet(Facet):
facet_sql, facet_sql,
self.params, self.params,
truncate=False, truncate=False,
custom_time_limit=self.ds.config("facet_time_limit_ms"), custom_time_limit=self.ds.setting("facet_time_limit_ms"),
) )
facet_results_values = [] facet_results_values = []
facet_results[column] = { facet_results.append(
"name": column, {
"type": self.type, "name": column,
"results": facet_results_values, "type": self.type,
"hideable": source != "metadata", "results": facet_results_values,
"toggle_url": path_with_removed_args( "hideable": source != "metadata",
self.request, {"_facet_date": column} "toggle_url": path_with_removed_args(
), self.request, {"_facet_date": column}
"truncated": len(facet_rows_results) > facet_size, ),
} "truncated": len(facet_rows_results) > facet_size,
}
)
facet_rows = facet_rows_results.rows[:facet_size] facet_rows = facet_rows_results.rows[:facet_size]
for row in facet_rows: for row in facet_rows:
selected = str(args.get("{}__date".format(column))) == str( selected = str(args.get(f"{column}__date")) == str(row["value"])
row["value"]
)
if selected: if selected:
toggle_path = path_with_removed_args( toggle_path = path_with_removed_args(
self.request, {"{}__date".format(column): str(row["value"])} self.request, {f"{column}__date": str(row["value"])}
) )
else: else:
toggle_path = path_with_added_args( toggle_path = path_with_added_args(
self.request, {"{}__date".format(column): row["value"]} self.request, {f"{column}__date": row["value"]}
) )
facet_results_values.append( facet_results_values.append(
{ {
@ -476,190 +580,3 @@ class DateFacet(Facet):
facets_timed_out.append(column) facets_timed_out.append(column)
return facet_results, facets_timed_out return facet_results, facets_timed_out
class ManyToManyFacet(Facet):
type = "m2m"
async def suggest(self):
# This is calculated based on foreign key relationships to this table
# Are there any many-to-many tables pointing here?
suggested_facets = []
db = self.ds.databases[self.database]
all_foreign_keys = await db.get_all_foreign_keys()
if not all_foreign_keys.get(self.table):
# It's probably a view
return []
args = set(self.get_querystring_pairs())
incoming = all_foreign_keys[self.table]["incoming"]
# Do any of these incoming tables have exactly two outgoing keys?
for fk in incoming:
other_table = fk["other_table"]
other_table_outgoing_foreign_keys = all_foreign_keys[other_table][
"outgoing"
]
if len(other_table_outgoing_foreign_keys) == 2:
destination_table = [
t
for t in other_table_outgoing_foreign_keys
if t["other_table"] != self.table
][0]["other_table"]
# Only suggest if it's not selected already
if ("_facet_m2m", destination_table) in args:
continue
suggested_facets.append(
{
"name": destination_table,
"type": "m2m",
"toggle_url": self.ds.absolute_url(
self.request,
path_with_added_args(
self.request, {"_facet_m2m": destination_table}
),
),
}
)
return suggested_facets
async def facet_results(self):
facet_results = {}
facets_timed_out = []
args = set(self.get_querystring_pairs())
facet_size = self.ds.config("default_facet_size")
db = self.ds.databases[self.database]
all_foreign_keys = await db.get_all_foreign_keys()
if not all_foreign_keys.get(self.table):
return [], []
# We care about three tables: self.table, middle_table and destination_table
incoming = all_foreign_keys[self.table]["incoming"]
for source_and_config in self.get_configs():
config = source_and_config["config"]
source = source_and_config["source"]
# The destination_table is specified in the _facet_m2m=xxx parameter
destination_table = config.get("column") or config["simple"]
# Find middle table - it has fks to self.table AND destination_table
fks = None
middle_table = None
for fk in incoming:
other_table = fk["other_table"]
other_table_outgoing_foreign_keys = all_foreign_keys[other_table][
"outgoing"
]
if (
any(
o
for o in other_table_outgoing_foreign_keys
if o["other_table"] == destination_table
)
and len(other_table_outgoing_foreign_keys) == 2
):
fks = other_table_outgoing_foreign_keys
middle_table = other_table
break
if middle_table is None or fks is None:
return [], []
# Now that we have determined the middle_table, we need to figure out the three
# columns on that table which are relevant to us. These are:
# column_to_table - the middle_table column with a foreign key to self.table
# table_pk - the primary key column on self.table that is referenced
# column_to_destination - the column with a foreign key to destination_table
#
# It turns out we don't actually need the fourth obvious column:
# destination_pk = the primary key column on destination_table which is referenced
#
# These are both in the fks array - which now contains 2 foreign key relationships, e.g:
# [
# {'other_table': 'characteristic', 'column': 'characteristic_id', 'other_column': 'pk'},
# {'other_table': 'attractions', 'column': 'attraction_id', 'other_column': 'pk'}
# ]
column_to_table = None
table_pk = None
column_to_destination = None
for fk in fks:
if fk["other_table"] == self.table:
table_pk = fk["other_column"]
column_to_table = fk["column"]
elif fk["other_table"] == destination_table:
column_to_destination = fk["column"]
assert all((column_to_table, table_pk, column_to_destination))
facet_sql = """
select
{middle_table}.{column_to_destination} as value,
count(distinct {middle_table}.{column_to_table}) as count
from {middle_table}
where {middle_table}.{column_to_table} in (
select {table_pk} from ({sql})
)
group by {middle_table}.{column_to_destination}
order by count desc limit {limit}
""".format(
sql=self.sql,
limit=facet_size + 1,
middle_table=escape_sqlite(middle_table),
column_to_destination=escape_sqlite(column_to_destination),
column_to_table=escape_sqlite(column_to_table),
table_pk=escape_sqlite(table_pk),
)
try:
facet_rows_results = await self.ds.execute(
self.database,
facet_sql,
self.params,
truncate=False,
custom_time_limit=self.ds.config("facet_time_limit_ms"),
)
facet_results_values = []
facet_results[destination_table] = {
"name": destination_table,
"type": self.type,
"results": facet_results_values,
"hideable": source != "metadata",
"toggle_url": path_with_removed_args(
self.request, {"_facet_m2m": destination_table}
),
"truncated": len(facet_rows_results) > facet_size,
}
facet_rows = facet_rows_results.rows[:facet_size]
# Attempt to expand foreign keys into labels
values = [row["value"] for row in facet_rows]
expanded = await self.ds.expand_foreign_keys(
self.database, middle_table, column_to_destination, values
)
for row in facet_rows:
through = json.dumps(
{
"table": middle_table,
"column": column_to_destination,
"value": str(row["value"]),
},
separators=(",", ":"),
sort_keys=True,
)
selected = ("_through", through) in args
if selected:
toggle_path = path_with_removed_args(
self.request, {"_through": through}
)
else:
toggle_path = path_with_added_args(
self.request, {"_through": through}
)
facet_results_values.append(
{
"value": row["value"],
"label": expanded.get(
(column_to_destination, row["value"]), row["value"]
),
"count": row["count"],
"toggle_url": self.ds.absolute_url(
self.request, toggle_path
),
"selected": selected,
}
)
except QueryInterrupted:
facets_timed_out.append(destination_table)
return facet_results, facets_timed_out

View file

@ -1,7 +1,173 @@
from datasette import hookimpl
from datasette.resources import DatabaseResource
from datasette.views.base import DatasetteError
from datasette.utils.asgi import BadRequest
import json import json
import numbers from .utils import detect_json1, escape_sqlite, path_with_removed_args
from .utils import detect_json1, escape_sqlite
@hookimpl(specname="filters_from_request")
def where_filters(request, database, datasette):
# This one deals with ?_where=
async def inner():
where_clauses = []
extra_wheres_for_ui = []
if "_where" in request.args:
if not await datasette.allowed(
action="execute-sql",
resource=DatabaseResource(database=database),
actor=request.actor,
):
raise DatasetteError("_where= is not allowed", status=403)
else:
where_clauses.extend(request.args.getlist("_where"))
extra_wheres_for_ui = [
{
"text": text,
"remove_url": path_with_removed_args(request, {"_where": text}),
}
for text in request.args.getlist("_where")
]
return FilterArguments(
where_clauses,
extra_context={
"extra_wheres_for_ui": extra_wheres_for_ui,
},
)
return inner
@hookimpl(specname="filters_from_request")
def search_filters(request, database, table, datasette):
# ?_search= and _search_colname=
async def inner():
where_clauses = []
params = {}
human_descriptions = []
extra_context = {}
# Figure out which fts_table to use
table_metadata = await datasette.table_config(database, table)
db = datasette.get_database(database)
fts_table = request.args.get("_fts_table")
fts_table = fts_table or table_metadata.get("fts_table")
fts_table = fts_table or await db.fts_table(table)
fts_pk = request.args.get("_fts_pk", table_metadata.get("fts_pk", "rowid"))
search_args = {
key: request.args[key]
for key in request.args
if key.startswith("_search") and key != "_searchmode"
}
search = ""
search_mode_raw = table_metadata.get("searchmode") == "raw"
# Or set search mode from the querystring
qs_searchmode = request.args.get("_searchmode")
if qs_searchmode == "escaped":
search_mode_raw = False
if qs_searchmode == "raw":
search_mode_raw = True
extra_context["supports_search"] = bool(fts_table)
if fts_table and search_args:
if "_search" in search_args:
# Simple ?_search=xxx
search = search_args["_search"]
where_clauses.append(
"{fts_pk} in (select rowid from {fts_table} where {fts_table} match {match_clause})".format(
fts_table=escape_sqlite(fts_table),
fts_pk=escape_sqlite(fts_pk),
match_clause=(
":search" if search_mode_raw else "escape_fts(:search)"
),
)
)
human_descriptions.append(f'search matches "{search}"')
params["search"] = search
extra_context["search"] = search
else:
# More complex: search against specific columns
for i, (key, search_text) in enumerate(search_args.items()):
search_col = key.split("_search_", 1)[1]
if search_col not in await db.table_columns(fts_table):
raise BadRequest("Cannot search by that column")
where_clauses.append(
"rowid in (select rowid from {fts_table} where {search_col} match {match_clause})".format(
fts_table=escape_sqlite(fts_table),
search_col=escape_sqlite(search_col),
match_clause=(
":search_{}".format(i)
if search_mode_raw
else "escape_fts(:search_{})".format(i)
),
)
)
human_descriptions.append(
f'search column "{search_col}" matches "{search_text}"'
)
params[f"search_{i}"] = search_text
extra_context["search"] = search_text
return FilterArguments(where_clauses, params, human_descriptions, extra_context)
return inner
@hookimpl(specname="filters_from_request")
def through_filters(request, database, table, datasette):
# ?_search= and _search_colname=
async def inner():
where_clauses = []
params = {}
human_descriptions = []
extra_context = {}
# Support for ?_through={table, column, value}
if "_through" in request.args:
for through in request.args.getlist("_through"):
through_data = json.loads(through)
through_table = through_data["table"]
other_column = through_data["column"]
value = through_data["value"]
db = datasette.get_database(database)
outgoing_foreign_keys = await db.foreign_keys_for_table(through_table)
try:
fk_to_us = [
fk for fk in outgoing_foreign_keys if fk["other_table"] == table
][0]
except IndexError:
raise DatasetteError(
"Invalid _through - could not find corresponding foreign key"
)
param = f"p{len(params)}"
where_clauses.append(
"{our_pk} in (select {our_column} from {through_table} where {other_column} = :{param})".format(
through_table=escape_sqlite(through_table),
our_pk=escape_sqlite(fk_to_us["other_column"]),
our_column=escape_sqlite(fk_to_us["column"]),
other_column=escape_sqlite(other_column),
param=param,
)
)
params[param] = value
human_descriptions.append(f'{through_table}.{other_column} = "{value}"')
return FilterArguments(where_clauses, params, human_descriptions, extra_context)
return inner
class FilterArguments:
def __init__(
self, where_clauses, params=None, human_descriptions=None, extra_context=None
):
self.where_clauses = where_clauses
self.params = params or {}
self.human_descriptions = human_descriptions or []
self.extra_context = extra_context or {}
class Filter: class Filter:
@ -43,7 +209,7 @@ class TemplatedFilter(Filter):
kwargs = {"c": column} kwargs = {"c": column}
converted = None converted = None
else: else:
kwargs = {"c": column, "p": "p{}".format(param_counter), "t": table} kwargs = {"c": column, "p": f"p{param_counter}", "t": table}
return self.sql_template.format(**kwargs), converted return self.sql_template.format(**kwargs), converted
def human_clause(self, column, value): def human_clause(self, column, value):
@ -69,12 +235,26 @@ class InFilter(Filter):
def where_clause(self, table, column, value, param_counter): def where_clause(self, table, column, value, param_counter):
values = self.split_value(value) values = self.split_value(value)
params = [":p{}".format(param_counter + i) for i in range(len(values))] params = [f":p{param_counter + i}" for i in range(len(values))]
sql = "{} in ({})".format(escape_sqlite(column), ", ".join(params)) sql = f"{escape_sqlite(column)} in ({', '.join(params)})"
return sql, values return sql, values
def human_clause(self, column, value): def human_clause(self, column, value):
return "{} in {}".format(column, json.dumps(self.split_value(value))) return f"{column} in {json.dumps(self.split_value(value))}"
class NotInFilter(InFilter):
key = "notin"
display = "not in"
def where_clause(self, table, column, value, param_counter):
values = self.split_value(value)
params = [f":p{param_counter + i}" for i in range(len(values))]
sql = f"{escape_sqlite(column)} not in ({', '.join(params)})"
return sql, values
def human_clause(self, column, value):
return f"{column} not in {json.dumps(self.split_value(value))}"
class Filters: class Filters:
@ -100,6 +280,13 @@ class Filters:
'{c} contains "{v}"', '{c} contains "{v}"',
format="%{}%", format="%{}%",
), ),
TemplatedFilter(
"notcontains",
"does not contain",
'"{c}" not like :{p}',
'{c} does not contain "{v}"',
format="%{}%",
),
TemplatedFilter( TemplatedFilter(
"endswith", "endswith",
"ends with", "ends with",
@ -123,20 +310,27 @@ class Filters:
"lte", "\u2264", '"{c}" <= :{p}', "{c} \u2264 {v}", numeric=True "lte", "\u2264", '"{c}" <= :{p}', "{c} \u2264 {v}", numeric=True
), ),
TemplatedFilter("like", "like", '"{c}" like :{p}', '{c} like "{v}"'), TemplatedFilter("like", "like", '"{c}" like :{p}', '{c} like "{v}"'),
TemplatedFilter(
"notlike", "not like", '"{c}" not like :{p}', '{c} not like "{v}"'
),
TemplatedFilter("glob", "glob", '"{c}" glob :{p}', '{c} glob "{v}"'), TemplatedFilter("glob", "glob", '"{c}" glob :{p}', '{c} glob "{v}"'),
InFilter(), InFilter(),
NotInFilter(),
] ]
+ ( + (
[ [
TemplatedFilter( TemplatedFilter(
"arraycontains", "arraycontains",
"array contains", "array contains",
"""rowid in ( """:{p} in (select value from json_each([{t}].[{c}]))""",
select {t}.rowid from {t}, json_each({t}.{c}) j
where j.value = :{p}
)""",
'{c} contains "{v}"', '{c} contains "{v}"',
) ),
TemplatedFilter(
"arraynotcontains",
"array does not contain",
""":{p} not in (select value from json_each([{t}].[{c}]))""",
'{c} does not contain "{v}"',
),
] ]
if detect_json1() if detect_json1()
else [] else []
@ -173,13 +367,11 @@ class Filters:
) )
_filters_by_key = {f.key: f for f in _filters} _filters_by_key = {f.key: f for f in _filters}
def __init__(self, pairs, units={}, ureg=None): def __init__(self, pairs):
self.pairs = pairs self.pairs = pairs
self.units = units
self.ureg = ureg
def lookups(self): def lookups(self):
"Yields (lookup, display, no_argument) pairs" """Yields (lookup, display, no_argument) pairs"""
for filter in self._filters: for filter in self._filters:
yield filter.key, filter.display, filter.no_argument yield filter.key, filter.display, filter.no_argument
@ -201,10 +393,10 @@ class Filters:
s = " and ".join(and_bits) s = " and ".join(and_bits)
if not s: if not s:
return "" return ""
return "where {}".format(s) return f"where {s}"
def selections(self): def selections(self):
"Yields (column, lookup, value) tuples" """Yields (column, lookup, value) tuples"""
for key, value in self.pairs: for key, value in self.pairs:
if "__" in key: if "__" in key:
column, lookup = key.rsplit("__", 1) column, lookup = key.rsplit("__", 1)
@ -216,20 +408,6 @@ class Filters:
def has_selections(self): def has_selections(self):
return bool(self.pairs) return bool(self.pairs)
def convert_unit(self, column, value):
"If the user has provided a unit in the query, convert it into the column unit, if present."
if column not in self.units:
return value
# Try to interpret the value as a unit
value = self.ureg(value)
if isinstance(value, numbers.Number):
# It's just a bare number, assume it's the column unit
return value
column_unit = self.ureg(self.units[column])
return value.to(column_unit).magnitude
def build_where_clauses(self, table): def build_where_clauses(self, table):
sql_bits = [] sql_bits = []
params = {} params = {}
@ -237,15 +415,13 @@ class Filters:
for column, lookup, value in self.selections(): for column, lookup, value in self.selections():
filter = self._filters_by_key.get(lookup, None) filter = self._filters_by_key.get(lookup, None)
if filter: if filter:
sql_bit, param = filter.where_clause( sql_bit, param = filter.where_clause(table, column, value, i)
table, column, self.convert_unit(column, value), i
)
sql_bits.append(sql_bit) sql_bits.append(sql_bit)
if param is not None: if param is not None:
if not isinstance(param, list): if not isinstance(param, list):
param = [param] param = [param]
for individual_param in param: for individual_param in param:
param_id = "p{}".format(i) param_id = f"p{i}"
params[param_id] = individual_param params[param_id] = individual_param
i += 1 i += 1
return sql_bits, params return sql_bits, params

19
datasette/forbidden.py Normal file
View file

@ -0,0 +1,19 @@
from datasette import hookimpl, Response
@hookimpl(trylast=True)
def forbidden(datasette, request, message):
async def inner():
return Response.html(
await datasette.render_template(
"error.html",
{
"title": "Forbidden",
"error": message,
},
request=request,
),
status=403,
)
return inner

View file

@ -0,0 +1,77 @@
from datasette import hookimpl, Response
from .utils import add_cors_headers
from .utils.asgi import (
Base400,
)
from .views.base import DatasetteError
from markupsafe import Markup
import traceback
try:
import ipdb as pdb
except ImportError:
import pdb
try:
import rich
except ImportError:
rich = None
@hookimpl(trylast=True)
def handle_exception(datasette, request, exception):
async def inner():
if datasette.pdb:
pdb.post_mortem(exception.__traceback__)
if rich is not None:
rich.get_console().print_exception(show_locals=True)
title = None
if isinstance(exception, Base400):
status = exception.status
info = {}
message = exception.args[0]
elif isinstance(exception, DatasetteError):
status = exception.status
info = exception.error_dict
message = exception.message
if exception.message_is_html:
message = Markup(message)
title = exception.title
else:
status = 500
info = {}
message = str(exception)
traceback.print_exc()
templates = [f"{status}.html", "error.html"]
info.update(
{
"ok": False,
"error": message,
"status": status,
"title": title,
}
)
headers = {}
if datasette.cors:
add_cors_headers(headers)
if request.path.split("?")[0].endswith(".json"):
return Response.json(info, status=status, headers=headers)
else:
environment = datasette.get_jinja_environment(request)
template = environment.select_template(templates)
return Response.html(
await template.render_async(
dict(
info,
urls=datasette.urls,
app_css_hash=datasette.app_css_hash(),
menu_links=lambda: [],
)
),
status=status,
headers=headers,
)
return inner

View file

@ -5,51 +5,218 @@ hookspec = HookspecMarker("datasette")
hookimpl = HookimplMarker("datasette") hookimpl = HookimplMarker("datasette")
@hookspec
def startup(datasette):
"""Fires directly after Datasette first starts running"""
@hookspec @hookspec
def asgi_wrapper(datasette): def asgi_wrapper(datasette):
"Returns an ASGI middleware callable to wrap our ASGI application with" """Returns an ASGI middleware callable to wrap our ASGI application with"""
@hookspec @hookspec
def prepare_connection(conn): def prepare_connection(conn, database, datasette):
"Modify SQLite connection in some way e.g. register custom SQL functions" """Modify SQLite connection in some way e.g. register custom SQL functions"""
@hookspec @hookspec
def prepare_jinja2_environment(env): def prepare_jinja2_environment(env, datasette):
"Modify Jinja2 template environment e.g. register custom template tags" """Modify Jinja2 template environment e.g. register custom template tags"""
@hookspec @hookspec
def extra_css_urls(template, database, table, datasette): def extra_css_urls(template, database, table, columns, view_name, request, datasette):
"Extra CSS URLs added by this plugin" """Extra CSS URLs added by this plugin"""
@hookspec @hookspec
def extra_js_urls(template, database, table, datasette): def extra_js_urls(template, database, table, columns, view_name, request, datasette):
"Extra JavaScript URLs added by this plugin" """Extra JavaScript URLs added by this plugin"""
@hookspec @hookspec
def extra_body_script(template, database, table, view_name, datasette): def extra_body_script(
"Extra JavaScript code to be included in <script> at bottom of body" template, database, table, columns, view_name, request, datasette
):
"""Extra JavaScript code to be included in <script> at bottom of body"""
@hookspec
def extra_template_vars(
template, database, table, columns, view_name, request, datasette
):
"""Extra template variables to be made available to the template - can return dict or callable or awaitable"""
@hookspec @hookspec
def publish_subcommand(publish): def publish_subcommand(publish):
"Subcommands for 'datasette publish'" """Subcommands for 'datasette publish'"""
@hookspec(firstresult=True) @hookspec
def render_cell(value, column, table, database, datasette): def render_cell(row, value, column, table, database, datasette, request):
"Customize rendering of HTML table cell values" """Customize rendering of HTML table cell values"""
@hookspec @hookspec
def register_output_renderer(datasette): def register_output_renderer(datasette):
"Register a renderer to output data in a different format" """Register a renderer to output data in a different format"""
@hookspec @hookspec
def register_facet_classes(): def register_facet_classes():
"Register Facet subclasses" """Register Facet subclasses"""
@hookspec
def register_actions(datasette):
"""Register actions: returns a list of datasette.permission.Action objects"""
@hookspec
def register_routes(datasette):
"""Register URL routes: return a list of (regex, view_function) pairs"""
@hookspec
def register_commands(cli):
"""Register additional CLI commands, e.g. 'datasette mycommand ...'"""
@hookspec
def actor_from_request(datasette, request):
"""Return an actor dictionary based on the incoming request"""
@hookspec(firstresult=True)
def actors_from_ids(datasette, actor_ids):
"""Returns a dictionary mapping those IDs to actor dictionaries"""
@hookspec
def jinja2_environment_from_request(datasette, request, env):
"""Return a Jinja2 environment based on the incoming request"""
@hookspec
def filters_from_request(request, database, table, datasette):
"""
Return datasette.filters.FilterArguments(
where_clauses=[str, str, str],
params={},
human_descriptions=[str, str, str],
extra_context={}
) based on the request"""
@hookspec
def permission_resources_sql(datasette, actor, action):
"""Return SQL query fragments for permission checks on resources.
Returns None, a PermissionSQL object, or a list of PermissionSQL objects.
Each PermissionSQL contains SQL that should return rows with columns:
parent (str|None), child (str|None), allow (int), reason (str).
Used to efficiently check permissions across multiple resources at once.
"""
@hookspec
def canned_queries(datasette, database, actor):
"""Return a dictionary of canned query definitions or an awaitable function that returns them"""
@hookspec
def register_magic_parameters(datasette):
"""Return a list of (name, function) magic parameter functions"""
@hookspec
def forbidden(datasette, request, message):
"""Custom response for a 403 forbidden error"""
@hookspec
def menu_links(datasette, actor, request):
"""Links for the navigation menu"""
@hookspec
def row_actions(datasette, actor, request, database, table, row):
"""Links for the row actions menu"""
@hookspec
def table_actions(datasette, actor, database, table, request):
"""Links for the table actions menu"""
@hookspec
def view_actions(datasette, actor, database, view, request):
"""Links for the view actions menu"""
@hookspec
def query_actions(datasette, actor, database, query_name, request, sql, params):
"""Links for the query and canned query actions menu"""
@hookspec
def database_actions(datasette, actor, database, request):
"""Links for the database actions menu"""
@hookspec
def homepage_actions(datasette, actor, request):
"""Links for the homepage actions menu"""
@hookspec
def skip_csrf(datasette, scope):
"""Mechanism for skipping CSRF checks for certain requests"""
@hookspec
def handle_exception(datasette, request, exception):
"""Handle an uncaught exception. Can return a Response or None."""
@hookspec
def track_event(datasette, event):
"""Respond to an event tracked by Datasette"""
@hookspec
def register_events(datasette):
"""Return a list of Event subclasses to use with track_event()"""
@hookspec
def top_homepage(datasette, request):
"""HTML to include at the top of the homepage"""
@hookspec
def top_database(datasette, request, database):
"""HTML to include at the top of the database page"""
@hookspec
def top_table(datasette, request, database, table):
"""HTML to include at the top of the table page"""
@hookspec
def top_row(datasette, request, database, table, row):
"""HTML to include at the top of the row page"""
@hookspec
def top_query(datasette, request, database, sql):
"""HTML to include at the top of the query results page"""
@hookspec
def top_canned_query(datasette, request, database, query_name):
"""HTML to include at the top of the canned query page"""

View file

@ -15,7 +15,7 @@ HASH_BLOCK_SIZE = 1024 * 1024
def inspect_hash(path): def inspect_hash(path):
" Calculate the hash of a database, efficiently. " """Calculate the hash of a database, efficiently."""
m = hashlib.sha256() m = hashlib.sha256()
with path.open("rb") as fp: with path.open("rb") as fp:
while True: while True:
@ -28,14 +28,14 @@ def inspect_hash(path):
def inspect_views(conn): def inspect_views(conn):
" List views in a database. " """List views in a database."""
return [ return [
v[0] for v in conn.execute('select name from sqlite_master where type = "view"') v[0] for v in conn.execute('select name from sqlite_master where type = "view"')
] ]
def inspect_tables(conn, database_metadata): def inspect_tables(conn, database_metadata):
" List tables and their row counts, excluding uninteresting tables. " """List tables and their row counts, excluding uninteresting tables."""
tables = {} tables = {}
table_names = [ table_names = [
r["name"] r["name"]
@ -47,7 +47,7 @@ def inspect_tables(conn, database_metadata):
try: try:
count = conn.execute( count = conn.execute(
"select count(*) from {}".format(escape_sqlite(table)) f"select count(*) from {escape_sqlite(table)}"
).fetchone()[0] ).fetchone()[0]
except sqlite3.OperationalError: except sqlite3.OperationalError:
# This can happen when running against a FTS virtual table # This can happen when running against a FTS virtual table

210
datasette/permissions.py Normal file
View file

@ -0,0 +1,210 @@
from abc import ABC, abstractmethod
from dataclasses import dataclass
from typing import Any, NamedTuple
import contextvars
# Context variable to track when permission checks should be skipped
_skip_permission_checks = contextvars.ContextVar(
"skip_permission_checks", default=False
)
class SkipPermissions:
"""Context manager to temporarily skip permission checks.
This is not a stable API and may change in future releases.
Usage:
with SkipPermissions():
# Permission checks are skipped within this block
response = await datasette.client.get("/protected")
"""
def __enter__(self):
self.token = _skip_permission_checks.set(True)
return self
def __exit__(self, exc_type, exc_val, exc_tb):
_skip_permission_checks.reset(self.token)
return False
class Resource(ABC):
"""
Base class for all resource types.
Each subclass represents a type of resource (e.g., TableResource, DatabaseResource).
The class itself carries metadata about the resource type.
Instances represent specific resources.
"""
# Class-level metadata (subclasses must define these)
name: str = None # e.g., "table", "database", "model"
parent_class: type["Resource"] | None = None # e.g., DatabaseResource for tables
# Instance-level optional extra attributes
reasons: list[str] | None = None
include_reasons: bool | None = None
def __init__(self, parent: str | None = None, child: str | None = None):
"""
Create a resource instance.
Args:
parent: The parent identifier (meaning depends on resource type)
child: The child identifier (meaning depends on resource type)
"""
self.parent = parent
self.child = child
self._private = None # Sentinel to track if private was set
@property
def private(self) -> bool:
"""
Whether this resource is private (accessible to actor but not anonymous).
This property is only available on Resource objects returned from
allowed_resources() when include_is_private=True is used.
Raises:
AttributeError: If accessed without calling include_is_private=True
"""
if self._private is None:
raise AttributeError(
"The 'private' attribute is only available when using "
"allowed_resources(..., include_is_private=True)"
)
return self._private
@private.setter
def private(self, value: bool):
self._private = value
@classmethod
def __init_subclass__(cls):
"""
Validate resource hierarchy doesn't exceed 2 levels.
Raises:
ValueError: If this resource would create a 3-level hierarchy
"""
super().__init_subclass__()
if cls.parent_class is None:
return # Top of hierarchy, nothing to validate
# Check if our parent has a parent - that would create 3 levels
if cls.parent_class.parent_class is not None:
# We have a parent, and that parent has a parent
# This creates a 3-level hierarchy, which is not allowed
raise ValueError(
f"Resource {cls.__name__} creates a 3-level hierarchy: "
f"{cls.parent_class.parent_class.__name__} -> {cls.parent_class.__name__} -> {cls.__name__}. "
f"Maximum 2 levels allowed (parent -> child)."
)
@classmethod
@abstractmethod
def resources_sql(cls) -> str:
"""
Return SQL query that returns all resources of this type.
Must return two columns: parent, child
"""
pass
class AllowedResource(NamedTuple):
"""A resource with the reason it was allowed (for debugging)."""
resource: Resource
reason: str
@dataclass(frozen=True, kw_only=True)
class Action:
name: str
description: str | None
abbr: str | None = None
resource_class: type[Resource] | None = None
also_requires: str | None = None # Optional action name that must also be allowed
@property
def takes_parent(self) -> bool:
"""
Whether this action requires a parent identifier when instantiating its resource.
Returns False for global-only actions (no resource_class).
Returns True for all actions with a resource_class (all resources require a parent identifier).
"""
return self.resource_class is not None
@property
def takes_child(self) -> bool:
"""
Whether this action requires a child identifier when instantiating its resource.
Returns False for global actions (no resource_class).
Returns False for parent-level resources (DatabaseResource - parent_class is None).
Returns True for child-level resources (TableResource, QueryResource - have a parent_class).
"""
if self.resource_class is None:
return False
return self.resource_class.parent_class is not None
_reason_id = 1
@dataclass
class PermissionSQL:
"""
A plugin contributes SQL that yields:
parent TEXT NULL,
child TEXT NULL,
allow INTEGER, -- 1 allow, 0 deny
reason TEXT
For restriction-only plugins, sql can be None and only restriction_sql is provided.
"""
sql: str | None = (
None # SQL that SELECTs the 4 columns above (can be None for restriction-only)
)
params: dict[str, Any] | None = (
None # bound params for the SQL (values only; no ':' prefix)
)
source: str | None = None # System will set this to the plugin name
restriction_sql: str | None = (
None # Optional SQL that returns (parent, child) for restriction filtering
)
@classmethod
def allow(cls, reason: str, _allow: bool = True) -> "PermissionSQL":
global _reason_id
i = _reason_id
_reason_id += 1
return cls(
sql=f"SELECT NULL AS parent, NULL AS child, {1 if _allow else 0} AS allow, :reason_{i} AS reason",
params={f"reason_{i}": reason},
)
@classmethod
def deny(cls, reason: str) -> "PermissionSQL":
return cls.allow(reason=reason, _allow=False)
# This is obsolete, replaced by Action and ResourceType
@dataclass
class Permission:
name: str
abbr: str | None
description: str | None
takes_database: bool
takes_resource: bool
default: bool
# This is deliberately undocumented: it's considered an internal
# implementation detail for view-table/view-database and should
# not be used by plugins as it may change in the future.
implies_can_view: bool = False

View file

@ -1,23 +1,124 @@
import importlib import importlib
import os
import pluggy import pluggy
from pprint import pprint
import sys import sys
from . import hookspecs from . import hookspecs
if sys.version_info >= (3, 9):
import importlib.resources as importlib_resources
else:
import importlib_resources
if sys.version_info >= (3, 10):
import importlib.metadata as importlib_metadata
else:
import importlib_metadata
DEFAULT_PLUGINS = ( DEFAULT_PLUGINS = (
"datasette.publish.heroku", "datasette.publish.heroku",
"datasette.publish.now",
"datasette.publish.cloudrun", "datasette.publish.cloudrun",
"datasette.facets", "datasette.facets",
"datasette.filters",
"datasette.sql_functions",
"datasette.actor_auth_cookie",
"datasette.default_permissions",
"datasette.default_actions",
"datasette.default_magic_parameters",
"datasette.blob_renderer",
"datasette.default_menu_links",
"datasette.handle_exception",
"datasette.forbidden",
"datasette.events",
) )
pm = pluggy.PluginManager("datasette") pm = pluggy.PluginManager("datasette")
pm.add_hookspecs(hookspecs) pm.add_hookspecs(hookspecs)
if not hasattr(sys, "_called_from_test"): DATASETTE_TRACE_PLUGINS = os.environ.get("DATASETTE_TRACE_PLUGINS", None)
def before(hook_name, hook_impls, kwargs):
print(file=sys.stderr)
print(f"{hook_name}:", file=sys.stderr)
pprint(kwargs, width=40, indent=4, stream=sys.stderr)
print("Hook implementations:", file=sys.stderr)
pprint(hook_impls, width=40, indent=4, stream=sys.stderr)
def after(outcome, hook_name, hook_impls, kwargs):
results = outcome.get_result()
if not isinstance(results, list):
results = [results]
print("Results:", file=sys.stderr)
pprint(results, width=40, indent=4, stream=sys.stderr)
if DATASETTE_TRACE_PLUGINS:
pm.add_hookcall_monitoring(before, after)
DATASETTE_LOAD_PLUGINS = os.environ.get("DATASETTE_LOAD_PLUGINS", None)
if not hasattr(sys, "_called_from_test") and DATASETTE_LOAD_PLUGINS is None:
# Only load plugins if not running tests # Only load plugins if not running tests
pm.load_setuptools_entrypoints("datasette") pm.load_setuptools_entrypoints("datasette")
# Load any plugins specified in DATASETTE_LOAD_PLUGINS")
if DATASETTE_LOAD_PLUGINS is not None:
for package_name in [
name for name in DATASETTE_LOAD_PLUGINS.split(",") if name.strip()
]:
try:
distribution = importlib_metadata.distribution(package_name)
entry_points = distribution.entry_points
for entry_point in entry_points:
if entry_point.group == "datasette":
mod = entry_point.load()
pm.register(mod, name=entry_point.name)
# Ensure name can be found in plugin_to_distinfo later:
pm._plugin_distinfo.append((mod, distribution))
except importlib_metadata.PackageNotFoundError:
sys.stderr.write("Plugin {} could not be found\n".format(package_name))
# Load default plugins # Load default plugins
for plugin in DEFAULT_PLUGINS: for plugin in DEFAULT_PLUGINS:
mod = importlib.import_module(plugin) mod = importlib.import_module(plugin)
pm.register(mod, plugin) pm.register(mod, plugin)
def get_plugins():
plugins = []
plugin_to_distinfo = dict(pm.list_plugin_distinfo())
for plugin in pm.get_plugins():
static_path = None
templates_path = None
plugin_name = (
plugin.__name__
if hasattr(plugin, "__name__")
else plugin.__class__.__name__
)
if plugin_name not in DEFAULT_PLUGINS:
try:
if (importlib_resources.files(plugin_name) / "static").is_dir():
static_path = str(importlib_resources.files(plugin_name) / "static")
if (importlib_resources.files(plugin_name) / "templates").is_dir():
templates_path = str(
importlib_resources.files(plugin_name) / "templates"
)
except (TypeError, ModuleNotFoundError):
# Caused by --plugins_dir= plugins
pass
plugin_info = {
"name": plugin_name,
"static_path": static_path,
"templates_path": templates_path,
"hooks": [h.name for h in pm.get_hookcallers(plugin)],
}
distinfo = plugin_to_distinfo.get(plugin)
if distinfo:
plugin_info["version"] = distinfo.version
plugin_info["name"] = distinfo.name or distinfo.project_name
plugins.append(plugin_info)
return plugins

View file

@ -1,7 +1,9 @@
from datasette import hookimpl from datasette import hookimpl
import click import click
import json import json
from subprocess import check_call, check_output import os
import re
from subprocess import CalledProcessError, check_call, check_output
from .common import ( from .common import (
add_common_publish_arguments_and_options, add_common_publish_arguments_and_options,
@ -21,9 +23,66 @@ def publish_subcommand(publish):
help="Application name to use when building", help="Application name to use when building",
) )
@click.option( @click.option(
"--service", default="", help="Cloud Run service to deploy (or over-write)" "--service",
default="",
help="Cloud Run service to deploy (or over-write)",
) )
@click.option("--spatialite", is_flag=True, help="Enable SpatialLite extension") @click.option("--spatialite", is_flag=True, help="Enable SpatialLite extension")
@click.option(
"--show-files",
is_flag=True,
help="Output the generated Dockerfile and metadata.json",
)
@click.option(
"--memory",
callback=_validate_memory,
help="Memory to allocate in Cloud Run, e.g. 1Gi",
)
@click.option(
"--cpu",
type=click.Choice(["1", "2", "4"]),
help="Number of vCPUs to allocate in Cloud Run",
)
@click.option(
"--timeout",
type=int,
help="Build timeout in seconds",
)
@click.option(
"--apt-get-install",
"apt_get_extras",
multiple=True,
help="Additional packages to apt-get install",
)
@click.option(
"--max-instances",
type=int,
default=1,
show_default=True,
help="Maximum Cloud Run instances (use 0 to remove the limit)",
)
@click.option(
"--min-instances",
type=int,
help="Minimum Cloud Run instances",
)
@click.option(
"--artifact-repository",
default="datasette",
show_default=True,
help="Artifact Registry repository to store the image",
)
@click.option(
"--artifact-region",
default="us",
show_default=True,
help="Artifact Registry location (region or multi-region)",
)
@click.option(
"--artifact-project",
default=None,
help="Project ID for Artifact Registry (defaults to the active project)",
)
def cloudrun( def cloudrun(
files, files,
metadata, metadata,
@ -33,7 +92,9 @@ def publish_subcommand(publish):
plugins_dir, plugins_dir,
static, static,
install, install,
plugin_secret,
version_note, version_note,
secret,
title, title,
license, license,
license_url, license_url,
@ -44,7 +105,18 @@ def publish_subcommand(publish):
name, name,
service, service,
spatialite, spatialite,
show_files,
memory,
cpu,
timeout,
apt_get_extras,
max_instances,
min_instances,
artifact_repository,
artifact_region,
artifact_project,
): ):
"Publish databases to Datasette running on Cloud Run"
fail_if_publish_binary_not_installed( fail_if_publish_binary_not_installed(
"gcloud", "Google Cloud", "https://cloud.google.com/sdk/" "gcloud", "Google Cloud", "https://cloud.google.com/sdk/"
) )
@ -52,6 +124,72 @@ def publish_subcommand(publish):
"gcloud config get-value project", shell=True, universal_newlines=True "gcloud config get-value project", shell=True, universal_newlines=True
).strip() ).strip()
artifact_project = artifact_project or project
# Ensure Artifact Registry exists for the target image
_ensure_artifact_registry(
artifact_project=artifact_project,
artifact_region=artifact_region,
artifact_repository=artifact_repository,
)
artifact_host = (
artifact_region
if artifact_region.endswith("-docker.pkg.dev")
else f"{artifact_region}-docker.pkg.dev"
)
if not service:
# Show the user their current services, then prompt for one
click.echo("Please provide a service name for this deployment\n")
click.echo("Using an existing service name will over-write it")
click.echo("")
existing_services = get_existing_services()
if existing_services:
click.echo("Your existing services:\n")
for existing_service in existing_services:
click.echo(
" {name} - created {created} - {url}".format(
**existing_service
)
)
click.echo("")
service = click.prompt("Service name", type=str)
image_id = (
f"{artifact_host}/{artifact_project}/"
f"{artifact_repository}/datasette-{service}"
)
extra_metadata = {
"title": title,
"license": license,
"license_url": license_url,
"source": source,
"source_url": source_url,
"about": about,
"about_url": about_url,
}
if not extra_options:
extra_options = ""
if "force_https_urls" not in extra_options:
if extra_options:
extra_options += " "
extra_options += "--setting force_https_urls on"
environment_variables = {}
if plugin_secret:
extra_metadata["plugins"] = {}
for plugin_name, plugin_setting, setting_value in plugin_secret:
environment_variable = (
f"{plugin_name}_{plugin_setting}".upper().replace("-", "_")
)
environment_variables[environment_variable] = setting_value
extra_metadata["plugins"].setdefault(plugin_name, {})[
plugin_setting
] = {"$env": environment_variable}
with temporary_docker_directory( with temporary_docker_directory(
files, files,
name, name,
@ -64,21 +202,112 @@ def publish_subcommand(publish):
install, install,
spatialite, spatialite,
version_note, version_note,
{ secret,
"title": title, extra_metadata,
"license": license, environment_variables,
"license_url": license_url, apt_get_extras=apt_get_extras,
"source": source,
"source_url": source_url,
"about": about,
"about_url": about_url,
},
): ):
image_id = "gcr.io/{project}/{name}".format(project=project, name=name) if show_files:
check_call("gcloud builds submit --tag {}".format(image_id), shell=True) if os.path.exists("metadata.json"):
print("=== metadata.json ===\n")
with open("metadata.json") as fp:
print(fp.read())
print("\n==== Dockerfile ====\n")
with open("Dockerfile") as fp:
print(fp.read())
print("\n====================\n")
check_call(
"gcloud builds submit --tag {}{}".format(
image_id, " --timeout {}".format(timeout) if timeout else ""
),
shell=True,
)
extra_deploy_options = []
for option, value in (
("--memory", memory),
("--cpu", cpu),
("--max-instances", max_instances),
("--min-instances", min_instances),
):
if value is not None:
extra_deploy_options.append("{} {}".format(option, value))
check_call( check_call(
"gcloud beta run deploy --allow-unauthenticated --image {}{}".format( "gcloud run deploy --allow-unauthenticated --platform=managed --image {} {}{}".format(
image_id, " {}".format(service) if service else "" image_id,
service,
" " + " ".join(extra_deploy_options) if extra_deploy_options else "",
), ),
shell=True, shell=True,
) )
def _ensure_artifact_registry(artifact_project, artifact_region, artifact_repository):
"""Ensure Artifact Registry API is enabled and the repository exists."""
enable_cmd = (
"gcloud services enable artifactregistry.googleapis.com "
f"--project {artifact_project} --quiet"
)
try:
check_call(enable_cmd, shell=True)
except CalledProcessError as exc:
raise click.ClickException(
"Failed to enable artifactregistry.googleapis.com. "
"Please ensure you have permissions to manage services."
) from exc
describe_cmd = (
"gcloud artifacts repositories describe {repo} --project {project} "
"--location {location} --quiet"
).format(
repo=artifact_repository,
project=artifact_project,
location=artifact_region,
)
try:
check_call(describe_cmd, shell=True)
return
except CalledProcessError:
create_cmd = (
"gcloud artifacts repositories create {repo} --repository-format=docker "
'--location {location} --project {project} --description "Datasette Cloud Run images" --quiet'
).format(
repo=artifact_repository,
location=artifact_region,
project=artifact_project,
)
try:
check_call(create_cmd, shell=True)
click.echo(f"Created Artifact Registry repository '{artifact_repository}'")
except CalledProcessError as exc:
raise click.ClickException(
"Failed to create Artifact Registry repository. "
"Use --artifact-repository/--artifact-region to point to an existing repo "
"or create one manually."
) from exc
def get_existing_services():
services = json.loads(
check_output(
"gcloud run services list --platform=managed --format json",
shell=True,
universal_newlines=True,
)
)
return [
{
"name": service["metadata"]["name"],
"created": service["metadata"]["creationTimestamp"],
"url": service["status"]["address"]["url"],
}
for service in services
if "url" in service["status"]
]
def _validate_memory(ctx, param, value):
if value and re.match(r"^\d+(Gi|G|Mi|M)$", value) is None:
raise click.BadParameter("--memory should be a number then Gi/G/Mi/M e.g 1Gi")
return value

View file

@ -1,5 +1,6 @@
from ..utils import StaticMount from ..utils import StaticMount
import click import click
import os
import shutil import shutil
import sys import sys
@ -12,13 +13,13 @@ def add_common_publish_arguments_and_options(subcommand):
"-m", "-m",
"--metadata", "--metadata",
type=click.File(mode="r"), type=click.File(mode="r"),
help="Path to JSON file containing metadata to publish", help="Path to JSON/YAML file containing metadata to publish",
), ),
click.option( click.option(
"--extra-options", help="Extra options to pass to datasette serve" "--extra-options", help="Extra options to pass to datasette serve"
), ),
click.option( click.option(
"--branch", help="Install datasette from a GitHub branch e.g. master" "--branch", help="Install datasette from a GitHub branch e.g. main"
), ),
click.option( click.option(
"--template-dir", "--template-dir",
@ -33,7 +34,7 @@ def add_common_publish_arguments_and_options(subcommand):
click.option( click.option(
"--static", "--static",
type=StaticMount(), type=StaticMount(),
help="mountpoint:path-to-directory for serving static files", help="Serve static files from this directory at /MOUNT/...",
multiple=True, multiple=True,
), ),
click.option( click.option(
@ -41,9 +42,23 @@ def add_common_publish_arguments_and_options(subcommand):
help="Additional packages (e.g. plugins) to install", help="Additional packages (e.g. plugins) to install",
multiple=True, multiple=True,
), ),
click.option(
"--plugin-secret",
nargs=3,
type=(str, str, str),
callback=validate_plugin_secret,
multiple=True,
help="Secrets to pass to plugins, e.g. --plugin-secret datasette-auth-github client_id xxx",
),
click.option( click.option(
"--version-note", help="Additional note to show on /-/versions" "--version-note", help="Additional note to show on /-/versions"
), ),
click.option(
"--secret",
help="Secret used for signing secure values, such as signed cookies",
envvar="DATASETTE_PUBLISH_SECRET",
default=lambda: os.urandom(32).hex(),
),
click.option("--title", help="Title for metadata"), click.option("--title", help="Title for metadata"),
click.option("--license", help="License label for metadata"), click.option("--license", help="License label for metadata"),
click.option("--license_url", help="License URL for metadata"), click.option("--license_url", help="License URL for metadata"),
@ -70,9 +85,14 @@ def fail_if_publish_binary_not_installed(binary, publish_target, install_link):
err=True, err=True,
) )
click.echo( click.echo(
"Follow the instructions at {install_link}".format( f"Follow the instructions at {install_link}",
install_link=install_link
),
err=True, err=True,
) )
sys.exit(1) sys.exit(1)
def validate_plugin_secret(ctx, param, value):
for plugin_name, plugin_setting, setting_value in value:
if "'" in setting_value:
raise click.BadParameter("--plugin-secret cannot contain single quotes")
return value

View file

@ -3,7 +3,9 @@ from datasette import hookimpl
import click import click
import json import json
import os import os
import pathlib
import shlex import shlex
import shutil
from subprocess import call, check_output from subprocess import call, check_output
import tempfile import tempfile
@ -11,7 +13,7 @@ from .common import (
add_common_publish_arguments_and_options, add_common_publish_arguments_and_options,
fail_if_publish_binary_not_installed, fail_if_publish_binary_not_installed,
) )
from datasette.utils import link_or_copy, link_or_copy_directory from datasette.utils import link_or_copy, link_or_copy_directory, parse_metadata
@hookimpl @hookimpl
@ -24,6 +26,15 @@ def publish_subcommand(publish):
default="datasette", default="datasette",
help="Application name to use when deploying", help="Application name to use when deploying",
) )
@click.option(
"--tar",
help="--tar option to pass to Heroku, e.g. --tar=/usr/local/bin/gtar",
)
@click.option(
"--generate-dir",
type=click.Path(dir_okay=True, file_okay=False),
help="Output generated application files and stop without deploying",
)
def heroku( def heroku(
files, files,
metadata, metadata,
@ -33,7 +44,9 @@ def publish_subcommand(publish):
plugins_dir, plugins_dir,
static, static,
install, install,
plugin_secret,
version_note, version_note,
secret,
title, title,
license, license,
license_url, license_url,
@ -42,7 +55,10 @@ def publish_subcommand(publish):
about, about,
about_url, about_url,
name, name,
tar,
generate_dir,
): ):
"Publish databases to Datasette running on Heroku"
fail_if_publish_binary_not_installed( fail_if_publish_binary_not_installed(
"heroku", "Heroku", "https://cli.heroku.com" "heroku", "Heroku", "https://cli.heroku.com"
) )
@ -61,6 +77,28 @@ def publish_subcommand(publish):
) )
call(["heroku", "plugins:install", "heroku-builds"]) call(["heroku", "plugins:install", "heroku-builds"])
extra_metadata = {
"title": title,
"license": license,
"license_url": license_url,
"source": source,
"source_url": source_url,
"about": about,
"about_url": about_url,
}
environment_variables = {}
if plugin_secret:
extra_metadata["plugins"] = {}
for plugin_name, plugin_setting, setting_value in plugin_secret:
environment_variable = (
f"{plugin_name}_{plugin_setting}".upper().replace("-", "_")
)
environment_variables[environment_variable] = setting_value
extra_metadata["plugins"].setdefault(plugin_name, {})[
plugin_setting
] = {"$env": environment_variable}
with temporary_heroku_directory( with temporary_heroku_directory(
files, files,
name, name,
@ -72,16 +110,19 @@ def publish_subcommand(publish):
static, static,
install, install,
version_note, version_note,
{ secret,
"title": title, extra_metadata,
"license": license,
"license_url": license_url,
"source": source,
"source_url": source_url,
"about": about,
"about_url": about_url,
},
): ):
if generate_dir:
# Recursively copy files from current working directory to it
if pathlib.Path(generate_dir).exists():
raise click.ClickException("Directory already exists")
shutil.copytree(".", generate_dir)
click.echo(
f"Generated files written to {generate_dir}, stopping without deploying",
err=True,
)
return
app_name = None app_name = None
if name: if name:
# Check to see if this app already exists # Check to see if this app already exists
@ -104,7 +145,15 @@ def publish_subcommand(publish):
create_output = check_output(cmd).decode("utf8") create_output = check_output(cmd).decode("utf8")
app_name = json.loads(create_output)["name"] app_name = json.loads(create_output)["name"]
call(["heroku", "builds:create", "-a", app_name, "--include-vcs-ignore"]) for key, value in environment_variables.items():
call(["heroku", "config:set", "-a", app_name, f"{key}={value}"])
tar_option = []
if tar:
tar_option = ["--tar", tar]
call(
["heroku", "builds:create", "-a", app_name, "--include-vcs-ignore"]
+ tar_option
)
@contextmanager @contextmanager
@ -119,6 +168,7 @@ def temporary_heroku_directory(
static, static,
install, install,
version_note, version_note,
secret,
extra_metadata=None, extra_metadata=None,
): ):
extra_metadata = extra_metadata or {} extra_metadata = extra_metadata or {}
@ -129,7 +179,7 @@ def temporary_heroku_directory(
file_names = [os.path.split(f)[-1] for f in files] file_names = [os.path.split(f)[-1] for f in files]
if metadata: if metadata:
metadata_content = json.load(metadata) metadata_content = parse_metadata(metadata.read())
else: else:
metadata_content = {} metadata_content = {}
for key, value in extra_metadata.items(): for key, value in extra_metadata.items():
@ -140,24 +190,24 @@ def temporary_heroku_directory(
os.chdir(tmp.name) os.chdir(tmp.name)
if metadata_content: if metadata_content:
open("metadata.json", "w").write(json.dumps(metadata_content, indent=2)) with open("metadata.json", "w") as fp:
fp.write(json.dumps(metadata_content, indent=2))
open("runtime.txt", "w").write("python-3.6.8") with open("runtime.txt", "w") as fp:
fp.write("python-3.11.0")
if branch: if branch:
install = [ install = [
"https://github.com/simonw/datasette/archive/{branch}.zip".format( f"https://github.com/simonw/datasette/archive/{branch}.zip"
branch=branch
)
] + list(install) ] + list(install)
else: else:
install = ["datasette"] + list(install) install = ["datasette"] + list(install)
open("requirements.txt", "w").write("\n".join(install)) with open("requirements.txt", "w") as fp:
fp.write("\n".join(install))
os.mkdir("bin") os.mkdir("bin")
open("bin/post_compile", "w").write( with open("bin/post_compile", "w") as fp:
"datasette inspect --inspect-file inspect-data.json" fp.write("datasette inspect --inspect-file inspect-data.json")
)
extras = [] extras = []
if template_dir: if template_dir:
@ -181,7 +231,7 @@ def temporary_heroku_directory(
link_or_copy_directory( link_or_copy_directory(
os.path.join(saved_cwd, path), os.path.join(tmp.name, mount_point) os.path.join(saved_cwd, path), os.path.join(tmp.name, mount_point)
) )
extras.extend(["--static", "{}:{}".format(mount_point, mount_point)]) extras.extend(["--static", f"{mount_point}:{mount_point}"])
quoted_files = " ".join( quoted_files = " ".join(
["-i {}".format(shlex.quote(file_name)) for file_name in file_names] ["-i {}".format(shlex.quote(file_name)) for file_name in file_names]
@ -189,7 +239,8 @@ def temporary_heroku_directory(
procfile_cmd = "web: datasette serve --host 0.0.0.0 {quoted_files} --cors --port $PORT --inspect-file inspect-data.json {extras}".format( procfile_cmd = "web: datasette serve --host 0.0.0.0 {quoted_files} --cors --port $PORT --inspect-file inspect-data.json {extras}".format(
quoted_files=quoted_files, extras=" ".join(extras) quoted_files=quoted_files, extras=" ".join(extras)
) )
open("Procfile", "w").write(procfile_cmd) with open("Procfile", "w") as fp:
fp.write(procfile_cmd)
for path, filename in zip(file_paths, file_names): for path, filename in zip(file_paths, file_names):
link_or_copy(path, os.path.join(tmp.name, filename)) link_or_copy(path, os.path.join(tmp.name, filename))

View file

@ -1,101 +0,0 @@
from datasette import hookimpl
import click
import json
from subprocess import run, PIPE
from .common import (
add_common_publish_arguments_and_options,
fail_if_publish_binary_not_installed,
)
from ..utils import temporary_docker_directory
@hookimpl
def publish_subcommand(publish):
@publish.command()
@add_common_publish_arguments_and_options
@click.option(
"-n",
"--name",
default="datasette",
help="Application name to use when deploying",
)
@click.option("--force", is_flag=True, help="Pass --force option to now")
@click.option("--token", help="Auth token to use for deploy")
@click.option("--alias", multiple=True, help="Desired alias e.g. yoursite.now.sh")
@click.option("--spatialite", is_flag=True, help="Enable SpatialLite extension")
def nowv1(
files,
metadata,
extra_options,
branch,
template_dir,
plugins_dir,
static,
install,
version_note,
title,
license,
license_url,
source,
source_url,
about,
about_url,
name,
force,
token,
alias,
spatialite,
):
fail_if_publish_binary_not_installed("now", "Zeit Now", "https://zeit.co/now")
if extra_options:
extra_options += " "
else:
extra_options = ""
extra_options += "--config force_https_urls:on"
with temporary_docker_directory(
files,
name,
metadata,
extra_options,
branch,
template_dir,
plugins_dir,
static,
install,
spatialite,
version_note,
{
"title": title,
"license": license,
"license_url": license_url,
"source": source,
"source_url": source_url,
"about": about,
"about_url": about_url,
},
):
now_json = {"version": 1}
open("now.json", "w").write(json.dumps(now_json, indent=4))
args = []
if force:
args.append("--force")
if token:
args.append("--token={}".format(token))
if args:
done = run(["now"] + args, stdout=PIPE)
else:
done = run("now", stdout=PIPE)
deployment_url = done.stdout
if alias:
# I couldn't get --target=production working, so I call
# 'now alias' with arguments directly instead - but that
# means I need to figure out what URL it was deployed to.
for single_alias in alias: # --alias can be specified multiple times
args = ["now", "alias", deployment_url, single_alias]
if token:
args.append("--token={}".format(token))
run(args)
else:
print(deployment_url.decode("latin1"))

View file

@ -4,7 +4,9 @@ from datasette.utils import (
remove_infinites, remove_infinites,
CustomJSONEncoder, CustomJSONEncoder,
path_from_row_pks, path_from_row_pks,
sqlite3,
) )
from datasette.utils.asgi import Response
def convert_specific_columns_to_json(rows, columns, json_cols): def convert_specific_columns_to_json(rows, columns, json_cols):
@ -18,21 +20,21 @@ def convert_specific_columns_to_json(rows, columns, json_cols):
if column in json_cols: if column in json_cols:
try: try:
value = json.loads(value) value = json.loads(value)
except (TypeError, ValueError) as e: except (TypeError, ValueError):
print(e)
pass pass
new_row.append(value) new_row.append(value)
new_rows.append(new_row) new_rows.append(new_row)
return new_rows return new_rows
def json_renderer(args, data, view_name): def json_renderer(request, args, data, error, truncated=None):
""" Render a response as JSON """ """Render a response as JSON"""
status_code = 200 status_code = 200
# Handle the _json= parameter which may modify data["rows"] # Handle the _json= parameter which may modify data["rows"]
json_cols = [] json_cols = []
if "_json" in args: if "_json" in args:
json_cols = args["_json"] json_cols = args.getlist("_json")
if json_cols and "rows" in data and "columns" in data: if json_cols and "rows" in data and "columns" in data:
data["rows"] = convert_specific_columns_to_json( data["rows"] = convert_specific_columns_to_json(
data["rows"], data["columns"], json_cols data["rows"], data["columns"], json_cols
@ -43,22 +45,38 @@ def json_renderer(args, data, view_name):
data["rows"] = [remove_infinites(row) for row in data["rows"]] data["rows"] = [remove_infinites(row) for row in data["rows"]]
# Deal with the _shape option # Deal with the _shape option
shape = args.get("_shape", "arrays") shape = args.get("_shape", "objects")
# if there's an error, ignore the shape entirely
data["ok"] = True
if error:
shape = "objects"
status_code = 400
data["error"] = error
data["ok"] = False
if truncated is not None:
data["truncated"] = truncated
if shape == "arrayfirst": if shape == "arrayfirst":
data = [row[0] for row in data["rows"]] if not data["rows"]:
data = []
elif isinstance(data["rows"][0], sqlite3.Row):
data = [row[0] for row in data["rows"]]
else:
assert isinstance(data["rows"][0], dict)
data = [next(iter(row.values())) for row in data["rows"]]
elif shape in ("objects", "object", "array"): elif shape in ("objects", "object", "array"):
columns = data.get("columns") columns = data.get("columns")
rows = data.get("rows") rows = data.get("rows")
if rows and columns: if rows and columns and not isinstance(rows[0], dict):
data["rows"] = [dict(zip(columns, row)) for row in rows] data["rows"] = [dict(zip(columns, row)) for row in rows]
if shape == "object": if shape == "object":
error = None shape_error = None
if "primary_keys" not in data: if "primary_keys" not in data:
error = "_shape=object is only available on tables" shape_error = "_shape=object is only available on tables"
else: else:
pks = data["primary_keys"] pks = data["primary_keys"]
if not pks: if not pks:
error = ( shape_error = (
"_shape=object not available for tables with no primary keys" "_shape=object not available for tables with no primary keys"
) )
else: else:
@ -67,26 +85,41 @@ def json_renderer(args, data, view_name):
pk_string = path_from_row_pks(row, pks, not pks) pk_string = path_from_row_pks(row, pks, not pks)
object_rows[pk_string] = row object_rows[pk_string] = row
data = object_rows data = object_rows
if error: if shape_error:
data = {"ok": False, "error": error} data = {"ok": False, "error": shape_error}
elif shape == "array": elif shape == "array":
data = data["rows"] data = data["rows"]
elif shape == "arrays": elif shape == "arrays":
pass if not data["rows"]:
pass
elif isinstance(data["rows"][0], sqlite3.Row):
data["rows"] = [list(row) for row in data["rows"]]
else:
data["rows"] = [list(row.values()) for row in data["rows"]]
else: else:
status_code = 400 status_code = 400
data = { data = {
"ok": False, "ok": False,
"error": "Invalid _shape: {}".format(shape), "error": f"Invalid _shape: {shape}",
"status": 400, "status": 400,
"title": None, "title": None,
} }
# Don't include "columns" in output
# https://github.com/simonw/datasette/issues/2136
if isinstance(data, dict) and "columns" not in request.args.getlist("_extra"):
data.pop("columns", None)
# Handle _nl option for _shape=array # Handle _nl option for _shape=array
nl = args.get("_nl", "") nl = args.get("_nl", "")
if nl and shape == "array": if nl and shape == "array":
body = "\n".join(json.dumps(item) for item in data) body = "\n".join(json.dumps(item, cls=CustomJSONEncoder) for item in data)
content_type = "text/plain" content_type = "text/plain"
else: else:
body = json.dumps(data, cls=CustomJSONEncoder) body = json.dumps(data, cls=CustomJSONEncoder)
content_type = "application/json; charset=utf-8" content_type = "application/json; charset=utf-8"
return {"body": body, "status_code": status_code, "content_type": content_type} headers = {}
return Response(
body, status=status_code, headers=headers, content_type=content_type
)

90
datasette/resources.py Normal file
View file

@ -0,0 +1,90 @@
"""Core resource types for Datasette's permission system."""
from datasette.permissions import Resource
class DatabaseResource(Resource):
"""A database in Datasette."""
name = "database"
parent_class = None # Top of the resource hierarchy
def __init__(self, database: str):
super().__init__(parent=database, child=None)
@classmethod
async def resources_sql(cls, datasette) -> str:
return """
SELECT database_name AS parent, NULL AS child
FROM catalog_databases
"""
class TableResource(Resource):
"""A table in a database."""
name = "table"
parent_class = DatabaseResource
def __init__(self, database: str, table: str):
super().__init__(parent=database, child=table)
@classmethod
async def resources_sql(cls, datasette) -> str:
return """
SELECT database_name AS parent, table_name AS child
FROM catalog_tables
UNION ALL
SELECT database_name AS parent, view_name AS child
FROM catalog_views
"""
class QueryResource(Resource):
"""A canned query in a database."""
name = "query"
parent_class = DatabaseResource
def __init__(self, database: str, query: str):
super().__init__(parent=database, child=query)
@classmethod
async def resources_sql(cls, datasette) -> str:
from datasette.plugins import pm
from datasette.utils import await_me_maybe
# Get all databases from catalog
db = datasette.get_internal_database()
result = await db.execute("SELECT database_name FROM catalog_databases")
databases = [row[0] for row in result.rows]
# Gather all canned queries from all databases
query_pairs = []
for database_name in databases:
# Call the hook to get queries (including from config via default plugin)
for queries_result in pm.hook.canned_queries(
datasette=datasette,
database=database_name,
actor=None, # Get ALL queries for resource enumeration
):
queries = await await_me_maybe(queries_result)
if queries:
for query_name in queries.keys():
query_pairs.append((database_name, query_name))
# Build SQL
if not query_pairs:
return "SELECT NULL AS parent, NULL AS child WHERE 0"
# Generate UNION ALL query
selects = []
for db_name, query_name in query_pairs:
# Escape single quotes by doubling them
db_escaped = db_name.replace("'", "''")
query_escaped = query_name.replace("'", "''")
selects.append(
f"SELECT '{db_escaped}' AS parent, '{query_escaped}' AS child"
)
return " UNION ALL ".join(selects)

View file

@ -0,0 +1,7 @@
from datasette import hookimpl
from datasette.utils import escape_fts
@hookimpl
def prepare_connection(conn):
conn.create_function("escape_fts", 1, escape_fts)

View file

@ -1,3 +1,69 @@
/* Reset and Page Setup ==================================================== */
/* Reset from http://meyerweb.com/eric/tools/css/reset/
v2.0 | 20110126
License: none (public domain)
*/
html, body, div, span, applet, object, iframe,
h1, h2, h3, h4, h5, h6, p, blockquote, pre,
a, abbr, acronym, address, big, cite, code,
del, dfn, em, img, ins, kbd, q, s, samp,
small, strike, strong, sub, sup, tt, var,
b, u, i, center,
dl, dt, dd, ol, ul, li,
fieldset, form, label, legend,
table, caption, tbody, tfoot, thead, tr, th, td,
article, aside, canvas, details, embed,
figure, figcaption, footer, header, hgroup,
menu, nav, output, ruby, section, summary,
time, mark, audio, video {
margin: 0;
padding: 0;
border: 0;
font-size: 100%;
font: inherit;
vertical-align: baseline;
}
/* HTML5 display-role reset for older browsers */
article, aside, details, figcaption, figure,
footer, header, hgroup, menu, nav, section {
display: block;
}
body {
line-height: 1;
}
ol,
ul {
list-style: none;
}
blockquote,
q {
quotes: none;
}
blockquote:before,
blockquote:after,
q:before,
q:after {
content: '';
content: none;
}
table {
border-collapse: collapse;
border-spacing: 0;
}
th {
padding-right: 1em;
white-space: nowrap;
}
strong {
font-weight: bold;
}
em {
font-style: italic;
}
/* end reset */
body { body {
margin: 0; margin: 0;
padding: 0; padding: 0;
@ -5,109 +71,411 @@ body {
font-size: 1rem; font-size: 1rem;
font-weight: 400; font-weight: 400;
line-height: 1.5; line-height: 1.5;
color: #212529; color: #111A35;
text-align: left; text-align: left;
background-color: #fff; background-color: #F8FAFB;
} }
.bd {
margin: 0 1em; /* Helper Styles ===========================================================*/
.intro {
font-size: 1rem;
} }
table { .metadata-description {
margin-bottom: 1em;
}
p {
margin: 0 0 0.75rem 0;
padding: 0;
}
.meta {
color: rgba(0,0,0,0.3);
font-size: 0.75rem
}
.intro {
font-size: 1.5rem;
margin-bottom: 0.75rem;
}
.context-text {
/* for accessibility and hidden from sight */
text-indent: -999em;
display: block;
width:0;
overflow: hidden;
margin: 0;
padding: 0;
line-height: 0;
}
h1,
h2,
h3,
h4,
h5,
h6,
.header1,
.header2,
.header3,
.header4,
.header5,
.header6 {
font-weight: 700;
font-size: 1rem;
margin: 0;
padding: 0;
word-break: break-word;
}
h1,
.header1 {
font-size: 2rem;
margin-bottom: 0.75rem;
margin-top: 1rem;
}
h2,
.header2 {
font-size: 1.5rem;
margin-bottom: 0.75rem;
margin-top: 1rem;
}
h3,
.header3 {
font-size: 1.25rem;
margin: 1rem 0 0.25rem 0;
}
h4,
.header4 {
margin: 1rem 0 0.25rem 0;
font-weight: 400;
text-decoration: underline;
}
h5,
.header5 {
margin: 1rem 0 0.25rem 0;
font-weight: 700;
text-decoration: underline;
}
h6,
.header6 {
margin: 1rem 0 0.25rem 0;
font-weight: 400;
font-style: italic;
text-decoration: underline;
}
.page-header {
padding-left: 10px;
border-left: 10px solid #666;
margin-bottom: 0.75rem;
margin-top: 1rem;
}
.page-header h1 {
margin: 0;
font-size: 2rem;
padding-right: 0.2em;
}
.page-action-menu details > summary {
list-style: none;
cursor: pointer;
}
.page-action-menu details > summary::-webkit-details-marker {
display: none;
}
div,
section,
article,
header,
nav,
footer,
.wrapper {
display: block;
box-sizing: border-box;
}
a:link {
color: #276890;
text-decoration: underline;
}
a:visited {
color: #54AC8E;
text-decoration: underline;
}
a:hover,
a:focus,
a:active {
color: #67C98D;
text-decoration: underline;
}
button.button-as-link {
background: none;
border: none;
padding: 0;
color: #276890;
text-decoration: underline;
cursor: pointer;
font-size: 1rem;
}
button.button-as-link:hover,
button.button-as-link:focus {
color: #67C98D;
}
code,
pre {
font-family: monospace;
}
ul.bullets,
ul.tight-bullets,
ul.spaced,
ol.spaced {
margin-bottom: 0.8rem;
}
ul.bullets,
ul.tight-bullets {
padding-left: 1.25rem;
}
ul.bullets li,
ul.spaced li,
ol.spaced li {
margin-bottom: 0.4rem;
}
ul.bullets li {
list-style-type: circle;
}
ul.tight-bullets li {
list-style-type: disc;
margin-bottom: 0;
word-break: break-all;
}
a.not-underlined {
text-decoration: none;
}
.not-underlined .underlined {
text-decoration: underline;
}
/* Page Furniture ========================================================= */
/* Header */
header.hd,
footer.ft {
padding: 0.6rem 1rem 0.5rem 1rem;
background-color: #276890;
background: linear-gradient(180deg, rgba(96,144,173,1) 0%, rgba(39,104,144,1) 50%);
color: rgba(255,255,244,0.9);
overflow: hidden;
box-sizing: border-box;
min-height: 2.6rem;
}
footer.ft {
margin-top: 1rem;
}
header.hd p,
footer.ft p {
margin: 0;
padding: 0;
}
header.hd .crumbs {
float: left;
}
header.hd .actor {
float: right;
text-align: right;
padding-left: 1rem;
padding-right: 1rem;
position: relative;
top: -3px;
}
footer.ft a:link,
footer.ft a:visited,
footer.ft a:hover,
footer.ft a:focus,
footer.ft a:active,
footer.ft button.button-as-link {
color: rgba(255,255,244,0.8);
}
header.hd a:link,
header.hd a:visited,
header.hd a:hover,
header.hd a:focus,
header.hd a:active,
header.hd button.button-as-link {
color: rgba(255,255,244,0.8);
text-decoration: none;
}
footer.ft a:hover,
footer.ft a:focus,
footer.ft a:active,
footer.ft .button-as-link:hover,
footer.ft .button-as-link:focus,
header.hd a:hover,
header.hd a:focus,
header.hd a:active,
button.button-as-link:hover,
button.button-as-link:focus {
color: rgba(255,255,244,1);
}
/* Body */
section.content {
margin: 0 1rem;
}
/* Navigation menu */
details.nav-menu > summary {
list-style: none;
display: inline;
float: right;
position: relative;
cursor: pointer;
}
details.nav-menu > summary::-webkit-details-marker {
display: none;
}
details .nav-menu-inner {
position: absolute;
top: 2.6rem;
right: 10px;
width: 180px;
background-color: #276890;
z-index: 1000;
padding: 0;
}
.nav-menu-inner li,
form.nav-menu-logout {
padding: 0.3rem 0.5rem;
border-top: 1px solid #ffffff69;
}
.nav-menu-inner a {
display: block;
}
/* Table/database actions menu */
.page-action-menu {
position: relative;
margin-bottom: 0.5em;
}
.actions-menu-links {
display: inline;
}
.actions-menu-links .dropdown-menu {
position: absolute;
top: calc(100% + 10px);
left: 0;
z-index: 10000;
}
.page-action-menu .icon-text {
display: inline-flex;
align-items: center;
border-radius: .25rem;
padding: 5px 12px 3px 7px;
color: #fff;
font-weight: 400;
font-size: 0.8em;
background: linear-gradient(180deg, #007bff 0%, #4E79C7 100%);
border-color: #007bff;
}
.page-action-menu .icon-text span {
/* Nudge text up a bit */
position: relative;
top: -2px;
}
.page-action-menu .icon-text:hover {
cursor: pointer;
}
.page-action-menu .icon {
width: 18px;
height: 18px;
margin-right: 4px;
}
/* Components ============================================================== */
h2 em {
font-style: normal;
font-weight: lighter;
}
/* Messages */
.message-info,
.message-warning,
.message-error {
padding: 1rem;
margin-bottom: 1rem;
background-color: rgba(103,201,141,0.3);
}
.message-warning {
background-color: rgba(245,166,35,0.3);
}
.message-error {
background-color: rgba(208,2,27,0.3);
}
.pattern-heading {
padding: 1rem;
margin-top: 2rem;
border-top: 1px solid rgba(208,2,27,0.8);
border-bottom: 1px solid rgba(208,2,27,0.8);
background-color: rgba(208,2,27,0.2)
}
/* URL arguments */
.extra-wheres ul,
.extra-wheres li {
list-style-type: none;
padding: 0;
margin: 0;
}
.wrapped-sql {
white-space: pre-wrap;
margin: 1rem 0;
font-family: monospace;
}
/* Tables ================================================================== */
.table-wrapper {
overflow-x: auto;
}
table.rows-and-columns {
border-collapse: collapse; border-collapse: collapse;
} }
td { table.rows-and-columns td {
border-top: 1px solid #aaa; border-top: 1px solid #aaa;
border-right: 1px solid #eee; border-right: 1px solid #eee;
padding: 4px; padding: 4px;
vertical-align: top; vertical-align: top;
white-space: pre-wrap;
} }
td.col-link { table.rows-and-columns td.type-pk {
font-weight: bold; font-weight: bold;
} }
td em { table.rows-and-columns td em {
font-style: normal; font-style: normal;
font-size: 0.8em; font-size: 0.8em;
color: #aaa; color: #aaa;
} }
th { table.rows-and-columns th {
padding-right: 1em; padding-right: 1em;
} }
table a:link { table.rows-and-columns a:link {
text-decoration: none;
color: #445ac8;
}
table a:visited {
color: #8f54c4;
}
.small-screen-only,
.select-wrapper.small-screen-only {
display: none;
}
@media only screen and (max-width: 576px) {
.small-screen-only {
display: initial;
}
/* Force table to not be like tables anymore */
table.rows-and-columns,
.rows-and-columns thead,
.rows-and-columns tbody,
.rows-and-columns th,
.rows-and-columns td,
.rows-and-columns tr {
display: block;
}
/* Hide table headers (but not display: none;, for accessibility) */
.rows-and-columns thead tr {
position: absolute;
top: -9999px;
left: -9999px;
}
.rows-and-columns tr {
border: 1px solid #ccc;
margin-bottom: 1em;
}
.rows-and-columns td {
/* Behave like a "row" */
border: none;
border-bottom: 1px solid #eee;
padding: 0;
padding-left: 10%;
}
.rows-and-columns td:before {
display: block;
margin-left: -10%;
font-size: 0.8em;
}
}
.hd {
border-bottom: 2px solid #ccc;
padding: 0.2em 1em;
background-color: #eee;
overflow: hidden;
box-sizing: border-box;
}
.hd p {
margin: 0;
padding: 0;
}
.hd .crumbs {
float: left;
}
.ft {
margin: 1em 0;
padding: 0.5em 1em 0 1em;
border-top: 1px solid #ccc;
font-size: 0.8em;
}
.hd :link {
text-decoration: none; text-decoration: none;
} }
.rows-and-columns td ol,
.rows-and-columns td ul {
list-style: initial;
list-style-position: inside;
}
a.blob-download {
display: inline-block;
}
.db-table p { .db-table p {
margin-top: 0; margin-top: 0;
margin-bottom: 0.3em; margin-bottom: 0.3em;
@ -117,15 +485,8 @@ table a:visited {
margin-bottom: 0; margin-bottom: 0;
} }
h2 em { /* Forms =================================================================== */
font-style: normal;
font-weight: lighter;
}
.extra-wheres ul, .extra-wheres li {
list-style-type: none;
padding: 0;
margin: 0;
}
form.sql textarea { form.sql textarea {
border: 1px solid #ccc; border: 1px solid #ccc;
width: 70%; width: 70%;
@ -134,24 +495,30 @@ form.sql textarea {
font-family: monospace; font-family: monospace;
font-size: 1.3em; font-size: 1.3em;
} }
form label { form.sql label {
font-weight: bold;
display: inline-block;
width: 15%; width: 15%;
} }
.advanced-export form label {
width: auto;
}
.advanced-export input[type=submit] { .advanced-export input[type=submit] {
font-size: 0.6em; font-size: 0.6em;
margin-left: 1em; margin-left: 1em;
} }
label.sort_by_desc { label.sort_by_desc {
width: auto;
padding-right: 1em; padding-right: 1em;
} }
form input[type=text], pre#sql-query {
form input[type=search] { margin-bottom: 1em;
}
.core label,
label.core {
font-weight: bold;
display: inline-block;
}
.core input[type=text],
input.core[type=text],
.core input[type=search],
input.core[type=search] {
border: 1px solid #ccc; border: 1px solid #ccc;
border-radius: 3px; border-radius: 3px;
width: 60%; width: 60%;
@ -160,27 +527,54 @@ form input[type=search] {
font-size: 1em; font-size: 1em;
font-family: Helvetica, sans-serif; font-family: Helvetica, sans-serif;
} }
@media only screen and (max-width: 576px) { .core input[type=search],
form.sql textarea { input.core[type=search] {
width: 95%; /* Stop Webkit from styling search boxes in an inconsistent way */
} /* https://css-tricks.com/webkit-html5-search-inputs/ comments */
-webkit-appearance: textfield;
} }
form input[type=submit] { .core input[type="search"]::-webkit-search-decoration,
color: #fff; input.core[type="search"]::-webkit-search-decoration,
background-color: #007bff; .core input[type="search"]::-webkit-search-cancel-button,
border-color: #007bff; input.core[type="search"]::-webkit-search-cancel-button,
.core input[type="search"]::-webkit-search-results-button,
input.core[type="search"]::-webkit-search-results-button,
.core input[type="search"]::-webkit-search-results-decoration,
input.core[type="search"]::-webkit-search-results-decoration {
display: none;
}
.core input[type=submit],
.core button[type=button],
input.core[type=submit],
button.core[type=button] {
font-weight: 400; font-weight: 400;
cursor: pointer; cursor: pointer;
text-align: center; text-align: center;
vertical-align: middle; vertical-align: middle;
border: 1px solid blue; border-width: 1px;
border-style: solid;
padding: .5em 0.8em; padding: .5em 0.8em;
font-size: 0.9rem; font-size: 0.9rem;
line-height: 1; line-height: 1;
border-radius: .25rem; border-radius: .25rem;
}
.core input[type=submit],
input.core[type=submit] {
color: #fff;
background: linear-gradient(180deg, #007bff 0%, #4E79C7 100%);
border-color: #007bff;
-webkit-appearance: button; -webkit-appearance: button;
} }
.core button[type=button],
button.core[type=button] {
color: #007bff;
background-color: #fff;
border-color: #007bff;
}
.filter-row { .filter-row {
margin-bottom: 0.6em; margin-bottom: 0.6em;
} }
@ -205,6 +599,9 @@ form input[type=submit] {
display: inline-block; display: inline-block;
margin-right: 0.3em; margin-right: 0.3em;
} }
.select-wrapper:focus-within {
border: 1px solid black;
}
.select-wrapper.filter-op { .select-wrapper.filter-op {
width: 80px; width: 80px;
} }
@ -253,27 +650,9 @@ form input[type=submit] {
font-size: 1em; font-size: 1em;
font-family: Helvetica, sans-serif; font-family: Helvetica, sans-serif;
} }
@media only screen and (max-width: 576px) {
.select-wrapper.small-screen-only {
display: inline-block;
}
.select-wrapper {
width: 100px;
}
.select-wrapper.filter-op {
width: 60px;
}
.filters input.filter-value {
width: 140px;
}
}
a.not-underlined {
text-decoration: none;
}
.not-underlined .underlined {
text-decoration: underline;
}
.facet-results { .facet-results {
display: flex; display: flex;
@ -284,6 +663,11 @@ a.not-underlined {
width: 250px; width: 250px;
margin-right: 15px; margin-right: 15px;
} }
.facet-info-total {
font-size: 0.8em;
color: #666;
padding-right: 0.25em;
}
.facet-info li, .facet-info li,
.facet-info ul { .facet-info ul {
margin: 0; margin: 0;
@ -300,15 +684,228 @@ a.not-underlined {
.facet-info a.cross:active { .facet-info a.cross:active {
text-decoration: none; text-decoration: none;
} }
ul li.facet-truncated {
list-style-type: none;
position: relative;
top: -0.35em;
text-indent: 0.85em;
}
.advanced-export { .advanced-export {
margin-top: 1em; margin-top: 1em;
padding: 0.01em 2em 0.01em 1em; padding: 0.01em 2em 0.01em 1em;
width: auto; width: auto;
display: inline-block; display: inline-block;
box-shadow: 1px 2px 8px 2px rgba(0,0,0,0.08); box-shadow: 1px 2px 8px 2px rgba(0,0,0,0.08);
background-color: white;
} }
.download-sqlite em { .download-sqlite em {
font-style: normal; font-style: normal;
font-size: 0.8em; font-size: 0.8em;
} }
p.zero-results {
border: 2px solid #ccc;
background-color: #eee;
padding: 0.5em;
font-style: italic;
}
/* Value types */
.type-float, .type-int {
color: #666;
}
/* Overrides ===============================================================*/
.small-screen-only,
.select-wrapper.small-screen-only {
display: none;
}
@media only screen and (max-width: 576px) {
.small-screen-only {
display: initial;
}
.select-wrapper.small-screen-only {
display: inline-block;
}
form.sql textarea {
width: 95%;
}
/* Force table to not be like tables anymore */
table.rows-and-columns,
.rows-and-columns thead,
.rows-and-columns tbody,
.rows-and-columns th,
.rows-and-columns td,
.rows-and-columns tr {
display: block;
}
/* Hide table headers (but not display: none;, for accessibility) */
.rows-and-columns thead tr {
position: absolute;
top: -9999px;
left: -9999px;
}
table.rows-and-columns tr {
border: 1px solid #ccc;
margin-bottom: 1em;
border-radius: 10px;
background-color: white;
padding: 0.2rem;
}
table.rows-and-columns td {
/* Behave like a "row" */
border: none;
border-bottom: 1px solid #eee;
padding: 0;
padding-left: 10%;
}
table.rows-and-columns td:before {
display: block;
color: black;
margin-left: -10%;
font-size: 0.8em;
}
.select-wrapper {
width: 100px;
}
.select-wrapper.filter-op {
width: 60px;
}
.filters input.filter-value {
width: 140px;
}
}
svg.dropdown-menu-icon {
display: inline-block;
position: relative;
top: 2px;
cursor: pointer;
opacity: 0.8;
}
.dropdown-menu {
border: 1px solid #ccc;
border-radius: 4px;
line-height: 1.4;
font-size: 16px;
box-shadow: 2px 2px 2px #aaa;
background-color: #fff;
z-index: 1000;
}
.dropdown-menu ul,
.dropdown-menu li {
list-style-type: none;
margin: 0;
padding: 0;
}
.dropdown-menu .dropdown-column-type {
font-size: 0.7em;
color: #666;
margin: 0;
padding: 4px 8px 4px 8px;
}
.dropdown-menu .dropdown-column-description {
margin: 0;
color: #666;
padding: 4px 8px 4px 8px;
max-width: 20em;
}
.dropdown-menu li {
border-bottom: 1px solid #ccc;
}
.dropdown-menu li:last-child {
border: none;
}
.dropdown-menu a:link,
.dropdown-menu a:visited,
.dropdown-menu a:hover,
.dropdown-menu a:focus
.dropdown-menu a:active {
text-decoration: none;
display: block;
padding: 4px 8px 2px 8px;
color: #222;
white-space: nowrap;
}
.dropdown-menu a:hover {
background-color: #eee;
}
.dropdown-menu .dropdown-description {
margin: 0;
color: #666;
font-size: 0.8em;
max-width: 80vw;
white-space: normal;
}
.dropdown-menu .hook {
display: block;
position: absolute;
top: -5px;
left: 6px;
width: 0;
height: 0;
border-left: 5px solid transparent;
border-right: 5px solid transparent;
border-bottom: 5px solid #666;
}
.canned-query-edit-sql {
padding-left: 0.5em;
position: relative;
top: 1px;
}
.blob-download {
display: block;
white-space: nowrap;
padding-right: 20px;
position: relative;
background-image: url("data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHZpZXdCb3g9IjAgMCAxNiAxNiIgd2lkdGg9IjE2IiBoZWlnaHQ9IjE2Ij48cGF0aCBmaWxsLXJ1bGU9ImV2ZW5vZGQiIGQ9Ik03LjQ3IDEwLjc4YS43NS43NSAwIDAwMS4wNiAwbDMuNzUtMy43NWEuNzUuNzUgMCAwMC0xLjA2LTEuMDZMOC43NSA4LjQ0VjEuNzVhLjc1Ljc1IDAgMDAtMS41IDB2Ni42OUw0Ljc4IDUuOTdhLjc1Ljc1IDAgMDAtMS4wNiAxLjA2bDMuNzUgMy43NXpNMy43NSAxM2EuNzUuNzUgMCAwMDAgMS41aDguNWEuNzUuNzUgMCAwMDAtMS41aC04LjV6Ij48L3BhdGg+PC9zdmc+");
background-size: 16px 16px;
background-position: right;
background-repeat: no-repeat;
}
dl.column-descriptions dt {
font-weight: bold;
}
dl.column-descriptions dd {
padding-left: 1.5em;
white-space: pre-wrap;
line-height: 1.1em;
color: #666;
}
.anim-scale-in {
animation-name: scale-in;
animation-duration: 0.15s;
animation-timing-function: cubic-bezier(0.2, 0, 0.13, 1.5);
}
@keyframes scale-in {
0% {
opacity: 0;
transform: scale(0.6);
}
100% {
opacity: 1;
transform: scale(1);
}
}

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,74 @@
import { EditorView, basicSetup } from "codemirror";
import { keymap } from "@codemirror/view";
import { sql, SQLDialect } from "@codemirror/lang-sql";
// A variation of SQLite from lang-sql https://github.com/codemirror/lang-sql/blob/ebf115fffdbe07f91465ccbd82868c587f8182bc/src/sql.ts#L231
const SQLite = SQLDialect.define({
// Based on https://www.sqlite.org/lang_keywords.html based on likely keywords to be used in select queries
// https://github.com/simonw/datasette/pull/1893#issuecomment-1316401895:
keywords:
"and as asc between by case cast count current_date current_time current_timestamp desc distinct each else escape except exists explain filter first for from full generated group having if in index inner intersect into isnull join last left like limit not null or order outer over pragma primary query raise range regexp right rollback row select set table then to union unique using values view virtual when where",
// https://www.sqlite.org/datatype3.html
types: "null integer real text blob",
builtin: "",
operatorChars: "*+-%<>!=&|/~",
identifierQuotes: '`"',
specialVar: "@:?$",
});
// Utility function from https://codemirror.net/docs/migration/
export function editorFromTextArea(textarea, conf = {}) {
// This could also be configured with a set of tables and columns for better autocomplete:
// https://github.com/codemirror/lang-sql#user-content-sqlconfig.tables
let view = new EditorView({
doc: textarea.value,
extensions: [
keymap.of([
{
key: "Shift-Enter",
run: function () {
textarea.value = view.state.doc.toString();
textarea.form.submit();
return true;
},
},
{
key: "Meta-Enter",
run: function () {
textarea.value = view.state.doc.toString();
textarea.form.submit();
return true;
},
},
]),
// This has to be after the keymap or else the basicSetup keys will prevent
// Meta-Enter from running
basicSetup,
EditorView.lineWrapping,
sql({
dialect: SQLite,
schema: conf.schema,
tables: conf.tables,
defaultTableName: conf.defaultTableName,
defaultSchemaName: conf.defaultSchemaName,
}),
],
});
// Idea taken from https://discuss.codemirror.net/t/resizing-codemirror-6/3265.
// Using CSS resize: both and scheduling a measurement when the element changes.
let editorDOM = view.contentDOM.closest(".cm-editor");
let observer = new ResizeObserver(function () {
view.requestMeasure();
});
observer.observe(editorDOM, { attributes: true });
textarea.parentNode.insertBefore(view.dom, textarea);
textarea.style.display = "none";
if (textarea.form) {
textarea.form.addEventListener("submit", () => {
textarea.value = view.state.doc.toString();
});
}
return view;
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,210 @@
// Custom events for use with the native CustomEvent API
const DATASETTE_EVENTS = {
INIT: "datasette_init", // returns datasette manager instance in evt.detail
};
// Datasette "core" -> Methods/APIs that are foundational
// Plugins will have greater stability if they use the functional hooks- but if they do decide to hook into
// literal DOM selectors, they'll have an easier time using these addresses.
const DOM_SELECTORS = {
/** Should have one match */
jsonExportLink: ".export-links a[href*=json]",
/** Event listeners that go outside of the main table, e.g. existing scroll listener */
tableWrapper: ".table-wrapper",
table: "table.rows-and-columns",
aboveTablePanel: ".above-table-panel",
// These could have multiple matches
/** Used for selecting table headers. Use makeColumnActions if you want to add menu items. */
tableHeaders: `table.rows-and-columns th`,
/** Used to add "where" clauses to query using direct manipulation */
filterRows: ".filter-row",
/** Used to show top available enum values for a column ("facets") */
facetResults: ".facet-results [data-column]",
};
/**
* Monolith class for interacting with Datasette JS API
* Imported with DEFER, runs after main document parsed
* For now, manually synced with datasette/version.py
*/
const datasetteManager = {
VERSION: window.datasetteVersion,
// TODO: Should order of registration matter more?
// Should plugins be allowed to clobber others or is it last-in takes priority?
// Does pluginMetadata need to be serializable, or can we let it be stateful / have functions?
plugins: new Map(),
registerPlugin: (name, pluginMetadata) => {
if (datasetteManager.plugins.has(name)) {
console.warn(`Warning -> plugin ${name} was redefined`);
}
datasetteManager.plugins.set(name, pluginMetadata);
// If the plugin participates in the panel... update the panel.
if (pluginMetadata.makeAboveTablePanelConfigs) {
datasetteManager.renderAboveTablePanel();
}
},
/**
* New DOM elements are created on each click, so the data is not stale.
*
* Items
* - must provide label (text)
* - might provide href (string) or an onclick ((evt) => void)
*
* columnMeta is metadata stored on the column header (TH) as a DOMStringMap
* - column: string
* - columnNotNull: boolean
* - columnType: sqlite datatype enum (text, number, etc)
* - isPk: boolean
*/
makeColumnActions: (columnMeta) => {
let columnActions = [];
// Accept function that returns list of columnActions with keys
// Required: label (text)
// Optional: onClick or href
datasetteManager.plugins.forEach((plugin) => {
if (plugin.makeColumnActions) {
// Plugins can provide multiple columnActions if they want
// If multiple try to create entry with same label, the last one deletes the others
columnActions.push(...plugin.makeColumnActions(columnMeta));
}
});
// TODO: Validate columnAction configs and give informative error message if missing keys.
return columnActions;
},
/**
* In MVP, each plugin can only have 1 instance.
* In future, panels could be repeated. We omit that for now since so many plugins depend on
* shared URL state, so having multiple instances of plugin at same time is problematic.
* Currently, we never destroy any panels, we just hide them.
*
* TODO: nicer panel css, show panel selection state.
* TODO: does this hook need to take any arguments?
*/
renderAboveTablePanel: () => {
const aboveTablePanel = document.querySelector(
DOM_SELECTORS.aboveTablePanel,
);
if (!aboveTablePanel) {
console.warn(
"This page does not have a table, the renderAboveTablePanel cannot be used.",
);
return;
}
let aboveTablePanelWrapper = aboveTablePanel.querySelector(".panels");
// First render: create wrappers. Otherwise, reuse previous.
if (!aboveTablePanelWrapper) {
aboveTablePanelWrapper = document.createElement("div");
aboveTablePanelWrapper.classList.add("tab-contents");
const panelNav = document.createElement("div");
panelNav.classList.add("tab-controls");
// Temporary: css for minimal amount of breathing room.
panelNav.style.display = "flex";
panelNav.style.gap = "8px";
panelNav.style.marginTop = "4px";
panelNav.style.marginBottom = "20px";
aboveTablePanel.appendChild(panelNav);
aboveTablePanel.appendChild(aboveTablePanelWrapper);
}
datasetteManager.plugins.forEach((plugin, pluginName) => {
const { makeAboveTablePanelConfigs } = plugin;
if (makeAboveTablePanelConfigs) {
const controls = aboveTablePanel.querySelector(".tab-controls");
const contents = aboveTablePanel.querySelector(".tab-contents");
// Each plugin can make multiple panels
const configs = makeAboveTablePanelConfigs();
configs.forEach((config, i) => {
const nodeContentId = `${pluginName}_${config.id}_panel-content`;
// quit if we've already registered this plugin
// TODO: look into whether plugins should be allowed to ask
// parent to re-render, or if they should manage that internally.
if (document.getElementById(nodeContentId)) {
return;
}
// Add tab control button
const pluginControl = document.createElement("button");
pluginControl.textContent = config.label;
pluginControl.onclick = () => {
contents.childNodes.forEach((node) => {
if (node.id === nodeContentId) {
node.style.display = "block";
} else {
node.style.display = "none";
}
});
};
controls.appendChild(pluginControl);
// Add plugin content area
const pluginNode = document.createElement("div");
pluginNode.id = nodeContentId;
config.render(pluginNode);
pluginNode.style.display = "none"; // Default to hidden unless you're ifrst
contents.appendChild(pluginNode);
});
// Let first node be selected by default
if (contents.childNodes.length) {
contents.childNodes[0].style.display = "block";
}
}
});
},
/** Selectors for document (DOM) elements. Store identifier instead of immediate references in case they haven't loaded when Manager starts. */
selectors: DOM_SELECTORS,
// Future API ideas
// Fetch page's data in array, and cache so plugins could reuse it
// Provide knowledge of what datasette JS or server-side via traditional console autocomplete
// State helpers: URL params https://github.com/simonw/datasette/issues/1144 and localstorage
// UI Hooks: command + k, tab manager hook
// Should we notify plugins that have dependencies
// when all dependencies were fulfilled? (leaflet, codemirror, etc)
// https://github.com/simonw/datasette-leaflet -> this way
// multiple plugins can all request the same copy of leaflet.
};
const initializeDatasette = () => {
// Hide the global behind __ prefix. Ideally they should be listening for the
// DATASETTE_EVENTS.INIT event to avoid the habit of reading from the window.
window.__DATASETTE__ = datasetteManager;
console.debug("Datasette Manager Created!");
const initDatasetteEvent = new CustomEvent(DATASETTE_EVENTS.INIT, {
detail: datasetteManager,
});
document.dispatchEvent(initDatasetteEvent);
};
/**
* Main function
* Fires AFTER the document has been parsed
*/
document.addEventListener("DOMContentLoaded", function () {
initializeDatasette();
});

Binary file not shown.

After

Width:  |  Height:  |  Size: 208 B

View file

@ -0,0 +1,56 @@
/*
https://github.com/luyilin/json-format-highlight
From https://unpkg.com/json-format-highlight@1.0.1/dist/json-format-highlight.js
MIT Licensed
*/
(function (global, factory) {
typeof exports === "object" && typeof module !== "undefined"
? (module.exports = factory())
: typeof define === "function" && define.amd
? define(factory)
: (global.jsonFormatHighlight = factory());
})(this, function () {
"use strict";
var defaultColors = {
keyColor: "dimgray",
numberColor: "lightskyblue",
stringColor: "lightcoral",
trueColor: "lightseagreen",
falseColor: "#f66578",
nullColor: "cornflowerblue",
};
function index(json, colorOptions) {
if (colorOptions === void 0) colorOptions = {};
if (!json) {
return;
}
if (typeof json !== "string") {
json = JSON.stringify(json, null, 2);
}
var colors = Object.assign({}, defaultColors, colorOptions);
json = json.replace(/&/g, "&").replace(/</g, "<").replace(/>/g, ">");
return json.replace(
/("(\\u[a-zA-Z0-9]{4}|\\[^u]|[^\\"])*"(\s*:)?|\b(true|false|null)\b|-?\d+(?:\.\d*)?(?:[eE][+]?\d+)?)/g,
function (match) {
var color = colors.numberColor;
if (/^"/.test(match)) {
color = /:$/.test(match) ? colors.keyColor : colors.stringColor;
} else {
color = /true/.test(match)
? colors.trueColor
: /false/.test(match)
? colors.falseColor
: /null/.test(match)
? colors.nullColor
: color;
}
return '<span style="color: ' + color + '">' + match + "</span>";
},
);
}
return index;
});

View file

@ -0,0 +1,416 @@
class NavigationSearch extends HTMLElement {
constructor() {
super();
this.attachShadow({ mode: "open" });
this.selectedIndex = -1;
this.matches = [];
this.debounceTimer = null;
this.render();
this.setupEventListeners();
}
render() {
this.shadowRoot.innerHTML = `
<style>
:host {
display: contents;
}
dialog {
border: none;
border-radius: 0.75rem;
padding: 0;
max-width: 90vw;
width: 600px;
max-height: 80vh;
box-shadow: 0 20px 25px -5px rgba(0, 0, 0, 0.1), 0 10px 10px -5px rgba(0, 0, 0, 0.04);
animation: slideIn 0.2s ease-out;
}
dialog::backdrop {
background: rgba(0, 0, 0, 0.5);
backdrop-filter: blur(4px);
animation: fadeIn 0.2s ease-out;
}
@keyframes slideIn {
from {
opacity: 0;
transform: translateY(-20px) scale(0.95);
}
to {
opacity: 1;
transform: translateY(0) scale(1);
}
}
@keyframes fadeIn {
from { opacity: 0; }
to { opacity: 1; }
}
.search-container {
display: flex;
flex-direction: column;
height: 100%;
}
.search-input-wrapper {
padding: 1.25rem;
border-bottom: 1px solid #e5e7eb;
}
.search-input {
width: 100%;
padding: 0.75rem 1rem;
font-size: 1rem;
border: 2px solid #e5e7eb;
border-radius: 0.5rem;
outline: none;
transition: border-color 0.2s;
box-sizing: border-box;
}
.search-input:focus {
border-color: #2563eb;
}
.results-container {
overflow-y: auto;
height: calc(80vh - 180px);
padding: 0.5rem;
}
.result-item {
padding: 0.875rem 1rem;
cursor: pointer;
border-radius: 0.5rem;
transition: background-color 0.15s;
display: flex;
align-items: center;
gap: 0.75rem;
}
.result-item:hover {
background-color: #f3f4f6;
}
.result-item.selected {
background-color: #dbeafe;
}
.result-name {
font-weight: 500;
color: #111827;
}
.result-url {
font-size: 0.875rem;
color: #6b7280;
}
.no-results {
padding: 2rem;
text-align: center;
color: #6b7280;
}
.hint-text {
padding: 0.75rem 1.25rem;
font-size: 0.875rem;
color: #6b7280;
border-top: 1px solid #e5e7eb;
display: flex;
gap: 1rem;
flex-wrap: wrap;
}
.hint-text kbd {
background: #f3f4f6;
padding: 0.125rem 0.375rem;
border-radius: 0.25rem;
font-size: 0.75rem;
border: 1px solid #d1d5db;
font-family: monospace;
}
/* Mobile optimizations */
@media (max-width: 640px) {
dialog {
width: 95vw;
max-height: 85vh;
border-radius: 0.5rem;
}
.search-input-wrapper {
padding: 1rem;
}
.search-input {
font-size: 16px; /* Prevents zoom on iOS */
}
.result-item {
padding: 1rem 0.75rem;
}
.hint-text {
font-size: 0.8rem;
padding: 0.5rem 1rem;
}
}
</style>
<dialog>
<div class="search-container">
<div class="search-input-wrapper">
<input
type="text"
class="search-input"
placeholder="Search..."
aria-label="Search navigation"
autocomplete="off"
spellcheck="false"
>
</div>
<div class="results-container" role="listbox"></div>
<div class="hint-text">
<span><kbd></kbd> <kbd></kbd> Navigate</span>
<span><kbd>Enter</kbd> Select</span>
<span><kbd>Esc</kbd> Close</span>
</div>
</div>
</dialog>
`;
}
setupEventListeners() {
const dialog = this.shadowRoot.querySelector("dialog");
const input = this.shadowRoot.querySelector(".search-input");
const resultsContainer =
this.shadowRoot.querySelector(".results-container");
// Global keyboard listener for "/"
document.addEventListener("keydown", (e) => {
if (e.key === "/" && !this.isInputFocused() && !dialog.open) {
e.preventDefault();
this.openMenu();
}
});
// Input event
input.addEventListener("input", (e) => {
this.handleSearch(e.target.value);
});
// Keyboard navigation
input.addEventListener("keydown", (e) => {
if (e.key === "ArrowDown") {
e.preventDefault();
this.moveSelection(1);
} else if (e.key === "ArrowUp") {
e.preventDefault();
this.moveSelection(-1);
} else if (e.key === "Enter") {
e.preventDefault();
this.selectCurrentItem();
} else if (e.key === "Escape") {
this.closeMenu();
}
});
// Click on result item
resultsContainer.addEventListener("click", (e) => {
const item = e.target.closest(".result-item");
if (item) {
const index = parseInt(item.dataset.index);
this.selectItem(index);
}
});
// Close on backdrop click
dialog.addEventListener("click", (e) => {
if (e.target === dialog) {
this.closeMenu();
}
});
// Initial load
this.loadInitialData();
}
isInputFocused() {
const activeElement = document.activeElement;
return (
activeElement &&
(activeElement.tagName === "INPUT" ||
activeElement.tagName === "TEXTAREA" ||
activeElement.isContentEditable)
);
}
loadInitialData() {
const itemsAttr = this.getAttribute("items");
if (itemsAttr) {
try {
this.allItems = JSON.parse(itemsAttr);
this.matches = this.allItems;
} catch (e) {
console.error("Failed to parse items attribute:", e);
this.allItems = [];
this.matches = [];
}
}
}
handleSearch(query) {
clearTimeout(this.debounceTimer);
this.debounceTimer = setTimeout(() => {
const url = this.getAttribute("url");
if (url) {
// Fetch from API
this.fetchResults(url, query);
} else {
// Filter local items
this.filterLocalItems(query);
}
}, 200);
}
async fetchResults(url, query) {
try {
const searchUrl = `${url}?q=${encodeURIComponent(query)}`;
const response = await fetch(searchUrl);
const data = await response.json();
this.matches = data.matches || [];
this.selectedIndex = this.matches.length > 0 ? 0 : -1;
this.renderResults();
} catch (e) {
console.error("Failed to fetch search results:", e);
this.matches = [];
this.renderResults();
}
}
filterLocalItems(query) {
if (!query.trim()) {
this.matches = [];
} else {
const lowerQuery = query.toLowerCase();
this.matches = (this.allItems || []).filter(
(item) =>
item.name.toLowerCase().includes(lowerQuery) ||
item.url.toLowerCase().includes(lowerQuery),
);
}
this.selectedIndex = this.matches.length > 0 ? 0 : -1;
this.renderResults();
}
renderResults() {
const container = this.shadowRoot.querySelector(".results-container");
const input = this.shadowRoot.querySelector(".search-input");
if (this.matches.length === 0) {
const message = input.value.trim()
? "No results found"
: "Start typing to search...";
container.innerHTML = `<div class="no-results">${message}</div>`;
return;
}
container.innerHTML = this.matches
.map(
(match, index) => `
<div
class="result-item ${
index === this.selectedIndex ? "selected" : ""
}"
data-index="${index}"
role="option"
aria-selected="${index === this.selectedIndex}"
>
<div>
<div class="result-name">${this.escapeHtml(
match.name,
)}</div>
<div class="result-url">${this.escapeHtml(match.url)}</div>
</div>
</div>
`,
)
.join("");
// Scroll selected item into view
if (this.selectedIndex >= 0) {
const selectedItem = container.children[this.selectedIndex];
if (selectedItem) {
selectedItem.scrollIntoView({ block: "nearest" });
}
}
}
moveSelection(direction) {
const newIndex = this.selectedIndex + direction;
if (newIndex >= 0 && newIndex < this.matches.length) {
this.selectedIndex = newIndex;
this.renderResults();
}
}
selectCurrentItem() {
if (this.selectedIndex >= 0 && this.selectedIndex < this.matches.length) {
this.selectItem(this.selectedIndex);
}
}
selectItem(index) {
const match = this.matches[index];
if (match) {
// Dispatch custom event
this.dispatchEvent(
new CustomEvent("select", {
detail: match,
bubbles: true,
composed: true,
}),
);
// Navigate to URL
window.location.href = match.url;
this.closeMenu();
}
}
openMenu() {
const dialog = this.shadowRoot.querySelector("dialog");
const input = this.shadowRoot.querySelector(".search-input");
dialog.showModal();
input.value = "";
input.focus();
// Reset state - start with no items shown
this.matches = [];
this.selectedIndex = -1;
this.renderResults();
}
closeMenu() {
const dialog = this.shadowRoot.querySelector("dialog");
dialog.close();
}
escapeHtml(text) {
const div = document.createElement("div");
div.textContent = text;
return div.innerHTML;
}
}
// Register the custom element
customElements.define("navigation-search", NavigationSearch);

File diff suppressed because one or more lines are too long

343
datasette/static/table.js Normal file
View file

@ -0,0 +1,343 @@
var DROPDOWN_HTML = `<div class="dropdown-menu">
<div class="hook"></div>
<ul>
<li><a class="dropdown-sort-asc" href="#">Sort ascending</a></li>
<li><a class="dropdown-sort-desc" href="#">Sort descending</a></li>
<li><a class="dropdown-facet" href="#">Facet by this</a></li>
<li><a class="dropdown-hide-column" href="#">Hide this column</a></li>
<li><a class="dropdown-show-all-columns" href="#">Show all columns</a></li>
<li><a class="dropdown-not-blank" href="#">Show not-blank rows</a></li>
</ul>
<p class="dropdown-column-type"></p>
<p class="dropdown-column-description"></p>
</div>`;
var DROPDOWN_ICON_SVG = `<svg xmlns="http://www.w3.org/2000/svg" width="14" height="14" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>`;
/** Main initialization function for Datasette Table interactions */
const initDatasetteTable = function (manager) {
// Feature detection
if (!window.URLSearchParams) {
return;
}
function getParams() {
return new URLSearchParams(location.search);
}
function paramsToUrl(params) {
var s = params.toString();
return s ? "?" + s : location.pathname;
}
function sortDescUrl(column) {
var params = getParams();
params.set("_sort_desc", column);
params.delete("_sort");
params.delete("_next");
return paramsToUrl(params);
}
function sortAscUrl(column) {
var params = getParams();
params.set("_sort", column);
params.delete("_sort_desc");
params.delete("_next");
return paramsToUrl(params);
}
function facetUrl(column) {
var params = getParams();
params.append("_facet", column);
return paramsToUrl(params);
}
function hideColumnUrl(column) {
var params = getParams();
params.append("_nocol", column);
return paramsToUrl(params);
}
function showAllColumnsUrl() {
var params = getParams();
params.delete("_nocol");
params.delete("_col");
return paramsToUrl(params);
}
function notBlankUrl(column) {
var params = getParams();
params.set(`${column}__notblank`, "1");
return paramsToUrl(params);
}
function closeMenu() {
menu.style.display = "none";
menu.classList.remove("anim-scale-in");
}
const tableWrapper = document.querySelector(manager.selectors.tableWrapper);
if (tableWrapper) {
tableWrapper.addEventListener("scroll", closeMenu);
}
document.body.addEventListener("click", (ev) => {
/* was this click outside the menu? */
var target = ev.target;
while (target && target != menu) {
target = target.parentNode;
}
if (!target) {
closeMenu();
}
});
function onTableHeaderClick(ev) {
ev.preventDefault();
ev.stopPropagation();
menu.innerHTML = DROPDOWN_HTML;
var th = ev.target;
while (th.nodeName != "TH") {
th = th.parentNode;
}
var rect = th.getBoundingClientRect();
var menuTop = rect.bottom + window.scrollY;
var menuLeft = rect.left + window.scrollX;
var column = th.getAttribute("data-column");
var params = getParams();
var sort = menu.querySelector("a.dropdown-sort-asc");
var sortDesc = menu.querySelector("a.dropdown-sort-desc");
var facetItem = menu.querySelector("a.dropdown-facet");
var notBlank = menu.querySelector("a.dropdown-not-blank");
var hideColumn = menu.querySelector("a.dropdown-hide-column");
var showAllColumns = menu.querySelector("a.dropdown-show-all-columns");
if (params.get("_sort") == column) {
sort.parentNode.style.display = "none";
} else {
sort.parentNode.style.display = "block";
sort.setAttribute("href", sortAscUrl(column));
}
if (params.get("_sort_desc") == column) {
sortDesc.parentNode.style.display = "none";
} else {
sortDesc.parentNode.style.display = "block";
sortDesc.setAttribute("href", sortDescUrl(column));
}
/* Show hide columns options */
if (params.get("_nocol") || params.get("_col")) {
showAllColumns.parentNode.style.display = "block";
showAllColumns.setAttribute("href", showAllColumnsUrl());
} else {
showAllColumns.parentNode.style.display = "none";
}
if (th.getAttribute("data-is-pk") != "1") {
hideColumn.parentNode.style.display = "block";
hideColumn.setAttribute("href", hideColumnUrl(column));
} else {
hideColumn.parentNode.style.display = "none";
}
/* Only show "Facet by this" if it's not the first column, not selected,
not a single PK and the Datasette allow_facet setting is True */
var displayedFacets = Array.from(
document.querySelectorAll(".facet-info"),
).map((el) => el.dataset.column);
var isFirstColumn =
th.parentElement.querySelector("th:first-of-type") == th;
var isSinglePk =
th.getAttribute("data-is-pk") == "1" &&
document.querySelectorAll('th[data-is-pk="1"]').length == 1;
if (
!DATASETTE_ALLOW_FACET ||
isFirstColumn ||
displayedFacets.includes(column) ||
isSinglePk
) {
facetItem.parentNode.style.display = "none";
} else {
facetItem.parentNode.style.display = "block";
facetItem.setAttribute("href", facetUrl(column));
}
/* Show notBlank option if not selected AND at least one visible blank value */
var tdsForThisColumn = Array.from(
th.closest("table").querySelectorAll("td." + th.className),
);
if (
params.get(`${column}__notblank`) != "1" &&
tdsForThisColumn.filter((el) => el.innerText.trim() == "").length
) {
notBlank.parentNode.style.display = "block";
notBlank.setAttribute("href", notBlankUrl(column));
} else {
notBlank.parentNode.style.display = "none";
}
var columnTypeP = menu.querySelector(".dropdown-column-type");
var columnType = th.dataset.columnType;
var notNull = th.dataset.columnNotNull == 1 ? " NOT NULL" : "";
if (columnType) {
columnTypeP.style.display = "block";
columnTypeP.innerText = `Type: ${columnType.toUpperCase()}${notNull}`;
} else {
columnTypeP.style.display = "none";
}
var columnDescriptionP = menu.querySelector(".dropdown-column-description");
if (th.dataset.columnDescription) {
columnDescriptionP.innerText = th.dataset.columnDescription;
columnDescriptionP.style.display = "block";
} else {
columnDescriptionP.style.display = "none";
}
menu.style.position = "absolute";
menu.style.top = menuTop + 6 + "px";
menu.style.left = menuLeft + "px";
menu.style.display = "block";
menu.classList.add("anim-scale-in");
// Custom menu items on each render
// Plugin hook: allow adding JS-based additional menu items
const columnActionsPayload = {
columnName: th.dataset.column,
columnNotNull: th.dataset.columnNotNull === "1",
columnType: th.dataset.columnType,
isPk: th.dataset.isPk === "1",
};
const columnItemConfigs = manager.makeColumnActions(columnActionsPayload);
const menuList = menu.querySelector("ul");
columnItemConfigs.forEach((itemConfig) => {
// Remove items from previous render. We assume entries have unique labels.
const existingItems = menuList.querySelectorAll(`li`);
Array.from(existingItems)
.filter((item) => item.innerText === itemConfig.label)
.forEach((node) => {
node.remove();
});
const newLink = document.createElement("a");
newLink.textContent = itemConfig.label;
newLink.href = itemConfig.href ?? "#";
if (itemConfig.onClick) {
newLink.onclick = itemConfig.onClick;
}
// Attach new elements to DOM
const menuItem = document.createElement("li");
menuItem.appendChild(newLink);
menuList.appendChild(menuItem);
});
// Measure width of menu and adjust position if too far right
const menuWidth = menu.offsetWidth;
const windowWidth = window.innerWidth;
if (menuLeft + menuWidth > windowWidth) {
menu.style.left = windowWidth - menuWidth - 20 + "px";
}
// Align menu .hook arrow with the column cog icon
const hook = menu.querySelector(".hook");
const icon = th.querySelector(".dropdown-menu-icon");
const iconRect = icon.getBoundingClientRect();
const hookLeft = iconRect.left - menuLeft + 1 + "px";
hook.style.left = hookLeft;
// Move the whole menu right if the hook is too far right
const menuRect = menu.getBoundingClientRect();
if (iconRect.right > menuRect.right) {
menu.style.left = iconRect.right - menuWidth + "px";
// And move hook tip as well
hook.style.left = menuWidth - 13 + "px";
}
}
var svg = document.createElement("div");
svg.innerHTML = DROPDOWN_ICON_SVG;
svg = svg.querySelector("*");
svg.classList.add("dropdown-menu-icon");
var menu = document.createElement("div");
menu.innerHTML = DROPDOWN_HTML;
menu = menu.querySelector("*");
menu.style.position = "absolute";
menu.style.display = "none";
document.body.appendChild(menu);
var ths = Array.from(
document.querySelectorAll(manager.selectors.tableHeaders),
);
ths.forEach((th) => {
if (!th.querySelector("a")) {
return;
}
var icon = svg.cloneNode(true);
icon.addEventListener("click", onTableHeaderClick);
th.appendChild(icon);
});
};
/* Add x buttons to the filter rows */
function addButtonsToFilterRows(manager) {
var x = "✖";
var rows = Array.from(
document.querySelectorAll(manager.selectors.filterRow),
).filter((el) => el.querySelector(".filter-op"));
rows.forEach((row) => {
var a = document.createElement("a");
a.setAttribute("href", "#");
a.setAttribute("aria-label", "Remove this filter");
a.style.textDecoration = "none";
a.innerText = x;
a.addEventListener("click", (ev) => {
ev.preventDefault();
let row = ev.target.closest("div");
row.querySelector("select").value = "";
row.querySelector(".filter-op select").value = "exact";
row.querySelector("input.filter-value").value = "";
ev.target.closest("a").style.display = "none";
});
row.appendChild(a);
var column = row.querySelector("select");
if (!column.value) {
a.style.display = "none";
}
});
}
/* Set up datalist autocomplete for filter values */
function initAutocompleteForFilterValues(manager) {
function createDataLists() {
var facetResults = document.querySelectorAll(
manager.selectors.facetResults,
);
Array.from(facetResults).forEach(function (facetResult) {
// Use link text from all links in the facet result
var links = Array.from(
facetResult.querySelectorAll("li:not(.facet-truncated) a"),
);
// Create a datalist element
var datalist = document.createElement("datalist");
datalist.id = "datalist-" + facetResult.dataset.column;
// Create an option element for each link text
links.forEach(function (link) {
var option = document.createElement("option");
option.label = link.innerText;
option.value = link.dataset.facetValue;
datalist.appendChild(option);
});
// Add the datalist to the facet result
facetResult.appendChild(datalist);
});
}
createDataLists();
// When any select with name=_filter_column changes, update the datalist
document.body.addEventListener("change", function (event) {
if (event.target.name === "_filter_column") {
event.target
.closest(manager.selectors.filterRow)
.querySelector(".filter-value")
.setAttribute("list", "datalist-" + event.target.value);
}
});
}
// Ensures Table UI is initialized only after the Manager is ready.
document.addEventListener("datasette_init", function (evt) {
const { detail: manager } = evt;
// Main table
initDatasetteTable(manager);
// Other UI functions with interactive JS needs
addButtonsToFilterRows(manager);
initAutocompleteForFilterValues(manager);
});

View file

@ -0,0 +1,28 @@
{% if action_links %}
<div class="page-action-menu">
<details class="actions-menu-links details-menu">
<summary>
<div class="icon-text">
<svg class="icon" aria-labelledby="actions-menu-links-title" role="img" style="color: #fff" xmlns="http://www.w3.org/2000/svg" width="28" height="28" viewBox="0 0 28 28" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round">
<title id="actions-menu-links-title">{{ action_title }}</title>
<circle cx="12" cy="12" r="3"></circle>
<path d="M19.4 15a1.65 1.65 0 0 0 .33 1.82l.06.06a2 2 0 0 1 0 2.83 2 2 0 0 1-2.83 0l-.06-.06a1.65 1.65 0 0 0-1.82-.33 1.65 1.65 0 0 0-1 1.51V21a2 2 0 0 1-2 2 2 2 0 0 1-2-2v-.09A1.65 1.65 0 0 0 9 19.4a1.65 1.65 0 0 0-1.82.33l-.06.06a2 2 0 0 1-2.83 0 2 2 0 0 1 0-2.83l.06-.06a1.65 1.65 0 0 0 .33-1.82 1.65 1.65 0 0 0-1.51-1H3a2 2 0 0 1-2-2 2 2 0 0 1 2-2h.09A1.65 1.65 0 0 0 4.6 9a1.65 1.65 0 0 0-.33-1.82l-.06-.06a2 2 0 0 1 0-2.83 2 2 0 0 1 2.83 0l.06.06a1.65 1.65 0 0 0 1.82.33H9a1.65 1.65 0 0 0 1-1.51V3a2 2 0 0 1 2-2 2 2 0 0 1 2 2v.09a1.65 1.65 0 0 0 1 1.51 1.65 1.65 0 0 0 1.82-.33l.06-.06a2 2 0 0 1 2.83 0 2 2 0 0 1 0 2.83l-.06.06a1.65 1.65 0 0 0-.33 1.82V9a1.65 1.65 0 0 0 1.51 1H21a2 2 0 0 1 2 2 2 2 0 0 1-2 2h-.09a1.65 1.65 0 0 0-1.51 1z"></path>
</svg>
<span>{{ action_title }}</span>
</div>
</summary>
<div class="dropdown-menu">
<div class="hook"></div>
<ul>
{% for link in action_links %}
<li><a href="{{ link.href }}">{{ link.label }}
{% if link.description %}
<p class="dropdown-description">{{ link.description }}</p>
{% endif %}</a>
</li>
{% endfor %}
</ul>
</div>
</details>
</div>
{% endif %}

View file

@ -0,0 +1,16 @@
<script>
document.body.addEventListener('click', (ev) => {
/* Close any open details elements that this click is outside of */
var target = ev.target;
var detailsClickedWithin = null;
while (target && target.tagName != 'DETAILS') {
target = target.parentNode;
}
if (target && target.tagName == 'DETAILS') {
detailsClickedWithin = target;
}
Array.from(document.querySelectorAll('details.details-menu')).filter(
(details) => details.open && details != detailsClickedWithin
).forEach(details => details.open = false);
});
</script>

View file

@ -1,7 +1,16 @@
<script src="/-/static/codemirror-5.31.0.js"></script> <script src="{{ base_url }}-/static/sql-formatter-2.3.3.min.js" defer></script>
<link rel="stylesheet" href="/-/static/codemirror-5.31.0-min.css" /> <script src="{{ base_url }}-/static/cm-editor-6.0.1.bundle.js"></script>
<script src="/-/static/codemirror-5.31.0-sql.min.js"></script>
<style> <style>
.CodeMirror { height: auto; min-height: 70px; width: 80%; border: 1px solid #ddd; } .cm-editor {
.CodeMirror-scroll { max-height: 200px; } resize: both;
overflow: hidden;
width: 80%;
border: 1px solid #ddd;
}
/* Fix autocomplete icon positioning. The icon element gets border-box sizing set due to
the global reset, but this causes overlapping icon and text. Markup:
`<div class="cm-completionIcon cm-completionIcon-keyword" aria-hidden="true"></div>` */
.cm-completionIcon {
box-sizing: content-box;
}
</style> </style>

View file

@ -1,13 +1,42 @@
<script> <script>
var editor = CodeMirror.fromTextArea(document.getElementById("sql-editor"), { {% if table_columns %}
lineNumbers: true, const schema = {{ table_columns|tojson(2) }};
mode: "text/x-sql", {% else %}
lineWrapping: true, const schema = {};
}); {% endif %}
editor.setOption("extraKeys", {
"Shift-Enter": function() { window.addEventListener("DOMContentLoaded", () => {
document.getElementsByClassName("sql")[0].submit(); const sqlFormat = document.querySelector("button#sql-format");
}, const readOnly = document.querySelector("pre#sql-query");
Tab: false const sqlInput = document.querySelector("textarea#sql-editor");
}); if (sqlFormat && !readOnly) {
sqlFormat.hidden = false;
}
if (sqlInput) {
var editor = (window.editor = cm.editorFromTextArea(sqlInput, {
schema,
}));
if (sqlFormat) {
sqlFormat.addEventListener("click", (ev) => {
const formatted = sqlFormatter.format(editor.state.doc.toString());
editor.dispatch({
changes: {
from: 0,
to: editor.state.doc.length,
insert: formatted,
},
});
});
}
}
if (sqlFormat && readOnly) {
const formatted = sqlFormatter.format(readOnly.innerHTML);
if (formatted != readOnly.innerHTML) {
sqlFormat.hidden = false;
sqlFormat.addEventListener("click", (ev) => {
readOnly.innerHTML = formatted;
});
}
}
});
</script> </script>

View file

@ -0,0 +1,15 @@
{% macro nav(request, database=None, table=None) -%}
{% if crumb_items is defined %}
{% set items=crumb_items(request=request, database=database, table=table) %}
{% if items %}
<p class="crumbs">
{% for item in items %}
<a href="{{ item.href }}">{{ item.label }}</a>
{% if not loop.last %}
/
{% endif %}
{% endfor %}
</p>
{% endif %}
{% endif %}
{%- endmacro %}

View file

@ -0,0 +1,50 @@
<script>
// Common utility functions for debug pages
// Populate form from URL parameters on page load
function populateFormFromURL() {
const params = new URLSearchParams(window.location.search);
const action = params.get('action');
if (action) {
const actionField = document.getElementById('action');
if (actionField) {
actionField.value = action;
}
}
const parent = params.get('parent');
if (parent) {
const parentField = document.getElementById('parent');
if (parentField) {
parentField.value = parent;
}
}
const child = params.get('child');
if (child) {
const childField = document.getElementById('child');
if (childField) {
childField.value = child;
}
}
const pageSize = params.get('page_size');
if (pageSize) {
const pageSizeField = document.getElementById('page_size');
if (pageSizeField) {
pageSizeField.value = pageSize;
}
}
return params;
}
// HTML escape function
function escapeHtml(text) {
if (text === null || text === undefined) return '';
const div = document.createElement('div');
div.textContent = text;
return div.innerHTML;
}
</script>

View file

@ -1,6 +1,6 @@
{% if metadata.description_html or metadata.description %} {% if metadata.get("description_html") or metadata.get("description") %}
<div class="metadata-description"> <div class="metadata-description">
{% if metadata.description_html %} {% if metadata.get("description_html") %}
{{ metadata.description_html|safe }} {{ metadata.description_html|safe }}
{% else %} {% else %}
{{ metadata.description }} {{ metadata.description }}
@ -21,7 +21,7 @@
<a href="{{ metadata.source_url }}"> <a href="{{ metadata.source_url }}">
{% endif %}{{ metadata.source or metadata.source_url }}{% if metadata.source_url %}</a>{% endif %} {% endif %}{{ metadata.source or metadata.source_url }}{% if metadata.source_url %}</a>{% endif %}
{% endif %} {% endif %}
{% if metadata.about or metadata.about_url %}{% if metadata.license or metadata.license_url or metadata.source or metadat.source_url %}&middot;{% endif %} {% if metadata.about or metadata.about_url %}{% if metadata.license or metadata.license_url or metadata.source or metadata.source_url %}&middot;{% endif %}
About: {% if metadata.about_url %} About: {% if metadata.about_url %}
<a href="{{ metadata.about_url }}"> <a href="{{ metadata.about_url }}">
{% endif %}{{ metadata.about or metadata.about_url }}{% if metadata.about_url %}</a>{% endif %} {% endif %}{{ metadata.about or metadata.about_url }}{% if metadata.about_url %}</a>{% endif %}

View file

@ -0,0 +1,28 @@
<div class="facet-results">
{% for facet_info in sorted_facet_results %}
<div class="facet-info facet-{{ database|to_css_class }}-{{ table|to_css_class }}-{{ facet_info.name|to_css_class }}" id="facet-{{ facet_info.name|to_css_class }}" data-column="{{ facet_info.name }}">
<p class="facet-info-name">
<strong>{{ facet_info.name }}{% if facet_info.type != "column" %} ({{ facet_info.type }}){% endif %}
<span class="facet-info-total">{% if facet_info.truncated %}&gt;{% endif %}{{ facet_info.results|length }}</span>
</strong>
{% if facet_info.hideable %}
<a href="{{ facet_info.toggle_url }}" class="cross">&#x2716;</a>
{% endif %}
</p>
<ul class="tight-bullets">
{% for facet_value in facet_info.results %}
{% if not facet_value.selected %}
<li><a href="{{ facet_value.toggle_url }}" data-facet-value="{{ facet_value.value }}">{{ (facet_value.label | string()) or "-" }}</a> {{ "{:,}".format(facet_value.count) }}</li>
{% else %}
<li>{{ facet_value.label or "-" }} &middot; {{ "{:,}".format(facet_value.count) }} <a href="{{ facet_value.toggle_url }}" class="cross">&#x2716;</a></li>
{% endif %}
{% endfor %}
{% if facet_info.truncated %}
<li class="facet-truncated">{% if request.args._facet_size != "max" -%}
<a href="{{ path_with_replaced_args(request, {"_facet_size": "max"}) }}"></a>{% else -%}…{% endif %}
</li>
{% endif %}
</ul>
</div>
{% endfor %}
</div>

View file

@ -1,5 +1,5 @@
Powered by <a href="https://github.com/simonw/datasette" title="Datasette v{{ datasette_version }}">Datasette</a> Powered by <a href="https://datasette.io/" title="Datasette v{{ datasette_version }}">Datasette</a>
{% if query_ms %}&middot; Query took {{ query_ms|round(3) }}ms{% endif %} {% if query_ms %}&middot; Queries took {{ query_ms|round(3) }}ms{% endif %}
{% if metadata %} {% if metadata %}
{% if metadata.license or metadata.license_url %}&middot; Data license: {% if metadata.license or metadata.license_url %}&middot; Data license:
{% if metadata.license_url %} {% if metadata.license_url %}

View file

@ -0,0 +1,145 @@
<style>
.permission-form {
background-color: #f5f5f5;
border: 1px solid #ddd;
border-radius: 5px;
padding: 1.5em;
margin-bottom: 2em;
}
.form-section {
margin-bottom: 1em;
}
.form-section label {
display: block;
margin-bottom: 0.3em;
font-weight: bold;
}
.form-section input[type="text"],
.form-section select {
width: 100%;
max-width: 500px;
padding: 0.5em;
box-sizing: border-box;
border: 1px solid #ccc;
border-radius: 3px;
}
.form-section input[type="text"]:focus,
.form-section select:focus {
outline: 2px solid #0066cc;
border-color: #0066cc;
}
.form-section small {
display: block;
margin-top: 0.3em;
color: #666;
}
.form-actions {
margin-top: 1em;
}
.submit-btn {
padding: 0.6em 1.5em;
font-size: 1em;
background-color: #0066cc;
color: white;
border: none;
border-radius: 3px;
cursor: pointer;
}
.submit-btn:hover {
background-color: #0052a3;
}
.submit-btn:disabled {
background-color: #ccc;
cursor: not-allowed;
}
.results-container {
margin-top: 2em;
}
.results-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 1em;
}
.results-count {
font-size: 0.9em;
color: #666;
}
.results-table {
width: 100%;
border-collapse: collapse;
background-color: white;
box-shadow: 0 1px 3px rgba(0,0,0,0.1);
}
.results-table th {
background-color: #f5f5f5;
padding: 0.75em;
text-align: left;
font-weight: bold;
border-bottom: 2px solid #ddd;
}
.results-table td {
padding: 0.75em;
border-bottom: 1px solid #eee;
}
.results-table tr:hover {
background-color: #f9f9f9;
}
.results-table tr.allow-row {
background-color: #f1f8f4;
}
.results-table tr.allow-row:hover {
background-color: #e8f5e9;
}
.results-table tr.deny-row {
background-color: #fef5f5;
}
.results-table tr.deny-row:hover {
background-color: #ffebee;
}
.resource-path {
font-family: monospace;
background-color: #f5f5f5;
padding: 0.2em 0.4em;
border-radius: 3px;
}
.pagination {
margin-top: 1.5em;
display: flex;
gap: 1em;
align-items: center;
}
.pagination a {
padding: 0.5em 1em;
background-color: #0066cc;
color: white;
text-decoration: none;
border-radius: 3px;
}
.pagination a:hover {
background-color: #0052a3;
}
.pagination span {
color: #666;
}
.no-results {
padding: 2em;
text-align: center;
color: #666;
background-color: #f9f9f9;
border: 1px solid #ddd;
border-radius: 5px;
}
.error-message {
padding: 1em;
background-color: #ffebee;
border: 2px solid #f44336;
border-radius: 5px;
color: #c62828;
}
.loading {
padding: 2em;
text-align: center;
color: #666;
}
</style>

View file

@ -0,0 +1,54 @@
{% if has_debug_permission %}
{% set query_string = '?' + request.query_string if request.query_string else '' %}
<style>
.permissions-debug-tabs {
border-bottom: 2px solid #e0e0e0;
margin-bottom: 2em;
display: flex;
flex-wrap: wrap;
gap: 0.5em;
}
.permissions-debug-tabs a {
padding: 0.75em 1.25em;
text-decoration: none;
color: #333;
border-bottom: 3px solid transparent;
margin-bottom: -2px;
transition: all 0.2s;
font-weight: 500;
}
.permissions-debug-tabs a:hover {
background-color: #f5f5f5;
border-bottom-color: #999;
}
.permissions-debug-tabs a.active {
color: #0066cc;
border-bottom-color: #0066cc;
background-color: #f0f7ff;
}
@media only screen and (max-width: 576px) {
.permissions-debug-tabs {
flex-direction: column;
gap: 0;
}
.permissions-debug-tabs a {
border-bottom: 1px solid #e0e0e0;
margin-bottom: 0;
}
.permissions-debug-tabs a.active {
border-left: 3px solid #0066cc;
border-bottom: 1px solid #e0e0e0;
}
}
</style>
<nav class="permissions-debug-tabs">
<a href="{{ urls.path('-/permissions') }}" {% if current_tab == "permissions" %}class="active"{% endif %}>Playground</a>
<a href="{{ urls.path('-/check') }}{{ query_string }}" {% if current_tab == "check" %}class="active"{% endif %}>Check</a>
<a href="{{ urls.path('-/allowed') }}{{ query_string }}" {% if current_tab == "allowed" %}class="active"{% endif %}>Allowed</a>
<a href="{{ urls.path('-/rules') }}{{ query_string }}" {% if current_tab == "rules" %}class="active"{% endif %}>Rules</a>
<a href="{{ urls.path('-/actions') }}" {% if current_tab == "actions" %}class="active"{% endif %}>Actions</a>
<a href="{{ urls.path('-/allow-debug') }}" {% if current_tab == "allow_debug" %}class="active"{% endif %}>Allow debug</a>
</nav>
{% endif %}

View file

@ -0,0 +1,3 @@
<p class="suggested-facets">
Suggested facets: {% for facet in suggested_facets %}<a href="{{ facet.toggle_url }}#facet-{{ facet.name|to_css_class }}">{{ facet.name }}</a>{% if facet.get("type") %} ({{ facet.type }}){% endif %}{% if not loop.last %}, {% endif %}{% endfor %}
</p>

View file

@ -1,28 +1,36 @@
<table class="rows-and-columns"> <!-- above-table-panel is a hook node for plugins to attach to . Displays even if no data available -->
<thead> <div class="above-table-panel"> </div>
<tr> {% if display_rows %}
{% for column in display_columns %} <div class="table-wrapper">
<th class="col-{{ column.name|to_css_class }}" scope="col"> <table class="rows-and-columns">
{% if not column.sortable %} <thead>
{{ column.name }} <tr>
{% else %} {% for column in display_columns %}
{% if column.name == sort %} <th {% if column.description %}data-column-description="{{ column.description }}" {% endif %}class="col-{{ column.name|to_css_class }}" scope="col" data-column="{{ column.name }}" data-column-type="{{ column.type.lower() }}" data-column-not-null="{{ column.notnull }}" data-is-pk="{% if column.is_pk %}1{% else %}0{% endif %}">
<a href="{{ path_with_replaced_args(request, {'_sort_desc': column.name, '_sort': None, '_next': None}) }}" rel="nofollow">{{ column.name }}&nbsp;</a> {% if not column.sortable %}
{{ column.name }}
{% else %} {% else %}
<a href="{{ path_with_replaced_args(request, {'_sort': column.name, '_sort_desc': None, '_next': None}) }}" rel="nofollow">{{ column.name }}{% if column.name == sort_desc %}&nbsp;▲{% endif %}</a> {% if column.name == sort %}
<a href="{{ fix_path(path_with_replaced_args(request, {'_sort_desc': column.name, '_sort': None, '_next': None})) }}" rel="nofollow">{{ column.name }}&nbsp;</a>
{% else %}
<a href="{{ fix_path(path_with_replaced_args(request, {'_sort': column.name, '_sort_desc': None, '_next': None})) }}" rel="nofollow">{{ column.name }}{% if column.name == sort_desc %}&nbsp;▲{% endif %}</a>
{% endif %}
{% endif %} {% endif %}
{% endif %} </th>
</th> {% endfor %}
{% endfor %} </tr>
</tr> </thead>
</thead> <tbody>
<tbody> {% for row in display_rows %}
{% for row in display_rows %} <tr>
<tr> {% for cell in row %}
{% for cell in row %} <td class="col-{{ cell.column|to_css_class }} type-{{ cell.value_type }}">{{ cell.value }}</td>
<td class="col-{{ cell.column|to_css_class }}">{{ cell.value }}</td> {% endfor %}
{% endfor %} </tr>
</tr> {% endfor %}
{% endfor %} </tbody>
</tbody> </table>
</table> </div>
{% else %}
<p class="zero-results">0 records</p>
{% endif %}

View file

@ -0,0 +1,61 @@
{% extends "base.html" %}
{% block title %}Debug allow rules{% endblock %}
{% block extra_head %}
<style>
textarea {
height: 10em;
width: 95%;
box-sizing: border-box;
padding: 0.5em;
border: 2px dotted black;
}
.two-col {
display: inline-block;
width: 48%;
}
.two-col label {
width: 48%;
}
p.message-warning {
white-space: pre-wrap;
}
@media only screen and (max-width: 576px) {
.two-col {
width: 100%;
}
}
</style>
{% endblock %}
{% block content %}
<h1>Debug allow rules</h1>
{% set current_tab = "allow_debug" %}
{% include "_permissions_debug_tabs.html" %}
<p>Use this tool to try out different actor and allow combinations. See <a href="https://docs.datasette.io/en/stable/authentication.html#defining-permissions-with-allow-blocks">Defining permissions with "allow" blocks</a> for documentation.</p>
<form class="core" action="{{ urls.path('-/allow-debug') }}" method="get" style="margin-bottom: 1em">
<div class="two-col">
<p><label>Allow block</label></p>
<textarea name="allow">{{ allow_input }}</textarea>
</div>
<div class="two-col">
<p><label>Actor</label></p>
<textarea name="actor">{{ actor_input }}</textarea>
</div>
<div style="margin-top: 1em;">
<input type="submit" value="Apply allow block to actor">
</div>
</form>
{% if error %}<p class="message-warning">{{ error }}</p>{% endif %}
{% if result == "True" %}<p class="message-info">Result: allow</p>{% endif %}
{% if result == "False" %}<p class="message-error">Result: deny</p>{% endif %}
{% endblock %}

View file

@ -0,0 +1,208 @@
{% extends "base.html" %}
{% block title %}API Explorer{% endblock %}
{% block extra_head %}
<script src="{{ base_url }}-/static/json-format-highlight-1.0.1.js"></script>
{% endblock %}
{% block content %}
<h1>API Explorer{% if private %} 🔒{% endif %}</h1>
<p>Use this tool to try out the
{% if datasette_version %}
<a href="https://docs.datasette.io/en/{{ datasette_version }}/json_api.html">Datasette API</a>.
{% else %}
Datasette API.
{% endif %}
</p>
<details open style="border: 2px solid #ccc; border-bottom: none; padding: 0.5em">
<summary style="cursor: pointer;">GET</summary>
<form class="core" method="get" id="api-explorer-get" style="margin-top: 0.7em">
<div>
<label for="path">API path:</label>
<input type="text" id="path" name="path" style="width: 60%">
<input type="submit" value="GET">
</div>
</form>
</details>
<details style="border: 2px solid #ccc; padding: 0.5em">
<summary style="cursor: pointer">POST</summary>
<form class="core" method="post" id="api-explorer-post" style="margin-top: 0.7em">
<div>
<label for="path">API path:</label>
<input type="text" id="path" name="path" style="width: 60%">
</div>
<div style="margin: 0.5em 0">
<label for="apiJson" style="vertical-align: top">JSON:</label>
<textarea id="apiJson" name="json" style="width: 60%; height: 200px; font-family: monospace; font-size: 0.8em;"></textarea>
</div>
<p><button id="json-format" type="button">Format JSON</button> <input type="submit" value="POST"></p>
</form>
</details>
<div id="output" style="display: none">
<h2>API response: HTTP <span id="response-status"></span></h2>
</h2>
<ul class="errors message-error"></ul>
<pre></pre>
</div>
<script>
document.querySelector('#json-format').addEventListener('click', (ev) => {
ev.preventDefault();
let json = document.querySelector('textarea[name="json"]').value.trim();
if (!json) {
return;
}
try {
const parsed = JSON.parse(json);
document.querySelector('textarea[name="json"]').value = JSON.stringify(parsed, null, 2);
} catch (e) {
alert("Error parsing JSON: " + e);
}
});
var postForm = document.getElementById('api-explorer-post');
var getForm = document.getElementById('api-explorer-get');
var output = document.getElementById('output');
var errorList = output.querySelector('.errors');
// On first load or fragment change populate forms from # in URL, if present
if (window.location.hash) {
onFragmentChange();
}
function onFragmentChange() {
var hash = window.location.hash.slice(1);
// Treat hash as a foo=bar string and parse it:
var params = new URLSearchParams(hash);
var method = params.get('method');
if (method == 'GET') {
getForm.closest('details').open = true;
postForm.closest('details').open = false;
getForm.querySelector('input[name="path"]').value = params.get('path');
} else if (method == 'POST') {
postForm.closest('details').open = true;
getForm.closest('details').open = false;
postForm.querySelector('input[name="path"]').value = params.get('path');
postForm.querySelector('textarea[name="json"]').value = params.get('json');
}
}
window.addEventListener('hashchange', () => {
onFragmentChange();
// Animate scroll to top of page
window.scrollTo({top: 0, behavior: 'smooth'});
});
// Cause GET and POST regions to toggle each other
var getDetails = getForm.closest('details');
var postDetails = postForm.closest('details');
getDetails.addEventListener('toggle', (ev) => {
if (getDetails.open) {
postDetails.open = false;
}
});
postDetails.addEventListener('toggle', (ev) => {
if (postDetails.open) {
getDetails.open = false;
}
});
getForm.addEventListener("submit", (ev) => {
ev.preventDefault();
var formData = new FormData(getForm);
// Update URL fragment hash
var serialized = new URLSearchParams(formData).toString() + '&method=GET';
window.history.pushState({}, "", location.pathname + '#' + serialized);
// Send the request
var path = formData.get('path');
fetch(path, {
method: 'GET',
headers: {
'Accept': 'application/json',
}
}).then((response) => {
output.style.display = 'block';
document.getElementById('response-status').textContent = response.status;
return response.json();
}).then((data) => {
output.querySelector('pre').innerHTML = jsonFormatHighlight(data);
errorList.style.display = 'none';
}).catch((error) => {
alert(error);
});
});
postForm.addEventListener("submit", (ev) => {
ev.preventDefault();
var formData = new FormData(postForm);
// Update URL fragment hash
var serialized = new URLSearchParams(formData).toString() + '&method=POST';
window.history.pushState({}, "", location.pathname + '#' + serialized);
// Send the request
var json = formData.get('json');
var path = formData.get('path');
// Validate JSON
if (!json.length) {
json = '{}';
}
try {
var data = JSON.parse(json);
} catch (err) {
alert("Invalid JSON: " + err);
return;
}
// POST JSON to path with content-type application/json
fetch(path, {
method: 'POST',
body: json,
headers: {
'Content-Type': 'application/json',
}
}).then(r => {
document.getElementById('response-status').textContent = r.status;
return r.json();
}).then(data => {
if (data.errors) {
errorList.style.display = 'block';
errorList.innerHTML = '';
data.errors.forEach(error => {
var li = document.createElement('li');
li.textContent = error;
errorList.appendChild(li);
});
} else {
errorList.style.display = 'none';
}
output.querySelector('pre').innerHTML = jsonFormatHighlight(data);
output.style.display = 'block';
}).catch(err => {
alert("Error: " + err);
});
});
</script>
{% if example_links %}
<h2>API endpoints</h2>
<ul class="bullets">
{% for database in example_links %}
<li>Database: <strong>{{ database.name }}</strong></li>
<ul class="bullets">
{% for link in database.links %}
<li><a href="{{ api_path(link) }}">{{ link.path }}</a> - {{ link.label }} </li>
{% endfor %}
{% for table in database.tables %}
<li><strong>{{ table.name }}</strong>
<ul class="bullets">
{% for link in table.links %}
<li><a href="{{ api_path(link) }}">{{ link.path }}</a> - {{ link.label }} </li>
{% endfor %}
</ul>
</li>
{% endfor %}
</ul>
{% endfor %}
</ul>
{% endif %}
{% endblock %}

View file

@ -1,32 +1,78 @@
<!DOCTYPE html> {% import "_crumbs.html" as crumbs with context %}<!DOCTYPE html>
<html> <html lang="en">
<head> <head>
<title>{% block title %}{% endblock %}</title> <title>{% block title %}{% endblock %}</title>
<link rel="stylesheet" href="/-/static/app.css?{{ app_css_hash }}"> <link rel="stylesheet" href="{{ urls.static('app.css') }}?{{ app_css_hash }}">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no"> <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
{% for url in extra_css_urls %} {% for url in extra_css_urls %}
<link rel="stylesheet" href="{{ url.url }}"{% if url.sri %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}> <link rel="stylesheet" href="{{ url.url }}"{% if url.get("sri") %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}>
{% endfor %} {% endfor %}
<script>window.datasetteVersion = '{{ datasette_version }}';</script>
<script src="{{ urls.static('datasette-manager.js') }}" defer></script>
{% for url in extra_js_urls %} {% for url in extra_js_urls %}
<script src="{{ url.url }}"{% if url.sri %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}></script> <script {% if url.module %}type="module" {% endif %}src="{{ url.url }}"{% if url.get("sri") %} integrity="{{ url.sri }}" crossorigin="anonymous"{% endif %}></script>
{% endfor %} {% endfor %}
{% block extra_head %}{% endblock %} {%- if alternate_url_json -%}
<link rel="alternate" type="application/json+datasette" href="{{ alternate_url_json }}">
{%- endif -%}
{%- block extra_head %}{% endblock -%}
</head> </head>
<body class="{% block body_class %}{% endblock %}"> <body class="{% block body_class %}{% endblock %}">
<div class="not-footer">
<header class="hd"><nav>{% block nav %}{% block crumbs %}{{ crumbs.nav(request=request) }}{% endblock %}
{% set links = menu_links() %}{% if links or show_logout %}
<details class="nav-menu details-menu">
<summary><svg aria-labelledby="nav-menu-svg-title" role="img"
fill="currentColor" stroke="currentColor" xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 16 16" width="16" height="16">
<title id="nav-menu-svg-title">Menu</title>
<path fill-rule="evenodd" d="M1 2.75A.75.75 0 011.75 2h12.5a.75.75 0 110 1.5H1.75A.75.75 0 011 2.75zm0 5A.75.75 0 011.75 7h12.5a.75.75 0 110 1.5H1.75A.75.75 0 011 7.75zM1.75 12a.75.75 0 100 1.5h12.5a.75.75 0 100-1.5H1.75z"></path>
</svg></summary>
<div class="nav-menu-inner">
{% if links %}
<ul>
{% for link in links %}
<li><a href="{{ link.href }}">{{ link.label }}</a></li>
{% endfor %}
</ul>
{% endif %}
{% if show_logout %}
<form class="nav-menu-logout" action="{{ urls.logout() }}" method="post">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<button class="button-as-link">Log out</button>
</form>{% endif %}
</div>
</details>{% endif %}
{% if actor %}
<div class="actor">
<strong>{{ display_actor(actor) }}</strong>
</div>
{% endif %}
{% endblock %}</nav></header>
<nav class="hd">{% block nav %}{% endblock %}</nav> {% block messages %}
{% if show_messages %}
{% for message, message_type in show_messages() %}
<p class="message-{% if message_type == 1 %}info{% elif message_type == 2 %}warning{% elif message_type == 3 %}error{% endif %}">{{ message }}</p>
{% endfor %}
{% endif %}
{% endblock %}
<div class="bd"> <section class="content">
{% block content %} {% block content %}
{% endblock %} {% endblock %}
</section>
</div> </div>
<footer class="ft">{% block footer %}{% include "_footer.html" %}{% endblock %}</footer>
<div class="ft">{% block footer %}{% include "_footer.html" %}{% endblock %}</div> {% include "_close_open_menus.html" %}
{% for body_script in body_scripts %} {% for body_script in body_scripts %}
<script>{{ body_script }}</script> <script{% if body_script.module %} type="module"{% endif %}>{{ body_script.script }}</script>
{% endfor %} {% endfor %}
{% if select_templates %}<!-- Templates considered: {{ select_templates|join(", ") }} -->{% endif %} {% if select_templates %}<!-- Templates considered: {{ select_templates|join(", ") }} -->{% endif %}
<script src="{{ urls.static('navigation-search.js') }}" defer></script>
<navigation-search url="/-/tables"></navigation-search>
</body> </body>
</html> </html>

View file

@ -0,0 +1,124 @@
{% extends "base.html" %}
{% block title %}Create an API token{% endblock %}
{% block extra_head %}
<style type="text/css">
#restrict-permissions label {
display: inline;
width: 90%;
}
</style>
{% endblock %}
{% block content %}
<h1>Create an API token</h1>
<p>This token will allow API access with the same abilities as your current user, <strong>{{ request.actor.id }}</strong></p>
{% if token %}
<div>
<h2>Your API token</h2>
<form>
<input type="text" class="copyable" style="width: 40%" value="{{ token }}">
<span class="copy-link-wrapper"></span>
</form>
<!--- show token in a <details> -->
<details style="margin-top: 1em">
<summary>Token details</summary>
<pre>{{ token_bits|tojson(4) }}</pre>
</details>
</div>
<h2>Create another token</h2>
{% endif %}
{% if errors %}
{% for error in errors %}
<p class="message-error">{{ error }}</p>
{% endfor %}
{% endif %}
<form class="core" action="{{ urls.path('-/create-token') }}" method="post">
<div>
<div class="select-wrapper" style="width: unset">
<select name="expire_type">
<option value="">Token never expires</option>
<option value="minutes">Expires after X minutes</option>
<option value="hours">Expires after X hours</option>
<option value="days">Expires after X days</option>
</select>
</div>
<input type="text" name="expire_duration" style="width: 10%">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<input type="submit" value="Create token">
<details style="margin-top: 1em" id="restrict-permissions">
<summary style="cursor: pointer;">Restrict actions that can be performed using this token</summary>
<h2>All databases and tables</h2>
<ul>
{% for permission in all_actions %}
<li><label><input type="checkbox" name="all:{{ permission }}"> {{ permission }}</label></li>
{% endfor %}
</ul>
{% for database in database_with_tables %}
<h2>All tables in "{{ database.name }}"</h2>
<ul>
{% for permission in database_actions %}
<li><label><input type="checkbox" name="database:{{ database.encoded }}:{{ permission }}"> {{ permission }}</label></li>
{% endfor %}
</ul>
{% endfor %}
<h2>Specific tables</h2>
{% for database in database_with_tables %}
{% for table in database.tables %}
<h3>{{ database.name }}: {{ table.name }}</h3>
<ul>
{% for permission in child_actions %}
<li><label><input type="checkbox" name="resource:{{ database.encoded }}:{{ table.encoded }}:{{ permission }}"> {{ permission }}</label></li>
{% endfor %}
</ul>
{% endfor %}
{% endfor %}
</details>
</form>
</div>
<script>
var expireDuration = document.querySelector('input[name="expire_duration"]');
expireDuration.style.display = 'none';
var expireType = document.querySelector('select[name="expire_type"]');
function showHideExpireDuration() {
if (expireType.value) {
expireDuration.style.display = 'inline';
expireDuration.setAttribute("placeholder", expireType.value.replace("Expires after X ", ""));
} else {
expireDuration.style.display = 'none';
}
}
showHideExpireDuration();
expireType.addEventListener('change', showHideExpireDuration);
var copyInput = document.querySelector(".copyable");
if (copyInput) {
var wrapper = document.querySelector(".copy-link-wrapper");
var button = document.createElement("button");
button.className = "copyable-copy-button";
button.setAttribute("type", "button");
button.innerHTML = "Copy to clipboard";
button.onclick = (ev) => {
ev.preventDefault();
copyInput.select();
document.execCommand("copy");
button.innerHTML = "Copied!";
setTimeout(() => {
button.innerHTML = "Copy to clipboard";
}, 1500);
};
wrapper.appendChild(button);
wrapper.insertAdjacentElement("afterbegin", button);
}
</script>
{% endblock %}

View file

@ -0,0 +1,13 @@
{% extends "base.html" %}
{% block title %}CSRF check failed){% endblock %}
{% block content %}
<h1>Form origin check failed</h1>
<p>Your request's origin could not be validated. Please return to the form and submit it again.</p>
<details><summary>Technical details</summary>
<p>Developers: consult Datasette's <a href="https://docs.datasette.io/en/latest/internals.html#csrf-protection">CSRF protection documentation</a>.</p>
<p>Error code is {{ message_name }}.</p>
</details>
{% endblock %}

View file

@ -3,67 +3,87 @@
{% block title %}{{ database }}{% endblock %} {% block title %}{{ database }}{% endblock %}
{% block extra_head %} {% block extra_head %}
{{ super() }} {{- super() -}}
{% include "_codemirror.html" %} {% include "_codemirror.html" %}
{% endblock %} {% endblock %}
{% block body_class %}db db-{{ database|to_css_class }}{% endblock %} {% block body_class %}db db-{{ database|to_css_class }}{% endblock %}
{% block nav %} {% block crumbs %}
<p class="crumbs"> {{ crumbs.nav(request=request, database=database) }}
<a href="/">home</a>
</p>
{{ super() }}
{% endblock %} {% endblock %}
{% block content %} {% block content %}
<div class="page-header" style="border-color: #{{ database_color }}">
<h1>{{ metadata.title or database }}{% if private %} 🔒{% endif %}</h1>
</div>
{% set action_links, action_title = database_actions(), "Database actions" %}
{% include "_action_menu.html" %}
<h1 style="padding-left: 10px; border-left: 10px solid #{{ database_color(database) }}">{{ metadata.title or database }}</h1> {{ top_database() }}
{% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %} {% block description_source_license %}{% include "_description_source_license.html" %}{% endblock %}
{% if config.allow_sql %} {% if allow_execute_sql %}
<form class="sql" action="{{ database_url(database) }}" method="get"> <form class="sql core" action="{{ urls.database(database) }}/-/query" method="get">
<h3>Custom SQL query</h3> <h3>Custom SQL query</h3>
<p><textarea name="sql">{% if tables %}select * from {{ tables[0].name|escape_sqlite }}{% else %}select sqlite_version(){% endif %}</textarea></p> <p><textarea id="sql-editor" name="sql">{% if tables %}select * from {{ tables[0].name|escape_sqlite }}{% else %}select sqlite_version(){% endif %}</textarea></p>
<p><input type="submit" value="Run SQL"></p> <p>
<button id="sql-format" type="button" hidden>Format SQL</button>
<input type="submit" value="Run SQL">
</p>
</form> </form>
{% endif %} {% endif %}
{% if attached_databases %}
<div class="message-info">
<p>The following databases are attached to this connection, and can be used for cross-database joins:</p>
<ul class="bullets">
{% for db_name in attached_databases %}
<li><strong>{{ db_name }}</strong> - <a href="{{ urls.database(db_name) }}/-/query?sql=select+*+from+[{{ db_name }}].sqlite_master+where+type='table'">tables</a></li>
{% endfor %}
</ul>
</div>
{% endif %}
{% if queries %}
<h2 id="queries">Queries</h2>
<ul class="bullets">
{% for query in queries %}
<li><a href="{{ urls.query(database, query.name) }}{% if query.fragment %}#{{ query.fragment }}{% endif %}" title="{{ query.description or query.sql }}">{{ query.title or query.name }}</a>{% if query.private %} 🔒{% endif %}</li>
{% endfor %}
</ul>
{% endif %}
{% if tables %}
<h2 id="tables">Tables <a style="font-weight: normal; font-size: 0.75em; padding-left: 0.5em;" href="{{ urls.database(database) }}/-/schema">schema</a></h2>
{% endif %}
{% for table in tables %} {% for table in tables %}
{% if show_hidden or not table.hidden %} {% if show_hidden or not table.hidden %}
<div class="db-table"> <div class="db-table">
<h2><a href="{{ database_url(database) }}/{{ table.name|quote_plus }}">{{ table.name }}</a>{% if table.hidden %}<em> (hidden)</em>{% endif %}</h2> <h3><a href="{{ urls.table(database, table.name) }}">{{ table.name }}</a>{% if table.private %} 🔒{% endif %}{% if table.hidden %}<em> (hidden)</em>{% endif %}</h3>
<p><em>{% for column in table.columns[:9] %}{{ column }}{% if not loop.last %}, {% endif %}{% endfor %}{% if table.columns|length > 9 %}...{% endif %}</em></p> <p><em>{% for column in table.columns %}{{ column }}{% if not loop.last %}, {% endif %}{% endfor %}</em></p>
<p>{% if table.count is none %}Many rows{% else %}{{ "{:,}".format(table.count) }} row{% if table.count == 1 %}{% else %}s{% endif %}{% endif %}</p> <p>{% if table.count is none %}Many rows{% elif table.count == count_limit + 1 %}&gt;{{ "{:,}".format(count_limit) }} rows{% else %}{{ "{:,}".format(table.count) }} row{% if table.count == 1 %}{% else %}s{% endif %}{% endif %}</p>
</div> </div>
{% endif %} {% endif %}
{% endfor %} {% endfor %}
{% if hidden_count and not show_hidden %} {% if hidden_count and not show_hidden %}
<p>... and <a href="{{ database_url(database) }}?_show_hidden=1">{{ "{:,}".format(hidden_count) }} hidden table{% if hidden_count == 1 %}{% else %}s{% endif %}</a></p> <p>... and <a href="{{ urls.database(database) }}?_show_hidden=1">{{ "{:,}".format(hidden_count) }} hidden table{% if hidden_count == 1 %}{% else %}s{% endif %}</a></p>
{% endif %} {% endif %}
{% if views %} {% if views %}
<h2>Views</h2> <h2 id="views">Views</h2>
<ul> <ul class="bullets">
{% for view in views %} {% for view in views %}
<li><a href="{{ database_url(database) }}/{{ view|urlencode }}">{{ view }}</a></li> <li><a href="{{ urls.database(database) }}/{{ view.name|urlencode }}">{{ view.name }}</a>{% if view.private %} 🔒{% endif %}</li>
{% endfor %}
</ul>
{% endif %}
{% if queries %}
<h2>Queries</h2>
<ul>
{% for query in queries %}
<li><a href="{{ database_url(database) }}/{{ query.name|urlencode }}" title="{{ query.description or query.sql }}">{{ query.title or query.name }}</a></li>
{% endfor %} {% endfor %}
</ul> </ul>
{% endif %} {% endif %}
{% if allow_download %} {% if allow_download %}
<p class="download-sqlite">Download SQLite DB: <a href="{{ database_url(database) }}.db">{{ database }}.db</a> <em>{{ format_bytes(size) }}</em></p> <p class="download-sqlite">Download SQLite DB: <a href="{{ urls.database(database) }}.db" rel="nofollow">{{ database }}.db</a> <em>{{ format_bytes(size) }}</em></p>
{% endif %} {% endif %}
{% include "_codemirror_foot.html" %} {% include "_codemirror_foot.html" %}

View file

@ -0,0 +1,43 @@
{% extends "base.html" %}
{% block title %}Registered Actions{% endblock %}
{% block content %}
<h1>Registered actions</h1>
{% set current_tab = "actions" %}
{% include "_permissions_debug_tabs.html" %}
<p style="margin-bottom: 2em;">
This Datasette instance has registered {{ data|length }} action{{ data|length != 1 and "s" or "" }}.
Actions are used by the permission system to control access to different features.
</p>
<table class="rows-and-columns">
<thead>
<tr>
<th>Name</th>
<th>Abbr</th>
<th>Description</th>
<th>Resource</th>
<th>Takes Parent</th>
<th>Takes Child</th>
<th>Also Requires</th>
</tr>
</thead>
<tbody>
{% for action in data %}
<tr>
<td><strong>{{ action.name }}</strong></td>
<td>{% if action.abbr %}<code>{{ action.abbr }}</code>{% endif %}</td>
<td>{{ action.description or "" }}</td>
<td>{% if action.resource_class %}<code>{{ action.resource_class }}</code>{% endif %}</td>
<td>{% if action.takes_parent %}✓{% endif %}</td>
<td>{% if action.takes_child %}✓{% endif %}</td>
<td>{% if action.also_requires %}<code>{{ action.also_requires }}</code>{% endif %}</td>
</tr>
{% endfor %}
</tbody>
</table>
{% endblock %}

View file

@ -0,0 +1,229 @@
{% extends "base.html" %}
{% block title %}Allowed Resources{% endblock %}
{% block extra_head %}
<script src="{{ base_url }}-/static/json-format-highlight-1.0.1.js"></script>
{% include "_permission_ui_styles.html" %}
{% include "_debug_common_functions.html" %}
{% endblock %}
{% block content %}
<h1>Allowed resources</h1>
{% set current_tab = "allowed" %}
{% include "_permissions_debug_tabs.html" %}
<p>Use this tool to check which resources the current actor is allowed to access for a given permission action. It queries the <code>/-/allowed.json</code> API endpoint.</p>
{% if request.actor %}
<p>Current actor: <strong>{{ request.actor.get("id", "anonymous") }}</strong></p>
{% else %}
<p>Current actor: <strong>anonymous (not logged in)</strong></p>
{% endif %}
<div class="permission-form">
<form id="allowed-form" method="get" action="{{ urls.path("-/allowed") }}">
<div class="form-section">
<label for="action">Action (permission name):</label>
<select id="action" name="action" required>
<option value="">Select an action...</option>
{% for action_name in supported_actions %}
<option value="{{ action_name }}">{{ action_name }}</option>
{% endfor %}
</select>
<small>Only certain actions are supported by this endpoint</small>
</div>
<div class="form-section">
<label for="parent">Filter by parent (optional):</label>
<input type="text" id="parent" name="parent" placeholder="e.g., database name">
<small>Filter results to a specific parent resource</small>
</div>
<div class="form-section">
<label for="child">Filter by child (optional):</label>
<input type="text" id="child" name="child" placeholder="e.g., table name">
<small>Filter results to a specific child resource (requires parent to be set)</small>
</div>
<div class="form-section">
<label for="page_size">Page size:</label>
<input type="number" id="page_size" name="page_size" value="50" min="1" max="200" style="max-width: 100px;">
<small>Number of results per page (max 200)</small>
</div>
<div class="form-actions">
<button type="submit" class="submit-btn" id="submit-btn">Check Allowed Resources</button>
</div>
</form>
</div>
<div id="results-container" style="display: none;">
<div class="results-header">
<h2>Results</h2>
<div class="results-count" id="results-count"></div>
</div>
<div id="results-content"></div>
<div id="pagination" class="pagination"></div>
<details style="margin-top: 2em;">
<summary style="cursor: pointer; font-weight: bold;">Raw JSON response</summary>
<pre id="raw-json" style="margin-top: 1em; padding: 1em; background-color: #f5f5f5; border: 1px solid #ddd; border-radius: 3px; overflow-x: auto;"></pre>
</details>
</div>
<script>
const form = document.getElementById('allowed-form');
const resultsContainer = document.getElementById('results-container');
const resultsContent = document.getElementById('results-content');
const resultsCount = document.getElementById('results-count');
const pagination = document.getElementById('pagination');
const submitBtn = document.getElementById('submit-btn');
const hasDebugPermission = {{ 'true' if has_debug_permission else 'false' }};
// Populate form on initial load
(function() {
const params = populateFormFromURL();
const action = params.get('action');
const page = params.get('page');
if (action) {
fetchResults(page ? parseInt(page) : 1);
}
})();
async function fetchResults(page = 1) {
submitBtn.disabled = true;
submitBtn.textContent = 'Loading...';
const formData = new FormData(form);
const params = new URLSearchParams();
for (const [key, value] of formData.entries()) {
if (value && key !== 'page_size') {
params.append(key, value);
}
}
const pageSize = document.getElementById('page_size').value || '50';
params.append('page', page.toString());
params.append('page_size', pageSize);
try {
const response = await fetch('{{ urls.path("-/allowed.json") }}?' + params.toString(), {
method: 'GET',
headers: {
'Accept': 'application/json',
}
});
const data = await response.json();
if (response.ok) {
displayResults(data);
} else {
displayError(data);
}
} catch (error) {
displayError({ error: error.message });
} finally {
submitBtn.disabled = false;
submitBtn.textContent = 'Check Allowed Resources';
}
}
function displayResults(data) {
resultsContainer.style.display = 'block';
// Update count
resultsCount.textContent = `Showing ${data.items.length} of ${data.total} total resources (page ${data.page})`;
// Display results table
if (data.items.length === 0) {
resultsContent.innerHTML = '<div class="no-results">No allowed resources found for this action.</div>';
} else {
let html = '<table class="results-table">';
html += '<thead><tr>';
html += '<th>Resource Path</th>';
html += '<th>Parent</th>';
html += '<th>Child</th>';
if (hasDebugPermission) {
html += '<th>Reason</th>';
}
html += '</tr></thead>';
html += '<tbody>';
for (const item of data.items) {
html += '<tr>';
html += `<td><span class="resource-path">${escapeHtml(item.resource || '/')}</span></td>`;
html += `<td>${escapeHtml(item.parent || '—')}</td>`;
html += `<td>${escapeHtml(item.child || '—')}</td>`;
if (hasDebugPermission) {
// Display reason as JSON array
let reasonHtml = '—';
if (item.reason && Array.isArray(item.reason)) {
reasonHtml = `<code>${escapeHtml(JSON.stringify(item.reason))}</code>`;
}
html += `<td>${reasonHtml}</td>`;
}
html += '</tr>';
}
html += '</tbody></table>';
resultsContent.innerHTML = html;
}
// Update pagination
pagination.innerHTML = '';
if (data.previous_url || data.next_url) {
if (data.previous_url) {
const prevLink = document.createElement('a');
prevLink.href = data.previous_url;
prevLink.textContent = '← Previous';
pagination.appendChild(prevLink);
}
const pageInfo = document.createElement('span');
pageInfo.textContent = `Page ${data.page}`;
pagination.appendChild(pageInfo);
if (data.next_url) {
const nextLink = document.createElement('a');
nextLink.href = data.next_url;
nextLink.textContent = 'Next →';
pagination.appendChild(nextLink);
}
}
// Update raw JSON
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
}
function displayError(data) {
resultsContainer.style.display = 'block';
resultsCount.textContent = '';
pagination.innerHTML = '';
resultsContent.innerHTML = `<div class="error-message">Error: ${escapeHtml(data.error || 'Unknown error')}</div>`;
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
}
// Disable child input if parent is empty
const parentInput = document.getElementById('parent');
const childInput = document.getElementById('child');
parentInput.addEventListener('input', () => {
childInput.disabled = !parentInput.value;
if (!parentInput.value) {
childInput.value = '';
}
});
// Initialize disabled state
childInput.disabled = !parentInput.value;
</script>
{% endblock %}

View file

@ -0,0 +1,270 @@
{% extends "base.html" %}
{% block title %}Permission Check{% endblock %}
{% block extra_head %}
<script src="{{ base_url }}-/static/json-format-highlight-1.0.1.js"></script>
{% include "_permission_ui_styles.html" %}
{% include "_debug_common_functions.html" %}
<style>
#output {
margin-top: 2em;
padding: 1em;
border-radius: 5px;
}
#output.allowed {
background-color: #e8f5e9;
border: 2px solid #4caf50;
}
#output.denied {
background-color: #ffebee;
border: 2px solid #f44336;
}
#output h2 {
margin-top: 0;
}
#output .result-badge {
display: inline-block;
padding: 0.3em 0.8em;
border-radius: 3px;
font-weight: bold;
font-size: 1.1em;
}
#output .allowed-badge {
background-color: #4caf50;
color: white;
}
#output .denied-badge {
background-color: #f44336;
color: white;
}
.details-section {
margin-top: 1em;
}
.details-section dt {
font-weight: bold;
margin-top: 0.5em;
}
.details-section dd {
margin-left: 1em;
}
</style>
{% endblock %}
{% block content %}
<h1>Permission check</h1>
{% set current_tab = "check" %}
{% include "_permissions_debug_tabs.html" %}
<p>Use this tool to test permission checks for the current actor. It queries the <code>/-/check.json</code> API endpoint.</p>
{% if request.actor %}
<p>Current actor: <strong>{{ request.actor.get("id", "anonymous") }}</strong></p>
{% else %}
<p>Current actor: <strong>anonymous (not logged in)</strong></p>
{% endif %}
<div class="permission-form">
<form id="check-form" method="get" action="{{ urls.path("-/check") }}">
<div class="form-section">
<label for="action">Action (permission name):</label>
<select id="action" name="action" required>
<option value="">Select an action...</option>
{% for action_name in sorted_actions %}
<option value="{{ action_name }}">{{ action_name }}</option>
{% endfor %}
</select>
<small>The permission action to check</small>
</div>
<div class="form-section">
<label for="parent">Parent resource (optional):</label>
<input type="text" id="parent" name="parent" placeholder="e.g., database name">
<small>For database-level permissions, specify the database name</small>
</div>
<div class="form-section">
<label for="child">Child resource (optional):</label>
<input type="text" id="child" name="child" placeholder="e.g., table name">
<small>For table-level permissions, specify the table name (requires parent)</small>
</div>
<div class="form-actions">
<button type="submit" class="submit-btn" id="submit-btn">Check Permission</button>
</div>
</form>
</div>
<div id="output" style="display: none;">
<h2>Result: <span class="result-badge" id="result-badge"></span></h2>
<dl class="details-section">
<dt>Action:</dt>
<dd id="result-action"></dd>
<dt>Resource Path:</dt>
<dd id="result-resource"></dd>
<dt>Actor ID:</dt>
<dd id="result-actor"></dd>
<div id="additional-details"></div>
</dl>
<details style="margin-top: 1em;">
<summary style="cursor: pointer; font-weight: bold;">Raw JSON response</summary>
<pre id="raw-json" style="margin-top: 1em; padding: 1em; background-color: #f5f5f5; border: 1px solid #ddd; border-radius: 3px; overflow-x: auto;"></pre>
</details>
</div>
<script>
const form = document.getElementById('check-form');
const output = document.getElementById('output');
const submitBtn = document.getElementById('submit-btn');
async function performCheck() {
submitBtn.disabled = true;
submitBtn.textContent = 'Checking...';
const formData = new FormData(form);
const params = new URLSearchParams();
for (const [key, value] of formData.entries()) {
if (value) {
params.append(key, value);
}
}
try {
const response = await fetch('{{ urls.path("-/check.json") }}?' + params.toString(), {
method: 'GET',
headers: {
'Accept': 'application/json',
}
});
const data = await response.json();
if (response.ok) {
displayResult(data);
} else {
displayError(data);
}
} catch (error) {
alert('Error: ' + error.message);
} finally {
submitBtn.disabled = false;
submitBtn.textContent = 'Check Permission';
}
}
// Populate form on initial load
(function() {
const params = populateFormFromURL();
const action = params.get('action');
if (action) {
performCheck();
}
})();
function displayResult(data) {
output.style.display = 'block';
// Set badge and styling
const resultBadge = document.getElementById('result-badge');
if (data.allowed) {
output.className = 'allowed';
resultBadge.className = 'result-badge allowed-badge';
resultBadge.textContent = 'ALLOWED ✓';
} else {
output.className = 'denied';
resultBadge.className = 'result-badge denied-badge';
resultBadge.textContent = 'DENIED ✗';
}
// Basic details
document.getElementById('result-action').textContent = data.action || 'N/A';
document.getElementById('result-resource').textContent = data.resource?.path || '/';
document.getElementById('result-actor').textContent = data.actor_id || 'anonymous';
// Additional details
const additionalDetails = document.getElementById('additional-details');
additionalDetails.innerHTML = '';
if (data.reason !== undefined) {
const dt = document.createElement('dt');
dt.textContent = 'Reason:';
const dd = document.createElement('dd');
dd.textContent = data.reason || 'N/A';
additionalDetails.appendChild(dt);
additionalDetails.appendChild(dd);
}
if (data.source_plugin !== undefined) {
const dt = document.createElement('dt');
dt.textContent = 'Source Plugin:';
const dd = document.createElement('dd');
dd.textContent = data.source_plugin || 'N/A';
additionalDetails.appendChild(dt);
additionalDetails.appendChild(dd);
}
if (data.used_default !== undefined) {
const dt = document.createElement('dt');
dt.textContent = 'Used Default:';
const dd = document.createElement('dd');
dd.textContent = data.used_default ? 'Yes' : 'No';
additionalDetails.appendChild(dt);
additionalDetails.appendChild(dd);
}
if (data.depth !== undefined) {
const dt = document.createElement('dt');
dt.textContent = 'Depth:';
const dd = document.createElement('dd');
dd.textContent = data.depth;
additionalDetails.appendChild(dt);
additionalDetails.appendChild(dd);
}
// Raw JSON
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
// Scroll to output
output.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
}
function displayError(data) {
output.style.display = 'block';
output.className = 'denied';
const resultBadge = document.getElementById('result-badge');
resultBadge.className = 'result-badge denied-badge';
resultBadge.textContent = 'ERROR';
document.getElementById('result-action').textContent = 'N/A';
document.getElementById('result-resource').textContent = 'N/A';
document.getElementById('result-actor').textContent = 'N/A';
const additionalDetails = document.getElementById('additional-details');
additionalDetails.innerHTML = '<dt>Error:</dt><dd>' + (data.error || 'Unknown error') + '</dd>';
document.getElementById('raw-json').innerHTML = jsonFormatHighlight(data);
output.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
}
// Disable child input if parent is empty
const parentInput = document.getElementById('parent');
const childInput = document.getElementById('child');
childInput.addEventListener('focus', () => {
if (!parentInput.value) {
alert('Please specify a parent resource first before adding a child resource.');
parentInput.focus();
}
});
</script>
{% endblock %}

View file

@ -0,0 +1,166 @@
{% extends "base.html" %}
{% block title %}Debug permissions{% endblock %}
{% block extra_head %}
{% include "_permission_ui_styles.html" %}
<style type="text/css">
.check-result-true {
color: green;
}
.check-result-false {
color: red;
}
.check-result-no-opinion {
color: #aaa;
}
.check h2 {
font-size: 1em
}
.check-action, .check-when, .check-result {
font-size: 1.3em;
}
textarea {
height: 10em;
width: 95%;
box-sizing: border-box;
padding: 0.5em;
border: 2px dotted black;
}
.two-col {
display: inline-block;
width: 48%;
}
.two-col label {
width: 48%;
}
@media only screen and (max-width: 576px) {
.two-col {
width: 100%;
}
}
</style>
{% endblock %}
{% block content %}
<h1>Permission playground</h1>
{% set current_tab = "permissions" %}
{% include "_permissions_debug_tabs.html" %}
<p>This tool lets you simulate an actor and a permission check for that actor.</p>
<div class="permission-form">
<form action="{{ urls.path('-/permissions') }}" id="debug-post" method="post">
<input type="hidden" name="csrftoken" value="{{ csrftoken() }}">
<div class="two-col">
<div class="form-section">
<label>Actor</label>
<textarea name="actor">{% if actor_input %}{{ actor_input }}{% else %}{"id": "root"}{% endif %}</textarea>
</div>
</div>
<div class="two-col" style="vertical-align: top">
<div class="form-section">
<label for="permission">Action</label>
<select name="permission" id="permission">
{% for permission in permissions %}
<option value="{{ permission.name }}">{{ permission.name }}</option>
{% endfor %}
</select>
</div>
<div class="form-section">
<label for="resource_1">Parent</label>
<input type="text" id="resource_1" name="resource_1" placeholder="e.g., database name">
</div>
<div class="form-section">
<label for="resource_2">Child</label>
<input type="text" id="resource_2" name="resource_2" placeholder="e.g., table name">
</div>
</div>
<div class="form-actions">
<button type="submit" class="submit-btn">Simulate permission check</button>
</div>
<pre style="margin-top: 1em" id="debugResult"></pre>
</form>
</div>
<script>
var rawPerms = {{ permissions|tojson }};
var permissions = Object.fromEntries(rawPerms.map(p => [p.name, p]));
var permissionSelect = document.getElementById('permission');
var resource1 = document.getElementById('resource_1');
var resource2 = document.getElementById('resource_2');
var resource1Section = resource1.closest('.form-section');
var resource2Section = resource2.closest('.form-section');
function updateResourceVisibility() {
var permission = permissionSelect.value;
var {takes_parent, takes_child} = permissions[permission];
resource1Section.style.display = takes_parent ? 'block' : 'none';
resource2Section.style.display = takes_child ? 'block' : 'none';
}
permissionSelect.addEventListener('change', updateResourceVisibility);
updateResourceVisibility();
// When #debug-post form is submitted, use fetch() to POST data
var debugPost = document.getElementById('debug-post');
var debugResult = document.getElementById('debugResult');
debugPost.addEventListener('submit', function(ev) {
ev.preventDefault();
var formData = new FormData(debugPost);
fetch(debugPost.action, {
method: 'POST',
body: new URLSearchParams(formData),
headers: {
'Accept': 'application/json'
}
}).then(function(response) {
if (!response.ok) {
throw new Error('Request failed with status ' + response.status);
}
return response.json();
}).then(function(data) {
debugResult.innerText = JSON.stringify(data, null, 4);
}).catch(function(error) {
debugResult.innerText = JSON.stringify({ error: error.message }, null, 4);
});
});
</script>
<h1>Recent permissions checks</h1>
<p>
{% if filter != "all" %}<a href="?filter=all">All</a>{% else %}<strong>All</strong>{% endif %},
{% if filter != "exclude-yours" %}<a href="?filter=exclude-yours">Exclude yours</a>{% else %}<strong>Exclude yours</strong>{% endif %},
{% if filter != "only-yours" %}<a href="?filter=only-yours">Only yours</a>{% else %}<strong>Only yours</strong>{% endif %}
</p>
{% if permission_checks %}
<table class="rows-and-columns permission-checks-table" id="permission-checks-table">
<thead>
<tr>
<th>When</th>
<th>Action</th>
<th>Parent</th>
<th>Child</th>
<th>Actor</th>
<th>Result</th>
</tr>
</thead>
<tbody>
{% for check in permission_checks %}
<tr>
<td><span style="font-size: 0.8em">{{ check.when.split('T', 1)[0] }}</span><br>{{ check.when.split('T', 1)[1].split('+', 1)[0].split('-', 1)[0].split('Z', 1)[0] }}</td>
<td><code>{{ check.action }}</code></td>
<td>{{ check.parent or '—' }}</td>
<td>{{ check.child or '—' }}</td>
<td>{% if check.actor %}<code>{{ check.actor|tojson }}</code>{% else %}<span class="check-actor-anon">anonymous</span>{% endif %}</td>
<td>{% if check.result %}<span class="check-result check-result-true">Allowed</span>{% elif check.result is none %}<span class="check-result check-result-no-opinion">No opinion</span>{% else %}<span class="check-result check-result-false">Denied</span>{% endif %}</td>
</tr>
{% endfor %}
</tbody>
</table>
{% else %}
<p class="no-results">No permission checks have been recorded yet.</p>
{% endif %}
{% endblock %}

Some files were not shown because too many files have changed in this diff Show more