`__.
+Specifying units for a column
+-----------------------------
+
+Datasette supports attaching units to a column, which will be used when displaying
+values from that column. SI prefixes will be used where appropriate.
+
+Column units are configured in the metadata like so:
+
+.. code-block:: json
+
+ {
+ "databases": {
+ "database1": {
+ "tables": {
+ "example_table": {
+ "units": {
+ "column1": "metres",
+ "column2": "Hz"
+ }
+ }
+ }
+ }
+ }
+ }
+
+Units are interpreted using Pint_, and you can see the full list of available units in
+Pint's `unit registry`_. You can also add `custom units`_ to the metadata, which will be
+registered with Pint:
+
+.. code-block:: json
+
+ {
+ "custom_units": [
+ "decibel = [] = dB"
+ ]
+ }
+
+.. _Pint: https://pint.readthedocs.io/
+.. _unit registry: https://github.com/hgrecco/pint/blob/master/pint/default_en.txt
+.. _custom units: http://pint.readthedocs.io/en/latest/defining.html
+
.. _metadata_default_sort:
Setting a default sort order
@@ -212,8 +154,9 @@ Setting a default sort order
By default Datasette tables are sorted by primary key. You can over-ride this default for a specific table using the ``"sort"`` or ``"sort_desc"`` metadata properties:
-.. [[[cog
- metadata_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"mydatabase": {
"tables": {
@@ -223,41 +166,13 @@ By default Datasette tables are sorted by primary key. You can over-ride this de
}
}
}
- })
-.. ]]]
-
-.. tab:: metadata.yaml
-
- .. code-block:: yaml
-
- databases:
- mydatabase:
- tables:
- example_table:
- sort: created
-
-
-.. tab:: metadata.json
-
- .. code-block:: json
-
- {
- "databases": {
- "mydatabase": {
- "tables": {
- "example_table": {
- "sort": "created"
- }
- }
- }
- }
- }
-.. [[[end]]]
+ }
Or use ``"sort_desc"`` to sort in descending order:
-.. [[[cog
- metadata_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"mydatabase": {
"tables": {
@@ -267,36 +182,7 @@ Or use ``"sort_desc"`` to sort in descending order:
}
}
}
- })
-.. ]]]
-
-.. tab:: metadata.yaml
-
- .. code-block:: yaml
-
- databases:
- mydatabase:
- tables:
- example_table:
- sort_desc: created
-
-
-.. tab:: metadata.json
-
- .. code-block:: json
-
- {
- "databases": {
- "mydatabase": {
- "tables": {
- "example_table": {
- "sort_desc": "created"
- }
- }
- }
- }
- }
-.. [[[end]]]
+ }
.. _metadata_page_size:
@@ -305,8 +191,9 @@ Setting a custom page size
Datasette defaults to displaying 100 rows per page, for both tables and views. You can change this default page size on a per-table or per-view basis using the ``"size"`` key in ``metadata.json``:
-.. [[[cog
- metadata_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"mydatabase": {
"tables": {
@@ -316,36 +203,7 @@ Datasette defaults to displaying 100 rows per page, for both tables and views. Y
}
}
}
- })
-.. ]]]
-
-.. tab:: metadata.yaml
-
- .. code-block:: yaml
-
- databases:
- mydatabase:
- tables:
- example_table:
- size: 10
-
-
-.. tab:: metadata.json
-
- .. code-block:: json
-
- {
- "databases": {
- "mydatabase": {
- "tables": {
- "example_table": {
- "size": 10
- }
- }
- }
- }
- }
-.. [[[end]]]
+ }
This size can still be over-ridden by passing e.g. ``?_size=50`` in the query string.
@@ -358,8 +216,9 @@ Datasette allows any column to be used for sorting by default. If you need to
control which columns are available for sorting you can do so using the optional
``sortable_columns`` key:
-.. [[[cog
- metadata_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"database1": {
"tables": {
@@ -372,41 +231,7 @@ control which columns are available for sorting you can do so using the optional
}
}
}
- })
-.. ]]]
-
-.. tab:: metadata.yaml
-
- .. code-block:: yaml
-
- databases:
- database1:
- tables:
- example_table:
- sortable_columns:
- - height
- - weight
-
-
-.. tab:: metadata.json
-
- .. code-block:: json
-
- {
- "databases": {
- "database1": {
- "tables": {
- "example_table": {
- "sortable_columns": [
- "height",
- "weight"
- ]
- }
- }
- }
- }
- }
-.. [[[end]]]
+ }
This will restrict sorting of ``example_table`` to just the ``height`` and
``weight`` columns.
@@ -415,8 +240,9 @@ You can also disable sorting entirely by setting ``"sortable_columns": []``
You can use ``sortable_columns`` to enable specific sort orders for a view called ``name_of_view`` in the database ``my_database`` like so:
-.. [[[cog
- metadata_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"my_database": {
"tables": {
@@ -429,41 +255,7 @@ You can use ``sortable_columns`` to enable specific sort orders for a view calle
}
}
}
- })
-.. ]]]
-
-.. tab:: metadata.yaml
-
- .. code-block:: yaml
-
- databases:
- my_database:
- tables:
- name_of_view:
- sortable_columns:
- - clicks
- - impressions
-
-
-.. tab:: metadata.json
-
- .. code-block:: json
-
- {
- "databases": {
- "my_database": {
- "tables": {
- "name_of_view": {
- "sortable_columns": [
- "clicks",
- "impressions"
- ]
- }
- }
- }
- }
- }
-.. [[[end]]]
+ }
.. _label_columns:
@@ -478,8 +270,9 @@ column should be used as the link label.
If your table has more than two columns you can specify which column should be
used for the link label with the ``label_column`` property:
-.. [[[cog
- metadata_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"database1": {
"tables": {
@@ -489,36 +282,7 @@ used for the link label with the ``label_column`` property:
}
}
}
- })
-.. ]]]
-
-.. tab:: metadata.yaml
-
- .. code-block:: yaml
-
- databases:
- database1:
- tables:
- example_table:
- label_column: title
-
-
-.. tab:: metadata.json
-
- .. code-block:: json
-
- {
- "databases": {
- "database1": {
- "tables": {
- "example_table": {
- "label_column": "title"
- }
- }
- }
- }
- }
-.. [[[end]]]
+ }
.. _metadata_hiding_tables:
@@ -528,106 +292,54 @@ Hiding tables
You can hide tables from the database listing view (in the same way that FTS and
SpatiaLite tables are automatically hidden) using ``"hidden": true``:
-.. [[[cog
- metadata_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"database1": {
"tables": {
"example_table": {
- "hidden": True
+ "hidden": true
}
}
}
}
- })
-.. ]]]
+ }
-.. tab:: metadata.yaml
+.. _metadata_yaml:
- .. code-block:: yaml
+Using YAML for metadata
+-----------------------
- databases:
- database1:
- tables:
- example_table:
- hidden: true
+Datasette accepts YAML as an alternative to JSON for your metadata configuration file. YAML is particularly useful for including multiline HTML and SQL strings.
+Here's an example of a ``metadata.yml`` file, reusing an example from :ref:`canned_queries`.
-.. tab:: metadata.json
+.. code-block:: yaml
- .. code-block:: json
+ title: Demonstrating Metadata from YAML
+ description_html: |-
+ This description includes a long HTML string
+
+ - YAML is better for embedding HTML strings than JSON!
+
+ license: ODbL
+ license_url: https://opendatacommons.org/licenses/odbl/
+ databases:
+ fixtures:
+ tables:
+ no_primary_key:
+ hidden: true
+ queries:
+ neighborhood_search:
+ sql: |-
+ select neighborhood, facet_cities.name, state
+ from facetable join facet_cities on facetable.city_id = facet_cities.id
+ where neighborhood like '%' || :text || '%' order by neighborhood;
+ title: Search neighborhoods
+ description_html: |-
+ This demonstrates basic LIKE search
- {
- "databases": {
- "database1": {
- "tables": {
- "example_table": {
- "hidden": true
- }
- }
- }
- }
- }
-.. [[[end]]]
+The ``metadata.yml`` file is passed to Datasette using the same ``--metadata`` option::
-.. _metadata_reference:
-
-Metadata reference
-------------------
-
-
-A full reference of every supported option in a ``metadata.json`` or ``metadata.yaml`` file.
-
-
-Top-level metadata
-~~~~~~~~~~~~~~~~~~
-
-"Top-level" metadata refers to fields that can be specified at the root level of a metadata file. These attributes are meant to describe the entire Datasette instance.
-
-The following are the full list of allowed top-level metadata fields:
-
-- ``title``
-- ``description``
-- ``description_html``
-- ``license``
-- ``license_url``
-- ``source``
-- ``source_url``
-
-Database-level metadata
-~~~~~~~~~~~~~~~~~~~~~~~
-
-"Database-level" metadata refers to fields that can be specified for each database in a Datasette instance. These attributes should be listed under a database inside the `"databases"` field.
-
-The following are the full list of allowed database-level metadata fields:
-
-- ``source``
-- ``source_url``
-- ``license``
-- ``license_url``
-- ``about``
-- ``about_url``
-
-Table-level metadata
-~~~~~~~~~~~~~~~~~~~~
-
-"Table-level" metadata refers to fields that can be specified for each table in a Datasette instance. These attributes should be listed under a specific table using the `"tables"` field.
-
-The following are the full list of allowed table-level metadata fields:
-
-- ``source``
-- ``source_url``
-- ``license``
-- ``license_url``
-- ``about``
-- ``about_url``
-- ``hidden``
-- ``sort/sort_desc``
-- ``size``
-- ``sortable_columns``
-- ``label_column``
-- ``facets``
-- ``fts_table``
-- ``fts_pk``
-- ``searchmode``
-- ``columns``
+ datasette fixtures.db --metadata metadata.yml
diff --git a/docs/pages.rst b/docs/pages.rst
index 3d6530a3..0ae72351 100644
--- a/docs/pages.rst
+++ b/docs/pages.rst
@@ -14,61 +14,31 @@ Top-level index
The root page of any Datasette installation is an index page that lists all of the currently attached databases. Some examples:
* `fivethirtyeight.datasettes.com `_
+* `global-power-plants.datasettes.com `_
* `register-of-members-interests.datasettes.com `_
Add ``/.json`` to the end of the URL for the JSON version of the underlying data:
* `fivethirtyeight.datasettes.com/.json `_
+* `global-power-plants.datasettes.com/.json `_
* `register-of-members-interests.datasettes.com/.json `_
-The index page can also be accessed at ``/-/``, useful for if the default index page has been replaced using an :ref:`index.html custom template `. The ``/-/`` page will always render the default Datasette ``index.html`` template.
-
.. _DatabaseView:
Database
========
-Each database has a page listing the tables, views and canned queries available for that database. If the :ref:`actions_execute_sql` permission is enabled (it's on by default) there will also be an interface for executing arbitrary SQL select queries against the data.
+Each database has a page listing the tables, views and canned queries available for that database. If the :ref:`permissions_execute_sql` permission is enabled (it's on by default) there will also be an interface for executing arbitrary SQL select queries against the data.
Examples:
* `fivethirtyeight.datasettes.com/fivethirtyeight `_
-* `datasette.io/global-power-plants `_
+* `global-power-plants.datasettes.com/global-power-plants `_
The JSON version of this page provides programmatic access to the underlying data:
* `fivethirtyeight.datasettes.com/fivethirtyeight.json `_
-* `datasette.io/global-power-plants.json `_
-
-.. _DatabaseView_hidden:
-
-Hidden tables
--------------
-
-Some tables listed on the database page are treated as hidden. Hidden tables are not completely invisible - they can be accessed through the "hidden tables" link at the bottom of the page. They are hidden because they represent low-level implementation details which are generally not useful to end-users of Datasette.
-
-The following tables are hidden by default:
-
-- Any table with a name that starts with an underscore - this is a Datasette convention to help plugins easily hide their own internal tables.
-- Tables that have been configured as ``"hidden": true`` using :ref:`metadata_hiding_tables`.
-- ``*_fts`` tables that implement SQLite full-text search indexes.
-- Tables relating to the inner workings of the SpatiaLite SQLite extension.
-- ``sqlite_stat`` tables used to store statistics used by the query optimizer.
-
-.. _QueryView:
-
-Queries
-=======
-
-The ``/database-name/-/query`` page can be used to execute an arbitrary SQL query against that database, if the :ref:`actions_execute_sql` permission is enabled. This query is passed as the ``?sql=`` query string parameter.
-
-This means you can link directly to a query by constructing the following URL:
-
-``/database-name/-/query?sql=SELECT+*+FROM+table_name``
-
-Each configured :ref:`canned query ` has its own page, at ``/database-name/query-name``. Viewing this page will execute the query and display the results.
-
-In both cases adding a ``.json`` extension to the URL will return the results as JSON.
+* `global-power-plants.datasettes.com/global-power-plants.json `_
.. _TableView:
@@ -87,7 +57,7 @@ Some examples:
* `../items `_ lists all of the line-items registered by UK MPs as potential conflicts of interest. It demonstrates Datasette's support for :ref:`full_text_search`.
* `../antiquities-act%2Factions_under_antiquities_act `_ is an interface for exploring the "actions under the antiquities act" data table published by FiveThirtyEight.
-* `../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas `_ is a filtered table page showing every Gas power plant in the United Kingdom. It includes some default facets (configured using `its metadata.json `_) and uses the `datasette-cluster-map `_ plugin to show a map of the results.
+* `../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas `_ is a filtered table page showing every Gas power plant in the United Kingdom. It includes some default facets (configured using `its metadata.json `_) and uses the `datasette-cluster-map `_ plugin to show a map of the results.
.. _RowView:
@@ -100,10 +70,10 @@ Table cells with extremely long text contents are truncated on the table view ac
Rows which are the targets of foreign key references from other tables will show a link to a filtered search for all records that reference that row. Here's an example from the Registers of Members Interests database:
-`../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001 `_
+`../people/uk.org.publicwhip%2Fperson%2F10001 `_
Note that this URL includes the encoded primary key of the record.
Here's that same page as JSON:
-`../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json `_
+`../people/uk.org.publicwhip%2Fperson%2F10001.json `_
diff --git a/docs/plugin_hooks.rst b/docs/plugin_hooks.rst
index 93f7f476..2c674975 100644
--- a/docs/plugin_hooks.rst
+++ b/docs/plugin_hooks.rst
@@ -57,8 +57,6 @@ arguments and can be called like this::
select random_integer(1, 10);
-``prepare_connection()`` hooks are not called for Datasette's :ref:`internal database `.
-
Examples: `datasette-jellyfish `__, `datasette-jq `__, `datasette-haversine `__, `datasette-rure `__
.. _plugin_hook_prepare_jinja2_environment:
@@ -94,17 +92,10 @@ This function can return an awaitable function if it needs to run any async code
Examples: `datasette-edit-templates `_
-.. _plugin_page_extras:
-
-Page extras
------------
-
-These plugin hooks can be used to affect the way HTML pages for different Datasette interfaces are rendered.
-
.. _plugin_hook_extra_template_vars:
extra_template_vars(template, database, table, columns, view_name, request, datasette)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+--------------------------------------------------------------------------------------
Extra template variables that should be made available in the rendered template context.
@@ -193,7 +184,7 @@ Examples: `datasette-search-all `
@@ -247,7 +238,7 @@ Examples: `datasette-cluster-map `
@@ -279,7 +270,7 @@ you have one:
Note that ``your-plugin`` here should be the hyphenated plugin name - the name that is displayed in the list on the ``/-/plugins`` debug page.
-If your code uses `JavaScript modules `__ you should include the ``"module": True`` key. See :ref:`configuration_reference_css_js` for more details.
+If your code uses `JavaScript modules `__ you should include the ``"module": True`` key. See :ref:`customization_css_and_javascript` for more details.
.. code-block:: python
@@ -297,7 +288,7 @@ Examples: `datasette-cluster-map `` block at the end of the ```` element on the page.
@@ -388,8 +379,8 @@ Examples: `datasette-publish-fly `__ for full details on how to build a CLI command, including how to define arguments and options.
-Note that ``register_commands()`` plugins cannot used with the :ref:`--plugins-dir mechanism ` - they need to be installed into the same virtual environment as Datasette using ``pip install``. Provided it has a ``pyproject.toml`` file (see :ref:`writing_plugins_packaging`) you can run ``pip install`` directly against the directory in which you are developing your plugin like so::
+Note that ``register_commands()`` plugins cannot used with the :ref:`--plugins-dir mechanism ` - they need to be installed into the same virtual environment as Datasette using ``pip install``. Provided it has a ``setup.py`` file (see :ref:`writing_plugins_packaging`) you can run ``pip install`` directly against the directory in which you are developing your plugin like so::
pip install -e path/to/my/datasette-plugin
@@ -777,129 +759,6 @@ The plugin hook can then be used to register the new facet class like this:
def register_facet_classes():
return [SpecialFacet]
-.. _plugin_register_actions:
-
-register_actions(datasette)
----------------------------
-
-If your plugin needs to register actions that can be checked with Datasette's new resource-based permission system, return a list of those actions from this hook.
-
-Actions define what operations can be performed on resources (like viewing a table, executing SQL, or custom plugin actions).
-
-.. code-block:: python
-
- from datasette import hookimpl
- from datasette.permissions import Action, Resource
-
-
- class DocumentCollectionResource(Resource):
- """A collection of documents."""
-
- name = "document-collection"
- parent_name = None
-
- def __init__(self, collection: str):
- super().__init__(parent=collection, child=None)
-
- @classmethod
- def resources_sql(cls) -> str:
- return """
- SELECT collection_name AS parent, NULL AS child
- FROM document_collections
- """
-
-
- class DocumentResource(Resource):
- """A document in a collection."""
-
- name = "document"
- parent_name = "document-collection"
-
- def __init__(self, collection: str, document: str):
- super().__init__(parent=collection, child=document)
-
- @classmethod
- def resources_sql(cls) -> str:
- return """
- SELECT collection_name AS parent, document_id AS child
- FROM documents
- """
-
-
- @hookimpl
- def register_actions(datasette):
- return [
- Action(
- name="list-documents",
- abbr="ld",
- description="List documents in a collection",
- resource_class=DocumentCollectionResource,
- ),
- Action(
- name="view-document",
- abbr="vdoc",
- description="View document",
- resource_class=DocumentResource,
- ),
- Action(
- name="edit-document",
- abbr="edoc",
- description="Edit document",
- resource_class=DocumentResource,
- ),
- ]
-
-The fields of the ``Action`` dataclass are as follows:
-
-``name`` - string
- The name of the action, e.g. ``view-document``. This should be unique across all plugins.
-
-``abbr`` - string or None
- An abbreviation of the action, e.g. ``vdoc``. This is optional. Since this needs to be unique across all installed plugins it's best to choose carefully or omit it entirely (same as setting it to ``None``.)
-
-``description`` - string or None
- A human-readable description of what the action allows you to do.
-
-``resource_class`` - type[Resource] or None
- The Resource subclass that defines what kind of resource this action applies to. Omit this (or set to ``None``) for global actions that apply only at the instance level with no associated resources (like ``debug-menu`` or ``permissions-debug``). Your Resource subclass must:
-
- - Define a ``name`` class attribute (e.g., ``"document"``)
- - Define a ``parent_class`` class attribute (``None`` for top-level resources like databases, or the parent ``Resource`` subclass for child resources)
- - Implement a ``resources_sql()`` classmethod that returns SQL returning all resources as ``(parent, child)`` columns
- - Have an ``__init__`` method that accepts appropriate parameters and calls ``super().__init__(parent=..., child=...)``
-
-The ``resources_sql()`` method
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-The ``resources_sql()`` classmethod returns a SQL query that lists all resources of that type that exist in the system.
-
-This query is used by Datasette to efficiently check permissions across multiple resources at once. When a user requests a list of resources (like tables, documents, or other entities), Datasette uses this SQL to:
-
-1. Get all resources of this type from your data catalog
-2. Combine it with permission rules from the ``permission_resources_sql`` hook
-3. Use SQL joins and filtering to determine which resources the actor can access
-4. Return only the permitted resources
-
-The SQL query **must** return exactly two columns:
-
-- ``parent`` - The parent identifier (e.g., database name, collection name), or ``NULL`` for top-level resources
-- ``child`` - The child identifier (e.g., table name, document ID), or ``NULL`` for parent-only resources
-
-For example, if you're building a document management plugin with collections and documents stored in a ``documents`` table, your ``resources_sql()`` might look like:
-
-.. code-block:: python
-
- @classmethod
- def resources_sql(cls) -> str:
- return """
- SELECT collection_name AS parent, document_id AS child
- FROM documents
- """
-
-This tells Datasette "here's how to find all documents in the system - look in the documents table and get the collection name and document ID for each one."
-
-The permission system then uses this query along with rules from plugins to determine which documents each user can access, all efficiently in SQL rather than loading everything into Python.
-
.. _plugin_asgi_wrapper:
asgi_wrapper(datasette)
@@ -959,9 +818,7 @@ Examples: `datasette-cors `__, `dat
startup(datasette)
------------------
-This hook fires when the Datasette application server first starts up.
-
-Here is an example that validates required plugin configuration. The server will fail to start and show an error if the validation check fails:
+This hook fires when the Datasette application server first starts up. You can implement a regular function, for example to validate required plugin configuration:
.. code-block:: python
@@ -972,7 +829,7 @@ Here is an example that validates required plugin configuration. The server will
"required-setting" in config
), "my-plugin requires setting required-setting"
-You can also return an async function, which will be awaited on startup. Use this option if you need to execute any database queries, for example this function which creates the ``my_table`` database table if it does not yet exist:
+Or you can return an async function which will be awaited on startup. Use this option if you need to make any database queries:
.. code-block:: python
@@ -993,7 +850,7 @@ Potential use-cases:
* Run some initialization code for the plugin
* Create database tables that a plugin needs on startup
-* Validate the configuration for a plugin on startup, and raise an error if it is invalid
+* Validate the metadata configuration for a plugin on startup, and raise an error if it is invalid
.. note::
@@ -1102,7 +959,7 @@ actor_from_request(datasette, request)
This is part of Datasette's :ref:`authentication and permissions system `. The function should attempt to authenticate an actor (either a user or an API actor of some sort) based on information in the request.
-If it cannot authenticate an actor, it should return ``None``, otherwise it should return a dictionary representing that actor. Once a plugin has returned an actor from this hook other plugins will be ignored.
+If it cannot authenticate an actor, it should return ``None``. Otherwise it should return a dictionary representing that actor.
Here's an example that authenticates the actor based on an incoming API key:
@@ -1126,7 +983,7 @@ Here's an example that authenticates the actor based on an incoming API key:
If you install this in your plugins directory you can test it like this::
- curl -H 'Authorization: Bearer this-is-a-secret' http://localhost:8003/-/actor.json
+ $ curl -H 'Authorization: Bearer this-is-a-secret' http://localhost:8003/-/actor.json
Instead of returning a dictionary, this function can return an awaitable function which itself returns either ``None`` or a dictionary. This is useful for authentication functions that need to make a database query - for example:
@@ -1153,108 +1010,7 @@ Instead of returning a dictionary, this function can return an awaitable functio
return inner
-Examples: `datasette-auth-tokens `_, `datasette-auth-passwords `_
-
-.. _plugin_hook_actors_from_ids:
-
-actors_from_ids(datasette, actor_ids)
--------------------------------------
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
-
-``actor_ids`` - list of strings or integers
- The actor IDs to look up.
-
-The hook must return a dictionary that maps the incoming actor IDs to their full dictionary representation.
-
-Some plugins that implement social features may store the ID of the :ref:`actor ` that performed an action - added a comment, bookmarked a table or similar - and then need a way to resolve those IDs into display-friendly actor dictionaries later on.
-
-The :ref:`await datasette.actors_from_ids(actor_ids) ` internal method can be used to look up actors from their IDs. It will dispatch to the first plugin that implements this hook.
-
-Unlike other plugin hooks, this only uses the first implementation of the hook to return a result. You can expect users to only have a single plugin installed that implements this hook.
-
-If no plugin is installed, Datasette defaults to returning actors that are just ``{"id": actor_id}``.
-
-The hook can return a dictionary or an awaitable function that then returns a dictionary.
-
-This example implementation returns actors from a database table:
-
-.. code-block:: python
-
- from datasette import hookimpl
-
-
- @hookimpl
- def actors_from_ids(datasette, actor_ids):
- db = datasette.get_database("actors")
-
- async def inner():
- sql = "select id, name from actors where id in ({})".format(
- ", ".join("?" for _ in actor_ids)
- )
- actors = {}
- for row in (await db.execute(sql, actor_ids)).rows:
- actor = dict(row)
- actors[actor["id"]] = actor
- return actors
-
- return inner
-
-The returned dictionary from this example looks like this:
-
-.. code-block:: json
-
- {
- "1": {"id": "1", "name": "Tony"},
- "2": {"id": "2", "name": "Tina"},
- }
-
-These IDs could be integers or strings, depending on how the actors used by the Datasette instance are configured.
-
-Example: `datasette-remote-actors `_
-
-.. _plugin_hook_jinja2_environment_from_request:
-
-jinja2_environment_from_request(datasette, request, env)
---------------------------------------------------------
-
-``datasette`` - :ref:`internals_datasette`
- A Datasette instance.
-
-``request`` - :ref:`internals_request` or ``None``
- The current HTTP request, if one is available.
-
-``env`` - ``Environment``
- The Jinja2 environment that will be used to render the current page.
-
-This hook can be used to return a customized `Jinja environment `__ based on the incoming request.
-
-If you want to run a single Datasette instance that serves different content for different domains, you can do so like this:
-
-.. code-block:: python
-
- from datasette import hookimpl
- from jinja2 import ChoiceLoader, FileSystemLoader
-
-
- @hookimpl
- def jinja2_environment_from_request(request, env):
- if request and request.host == "www.niche-museums.com":
- return env.overlay(
- loader=ChoiceLoader(
- [
- FileSystemLoader(
- "/mnt/niche-museums/templates"
- ),
- env.loader,
- ]
- ),
- enable_async=True,
- )
- return env
-
-This uses the Jinja `overlay() method `__ to create a new environment identical to the default environment except for having a different template loader, which first looks in the ``/mnt/niche-museums/templates`` directory before falling back on the default loader.
+Example: `datasette-auth-tokens `_
.. _plugin_hook_filters_from_request:
@@ -1366,206 +1122,19 @@ Here's an example that allows users to view the ``admin_log`` table only if thei
if not actor:
return False
user_id = actor["id"]
- result = await datasette.get_database(
+ return await datasette.get_database(
"staff"
).execute(
"select count(*) from admin_users where user_id = :user_id",
{"user_id": user_id},
)
- return result.first()[0] > 0
return inner
-See :ref:`built-in permissions ` for a full list of permissions that are included in Datasette core.
+See :ref:`built-in permissions ` for a full list of permissions that are included in Datasette core.
Example: `datasette-permissions-sql `_
-.. _plugin_hook_permission_resources_sql:
-
-permission_resources_sql(datasette, actor, action)
---------------------------------------------------
-
-``datasette`` - :ref:`internals_datasette`
- Access to the Datasette instance.
-
-``actor`` - dictionary or None
- The current actor dictionary. ``None`` for anonymous requests.
-
-``action`` - string
- The permission action being evaluated. Examples include ``"view-table"`` or ``"insert-row"``.
-
-Return value
- A :class:`datasette.permissions.PermissionSQL` object, ``None`` or an iterable of ``PermissionSQL`` objects.
-
-Datasette's action-based permission resolver calls this hook to gather SQL rows describing which
-resources an actor may access (``allow = 1``) or should be denied (``allow = 0``) for a specific action.
-Each SQL snippet should return ``parent``, ``child``, ``allow`` and ``reason`` columns.
-
-**Parameter naming convention:** Plugin parameters in ``PermissionSQL.params`` should use unique names
-to avoid conflicts with other plugins. The recommended convention is to prefix parameters with your
-plugin's source name (e.g., ``myplugin_user_id``). The system reserves these parameter names:
-``:actor``, ``:actor_id``, ``:action``, and ``:filter_parent``.
-
-You can also use return ``PermissionSQL.allow(reason="reason goes here")`` or ``PermissionSQL.deny(reason="reason goes here")`` as shortcuts for simple root-level allow or deny rules. These will create SQL snippets that look like this:
-
-.. code-block:: sql
-
- SELECT
- NULL AS parent,
- NULL AS child,
- 1 AS allow,
- 'reason goes here' AS reason
-
-Or ``0 AS allow`` for denies.
-
-Permission plugin examples
-~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-These snippets show how to use the new ``permission_resources_sql`` hook to
-contribute rows to the action-based permission resolver. Each hook receives the
-current actor dictionary (or ``None``) and must return ``None`` or an instance or list of
-``datasette.permissions.PermissionSQL`` (or a coroutine that resolves to that).
-
-Allow Alice to view a specific table
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-This plugin grants the actor with ``id == "alice"`` permission to perform the
-``view-table`` action against the ``sales`` table inside the ``accounting`` database.
-
-.. code-block:: python
-
- from datasette import hookimpl
- from datasette.permissions import PermissionSQL
-
-
- @hookimpl
- def permission_resources_sql(datasette, actor, action):
- if action != "view-table":
- return None
- if not actor or actor.get("id") != "alice":
- return None
-
- return PermissionSQL(
- sql="""
- SELECT
- 'accounting' AS parent,
- 'sales' AS child,
- 1 AS allow,
- 'alice can view accounting/sales' AS reason
- """,
- )
-
-Restrict execute-sql to a database prefix
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Only allow ``execute-sql`` against databases whose name begins with
-``analytics_``. This shows how to use parameters that the permission resolver
-will pass through to the SQL snippet.
-
-.. code-block:: python
-
- from datasette import hookimpl
- from datasette.permissions import PermissionSQL
-
-
- @hookimpl
- def permission_resources_sql(datasette, actor, action):
- if action != "execute-sql":
- return None
-
- return PermissionSQL(
- sql="""
- SELECT
- parent,
- NULL AS child,
- 1 AS allow,
- 'execute-sql allowed for analytics_*' AS reason
- FROM catalog_databases
- WHERE database_name LIKE :analytics_prefix
- """,
- params={
- "analytics_prefix": "analytics_%",
- },
- )
-
-Read permissions from a custom table
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-This example stores grants in an internal table called ``permission_grants``
-with columns ``(actor_id, action, parent, child, allow, reason)``.
-
-.. code-block:: python
-
- from datasette import hookimpl
- from datasette.permissions import PermissionSQL
-
-
- @hookimpl
- def permission_resources_sql(datasette, actor, action):
- if not actor:
- return None
-
- return PermissionSQL(
- sql="""
- SELECT
- parent,
- child,
- allow,
- COALESCE(reason, 'permission_grants table') AS reason
- FROM permission_grants
- WHERE actor_id = :grants_actor_id
- AND action = :grants_action
- """,
- params={
- "grants_actor_id": actor.get("id"),
- "grants_action": action,
- },
- )
-
-Default deny with an exception
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Combine a root-level deny with a specific table allow for trusted users.
-The resolver will automatically apply the most specific rule.
-
-.. code-block:: python
-
- from datasette import hookimpl
- from datasette.permissions import PermissionSQL
-
-
- TRUSTED = {"alice", "bob"}
-
-
- @hookimpl
- def permission_resources_sql(datasette, actor, action):
- if action != "view-table":
- return None
-
- actor_id = (actor or {}).get("id")
-
- if actor_id not in TRUSTED:
- return PermissionSQL(
- sql="""
- SELECT NULL AS parent, NULL AS child, 0 AS allow,
- 'default deny view-table' AS reason
- """,
- )
-
- return PermissionSQL(
- sql="""
- SELECT NULL AS parent, NULL AS child, 0 AS allow,
- 'default deny view-table' AS reason
- UNION ALL
- SELECT 'reports' AS parent, 'daily_metrics' AS child, 1 AS allow,
- 'trusted user access' AS reason
- """,
- params={"actor_id": actor_id},
- )
-
-The ``UNION ALL`` ensures the deny rule is always present, while the second row
-adds the exception for trusted users.
-
.. _plugin_hook_register_magic_parameters:
register_magic_parameters(datasette)
@@ -1580,11 +1149,10 @@ Magic parameters all take this format: ``_prefix_rest_of_parameter``. The prefix
To register a new function, return it as a tuple of ``(string prefix, function)`` from this hook. The function you register should take two arguments: ``key`` and ``request``, where ``key`` is the ``rest_of_parameter`` portion of the parameter and ``request`` is the current :ref:`internals_request`.
-This example registers two new magic parameters: ``:_request_http_version`` returning the HTTP version of the current request, and ``:_uuid_new`` which returns a new UUID. It also registers an ``:_asynclookup_key`` parameter, demonstrating that these functions can be asynchronous:
+This example registers two new magic parameters: ``:_request_http_version`` returning the HTTP version of the current request, and ``:_uuid_new`` which returns a new UUID:
.. code-block:: python
- from datasette import hookimpl
from uuid import uuid4
@@ -1602,16 +1170,11 @@ This example registers two new magic parameters: ``:_request_http_version`` retu
raise KeyError
- async def asynclookup(key, request):
- return await do_something_async(key)
-
-
@hookimpl
def register_magic_parameters(datasette):
return [
("request", request),
("uuid", uuid),
- ("asynclookup", asynclookup),
]
.. _plugin_hook_forbidden:
@@ -1707,32 +1270,6 @@ This example logs an error to `Sentry `__ and then renders a
Example: `datasette-sentry `_
-.. _plugin_hook_skip_csrf:
-
-skip_csrf(datasette, scope)
----------------------------
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
-
-``scope`` - dictionary
- The `ASGI scope `__ for the incoming HTTP request.
-
-This hook can be used to skip :ref:`internals_csrf` for a specific incoming request. For example, you might have a custom path at ``/submit-comment`` which is designed to accept comments from anywhere, whether or not the incoming request originated on the site and has an accompanying CSRF token.
-
-This example will disable CSRF protection for that specific URL path:
-
-.. code-block:: python
-
- from datasette import hookimpl
-
-
- @hookimpl
- def skip_csrf(scope):
- return scope["path"] == "/submit-comment"
-
-If any of the currently active ``skip_csrf()`` plugin hooks return ``True``, CSRF protection will be skipped for the request.
-
.. _plugin_hook_menu_links:
menu_links(datasette, actor, request)
@@ -1776,21 +1313,10 @@ Using :ref:`internals_datasette_urls` here ensures that links in the menu will t
Examples: `datasette-search-all `_, `datasette-graphql `_
-.. _plugin_actions:
-
-Action hooks
-------------
-
-Action hooks can be used to add items to the action menus that appear at the top of different pages within Datasette. Unlike :ref:`menu_links() `, actions which are displayed on every page, actions should only be relevant to the page the user is currently viewing.
-
-Each of these hooks should return return a list of ``{"href": "...", "label": "..."}`` menu items, with optional ``"description": "..."`` keys describing each action in more detail.
-
-They can alternatively return an ``async def`` awaitable function which, when called, returns a list of those menu items.
-
.. _plugin_hook_table_actions:
table_actions(datasette, actor, database, table, request)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+---------------------------------------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
@@ -1807,6 +1333,10 @@ table_actions(datasette, actor, database, table, request)
``request`` - :ref:`internals_request` or None
The current HTTP request. This can be ``None`` if the request object is not available.
+This hook allows table actions to be displayed in a menu accessed via an action icon at the top of the table page. It should return a list of ``{"href": "...", "label": "..."}`` menu items.
+
+It can alternatively return an ``async def`` awaitable function which returns a list of menu items.
+
This example adds a new table action if the signed in user is ``"root"``:
.. code-block:: python
@@ -1825,142 +1355,15 @@ This example adds a new table action if the signed in user is ``"root"``:
)
),
"label": "Edit schema for this table",
- "description": "Add, remove, rename or alter columns for this table.",
}
]
Example: `datasette-graphql `_
-.. _plugin_hook_view_actions:
-
-view_actions(datasette, actor, database, view, request)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
-
-``actor`` - dictionary or None
- The currently authenticated :ref:`actor `.
-
-``database`` - string
- The name of the database.
-
-``view`` - string
- The name of the SQL view.
-
-``request`` - :ref:`internals_request` or None
- The current HTTP request. This can be ``None`` if the request object is not available.
-
-Like :ref:`plugin_hook_table_actions` but for SQL views.
-
-.. _plugin_hook_query_actions:
-
-query_actions(datasette, actor, database, query_name, request, sql, params)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
-
-``actor`` - dictionary or None
- The currently authenticated :ref:`actor `.
-
-``database`` - string
- The name of the database.
-
-``query_name`` - string or None
- The name of the canned query, or ``None`` if this is an arbitrary SQL query.
-
-``request`` - :ref:`internals_request`
- The current HTTP request.
-
-``sql`` - string
- The SQL query being executed
-
-``params`` - dictionary
- The parameters passed to the SQL query, if any.
-
-Populates a "Query actions" menu on the canned query and arbitrary SQL query pages.
-
-This example adds a new query action linking to a page for explaining a query:
-
-.. code-block:: python
-
- from datasette import hookimpl
- import urllib
-
-
- @hookimpl
- def query_actions(datasette, database, query_name, sql):
- # Don't explain an explain
- if sql.lower().startswith("explain"):
- return
- return [
- {
- "href": datasette.urls.database(database)
- + "?"
- + urllib.parse.urlencode(
- {
- "sql": "explain " + sql,
- }
- ),
- "label": "Explain this query",
- "description": "Get a summary of how SQLite executes the query",
- },
- ]
-
-Example: `datasette-create-view `_
-
-.. _plugin_hook_row_actions:
-
-row_actions(datasette, actor, request, database, table, row)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
-
-``actor`` - dictionary or None
- The currently authenticated :ref:`actor `.
-
-``request`` - :ref:`internals_request` or None
- The current HTTP request.
-
-``database`` - string
- The name of the database.
-
-``table`` - string
- The name of the table.
-
-``row`` - ``sqlite.Row``
- The SQLite row object being displayed on the page.
-
-Return links for the "Row actions" menu shown at the top of the row page.
-
-This example displays the row in JSON plus some additional debug information if the user is signed in:
-
-.. code-block:: python
-
- from datasette import hookimpl
-
-
- @hookimpl
- def row_actions(datasette, database, table, actor, row):
- if actor:
- return [
- {
- "href": datasette.urls.instance(),
- "label": f"Row details for {actor['id']}",
- "description": json.dumps(
- dict(row), default=repr
- ),
- },
- ]
-
-Example: `datasette-enrichments `_
-
.. _plugin_hook_database_actions:
database_actions(datasette, actor, database, request)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+-----------------------------------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
@@ -1974,339 +1377,76 @@ database_actions(datasette, actor, database, request)
``request`` - :ref:`internals_request`
The current HTTP request.
-Populates an actions menu on the database page.
+This hook is similar to :ref:`plugin_hook_table_actions` but populates an actions menu on the database page.
-This example adds a new database action for creating a table, if the user has the ``edit-schema`` permission:
+Example: `datasette-graphql `_
-.. code-block:: python
+.. _plugin_hook_skip_csrf:
- from datasette import hookimpl
-
-
- @hookimpl
- def database_actions(datasette, actor, database):
- async def inner():
- if not await datasette.permission_allowed(
- actor,
- "edit-schema",
- resource=database,
- default=False,
- ):
- return []
- return [
- {
- "href": datasette.urls.path(
- "/-/edit-schema/{}/-/create".format(
- database
- )
- ),
- "label": "Create a table",
- }
- ]
-
- return inner
-
-Example: `datasette-graphql `_, `datasette-edit-schema `_
-
-.. _plugin_hook_homepage_actions:
-
-homepage_actions(datasette, actor, request)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+skip_csrf(datasette, scope)
+---------------------------
``datasette`` - :ref:`internals_datasette`
You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``, or to execute SQL queries.
+``scope`` - dictionary
+ The `ASGI scope `__ for the incoming HTTP request.
+
+This hook can be used to skip :ref:`internals_csrf` for a specific incoming request. For example, you might have a custom path at ``/submit-comment`` which is designed to accept comments from anywhere, whether or not the incoming request originated on the site and has an accompanying CSRF token.
+
+This example will disable CSRF protection for that specific URL path:
+
+.. code-block:: python
+
+ from datasette import hookimpl
+
+
+ @hookimpl
+ def skip_csrf(scope):
+ return scope["path"] == "/submit-comment"
+
+If any of the currently active ``skip_csrf()`` plugin hooks return ``True``, CSRF protection will be skipped for the request.
+
+.. _plugin_hook_get_metadata:
+
+get_metadata(datasette, key, database, table)
+---------------------------------------------
+
+``datasette`` - :ref:`internals_datasette`
+ You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
+
``actor`` - dictionary or None
The currently authenticated :ref:`actor `.
-``request`` - :ref:`internals_request`
- The current HTTP request.
+``database`` - string or None
+ The name of the database metadata is being asked for.
-Populates an actions menu on the top-level index homepage of the Datasette instance.
-
-This example adds a link an imagined tool for editing the homepage, only for signed in users:
-
-.. code-block:: python
-
- from datasette import hookimpl
-
-
- @hookimpl
- def homepage_actions(datasette, actor):
- if actor:
- return [
- {
- "href": datasette.urls.path(
- "/-/customize-homepage"
- ),
- "label": "Customize homepage",
- }
- ]
-
-.. _plugin_hook_slots:
-
-Template slots
---------------
-
-The following set of plugin hooks can be used to return extra HTML content that will be inserted into the corresponding page, directly below the ```` heading.
-
-Multiple plugins can contribute content here. The order in which it is displayed can be controlled using Pluggy's `call time order options `__.
-
-Each of these plugin hooks can return either a string or an awaitable function that returns a string.
-
-.. _plugin_hook_top_homepage:
-
-top_homepage(datasette, request)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-``request`` - :ref:`internals_request`
- The current HTTP request.
-
-Returns HTML to be displayed at the top of the Datasette homepage.
-
-.. _plugin_hook_top_database:
-
-top_database(datasette, request, database)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-``request`` - :ref:`internals_request`
- The current HTTP request.
-
-``database`` - string
- The name of the database.
-
-Returns HTML to be displayed at the top of the database page.
-
-.. _plugin_hook_top_table:
-
-top_table(datasette, request, database, table)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-``request`` - :ref:`internals_request`
- The current HTTP request.
-
-``database`` - string
- The name of the database.
-
-``table`` - string
+``table`` - string or None
The name of the table.
-Returns HTML to be displayed at the top of the table page.
+``key`` - string or None
+ The name of the key for which data is being asked for.
-.. _plugin_hook_top_row:
+This hook is responsible for returning a dictionary corresponding to Datasette :ref:`metadata`. This function is passed the ``database``, ``table`` and ``key`` which were passed to the upstream internal request for metadata. Regardless, it is important to return a global metadata object, where ``"databases": []`` would be a top-level key. The dictionary returned here, will be merged with, and overwritten by, the contents of the physical ``metadata.yaml`` if one is present.
-top_row(datasette, request, database, table, row)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-``request`` - :ref:`internals_request`
- The current HTTP request.
-
-``database`` - string
- The name of the database.
-
-``table`` - string
- The name of the table.
-
-``row`` - ``sqlite.Row``
- The SQLite row object being displayed.
-
-Returns HTML to be displayed at the top of the row page.
-
-.. _plugin_hook_top_query:
-
-top_query(datasette, request, database, sql)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-``request`` - :ref:`internals_request`
- The current HTTP request.
-
-``database`` - string
- The name of the database.
-
-``sql`` - string
- The SQL query.
-
-Returns HTML to be displayed at the top of the query results page.
-
-.. _plugin_hook_top_canned_query:
-
-top_canned_query(datasette, request, database, query_name)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-``request`` - :ref:`internals_request`
- The current HTTP request.
-
-``database`` - string
- The name of the database.
-
-``query_name`` - string
- The name of the canned query.
-
-Returns HTML to be displayed at the top of the canned query page.
-
-.. _plugin_event_tracking:
-
-Event tracking
---------------
-
-Datasette includes an internal mechanism for tracking notable events. This can be used for analytics, but can also be used by plugins that want to listen out for when key events occur (such as a table being created) and take action in response.
-
-Plugins can register to receive events using the ``track_event`` plugin hook.
-
-They can also define their own events for other plugins to receive using the :ref:`register_events() plugin hook `, combined with calls to the :ref:`datasette.track_event() internal method `.
-
-.. _plugin_hook_track_event:
-
-track_event(datasette, event)
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-``event`` - ``Event``
- Information about the event, represented as an instance of a subclass of the ``Event`` base class.
-
-This hook will be called any time an event is tracked by code that calls the :ref:`datasette.track_event(...) ` internal method.
-
-The ``event`` object will always have the following properties:
-
-- ``name``: a string representing the name of the event, for example ``logout`` or ``create-table``.
-- ``actor``: a dictionary representing the actor that triggered the event, or ``None`` if the event was not triggered by an actor.
-- ``created``: a ``datatime.datetime`` object in the ``timezone.utc`` timezone representing the time the event object was created.
-
-Other properties on the event will be available depending on the type of event. You can also access those as a dictionary using ``event.properties()``.
-
-The events fired by Datasette core are :ref:`documented here `.
-
-This example plugin logs details of all events to standard error:
+.. warning::
+ The design of this plugin hook does not currently provide a mechanism for interacting with async code, and may change in the future. See `issue 1384 `__.
.. code-block:: python
- from datasette import hookimpl
- import json
- import sys
-
-
@hookimpl
- def track_event(event):
- name = event.name
- actor = event.actor
- properties = event.properties()
- msg = json.dumps(
- {
- "name": name,
- "actor": actor,
- "properties": properties,
- }
- )
- print(msg, file=sys.stderr, flush=True)
+ def get_metadata(datasette, key, database, table):
+ metadata = {
+ "title": "This will be the Datasette landing page title!",
+ "description": get_instance_description(datasette),
+ "databases": [],
+ }
+ for db_name, db_data_dict in get_my_database_meta(
+ datasette, database, table, key
+ ):
+ metadata["databases"][db_name] = db_data_dict
+ # whatever we return here will be merged with any other plugins using this hook and
+ # will be overwritten by a local metadata.yaml if one exists!
+ return metadata
-The function can also return an async function which will be awaited. This is useful for writing to a database.
-
-This example logs events to a ``datasette_events`` table in a database called ``events``. It uses the :ref:`plugin_hook_startup` hook to create that table if it does not exist.
-
-.. code-block:: python
-
- from datasette import hookimpl
- import json
-
-
- @hookimpl
- def startup(datasette):
- async def inner():
- db = datasette.get_database("events")
- await db.execute_write(
- """
- create table if not exists datasette_events (
- id integer primary key,
- event_type text,
- created text,
- actor text,
- properties text
- )
- """
- )
-
- return inner
-
-
- @hookimpl
- def track_event(datasette, event):
- async def inner():
- db = datasette.get_database("events")
- properties = event.properties()
- await db.execute_write(
- """
- insert into datasette_events (event_type, created, actor, properties)
- values (?, strftime('%Y-%m-%d %H:%M:%S', 'now'), ?, ?)
- """,
- (
- event.name,
- json.dumps(event.actor),
- json.dumps(properties),
- ),
- )
-
- return inner
-
-Example: `datasette-events-db `_
-
-.. _plugin_hook_register_events:
-
-register_events(datasette)
-~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-``datasette`` - :ref:`internals_datasette`
- You can use this to access plugin configuration options via ``datasette.plugin_config(your_plugin_name)``.
-
-This hook should return a list of ``Event`` subclasses that represent custom events that the plugin might send to the :ref:`datasette.track_event() ` method.
-
-This example registers event subclasses for ``ban-user`` and ``unban-user`` events:
-
-.. code-block:: python
-
- from dataclasses import dataclass
- from datasette import hookimpl, Event
-
-
- @dataclass
- class BanUserEvent(Event):
- name = "ban-user"
- user: dict
-
-
- @dataclass
- class UnbanUserEvent(Event):
- name = "unban-user"
- user: dict
-
-
- @hookimpl
- def register_events():
- return [BanUserEvent, UnbanUserEvent]
-
-The plugin can then call ``datasette.track_event(...)`` to send a ``ban-user`` event:
-
-.. code-block:: python
-
- await datasette.track_event(
- BanUserEvent(user={"id": 1, "username": "cleverbot"})
- )
+Example: `datasette-remote-metadata plugin `__
diff --git a/docs/plugins.rst b/docs/plugins.rst
index d5a98923..29078054 100644
--- a/docs/plugins.rst
+++ b/docs/plugins.rst
@@ -25,7 +25,7 @@ Things you can do with plugins include:
* Customize how database values are rendered in the Datasette interface, for example
`datasette-render-binary `__ and
`datasette-pretty-json `__.
-* Customize how Datasette's authentication and permissions systems work, for example `datasette-auth-passwords `__ and
+* Customize how Datasette's authentication and permissions systems work, for example `datasette-auth-tokens `__ and
`datasette-permissions-sql `__.
.. _plugins_installing:
@@ -51,16 +51,7 @@ This command can also be used to upgrade Datasette itself to the latest released
datasette install -U datasette
-You can install multiple plugins at once by listing them as lines in a ``requirements.txt`` file like this::
-
- datasette-vega
- datasette-cluster-map
-
-Then pass that file to ``datasette install -r``::
-
- datasette install -r requirements.txt
-
-The ``install`` and ``uninstall`` commands are thin wrappers around ``pip install`` and ``pip uninstall``, which ensure that they run ``pip`` in the same virtual environment as Datasette itself.
+These commands are thin wrappers around ``pip install`` and ``pip uninstall``, which ensure they run ``pip`` in the same virtual environment as Datasette itself.
One-off plugins using --plugins-dir
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -81,60 +72,6 @@ You can use the name of a package on PyPI or any of the other valid arguments to
datasette publish cloudrun mydb.db \
--install=https://url-to-my-package.zip
-
-.. _plugins_datasette_load_plugins:
-
-Controlling which plugins are loaded
-------------------------------------
-
-Datasette defaults to loading every plugin that is installed in the same virtual environment as Datasette itself.
-
-You can set the ``DATASETTE_LOAD_PLUGINS`` environment variable to a comma-separated list of plugin names to load a controlled subset of plugins instead.
-
-For example, to load just the ``datasette-vega`` and ``datasette-cluster-map`` plugins, set ``DATASETTE_LOAD_PLUGINS`` to ``datasette-vega,datasette-cluster-map``:
-
-.. code-block:: bash
-
- export DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map'
- datasette mydb.db
-
-Or:
-
-.. code-block:: bash
-
- DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map' \
- datasette mydb.db
-
-To disable the loading of all additional plugins, set ``DATASETTE_LOAD_PLUGINS`` to an empty string:
-
-.. code-block:: bash
-
- export DATASETTE_LOAD_PLUGINS=''
- datasette mydb.db
-
-A quick way to test this setting is to use it with the ``datasette plugins`` command:
-
-.. code-block:: bash
-
- DATASETTE_LOAD_PLUGINS='datasette-vega' datasette plugins
-
-This should output the following:
-
-.. code-block:: json
-
- [
- {
- "name": "datasette-vega",
- "static": true,
- "templates": false,
- "version": "0.6.2",
- "hooks": [
- "extra_css_urls",
- "extra_js_urls"
- ]
- }
- ]
-
.. _plugins_installed:
Seeing what plugins are installed
@@ -144,12 +81,7 @@ You can see a list of installed plugins by navigating to the ``/-/plugins`` page
You can also use the ``datasette plugins`` command::
- datasette plugins
-
-Which outputs:
-
-.. code-block:: json
-
+ $ datasette plugins
[
{
"name": "datasette_json_html",
@@ -166,8 +98,7 @@ Which outputs:
cog.out("\n")
result = CliRunner().invoke(cli.cli, ["plugins", "--all"])
# cog.out() with text containing newlines was unindenting for some reason
- cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:\n")
- cog.outl(".. code-block:: json\n")
+ cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::\n")
plugins = [p for p in json.loads(result.output) if p["name"].startswith("datasette.")]
indented = textwrap.indent(json.dumps(plugins, indent=4), " ")
for line in indented.split("\n"):
@@ -175,9 +106,7 @@ Which outputs:
cog.out("\n\n")
.. ]]]
-If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:
-
-.. code-block:: json
+If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::
[
{
@@ -198,15 +127,6 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
"register_output_renderer"
]
},
- {
- "name": "datasette.default_actions",
- "static": false,
- "templates": false,
- "version": null,
- "hooks": [
- "register_actions"
- ]
- },
{
"name": "datasette.default_magic_parameters",
"static": false,
@@ -231,19 +151,7 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
"templates": false,
"version": null,
"hooks": [
- "actor_from_request",
- "canned_queries",
- "permission_resources_sql",
- "skip_csrf"
- ]
- },
- {
- "name": "datasette.events",
- "static": false,
- "templates": false,
- "version": null,
- "hooks": [
- "register_events"
+ "permission_allowed"
]
},
{
@@ -316,34 +224,18 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
You can add the ``--plugins-dir=`` option to include any plugins found in that directory.
-Add ``--requirements`` to output a list of installed plugins that can then be installed in another Datasette instance using ``datasette install -r requirements.txt``::
-
- datasette plugins --requirements
-
-The output will look something like this::
-
- datasette-codespaces==0.1.1
- datasette-graphql==2.2
- datasette-json-html==1.0.1
- datasette-pretty-json==0.2.2
- datasette-x-forwarded-host==0.1
-
-To write that to a ``requirements.txt`` file, run this::
-
- datasette plugins --requirements > requirements.txt
-
.. _plugins_configuration:
Plugin configuration
--------------------
-Plugins can have their own configuration, embedded in a :ref:`configuration file `. Configuration options for plugins live within a ``"plugins"`` key in that file, which can be included at the root, database or table level.
+Plugins can have their own configuration, embedded in a :ref:`metadata` file. Configuration options for plugins live within a ``"plugins"`` key in that file, which can be included at the root, database or table level.
Here is an example of some plugin configuration for a specific table:
-.. [[[cog
- from metadata_doc import config_example
- config_example(cog, {
+.. code-block:: json
+
+ {
"databases": {
"sf-trees": {
"tables": {
@@ -358,44 +250,7 @@ Here is an example of some plugin configuration for a specific table:
}
}
}
- })
-.. ]]]
-
-.. tab:: datasette.yaml
-
- .. code-block:: yaml
-
- databases:
- sf-trees:
- tables:
- Street_Tree_List:
- plugins:
- datasette-cluster-map:
- latitude_column: lat
- longitude_column: lng
-
-
-.. tab:: datasette.json
-
- .. code-block:: json
-
- {
- "databases": {
- "sf-trees": {
- "tables": {
- "Street_Tree_List": {
- "plugins": {
- "datasette-cluster-map": {
- "latitude_column": "lat",
- "longitude_column": "lng"
- }
- }
- }
- }
- }
- }
- }
-.. [[[end]]]
+ }
This tells the ``datasette-cluster-map`` column which latitude and longitude columns should be used for a table called ``Street_Tree_List`` inside a database file called ``sf-trees.db``.
@@ -404,12 +259,13 @@ This tells the ``datasette-cluster-map`` column which latitude and longitude col
Secret configuration values
~~~~~~~~~~~~~~~~~~~~~~~~~~~
-Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values.
+Any values embedded in ``metadata.json`` will be visible to anyone who views the ``/-/metadata`` page of your Datasette instance. Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values.
**As environment variables**. If your secret lives in an environment variable that is available to the Datasette process, you can indicate that the configuration value should be read from that environment variable like so:
-.. [[[cog
- config_example(cog, {
+.. code-block:: json
+
+ {
"plugins": {
"datasette-auth-github": {
"client_secret": {
@@ -417,38 +273,13 @@ Some plugins may need configuration that should stay secret - API keys for examp
}
}
}
- })
-.. ]]]
-
-.. tab:: datasette.yaml
-
- .. code-block:: yaml
-
- plugins:
- datasette-auth-github:
- client_secret:
- $env: GITHUB_CLIENT_SECRET
-
-
-.. tab:: datasette.json
-
- .. code-block:: json
-
- {
- "plugins": {
- "datasette-auth-github": {
- "client_secret": {
- "$env": "GITHUB_CLIENT_SECRET"
- }
- }
- }
- }
-.. [[[end]]]
+ }
**As values in separate files**. Your secrets can also live in files on disk. To specify a secret should be read from a file, provide the full file path like this:
-.. [[[cog
- config_example(cog, {
+.. code-block:: json
+
+ {
"plugins": {
"datasette-auth-github": {
"client_secret": {
@@ -456,46 +287,21 @@ Some plugins may need configuration that should stay secret - API keys for examp
}
}
}
- })
-.. ]]]
-
-.. tab:: datasette.yaml
-
- .. code-block:: yaml
-
- plugins:
- datasette-auth-github:
- client_secret:
- $file: /secrets/client-secret
-
-
-.. tab:: datasette.json
-
- .. code-block:: json
-
- {
- "plugins": {
- "datasette-auth-github": {
- "client_secret": {
- "$file": "/secrets/client-secret"
- }
- }
- }
- }
-.. [[[end]]]
+ }
If you are publishing your data using the :ref:`datasette publish ` family of commands, you can use the ``--plugin-secret`` option to set these secrets at publish time. For example, using Heroku you might run the following command::
- datasette publish heroku my_database.db \
+ $ datasette publish heroku my_database.db \
--name my-heroku-app-demo \
--install=datasette-auth-github \
--plugin-secret datasette-auth-github client_id your_client_id \
--plugin-secret datasette-auth-github client_secret your_client_secret
-This will set the necessary environment variables and add the following to the deployed ``metadata.yaml``:
+This will set the necessary environment variables and add the following to the deployed ``metadata.json``:
-.. [[[cog
- config_example(cog, {
+.. code-block:: json
+
+ {
"plugins": {
"datasette-auth-github": {
"client_id": {
@@ -506,35 +312,4 @@ This will set the necessary environment variables and add the following to the d
}
}
}
- })
-.. ]]]
-
-.. tab:: datasette.yaml
-
- .. code-block:: yaml
-
- plugins:
- datasette-auth-github:
- client_id:
- $env: DATASETTE_AUTH_GITHUB_CLIENT_ID
- client_secret:
- $env: DATASETTE_AUTH_GITHUB_CLIENT_SECRET
-
-
-.. tab:: datasette.json
-
- .. code-block:: json
-
- {
- "plugins": {
- "datasette-auth-github": {
- "client_id": {
- "$env": "DATASETTE_AUTH_GITHUB_CLIENT_ID"
- },
- "client_secret": {
- "$env": "DATASETTE_AUTH_GITHUB_CLIENT_SECRET"
- }
- }
- }
- }
-.. [[[end]]]
+ }
diff --git a/docs/publish.rst b/docs/publish.rst
index 87360c32..7ae0399e 100644
--- a/docs/publish.rst
+++ b/docs/publish.rst
@@ -131,7 +131,7 @@ You can also specify plugins you would like to install. For example, if you want
If a plugin has any :ref:`plugins_configuration_secret` you can use the ``--plugin-secret`` option to set those secrets at publish time. For example, using Heroku with `datasette-auth-github `__ you might run the following command::
- datasette publish heroku my_database.db \
+ $ datasette publish heroku my_database.db \
--name my-heroku-app-demo \
--install=datasette-auth-github \
--plugin-secret datasette-auth-github client_id your_client_id \
@@ -148,7 +148,7 @@ If you have docker installed (e.g. using `Docker for Mac 79e1dc9af1c1
diff --git a/docs/settings.rst b/docs/settings.rst
index 5cd49113..8a83cc2f 100644
--- a/docs/settings.rst
+++ b/docs/settings.rst
@@ -11,11 +11,9 @@ Datasette supports a number of settings. These can be set using the ``--setting
You can set multiple settings at once like this::
datasette mydatabase.db \
- --setting default_page_size 50 \
- --setting sql_time_limit_ms 3500 \
- --setting max_returned_rows 2000
-
-Settings can also be specified :ref:`in the database.yaml configuration file `.
+ --setting default_page_size 50 \
+ --setting sql_time_limit_ms 3500 \
+ --setting max_returned_rows 2000
.. _config_dir:
@@ -24,18 +22,17 @@ Configuration directory mode
Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose::
- datasette one.db two.db \
- --metadata=metadata.json \
- --template-dir=templates/ \
- --plugins-dir=plugins \
- --static css:css
+ $ datasette one.db two.db \
+ --metadata=metadata.json \
+ --template-dir=templates/ \
+ --plugins-dir=plugins \
+ --static css:css
As an alternative to this, you can run Datasette in *configuration directory* mode. Create a directory with the following structure::
# In a directory called my-app:
my-app/one.db
my-app/two.db
- my-app/datasette.yaml
my-app/metadata.json
my-app/templates/index.html
my-app/plugins/my_plugin.py
@@ -43,16 +40,16 @@ As an alternative to this, you can run Datasette in *configuration directory* mo
Now start Datasette by providing the path to that directory::
- datasette my-app/
+ $ datasette my-app/
Datasette will detect the files in that directory and automatically configure itself using them. It will serve all ``*.db`` files that it finds, will load ``metadata.json`` if it exists, and will load the ``templates``, ``plugins`` and ``static`` folders if they are present.
The files that can be included in this directory are as follows. All are optional.
* ``*.db`` (or ``*.sqlite3`` or ``*.sqlite``) - SQLite database files that will be served by Datasette
-* ``datasette.yaml`` - :ref:`configuration` for the Datasette instance
* ``metadata.json`` - :ref:`metadata` for those databases - ``metadata.yaml`` or ``metadata.yml`` can be used as well
* ``inspect-data.json`` - the result of running ``datasette inspect *.db --inspect-file=inspect-data.json`` from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running
+* ``settings.json`` - settings that would normally be passed using ``--setting`` - here they should be stored as a JSON object of key/value pairs
* ``templates/`` - a directory containing :ref:`customization_custom_templates`
* ``plugins/`` - a directory containing plugins, see :ref:`writing_plugins_one_off`
* ``static/`` - a directory containing static files - these will be served from ``/static/filename.txt``, see :ref:`customization_static_files`
@@ -69,13 +66,13 @@ default_allow_sql
Should users be able to execute arbitrary SQL queries by default?
-Setting this to ``off`` causes permission checks for :ref:`actions_execute_sql` to fail by default.
+Setting this to ``off`` causes permission checks for :ref:`permissions_execute_sql` to fail by default.
::
datasette mydatabase.db --setting default_allow_sql off
-Another way to achieve this is to add ``"allow_sql": false`` to your ``datasette.yaml`` file, as described in :ref:`authentication_permissions_execute_sql`. This setting offers a more convenient way to do this.
+There are two ways to achieve this: the other is to add ``"allow_sql": false`` to your ``metadata.json`` file, as described in :ref:`authentication_permissions_execute_sql`. This setting offers a more convenient way to do this.
.. _setting_default_page_size:
@@ -114,17 +111,6 @@ You can increase or decrease this limit like so::
datasette mydatabase.db --setting max_returned_rows 2000
-.. _setting_max_insert_rows:
-
-max_insert_rows
-~~~~~~~~~~~~~~~
-
-Maximum rows that can be inserted at a time using the bulk insert API, see :ref:`TableInsertView`. Defaults to 100.
-
-You can increase or decrease this limit like so::
-
- datasette mydatabase.db --setting max_insert_rows 1000
-
.. _setting_num_sql_threads:
num_sql_threads
@@ -198,34 +184,6 @@ Should users be able to download the original SQLite database using a link on th
datasette mydatabase.db --setting allow_download off
-.. _setting_allow_signed_tokens:
-
-allow_signed_tokens
-~~~~~~~~~~~~~~~~~~~
-
-Should users be able to create signed API tokens to access Datasette?
-
-This is turned on by default. Use the following to turn it off::
-
- datasette mydatabase.db --setting allow_signed_tokens off
-
-Turning this setting off will disable the ``/-/create-token`` page, :ref:`described here `. It will also cause any incoming ``Authorization: Bearer dstok_...`` API tokens to be ignored.
-
-.. _setting_max_signed_tokens_ttl:
-
-max_signed_tokens_ttl
-~~~~~~~~~~~~~~~~~~~~~
-
-Maximum allowed expiry time for signed API tokens created by users.
-
-Defaults to ``0`` which means no limit - tokens can be created that will never expire.
-
-Set this to a value in seconds to limit the maximum expiry time. For example, to set that limit to 24 hours you would use::
-
- datasette mydatabase.db --setting max_signed_tokens_ttl 86400
-
-This setting is enforced when incoming tokens are processed.
-
.. _setting_default_cache_ttl:
default_cache_ttl
@@ -356,25 +314,25 @@ Configuring the secret
Datasette uses a secret string to sign secure values such as cookies.
-If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies and :ref:`API tokens ` will not stay valid between restarts.
+If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies will not stay valid between restarts.
You can pass a secret to Datasette in two ways: with the ``--secret`` command-line option or by setting a ``DATASETTE_SECRET`` environment variable.
::
- datasette mydb.db --secret=SECRET_VALUE_HERE
+ $ datasette mydb.db --secret=SECRET_VALUE_HERE
Or::
- export DATASETTE_SECRET=SECRET_VALUE_HERE
- datasette mydb.db
+ $ export DATASETTE_SECRET=SECRET_VALUE_HERE
+ $ datasette mydb.db
One way to generate a secure random secret is to use Python like this::
- python3 -c 'import secrets; print(secrets.token_hex(32))'
+ $ python3 -c 'import secrets; print(secrets.token_hex(32))'
cdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52
-Plugin authors can make use of this signing mechanism in their plugins using the :ref:`datasette.sign()