mirror of
https://github.com/simonw/datasette.git
synced 2025-12-10 16:51:24 +01:00
Populate docs/ from 0.65.2
This commit is contained in:
parent
ec99bb46f8
commit
264bba5db2
26 changed files with 883 additions and 5641 deletions
31
docs/_templates/base.html
vendored
31
docs/_templates/base.html
vendored
|
|
@ -4,34 +4,3 @@
|
|||
{{ super() }}
|
||||
<script defer data-domain="docs.datasette.io" src="https://plausible.io/js/plausible.js"></script>
|
||||
{% endblock %}
|
||||
|
||||
{% block scripts %}
|
||||
{{ super() }}
|
||||
<script>
|
||||
document.addEventListener("DOMContentLoaded", function() {
|
||||
// Show banner linking to /stable/ if this is a /latest/ page
|
||||
if (!/\/latest\//.test(location.pathname)) {
|
||||
return;
|
||||
}
|
||||
var stableUrl = location.pathname.replace("/latest/", "/stable/");
|
||||
// Check it's not a 404
|
||||
fetch(stableUrl, { method: "HEAD" }).then((response) => {
|
||||
if (response.status === 200) {
|
||||
var warning = document.createElement("div");
|
||||
warning.className = "admonition warning";
|
||||
warning.innerHTML = `
|
||||
<p class="first admonition-title">Note</p>
|
||||
<p class="last">
|
||||
This documentation covers the <strong>development version</strong> of Datasette.
|
||||
</p>
|
||||
<p>
|
||||
See <a href="${stableUrl}">this page</a> for the current stable release.
|
||||
</p>
|
||||
`;
|
||||
var mainArticle = document.querySelector("article[role=main]");
|
||||
mainArticle.insertBefore(warning, mainArticle.firstChild);
|
||||
}
|
||||
});
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
|
|
@ -4,6 +4,7 @@
|
|||
Changelog
|
||||
=========
|
||||
|
||||
|
||||
.. _v0_65_2:
|
||||
|
||||
0.65.2 (2025-11-05)
|
||||
|
|
@ -14,102 +15,9 @@ Changelog
|
|||
* Fixed ``datasette publish cloudrun`` to work with changes to the underlying Cloud Run architecture. (:issue:`2511`)
|
||||
* Minor upgrades to fix warnings, including ``pkg_resources`` deprecation.
|
||||
|
||||
.. _v1_0_a20:
|
||||
|
||||
1.0a20 (2025-11-03)
|
||||
-------------------
|
||||
|
||||
This alpha introduces a major breaking change prior to the 1.0 release of Datasette concerning how Datasette's permission system works.
|
||||
|
||||
Permission system redesign
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Previously the permission system worked using ``datasette.permission_allowed()`` checks which consulted all available plugins in turn to determine whether a given actor was allowed to perform a given action on a given resource.
|
||||
|
||||
This approach could become prohibitively expensive for large lists of items - for example to determine the list of tables that a user could view in a large Datasette instance each plugin implementation of that hook would be fired for every table.
|
||||
|
||||
The new design uses SQL queries against Datasette's internal :ref:`catalog tables <internals_internal>` to derive the list of resources for which an actor has permission for a given action. This turns an N x M problem (N resources, M plugins) into a single SQL query.
|
||||
|
||||
Plugins can use the new :ref:`plugin_hook_permission_resources_sql` hook to return SQL fragments which will be used as part of that query.
|
||||
|
||||
Plugins that use any of the following features will need to be updated to work with this and following alphas (and Datasette 1.0 stable itself):
|
||||
|
||||
- Checking permissions with ``datasette.permission_allowed()`` - this method has been replaced with :ref:`datasette.allowed() <datasette_allowed>`.
|
||||
- Implementing the ``permission_allowed()`` plugin hook - this hook has been removed in favor of :ref:`permission_resources_sql() <plugin_hook_permission_resources_sql>`.
|
||||
- Using ``register_permissions()`` to register permissions - this hook has been removed in favor of :ref:`register_actions() <plugin_register_actions>`.
|
||||
|
||||
Consult the :ref:`v1.0a20 upgrade guide <upgrade_guide_v1_a20>` for further details on how to upgrade affected plugins.
|
||||
|
||||
Plugins can now make use of two new internal methods to help resolve permission checks:
|
||||
|
||||
- :ref:`datasette.allowed_resources() <datasette_allowed_resources>` returns a ``PaginatedResources`` object with a ``.resources`` list of ``Resource`` instances that an actor is allowed to access for a given action (and a ``.next`` token for pagination).
|
||||
- :ref:`datasette.allowed_resources_sql() <datasette_allowed_resources_sql>` returns the SQL and parameters that can be executed against the internal catalog tables to determine which resources an actor is allowed to access for a given action. This can be combined with further SQL to perform advanced custom filtering.
|
||||
|
||||
Related changes:
|
||||
|
||||
- The way ``datasette --root`` works has changed. Running Datasette with this flag now causes the root actor to pass *all* permission checks. (:issue:`2521`)
|
||||
|
||||
- Permission debugging improvements:
|
||||
|
||||
- The ``/-/allowed`` endpoint shows resources the user is allowed to interact with for different actions.
|
||||
|
||||
- ``/-/rules`` shows the raw allow/deny rules that apply to different permission checks.
|
||||
|
||||
- ``/-/actions`` lists every available action.
|
||||
|
||||
- ``/-/check`` can be used to try out different permission checks for the current actor.
|
||||
|
||||
Other changes
|
||||
~~~~~~~~~~~~~
|
||||
|
||||
- The internal ``catalog_views`` table now tracks SQLite views alongside tables in the introspection database. (:issue:`2495`)
|
||||
|
||||
- Hitting the ``/`` brings up a search interface for navigating to tables that the current user can view. A new ``/-/tables`` endpoint supports this functionality. (:issue:`2523`)
|
||||
|
||||
- Datasette attempts to detect some configuration errors on startup.
|
||||
|
||||
- Datasette now supports Python 3.14 and no longer tests against Python 3.9.
|
||||
|
||||
.. _v1_0_a19:
|
||||
|
||||
1.0a19 (2025-04-21)
|
||||
-------------------
|
||||
|
||||
- Tiny cosmetic bug fix for mobile display of table rows. (:issue:`2479`)
|
||||
|
||||
.. _v1_0_a18:
|
||||
|
||||
1.0a18 (2025-04-16)
|
||||
-------------------
|
||||
|
||||
- Fix for incorrect foreign key references in the internal database schema. (:issue:`2466`)
|
||||
- The ``prepare_connection()`` hook no longer runs for the internal database. (:issue:`2468`)
|
||||
- Fixed bug where ``link:`` HTTP headers used invalid syntax. (:issue:`2470`)
|
||||
- No longer tested against Python 3.8. Now tests against Python 3.13.
|
||||
- FTS tables are now hidden by default if they correspond to a content table. (:issue:`2477`)
|
||||
- Fixed bug with foreign key links to rows in databases with filenames containing a special character. Thanks, `Jack Stratton <https://github.com/phroa>`__. (`#2476 <https://github.com/simonw/datasette/pull/2476>`__)
|
||||
|
||||
.. _v1_0_a17:
|
||||
|
||||
1.0a17 (2025-02-06)
|
||||
-------------------
|
||||
|
||||
- ``DATASETTE_SSL_KEYFILE`` and ``DATASETTE_SSL_CERTFILE`` environment variables as alternatives to ``--ssl-keyfile`` and ``--ssl-certfile``. Thanks, Alex Garcia. (:issue:`2422`)
|
||||
- ``SQLITE_EXTENSIONS`` environment variable has been renamed to ``DATASETTE_LOAD_EXTENSION``. (:issue:`2424`)
|
||||
- ``datasette serve`` environment variables are now :ref:`documented here <cli_datasette_serve_env>`.
|
||||
- The :ref:`plugin_hook_register_magic_parameters` plugin hook can now register async functions. (:issue:`2441`)
|
||||
- Datasette is now tested against Python 3.13.
|
||||
- Breadcrumbs on database and table pages now include a consistent self-link for resetting query string parameters. (:issue:`2454`)
|
||||
- Fixed issue where Datasette could crash on ``metadata.json`` with nested values. (:issue:`2455`)
|
||||
- New internal methods ``datasette.set_actor_cookie()`` and ``datasette.delete_actor_cookie()``, :ref:`described here <authentication_ds_actor>`. (:issue:`1690`)
|
||||
- ``/-/permissions`` page now shows a list of all permissions registered by plugins. (:issue:`1943`)
|
||||
- If a table has a single unique text column Datasette now detects that as the foreign key label for that table. (:issue:`2458`)
|
||||
- The ``/-/permissions`` page now includes options for filtering or exclude permission checks recorded against the current user. (:issue:`2460`)
|
||||
- Fixed a bug where replacing a database with a new one with the same name did not pick up the new database correctly. (:issue:`2465`)
|
||||
|
||||
.. _v0_65_1:
|
||||
|
||||
0.65.1 (2024-11-28)
|
||||
0.65.1 (2024-12-28)
|
||||
-------------------
|
||||
|
||||
- Fixed bug with upgraded HTTPX 0.28.0 dependency. (:issue:`2443`)
|
||||
|
|
@ -122,66 +30,6 @@ Other changes
|
|||
- Upgrade for compatibility with Python 3.13 (by vendoring Pint dependency). (:issue:`2434`)
|
||||
- Dropped support for Python 3.8.
|
||||
|
||||
.. _v1_0_a16:
|
||||
|
||||
1.0a16 (2024-09-05)
|
||||
-------------------
|
||||
|
||||
This release focuses on performance, in particular against large tables, and introduces some minor breaking changes for CSS styling in Datasette plugins.
|
||||
|
||||
- Removed the unit conversions feature and its dependency, Pint. This means Datasette is now compatible with the upcoming Python 3.13. (:issue:`2400`, :issue:`2320`)
|
||||
- The ``datasette --pdb`` option now uses the `ipdb <https://github.com/gotcha/ipdb>`__ debugger if it is installed. You can install it using ``datasette install ipdb``. Thanks, `Tiago Ilieve <https://github.com/myhro>`__. (`#2342 <https://github.com/simonw/datasette/pull/2342>`__)
|
||||
- Fixed a confusing error that occurred if ``metadata.json`` contained nested objects. (:issue:`2403`)
|
||||
- Fixed a bug with ``?_trace=1`` where it returned a blank page if the response was larger than 256KB. (:issue:`2404`)
|
||||
- Tracing mechanism now also displays SQL queries that returned errors or ran out of time. `datasette-pretty-traces 0.5 <https://github.com/simonw/datasette-pretty-traces/releases/tag/0.5>`__ includes support for displaying this new type of trace. (:issue:`2405`)
|
||||
- Fixed a text spacing with table descriptions on the homepage. (:issue:`2399`)
|
||||
- Performance improvements for large tables:
|
||||
- Suggested facets now only consider the first 1000 rows. (:issue:`2406`)
|
||||
- Improved performance of date facet suggestion against large tables. (:issue:`2407`)
|
||||
- Row counts stop at 10,000 rows when listing tables. (:issue:`2398`)
|
||||
- On table page the count stops at 10,000 rows too, with a "count all" button to execute the full count. (:issue:`2408`)
|
||||
- New ``.dicts()`` internal method on :ref:`database_results` that returns a list of dictionaries representing the results from a SQL query: (:issue:`2414`)
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
rows = (await db.execute("select * from t")).dicts()
|
||||
|
||||
- Default Datasette core CSS that styles inputs and buttons now requires a class of ``"core"`` on the element or a containing element, for example ``<form class="core">``. (:issue:`2415`)
|
||||
- Similarly, default table styles now only apply to ``<table class="rows-and-columns">``. (:issue:`2420`)
|
||||
|
||||
.. _v1_0_a15:
|
||||
|
||||
1.0a15 (2024-08-15)
|
||||
-------------------
|
||||
|
||||
- Datasette now defaults to hiding SQLite "shadow" tables, as seen in extensions such as SQLite FTS and `sqlite-vec <https://github.com/asg017/sqlite-vec>`__. Virtual tables that it makes sense to display, such as FTS core tables, are no longer hidden. Thanks, `Alex Garcia <https://github.com/asg017>`__. (:issue:`2296`)
|
||||
- Fixed bug where running Datasette with one or more ``-s/--setting`` options could over-ride settings that were present in ``datasette.yml``. (:issue:`2389`)
|
||||
- The Datasette homepage is now duplicated at ``/-/``, using the default ``index.html`` template. This ensures that the information on that page is still accessible even if the Datasette homepage has been customized using a custom ``index.html`` template, for example on sites like `datasette.io <https://datasette.io/>`__. (:issue:`2393`)
|
||||
- Failed CSRF checks now display a more user-friendly error page. (:issue:`2390`)
|
||||
- Fixed a bug where the ``json1`` extension was not correctly detected on the ``/-/versions`` page. Thanks, `Seb Bacon <https://github.com/sebbacon>`__. (:issue:`2326`)
|
||||
- Fixed a bug where the Datasette write API did not correctly accept ``Content-Type: application/json; charset=utf-8``. (:issue:`2384`)
|
||||
- Fixed a bug where Datasette would fail to start if ``metadata.yml`` contained a ``queries`` block. (`#2386 <https://github.com/simonw/datasette/pull/2386>`__)
|
||||
|
||||
.. _v1_0_a14:
|
||||
|
||||
1.0a14 (2024-08-05)
|
||||
-------------------
|
||||
|
||||
This alpha introduces significant changes to Datasette's :ref:`metadata` system, some of which represent breaking changes in advance of the full 1.0 release. The new :ref:`upgrade_guide` document provides detailed coverage of those breaking changes and how they affect plugin authors and Datasette API consumers.
|
||||
|
||||
- The ``/databasename?sql=`` interface and JSON API for executing arbitrary SQL queries can now be found at ``/databasename/-/query?sql=``. Requests with a ``?sql=`` parameter to the old endpoints will be redirected. Thanks, `Alex Garcia <https://github.com/asg017>`__. (:issue:`2360`)
|
||||
- Metadata about tables, databases, instances and columns is now stored in :ref:`internals_internal`. Thanks, Alex Garcia. (:issue:`2341`)
|
||||
- Database write connections now execute using the ``IMMEDIATE`` isolation level for SQLite. This should help avoid a rare ``SQLITE_BUSY`` error that could occur when a transaction upgraded to a write mid-flight. (:issue:`2358`)
|
||||
- Fix for a bug where canned queries with named parameters could fail against SQLite 3.46. (:issue:`2353`)
|
||||
- Datasette now serves ``E-Tag`` headers for static files. Thanks, `Agustin Bacigalup <https://github.com/redraw>`__. (`#2306 <https://github.com/simonw/datasette/pull/2306>`__)
|
||||
- Dropdown menus now use a ``z-index`` that should avoid them being hidden by plugins. (:issue:`2311`)
|
||||
- Incorrect table and row names are no longer reflected back on the resulting 404 page. (:issue:`2359`)
|
||||
- Improved documentation for async usage of the :ref:`plugin_hook_track_event` hook. (:issue:`2319`)
|
||||
- Fixed some HTTPX deprecation warnings. (:issue:`2307`)
|
||||
- Datasette now serves a ``<html lang="en">`` attribute. Thanks, `Charles Nepote <https://github.com/CharlesNepote>`__. (:issue:`2348`)
|
||||
- Datasette's automated tests now run against the maximum and minimum supported versions of SQLite: 3.25 (from September 2018) and 3.46 (from May 2024). Thanks, Alex Garcia. (`#2352 <https://github.com/simonw/datasette/pull/2352>`__)
|
||||
- Fixed an issue where clicking twice on the URL output by ``datasette --root`` produced a confusing error. (:issue:`2375`)
|
||||
|
||||
.. _v0_64_8:
|
||||
|
||||
0.64.8 (2024-06-21)
|
||||
|
|
@ -197,180 +45,6 @@ This alpha introduces significant changes to Datasette's :ref:`metadata` system,
|
|||
|
||||
- Fixed a bug where canned queries with named parameters threw an error when run against SQLite 3.46.0. (:issue:`2353`)
|
||||
|
||||
.. _v1_0_a13:
|
||||
|
||||
1.0a13 (2024-03-12)
|
||||
-------------------
|
||||
|
||||
Each of the key concepts in Datasette now has an :ref:`actions menu <plugin_actions>`, which plugins can use to add additional functionality targeting that entity.
|
||||
|
||||
- Plugin hook: :ref:`view_actions() <plugin_hook_view_actions>` for actions that can be applied to a SQL view. (:issue:`2297`)
|
||||
- Plugin hook: :ref:`homepage_actions() <plugin_hook_homepage_actions>` for actions that apply to the instance homepage. (:issue:`2298`)
|
||||
- Plugin hook: :ref:`row_actions() <plugin_hook_row_actions>` for actions that apply to the row page. (:issue:`2299`)
|
||||
- Action menu items for all of the ``*_actions()`` plugin hooks can now return an optional ``"description"`` key, which will be displayed in the menu below the action label. (:issue:`2294`)
|
||||
- :ref:`Plugin hooks <plugin_hooks>` documentation page is now organized with additional headings. (:issue:`2300`)
|
||||
- Improved the display of action buttons on pages that also display metadata. (:issue:`2286`)
|
||||
- The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. (:issue:`2302`)
|
||||
- Table names that start with an underscore now default to hidden. (:issue:`2104`)
|
||||
- ``pragma_table_list`` has been added to the allow-list of SQLite pragma functions supported by Datasette. ``select * from pragma_table_list()`` is no longer blocked. (`#2104 <https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475>`__)
|
||||
|
||||
.. _v1_0_a12:
|
||||
|
||||
1.0a12 (2024-02-29)
|
||||
-------------------
|
||||
|
||||
- New :ref:`query_actions() <plugin_hook_query_actions>` plugin hook, similar to :ref:`table_actions() <plugin_hook_table_actions>` and :ref:`database_actions() <plugin_hook_database_actions>`. Can be used to add a menu of actions to the canned query or arbitrary SQL query page. (:issue:`2283`)
|
||||
- New design for the button that opens the query, table and database actions menu. (:issue:`2281`)
|
||||
- "does not contain" table filter for finding rows that do not contain a string. (:issue:`2287`)
|
||||
- Fixed a bug in the :ref:`javascript_plugins_makeColumnActions` JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. (:issue:`2289`)
|
||||
|
||||
.. _v1_0_a11:
|
||||
|
||||
1.0a11 (2024-02-19)
|
||||
-------------------
|
||||
|
||||
- The ``"replace": true`` argument to the ``/db/table/-/insert`` API now requires the actor to have the ``update-row`` permission. (:issue:`2279`)
|
||||
- Fixed some UI bugs in the interactive permissions debugging tool. (:issue:`2278`)
|
||||
- The column action menu now aligns better with the cog icon, and positions itself taking into account the width of the browser window. (:issue:`2263`)
|
||||
|
||||
.. _v1_0_a10:
|
||||
|
||||
1.0a10 (2024-02-17)
|
||||
-------------------
|
||||
|
||||
The only changes in this alpha correspond to the way Datasette handles database transactions. (:issue:`2277`)
|
||||
|
||||
- The :ref:`database.execute_write_fn() <database_execute_write_fn>` method has a new ``transaction=True`` parameter. This defaults to ``True`` which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not.
|
||||
- Pass ``transaction=False`` to ``execute_write_fn()`` if you want to manually handle transactions in your function.
|
||||
- Several internal Datasette features, including parts of the :ref:`JSON write API <json_api_write>`, had been failing to wrap their operations in a transaction. This has been fixed by the new ``transaction=True`` default.
|
||||
|
||||
.. _v1_0_a9:
|
||||
|
||||
1.0a9 (2024-02-16)
|
||||
------------------
|
||||
|
||||
This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the ``/upsert`` API endpoint.
|
||||
|
||||
Alter table support for create, insert, upsert and update
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
The :ref:`JSON write API <json_api_write>` can now be used to apply simple alter table schema changes, provided the acting actor has the new :ref:`actions_alter_table` permission. (:issue:`2101`)
|
||||
|
||||
The only alter operation supported so far is adding new columns to an existing table.
|
||||
|
||||
* The :ref:`/db/-/create <TableCreateView>` API now adds new columns during large operations to create a table based on incoming example ``"rows"``, in the case where one of the later rows includes columns that were not present in the earlier batches. This requires the ``create-table`` but not the ``alter-table`` permission.
|
||||
* When ``/db/-/create`` is called with rows in a situation where the table may have been already created, an ``"alter": true`` key can be included to indicate that any missing columns from the new rows should be added to the table. This requires the ``alter-table`` permission.
|
||||
* :ref:`/db/table/-/insert <TableInsertView>` and :ref:`/db/table/-/upsert <TableUpsertView>` and :ref:`/db/table/row-pks/-/update <RowUpdateView>` all now also accept ``"alter": true``, depending on the ``alter-table`` permission.
|
||||
|
||||
Operations that alter a table now fire the new :ref:`alter-table event <events>`.
|
||||
|
||||
Permissions fix for the upsert API
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
The :ref:`/database/table/-/upsert API <TableUpsertView>` had a minor permissions bug, only affecting Datasette instances that had configured the ``insert-row`` and ``update-row`` permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue :issue:`2262`.
|
||||
|
||||
To avoid similar mistakes in the future the ``datasette.permission_allowed()`` method now specifies ``default=`` as a keyword-only argument.
|
||||
|
||||
Permission checks now consider opinions from every plugin
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
The ``datasette.permission_allowed()`` method previously consulted every plugin that implemented the :ref:`permission_allowed() <plugin_hook_permission_allowed>` plugin hook and obeyed the opinion of the last plugin to return a value. (:issue:`2275`)
|
||||
|
||||
Datasette now consults every plugin and checks to see if any of them returned ``False`` (the veto rule), and if none of them did, it then checks to see if any of them returned ``True``.
|
||||
|
||||
This is explained at length in the new documentation covering :ref:`authentication_permissions_explained`.
|
||||
|
||||
Other changes
|
||||
~~~~~~~~~~~~~
|
||||
|
||||
- The new :ref:`DATASETTE_TRACE_PLUGINS=1 environment variable <writing_plugins_tracing>` turns on detailed trace output for every executed plugin hook, useful for debugging and understanding how the plugin system works at a low level. (:issue:`2274`)
|
||||
- Datasette on Python 3.9 or above marks its non-cryptographic uses of the MD5 hash function as ``usedforsecurity=False``, for compatibility with FIPS systems. (:issue:`2270`)
|
||||
- SQL relating to :ref:`internals_internal` now executes inside a transaction, avoiding a potential database locked error. (:issue:`2273`)
|
||||
- The ``/-/threads`` debug page now identifies the database in the name associated with each dedicated write thread. (:issue:`2265`)
|
||||
- The ``/db/-/create`` API now fires a ``insert-rows`` event if rows were inserted after the table was created. (:issue:`2260`)
|
||||
|
||||
.. _v1_0_a8:
|
||||
|
||||
1.0a8 (2024-02-07)
|
||||
------------------
|
||||
|
||||
This alpha release continues the migration of Datasette's configuration from ``metadata.yaml`` to the new ``datasette.yaml`` configuration file, introduces a new system for JavaScript plugins and adds several new plugin hooks.
|
||||
|
||||
See `Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml <https://simonwillison.net/2024/Feb/7/datasette-1a8/>`__ for an annotated version of these release notes.
|
||||
|
||||
Configuration
|
||||
~~~~~~~~~~~~~
|
||||
|
||||
- Plugin configuration now lives in the :ref:`datasette.yaml configuration file <configuration>`, passed to Datasette using the ``-c/--config`` option. Thanks, Alex Garcia. (:issue:`2093`)
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
datasette -c datasette.yaml
|
||||
|
||||
Where ``datasette.yaml`` contains configuration that looks like this:
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
plugins:
|
||||
datasette-cluster-map:
|
||||
latitude_column: xlat
|
||||
longitude_column: xlon
|
||||
|
||||
Previously plugins were configured in ``metadata.yaml``, which was confusing as plugin settings were unrelated to database and table metadata.
|
||||
- The ``-s/--setting`` option can now be used to set plugin configuration as well. See :ref:`configuration_cli` for details. (:issue:`2252`)
|
||||
|
||||
The above YAML configuration example using ``-s/--setting`` looks like this:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
datasette mydatabase.db \
|
||||
-s plugins.datasette-cluster-map.latitude_column xlat \
|
||||
-s plugins.datasette-cluster-map.longitude_column xlon
|
||||
|
||||
- The new ``/-/config`` page shows the current instance configuration, after redacting keys that could contain sensitive data such as API keys or passwords. (:issue:`2254`)
|
||||
|
||||
- Existing Datasette installations may already have configuration set in ``metadata.yaml`` that should be migrated to ``datasette.yaml``. To avoid breaking these installations, Datasette will silently treat table configuration, plugin configuration and allow blocks in metadata as if they had been specified in configuration instead. (:issue:`2247`) (:issue:`2248`) (:issue:`2249`)
|
||||
|
||||
Note that the ``datasette publish`` command has not yet been updated to accept a ``datasette.yaml`` configuration file. This will be addressed in :issue:`2195` but for the moment you can include those settings in ``metadata.yaml`` instead.
|
||||
|
||||
JavaScript plugins
|
||||
~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Datasette now includes a :ref:`JavaScript plugins mechanism <javascript_plugins>`, allowing JavaScript to customize Datasette in a way that can collaborate with other plugins.
|
||||
|
||||
This provides two initial hooks, with more to come in the future:
|
||||
|
||||
- :ref:`makeAboveTablePanelConfigs() <javascript_plugins_makeAboveTablePanelConfigs>` can add additional panels to the top of the table page.
|
||||
- :ref:`makeColumnActions() <javascript_plugins_makeColumnActions>` can add additional actions to the column menu.
|
||||
|
||||
Thanks `Cameron Yick <https://github.com/hydrosquall>`__ for contributing this feature. (`#2052 <https://github.com/simonw/datasette/pull/2052>`__)
|
||||
|
||||
Plugin hooks
|
||||
~~~~~~~~~~~~
|
||||
|
||||
- New :ref:`plugin_hook_jinja2_environment_from_request` plugin hook, which can be used to customize the current Jinja environment based on the incoming request. This can be used to modify the template lookup path based on the incoming request hostname, among other things. (:issue:`2225`)
|
||||
- New :ref:`family of template slot plugin hooks <plugin_hook_slots>`: ``top_homepage``, ``top_database``, ``top_table``, ``top_row``, ``top_query``, ``top_canned_query``. Plugins can use these to provide additional HTML to be injected at the top of the corresponding pages. (:issue:`1191`)
|
||||
- New :ref:`track_event() mechanism <plugin_event_tracking>` for plugins to emit and receive events when certain events occur within Datasette. (:issue:`2240`)
|
||||
- Plugins can register additional event classes using :ref:`plugin_hook_register_events`.
|
||||
- They can then trigger those events with the :ref:`datasette.track_event(event) <datasette_track_event>` internal method.
|
||||
- Plugins can subscribe to notifications of events using the :ref:`plugin_hook_track_event` plugin hook.
|
||||
- Datasette core now emits ``login``, ``logout``, ``create-token``, ``create-table``, ``drop-table``, ``insert-rows``, ``upsert-rows``, ``update-row``, ``delete-row`` events, :ref:`documented here <events>`.
|
||||
- New internal function for plugin authors: :ref:`database_execute_isolated_fn`, for creating a new SQLite connection, executing code and then closing that connection, all while preventing other code from writing to that particular database. This connection will not have the :ref:`prepare_connection() <plugin_hook_prepare_connection>` plugin hook executed against it, allowing plugins to perform actions that might otherwise be blocked by existing connection configuration. (:issue:`2218`)
|
||||
|
||||
Documentation
|
||||
~~~~~~~~~~~~~
|
||||
|
||||
- Documentation describing :ref:`how to write tests that use signed actor cookies <testing_datasette_client>` using ``datasette.client.actor_cookie()``. (:issue:`1830`)
|
||||
- Documentation on how to :ref:`register a plugin for the duration of a test <testing_plugins_register_in_test>`. (:issue:`2234`)
|
||||
- The :ref:`configuration documentation <configuration>` now shows examples of both YAML and JSON for each setting.
|
||||
|
||||
Minor fixes
|
||||
~~~~~~~~~~~
|
||||
|
||||
- Datasette no longer attempts to run SQL queries in parallel when rendering a table page, as this was leading to some rare crashing bugs. (:issue:`2189`)
|
||||
- Fixed warning: ``DeprecationWarning: pkg_resources is deprecated as an API`` (:issue:`2057`)
|
||||
- Fixed bug where ``?_extra=columns`` parameter returned an incorrectly shaped response. (:issue:`2230`)
|
||||
|
||||
.. _v0_64_6:
|
||||
|
||||
0.64.6 (2023-12-22)
|
||||
|
|
@ -385,13 +59,6 @@ Minor fixes
|
|||
|
||||
- Dropped dependency on ``click-default-group-wheel``, which could cause a dependency conflict. (:issue:`2197`)
|
||||
|
||||
.. _v1_0_a7:
|
||||
|
||||
1.0a7 (2023-09-21)
|
||||
------------------
|
||||
|
||||
- Fix for a crashing bug caused by viewing the table page for a named in-memory database. (:issue:`2189`)
|
||||
|
||||
.. _v0_64_4:
|
||||
|
||||
0.64.4 (2023-09-21)
|
||||
|
|
@ -399,96 +66,12 @@ Minor fixes
|
|||
|
||||
- Fix for a crashing bug caused by viewing the table page for a named in-memory database. (:issue:`2189`)
|
||||
|
||||
.. _v1_0_a6:
|
||||
.. _v0_64_3:
|
||||
|
||||
1.0a6 (2023-09-07)
|
||||
------------------
|
||||
0.64.3 (2023-04-27)
|
||||
-------------------
|
||||
|
||||
- New plugin hook: :ref:`plugin_hook_actors_from_ids` and an internal method to accompany it, :ref:`datasette_actors_from_ids`. This mechanism is intended to be used by plugins that may need to display the actor who was responsible for something managed by that plugin: they can now resolve the recorded IDs of actors into the full actor objects. (:issue:`2181`)
|
||||
- ``DATASETTE_LOAD_PLUGINS`` environment variable for :ref:`controlling which plugins <plugins_datasette_load_plugins>` are loaded by Datasette. (:issue:`2164`)
|
||||
- Datasette now checks if the user has permission to view a table linked to by a foreign key before turning that foreign key into a clickable link. (:issue:`2178`)
|
||||
- The ``execute-sql`` permission now implies that the actor can also view the database and instance. (:issue:`2169`)
|
||||
- Documentation describing a pattern for building plugins that themselves :ref:`define further hooks <writing_plugins_extra_hooks>` for other plugins. (:issue:`1765`)
|
||||
- Datasette is now tested against the Python 3.12 preview. (`#2175 <https://github.com/simonw/datasette/pull/2175>`__)
|
||||
|
||||
.. _v1_0_a5:
|
||||
|
||||
1.0a5 (2023-08-29)
|
||||
------------------
|
||||
|
||||
- When restrictions are applied to :ref:`API tokens <CreateTokenView>`, those restrictions now behave slightly differently: applying the ``view-table`` restriction will imply the ability to ``view-database`` for the database containing that table, and both ``view-table`` and ``view-database`` will imply ``view-instance``. Previously you needed to create a token with restrictions that explicitly listed ``view-instance`` and ``view-database`` and ``view-table`` in order to view a table without getting a permission denied error. (:issue:`2102`)
|
||||
- New ``datasette.yaml`` (or ``.json``) configuration file, which can be specified using ``datasette -c path-to-file``. The goal here to consolidate settings, plugin configuration, permissions, canned queries, and other Datasette configuration into a single single file, separate from ``metadata.yaml``. The legacy ``settings.json`` config file used for :ref:`config_dir` has been removed, and ``datasette.yaml`` has a ``"settings"`` section where the same settings key/value pairs can be included. In the next future alpha release, more configuration such as plugins/permissions/canned queries will be moved to the ``datasette.yaml`` file. See :issue:`2093` for more details. Thanks, Alex Garcia.
|
||||
- The ``-s/--setting`` option can now take dotted paths to nested settings. These will then be used to set or over-ride the same options as are present in the new configuration file. (:issue:`2156`)
|
||||
- New ``--actor '{"id": "json-goes-here"}'`` option for use with ``datasette --get`` to treat the simulated request as being made by a specific actor, see :ref:`cli_datasette_get`. (:issue:`2153`)
|
||||
- The Datasette ``_internal`` database has had some changes. It no longer shows up in the ``datasette.databases`` list by default, and is now instead available to plugins using the ``datasette.get_internal_database()``. Plugins are invited to use this as a private database to store configuration and settings and secrets that should not be made visible through the default Datasette interface. Users can pass the new ``--internal internal.db`` option to persist that internal database to disk. Thanks, Alex Garcia. (:issue:`2157`).
|
||||
|
||||
.. _v1_0_a4:
|
||||
|
||||
1.0a4 (2023-08-21)
|
||||
------------------
|
||||
|
||||
This alpha fixes a security issue with the ``/-/api`` API explorer. On authenticated Datasette instances (instances protected using plugins such as `datasette-auth-passwords <https://datasette.io/plugins/datasette-auth-passwords>`__) the API explorer interface could reveal the names of databases and tables within the protected instance. The data stored in those tables was not revealed.
|
||||
|
||||
For more information and workarounds, read `the security advisory <https://github.com/simonw/datasette/security/advisories/GHSA-7ch3-7pp7-7cpq>`__. The issue has been present in every previous alpha version of Datasette 1.0: versions 1.0a0, 1.0a1, 1.0a2 and 1.0a3.
|
||||
|
||||
Also in this alpha:
|
||||
|
||||
- The new ``datasette plugins --requirements`` option outputs a list of currently installed plugins in Python ``requirements.txt`` format, useful for duplicating that installation elsewhere. (:issue:`2133`)
|
||||
- :ref:`canned_queries_writable` can now define a ``on_success_message_sql`` field in their configuration, containing a SQL query that should be executed upon successful completion of the write operation in order to generate a message to be shown to the user. (:issue:`2138`)
|
||||
- The automatically generated border color for a database is now shown in more places around the application. (:issue:`2119`)
|
||||
- Every instance of example shell script code in the documentation should now include a working copy button, free from additional syntax. (:issue:`2140`)
|
||||
|
||||
.. _v1_0_a3:
|
||||
|
||||
1.0a3 (2023-08-09)
|
||||
------------------
|
||||
|
||||
This alpha release previews the updated design for Datasette's default JSON API. (:issue:`782`)
|
||||
|
||||
The new :ref:`default JSON representation <json_api_default>` for both table pages (``/dbname/table.json``) and arbitrary SQL queries (``/dbname.json?sql=...``) is now shaped like this:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"rows": [
|
||||
{
|
||||
"id": 3,
|
||||
"name": "Detroit"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"name": "Los Angeles"
|
||||
},
|
||||
{
|
||||
"id": 4,
|
||||
"name": "Memnonia"
|
||||
},
|
||||
{
|
||||
"id": 1,
|
||||
"name": "San Francisco"
|
||||
}
|
||||
],
|
||||
"truncated": false
|
||||
}
|
||||
|
||||
Tables will include an additional ``"next"`` key for pagination, which can be passed to ``?_next=`` to fetch the next page of results.
|
||||
|
||||
The various ``?_shape=`` options continue to work as before - see :ref:`json_api_shapes` for details.
|
||||
|
||||
A new ``?_extra=`` mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in :issue:`262`.
|
||||
|
||||
Smaller changes
|
||||
~~~~~~~~~~~~~~~
|
||||
|
||||
- Datasette documentation now shows YAML examples for :ref:`metadata` by default, with a tab interface for switching to JSON. (:issue:`1153`)
|
||||
- :ref:`plugin_register_output_renderer` plugins now have access to ``error`` and ``truncated`` arguments, allowing them to display error messages and take into account truncated results. (:issue:`2130`)
|
||||
- ``render_cell()`` plugin hook now also supports an optional ``request`` argument. (:issue:`2007`)
|
||||
- New ``Justfile`` to support development workflows for Datasette using `Just <https://github.com/casey/just>`__.
|
||||
- ``datasette.render_template()`` can now accepts a ``datasette.views.Context`` subclass as an alternative to a dictionary. (:issue:`2127`)
|
||||
- ``datasette install -e path`` option for editable installations, useful while developing plugins. (:issue:`2106`)
|
||||
- When started with the ``--cors`` option Datasette now serves an ``Access-Control-Max-Age: 3600`` header, ensuring CORS OPTIONS requests are repeated no more than once an hour. (:issue:`2079`)
|
||||
- Fixed a bug where the ``_internal`` database could display ``None`` instead of ``null`` for in-memory databases. (:issue:`1970`)
|
||||
- Added ``pip`` and ``setuptools`` as explicit dependencies. This fixes a bug where Datasette could not be installed using `Rye <https://github.com/mitsuhiko/rye>`__. (:issue:`2065`)
|
||||
|
||||
.. _v0_64_2:
|
||||
|
||||
|
|
@ -521,68 +104,9 @@ Smaller changes
|
|||
0.63.3 (2022-12-17)
|
||||
-------------------
|
||||
|
||||
- Fixed a bug where ``datasette --root``, when running in Docker, would only output the URL to sign in root when the server shut down, not when it started up. (:issue:`1958`)
|
||||
- Fixed a bug where ``datasette --root``, when running in Docker, would only output the URL to sign in as root when the server shut down, not when it started up. (:issue:`1958`)
|
||||
- You no longer need to ensure ``await datasette.invoke_startup()`` has been called in order for Datasette to start correctly serving requests - this is now handled automatically the first time the server receives a request. This fixes a bug experienced when Datasette is served directly by an ASGI application server such as Uvicorn or Gunicorn. It also fixes a bug with the `datasette-gunicorn <https://datasette.io/plugins/datasette-gunicorn>`__ plugin. (:issue:`1955`)
|
||||
|
||||
.. _v1_0_a2:
|
||||
|
||||
1.0a2 (2022-12-14)
|
||||
------------------
|
||||
|
||||
The third Datasette 1.0 alpha release adds upsert support to the JSON API, plus the ability to specify finely grained permissions when creating an API token.
|
||||
|
||||
See `Datasette 1.0a2: Upserts and finely grained permissions <https://simonwillison.net/2022/Dec/15/datasette-1a2/>`__ for an extended, annotated version of these release notes.
|
||||
|
||||
- New ``/db/table/-/upsert`` API, :ref:`documented here <TableUpsertView>`. upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. (:issue:`1878`)
|
||||
- New ``register_permissions()`` plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. (:issue:`1940`)
|
||||
- The ``/db/-/create`` API for :ref:`creating a table <TableCreateView>` now accepts ``"ignore": true`` and ``"replace": true`` options when called with the ``"rows"`` property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. (:issue:`1927`)
|
||||
- Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's :ref:`metadata` JSON and YAML files. The new ``"permissions"`` key can be used to specify which actors should have which permissions. See :ref:`authentication_permissions_other` for details. (:issue:`1636`)
|
||||
- The ``/-/create-token`` page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See :ref:`CreateTokenView` for details. (:issue:`1947`)
|
||||
- Likewise, the ``datasette create-token`` CLI command can now create tokens with :ref:`a subset of permissions <authentication_cli_create_token_restrict>`. (:issue:`1855`)
|
||||
- New :ref:`datasette.create_token() API method <datasette_create_token>` for programmatically creating signed API tokens. (:issue:`1951`)
|
||||
- ``/db/-/create`` API now requires actor to have ``insert-row`` permission in order to use the ``"row"`` or ``"rows"`` properties. (:issue:`1937`)
|
||||
|
||||
.. _v1_0_a1:
|
||||
|
||||
1.0a1 (2022-12-01)
|
||||
------------------
|
||||
|
||||
- Write APIs now serve correct CORS headers if Datasette is started in ``--cors`` mode. See the full list of :ref:`CORS headers <json_api>` in the documentation. (:issue:`1922`)
|
||||
- Fixed a bug where the ``_memory`` database could be written to even though writes were not persisted. (:issue:`1917`)
|
||||
- The https://latest.datasette.io/ demo instance now includes an ``ephemeral`` database which can be used to test Datasette's write APIs, using the new `datasette-ephemeral-tables <https://datasette.io/plugins/datasette-ephemeral-tables>`_ plugin to drop any created tables after five minutes. This database is only available if you sign in as the root user using the link on the homepage. (:issue:`1915`)
|
||||
- Fixed a bug where hitting the write endpoints with a ``GET`` request returned a 500 error. It now returns a 405 (method not allowed) error instead. (:issue:`1916`)
|
||||
- The list of endpoints in the API explorer now lists mutable databases first. (:issue:`1918`)
|
||||
- The ``"ignore": true`` and ``"replace": true`` options for the insert API are :ref:`now documented <TableInsertView>`. (:issue:`1924`)
|
||||
|
||||
.. _v1_0_a0:
|
||||
|
||||
1.0a0 (2022-11-29)
|
||||
------------------
|
||||
|
||||
This first alpha release of Datasette 1.0 introduces a brand new collection of APIs for writing to the database (:issue:`1850`), as well as a new API token mechanism baked into Datasette core. Previously, API tokens have only been supported by installing additional plugins.
|
||||
|
||||
This is very much a preview: expect many more backwards incompatible API changes prior to the full 1.0 release.
|
||||
|
||||
Feedback enthusiastically welcomed, either through `issue comments <https://github.com/simonw/datasette/issues/1850>`__ or via the `Datasette Discord <https://datasette.io/discord>`__ community.
|
||||
|
||||
Signed API tokens
|
||||
~~~~~~~~~~~~~~~~~
|
||||
|
||||
- New ``/-/create-token`` page allowing authenticated users to create signed API tokens that can act on their behalf, see :ref:`CreateTokenView`. (:issue:`1852`)
|
||||
- New ``datasette create-token`` command for creating tokens from the command line: :ref:`authentication_cli_create_token`.
|
||||
- New :ref:`setting_allow_signed_tokens` setting which can be used to turn off signed token support. (:issue:`1856`)
|
||||
- New :ref:`setting_max_signed_tokens_ttl` setting for restricting the maximum allowed duration of a signed token. (:issue:`1858`)
|
||||
|
||||
Write API
|
||||
~~~~~~~~~
|
||||
|
||||
- New API explorer at ``/-/api`` for trying out the API. (:issue:`1871`)
|
||||
- ``/db/-/create`` API for :ref:`TableCreateView`. (:issue:`1882`)
|
||||
- ``/db/table/-/insert`` API for :ref:`TableInsertView`. (:issue:`1851`)
|
||||
- ``/db/table/-/drop`` API for :ref:`TableDropView`. (:issue:`1874`)
|
||||
- ``/db/table/pk/-/update`` API for :ref:`RowUpdateView`. (:issue:`1863`)
|
||||
- ``/db/table/pk/-/delete`` API for :ref:`RowDeleteView`. (:issue:`1864`)
|
||||
|
||||
.. _v0_63_2:
|
||||
|
||||
0.63.2 (2022-11-18)
|
||||
|
|
@ -643,11 +167,11 @@ Documentation
|
|||
.. _v0_62:
|
||||
|
||||
0.62 (2022-08-14)
|
||||
-----------------
|
||||
-------------------
|
||||
|
||||
Datasette can now run entirely in your browser using WebAssembly. Try out `Datasette Lite <https://lite.datasette.io/>`__, take a look `at the code <https://github.com/simonw/datasette-lite>`__ or read more about it in `Datasette Lite: a server-side Python web application running in a browser <https://simonwillison.net/2022/May/4/datasette-lite/>`__.
|
||||
|
||||
Datasette now has a `Discord community <https://datasette.io/discord>`__ for questions and discussions about Datasette and its ecosystem of projects.
|
||||
Datasette now has a `Discord community <https://discord.gg/ktd74dm5mw>`__ for questions and discussions about Datasette and its ecosystem of projects.
|
||||
|
||||
Features
|
||||
~~~~~~~~
|
||||
|
|
@ -709,7 +233,7 @@ Datasette also now requires Python 3.7 or higher.
|
|||
- Datasette is now covered by a `Code of Conduct <https://github.com/simonw/datasette/blob/main/CODE_OF_CONDUCT.md>`__. (:issue:`1654`)
|
||||
- Python 3.6 is no longer supported. (:issue:`1577`)
|
||||
- Tests now run against Python 3.11-dev. (:issue:`1621`)
|
||||
- New ``datasette.ensure_permissions(actor, permissions)`` internal method for checking multiple permissions at once. (:issue:`1675`)
|
||||
- New :ref:`datasette.ensure_permissions(actor, permissions) <datasette_ensure_permissions>` internal method for checking multiple permissions at once. (:issue:`1675`)
|
||||
- New :ref:`datasette.check_visibility(actor, action, resource=None) <datasette_check_visibility>` internal method for checking if a user can see a resource that would otherwise be invisible to unauthenticated users. (:issue:`1678`)
|
||||
- Table and row HTML pages now include a ``<link rel="alternate" type="application/json+datasette" href="...">`` element and return a ``Link: URL; rel="alternate"; type="application/json+datasette"`` HTTP header pointing to the JSON version of those pages. (:issue:`1533`)
|
||||
- ``Access-Control-Expose-Headers: Link`` is now added to the CORS headers, allowing remote JavaScript to access that header.
|
||||
|
|
@ -849,7 +373,7 @@ Other small fixes
|
|||
|
||||
- New ``datasette --uds /tmp/datasette.sock`` option for binding Datasette to a Unix domain socket, see :ref:`proxy documentation <deploying_proxy>` (:issue:`1388`)
|
||||
- ``"searchmode": "raw"`` table metadata option for defaulting a table to executing SQLite full-text search syntax without first escaping it, see :ref:`full_text_search_advanced_queries`. (:issue:`1389`)
|
||||
- New plugin hook: ``get_metadata()``, for returning custom metadata for an instance, database or table. Thanks, Brandon Roberts! (:issue:`1384`)
|
||||
- New plugin hook: :ref:`plugin_hook_get_metadata`, for returning custom metadata for an instance, database or table. Thanks, Brandon Roberts! (:issue:`1384`)
|
||||
- New plugin hook: :ref:`plugin_hook_skip_csrf`, for opting out of CSRF protection based on the incoming request. (:issue:`1377`)
|
||||
- The :ref:`menu_links() <plugin_hook_menu_links>`, :ref:`table_actions() <plugin_hook_table_actions>` and :ref:`database_actions() <plugin_hook_database_actions>` plugin hooks all gained a new optional ``request`` argument providing access to the current request. (:issue:`1371`)
|
||||
- Major performance improvement for Datasette faceting. (:issue:`1394`)
|
||||
|
|
@ -977,7 +501,7 @@ JavaScript modules
|
|||
|
||||
To use modules, JavaScript needs to be included in ``<script>`` tags with a ``type="module"`` attribute.
|
||||
|
||||
Datasette now has the ability to output ``<script type="module">`` in places where you may wish to take advantage of modules. The ``extra_js_urls`` option described in :ref:`configuration_reference_css_js` can now be used with modules, and module support is also available for the :ref:`extra_body_script() <plugin_hook_extra_body_script>` plugin hook. (:issue:`1186`, :issue:`1187`)
|
||||
Datasette now has the ability to output ``<script type="module">`` in places where you may wish to take advantage of modules. The ``extra_js_urls`` option described in :ref:`customization_css_and_javascript` can now be used with modules, and module support is also available for the :ref:`extra_body_script() <plugin_hook_extra_body_script>` plugin hook. (:issue:`1186`, :issue:`1187`)
|
||||
|
||||
`datasette-leaflet-freedraw <https://datasette.io/plugins/datasette-leaflet-freedraw>`__ is the first example of a Datasette plugin that takes advantage of the new support for JavaScript modules. See `Drawing shapes on a map to query a SpatiaLite database <https://simonwillison.net/2021/Jan/24/drawing-shapes-spatialite/>`__ for more on this plugin.
|
||||
|
||||
|
|
@ -1134,7 +658,7 @@ Smaller changes
|
|||
~~~~~~~~~~~~~~~
|
||||
|
||||
- Wide tables shown within Datasette now scroll horizontally (:issue:`998`). This is achieved using a new ``<div class="table-wrapper">`` element which may impact the implementation of some plugins (for example `this change to datasette-cluster-map <https://github.com/simonw/datasette-cluster-map/commit/fcb4abbe7df9071c5ab57defd39147de7145b34e>`__).
|
||||
- New :ref:`actions_debug_menu` permission. (:issue:`1068`)
|
||||
- New :ref:`permissions_debug_menu` permission. (:issue:`1068`)
|
||||
- Removed ``--debug`` option, which didn't do anything. (:issue:`814`)
|
||||
- ``Link:`` HTTP header pagination. (:issue:`1014`)
|
||||
- ``x`` button for clearing filters. (:issue:`1016`)
|
||||
|
|
@ -1358,10 +882,7 @@ Prior to this release the Datasette ecosystem has treated authentication as excl
|
|||
|
||||
You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new ``--root`` command-line option, which outputs a one-time use URL to :ref:`authenticate as a root actor <authentication_root>` (:issue:`784`)::
|
||||
|
||||
datasette fixtures.db --root
|
||||
|
||||
::
|
||||
|
||||
$ datasette fixtures.db --root
|
||||
http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477
|
||||
INFO: Started server process [14973]
|
||||
INFO: Waiting for application startup.
|
||||
|
|
@ -1469,7 +990,7 @@ Smaller changes
|
|||
- New :ref:`datasette.get_database() <datasette_get_database>` method.
|
||||
- Added ``_`` prefix to many private, undocumented methods of the Datasette class. (:issue:`576`)
|
||||
- Removed the ``db.get_outbound_foreign_keys()`` method which duplicated the behaviour of ``db.foreign_keys_for_table()``.
|
||||
- New ``await datasette.permission_allowed()`` method.
|
||||
- New :ref:`await datasette.permission_allowed() <datasette_permission_allowed>` method.
|
||||
- ``/-/actor`` debugging endpoint for viewing the currently authenticated actor.
|
||||
- New ``request.cookies`` property.
|
||||
- ``/-/plugins`` endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1
|
||||
|
|
@ -1532,7 +1053,7 @@ You can now create :ref:`custom pages <custom_pages>` within your Datasette inst
|
|||
|
||||
:ref:`config_dir` (:issue:`731`) allows you to define a custom Datasette instance as a directory. So instead of running the following::
|
||||
|
||||
datasette one.db two.db \
|
||||
$ datasette one.db two.db \
|
||||
--metadata=metadata.json \
|
||||
--template-dir=templates/ \
|
||||
--plugins-dir=plugins \
|
||||
|
|
@ -1540,7 +1061,7 @@ You can now create :ref:`custom pages <custom_pages>` within your Datasette inst
|
|||
|
||||
You can instead arrange your files in a single directory called ``my-project`` and run this::
|
||||
|
||||
datasette my-project/
|
||||
$ datasette my-project/
|
||||
|
||||
Also in this release:
|
||||
|
||||
|
|
@ -1557,7 +1078,7 @@ Also in this release:
|
|||
0.40 (2020-04-21)
|
||||
-----------------
|
||||
|
||||
* Datasette :ref:`metadata` can now be provided as a YAML file as an optional alternative to JSON. (:issue:`713`)
|
||||
* Datasette :ref:`metadata` can now be provided as a YAML file as an optional alternative to JSON. See :ref:`metadata_yaml`. (:issue:`713`)
|
||||
* Removed support for ``datasette publish now``, which used the the now-retired Zeit Now v1 hosting platform. A new plugin, `datasette-publish-now <https://github.com/simonw/datasette-publish-now>`__, can be installed to publish data to Zeit (`now Vercel <https://vercel.com/blog/zeit-is-now-vercel>`__) Now v2. (:issue:`710`)
|
||||
* Fixed a bug where the ``extra_template_vars(request, view_name)`` plugin hook was not receiving the correct ``view_name``. (:issue:`716`)
|
||||
* Variables added to the template context by the ``extra_template_vars()`` plugin hook are now shown in the ``?_context=1`` debugging mode (see :ref:`setting_template_debug`). (:issue:`693`)
|
||||
|
|
@ -2212,10 +1733,7 @@ In addition to the work on facets:
|
|||
|
||||
Added new help section::
|
||||
|
||||
datasette --help-config
|
||||
|
||||
::
|
||||
|
||||
$ datasette --help-config
|
||||
Config options:
|
||||
default_page_size Default page size for the table view
|
||||
(default=100)
|
||||
|
|
|
|||
|
|
@ -47,14 +47,13 @@ Running ``datasette --help`` shows a list of all of the available commands.
|
|||
--help Show this message and exit.
|
||||
|
||||
Commands:
|
||||
serve* Serve up specified SQLite database files with a web UI
|
||||
create-token Create a signed API token for the specified actor ID
|
||||
inspect Generate JSON summary of provided database files
|
||||
install Install plugins and packages from PyPI into the same...
|
||||
package Package SQLite files into a Datasette Docker container
|
||||
plugins List currently installed plugins
|
||||
publish Publish specified SQLite database files to the internet...
|
||||
uninstall Uninstall plugins and Python packages from the Datasette...
|
||||
serve* Serve up specified SQLite database files with a web UI
|
||||
inspect Generate JSON summary of provided database files
|
||||
install Install plugins and packages from PyPI into the same...
|
||||
package Package SQLite files into a Datasette Docker container
|
||||
plugins List currently installed plugins
|
||||
publish Publish specified SQLite database files to the internet along...
|
||||
uninstall Uninstall plugins and Python packages from the Datasette...
|
||||
|
||||
|
||||
.. [[[end]]]
|
||||
|
|
@ -112,18 +111,16 @@ Once started you can access it at ``http://localhost:8001``
|
|||
--static MOUNT:DIRECTORY Serve static files from this directory at
|
||||
/MOUNT/...
|
||||
--memory Make /_memory database available
|
||||
-c, --config FILENAME Path to JSON/YAML Datasette configuration file
|
||||
-s, --setting SETTING... nested.key, value setting to use in Datasette
|
||||
configuration
|
||||
--config CONFIG Deprecated: set config option using
|
||||
configname:value. Use --setting instead.
|
||||
--setting SETTING... Setting, see
|
||||
docs.datasette.io/en/stable/settings.html
|
||||
--secret TEXT Secret used for signing secure values, such as
|
||||
signed cookies
|
||||
--root Output URL that sets a cookie authenticating
|
||||
the root user
|
||||
--get TEXT Run an HTTP GET request against this path,
|
||||
print results and exit
|
||||
--headers Include HTTP headers in --get output
|
||||
--token TEXT API token to send with --get requests
|
||||
--actor TEXT Actor to use for --get requests (JSON string)
|
||||
--version-note TEXT Additional note to show on /-/versions
|
||||
--help-settings Show available settings
|
||||
--pdb Launch debugger on any errors
|
||||
|
|
@ -135,24 +132,11 @@ Once started you can access it at ``http://localhost:8001``
|
|||
mode
|
||||
--ssl-keyfile TEXT SSL key file
|
||||
--ssl-certfile TEXT SSL certificate file
|
||||
--internal PATH Path to a persistent Datasette internal SQLite
|
||||
database
|
||||
--help Show this message and exit.
|
||||
|
||||
|
||||
.. [[[end]]]
|
||||
|
||||
.. _cli_datasette_serve_env:
|
||||
|
||||
Environment variables
|
||||
---------------------
|
||||
|
||||
Some of the ``datasette serve`` options can be provided by environment variables:
|
||||
|
||||
- ``DATASETTE_SECRET``: Equivalent to the ``--secret`` option.
|
||||
- ``DATASETTE_SSL_KEYFILE``: Equivalent to the ``--ssl-keyfile`` option.
|
||||
- ``DATASETTE_SSL_CERTFILE``: Equivalent to the ``--ssl-certfile`` option.
|
||||
- ``DATASETTE_LOAD_EXTENSION``: Equivalent to the ``--load-extension`` option.
|
||||
|
||||
.. _cli_datasette_get:
|
||||
|
||||
|
|
@ -163,14 +147,9 @@ The ``--get`` option to ``datasette serve`` (or just ``datasette``) specifies th
|
|||
|
||||
This means that all of Datasette's functionality can be accessed directly from the command-line.
|
||||
|
||||
For example:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
datasette --get '/-/versions.json' | jq .
|
||||
|
||||
.. code-block:: json
|
||||
For example::
|
||||
|
||||
$ datasette --get '/-/versions.json' | jq .
|
||||
{
|
||||
"python": {
|
||||
"version": "3.8.5",
|
||||
|
|
@ -209,15 +188,7 @@ For example:
|
|||
}
|
||||
}
|
||||
|
||||
You can use the ``--token TOKEN`` option to send an :ref:`API token <CreateTokenView>` with the simulated request.
|
||||
|
||||
Or you can make a request as a specific actor by passing a JSON representation of that actor to ``--actor``:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
datasette --memory --actor '{"id": "root"}' --get '/-/actor.json'
|
||||
|
||||
The exit code of ``datasette --get`` will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error.
|
||||
The exit code will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error.
|
||||
|
||||
This lets you use ``datasette --get /`` to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.
|
||||
|
||||
|
|
@ -241,8 +212,6 @@ These can be passed to ``datasette serve`` using ``datasette serve --setting nam
|
|||
(default=100)
|
||||
max_returned_rows Maximum rows that can be returned from a table or
|
||||
custom query (default=1000)
|
||||
max_insert_rows Maximum rows that can be inserted at a time using
|
||||
the bulk insert API (default=100)
|
||||
num_sql_threads Number of threads in the thread pool for
|
||||
executing SQLite queries (default=3)
|
||||
sql_time_limit_ms Time limit for a SQL query in milliseconds
|
||||
|
|
@ -255,14 +224,10 @@ These can be passed to ``datasette serve`` using ``datasette serve --setting nam
|
|||
(default=50)
|
||||
allow_facet Allow users to specify columns to facet using
|
||||
?_facet= parameter (default=True)
|
||||
allow_download Allow users to download the original SQLite
|
||||
database files (default=True)
|
||||
allow_signed_tokens Allow users to create and use signed API tokens
|
||||
(default=True)
|
||||
default_allow_sql Allow anyone to run arbitrary SQL queries
|
||||
(default=True)
|
||||
max_signed_tokens_ttl Maximum allowed expiry time for signed API tokens
|
||||
(default=0)
|
||||
allow_download Allow users to download the original SQLite
|
||||
database files (default=True)
|
||||
suggest_facets Calculate and display suggested facets
|
||||
(default=True)
|
||||
default_cache_ttl Default HTTP cache TTL (used in Cache-Control:
|
||||
|
|
@ -307,7 +272,6 @@ Output JSON showing all currently installed plugins, their versions, whether the
|
|||
|
||||
Options:
|
||||
--all Include built-in default plugins
|
||||
--requirements Output requirements.txt of installed plugins
|
||||
--plugins-dir DIRECTORY Path to directory containing custom plugins
|
||||
--help Show this message and exit.
|
||||
|
||||
|
|
@ -371,15 +335,13 @@ Would install the `datasette-cluster-map <https://datasette.io/plugins/datasette
|
|||
|
||||
::
|
||||
|
||||
Usage: datasette install [OPTIONS] [PACKAGES]...
|
||||
Usage: datasette install [OPTIONS] PACKAGES...
|
||||
|
||||
Install plugins and packages from PyPI into the same environment as Datasette
|
||||
|
||||
Options:
|
||||
-U, --upgrade Upgrade packages to latest version
|
||||
-r, --requirement PATH Install from requirements file
|
||||
-e, --editable TEXT Install a project in editable mode from this path
|
||||
--help Show this message and exit.
|
||||
-U, --upgrade Upgrade packages to latest version
|
||||
--help Show this message and exit.
|
||||
|
||||
|
||||
.. [[[end]]]
|
||||
|
|
@ -638,61 +600,3 @@ This performance optimization is used automatically by some of the ``datasette p
|
|||
|
||||
|
||||
.. [[[end]]]
|
||||
|
||||
|
||||
.. _cli_help_create_token___help:
|
||||
|
||||
datasette create-token
|
||||
======================
|
||||
|
||||
Create a signed API token, see :ref:`authentication_cli_create_token`.
|
||||
|
||||
.. [[[cog
|
||||
help(["create-token", "--help"])
|
||||
.. ]]]
|
||||
|
||||
::
|
||||
|
||||
Usage: datasette create-token [OPTIONS] ID
|
||||
|
||||
Create a signed API token for the specified actor ID
|
||||
|
||||
Example:
|
||||
|
||||
datasette create-token root --secret mysecret
|
||||
|
||||
To allow only "view-database-download" for all databases:
|
||||
|
||||
datasette create-token root --secret mysecret \
|
||||
--all view-database-download
|
||||
|
||||
To allow "create-table" against a specific database:
|
||||
|
||||
datasette create-token root --secret mysecret \
|
||||
--database mydb create-table
|
||||
|
||||
To allow "insert-row" against a specific table:
|
||||
|
||||
datasette create-token root --secret myscret \
|
||||
--resource mydb mytable insert-row
|
||||
|
||||
Restricted actions can be specified multiple times using multiple --all,
|
||||
--database, and --resource options.
|
||||
|
||||
Add --debug to see a decoded version of the token.
|
||||
|
||||
Options:
|
||||
--secret TEXT Secret used for signing the API tokens
|
||||
[required]
|
||||
-e, --expires-after INTEGER Token should expire after this many seconds
|
||||
-a, --all ACTION Restrict token to this action
|
||||
-d, --database DB ACTION Restrict token to this action on this database
|
||||
-r, --resource DB RESOURCE ACTION
|
||||
Restrict token to this action on this database
|
||||
resource (a table, SQL view or named query)
|
||||
--debug Show decoded token
|
||||
--plugins-dir DIRECTORY Path to directory containing custom plugins
|
||||
--help Show this message and exit.
|
||||
|
||||
|
||||
.. [[[end]]]
|
||||
|
|
|
|||
26
docs/conf.py
26
docs/conf.py
|
|
@ -17,8 +17,7 @@
|
|||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
|
||||
# import os
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
|
||||
|
|
@ -32,22 +31,7 @@ import os
|
|||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||
# ones.
|
||||
extensions = [
|
||||
"sphinx.ext.extlinks",
|
||||
"sphinx.ext.autodoc",
|
||||
"sphinx_copybutton",
|
||||
"myst_parser",
|
||||
"sphinx_markdown_builder",
|
||||
]
|
||||
if not os.environ.get("DISABLE_SPHINX_INLINE_TABS"):
|
||||
extensions += ["sphinx_inline_tabs"]
|
||||
|
||||
autodoc_member_order = "bysource"
|
||||
|
||||
myst_enable_extensions = ["colon_fence"]
|
||||
|
||||
markdown_http_base = "https://docs.datasette.io/en/stable"
|
||||
markdown_uri_doc_suffix = ".html"
|
||||
extensions = ["sphinx.ext.extlinks", "sphinx.ext.autodoc", "sphinx_copybutton"]
|
||||
|
||||
extlinks = {
|
||||
"issue": ("https://github.com/simonw/datasette/issues/%s", "#%s"),
|
||||
|
|
@ -60,10 +44,7 @@ templates_path = ["_templates"]
|
|||
# You can specify multiple suffix as a list of string:
|
||||
#
|
||||
# source_suffix = ['.rst', '.md']
|
||||
source_suffix = {
|
||||
".rst": "restructuredtext",
|
||||
".md": "markdown",
|
||||
}
|
||||
source_suffix = ".rst"
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = "index"
|
||||
|
|
@ -124,7 +105,6 @@ html_theme_options = {
|
|||
html_static_path = ["_static"]
|
||||
|
||||
html_logo = "datasette-logo.svg"
|
||||
html_favicon = "_static/datasette-favicon.png"
|
||||
|
||||
html_css_files = [
|
||||
"css/custom.css",
|
||||
|
|
|
|||
|
|
@ -13,14 +13,13 @@ General guidelines
|
|||
* **main should always be releasable**. Incomplete features should live in branches. This ensures that any small bug fixes can be quickly released.
|
||||
* **The ideal commit** should bundle together the implementation, unit tests and associated documentation updates. The commit message should link to an associated issue.
|
||||
* **New plugin hooks** should only be shipped if accompanied by a separate release of a non-demo plugin that uses them.
|
||||
* **New user-facing views and documentation** should be added or updated alongside their implementation. The `/docs` folder includes pages for plugin hooks and built-in views—please ensure any new hooks or views are reflected there so the documentation tests continue to pass.
|
||||
|
||||
.. _devenvironment:
|
||||
|
||||
Setting up a development environment
|
||||
------------------------------------
|
||||
|
||||
If you have Python 3.10 or higher installed on your computer (on OS X the quickest way to do this `is using homebrew <https://docs.python-guide.org/starting/install3/osx/>`__) you can install an editable copy of Datasette using the following steps.
|
||||
If you have Python 3.7 or higher installed on your computer (on OS X the quickest way to do this `is using homebrew <https://docs.python-guide.org/starting/install3/osx/>`__) you can install an editable copy of Datasette using the following steps.
|
||||
|
||||
If you want to use GitHub to publish your changes, first `create a fork of datasette <https://github.com/simonw/datasette/fork>`__ under your own GitHub account.
|
||||
|
||||
|
|
@ -42,7 +41,7 @@ The next step is to create a virtual environment for your project and use it to
|
|||
# Install Datasette and its testing dependencies
|
||||
python3 -m pip install -e '.[test]'
|
||||
|
||||
That last line does most of the work: ``pip install -e`` means "install this package in a way that allows me to edit the source code in place". The ``.[test]`` option means "install the optional testing dependencies as well".
|
||||
That last line does most of the work: ``pip install -e`` means "install this package in a way that allows me to edit the source code in place". The ``.[test]`` option means "use the setup.py in this directory and install the optional testing dependencies as well".
|
||||
|
||||
.. _contributing_running_tests:
|
||||
|
||||
|
|
@ -112,14 +111,10 @@ Debugging
|
|||
|
||||
Any errors that occur while Datasette is running while display a stack trace on the console.
|
||||
|
||||
You can tell Datasette to open an interactive ``pdb`` (or ``ipdb``, if present) debugger session if an error occurs using the ``--pdb`` option::
|
||||
You can tell Datasette to open an interactive ``pdb`` debugger session if an error occurs using the ``--pdb`` option::
|
||||
|
||||
datasette --pdb fixtures.db
|
||||
|
||||
For `ipdb <https://pypi.org/project/ipdb/>`__, first run this::
|
||||
|
||||
datasette install ipdb
|
||||
|
||||
.. _contributing_formatting:
|
||||
|
||||
Code formatting
|
||||
|
|
@ -131,15 +126,6 @@ These formatters are enforced by Datasette's continuous integration: if a commit
|
|||
|
||||
When developing locally, you can verify and correct the formatting of your code using these tools.
|
||||
|
||||
If you are using `Just <https://github.com/casey/just>`__ the quickest way to run these is like so::
|
||||
|
||||
just black
|
||||
just prettier
|
||||
|
||||
Or run both at the same time::
|
||||
|
||||
just format
|
||||
|
||||
.. _contributing_formatting_black:
|
||||
|
||||
Running Black
|
||||
|
|
@ -147,20 +133,14 @@ Running Black
|
|||
|
||||
Black will be installed when you run ``pip install -e '.[test]'``. To test that your code complies with Black, run the following in your root ``datasette`` repository checkout::
|
||||
|
||||
black . --check
|
||||
|
||||
::
|
||||
|
||||
$ black . --check
|
||||
All done! ✨ 🍰 ✨
|
||||
95 files would be left unchanged.
|
||||
|
||||
If any of your code does not conform to Black you can run this to automatically fix those problems::
|
||||
|
||||
black .
|
||||
|
||||
::
|
||||
|
||||
reformatted ../datasette/app.py
|
||||
$ black .
|
||||
reformatted ../datasette/setup.py
|
||||
All done! ✨ 🍰 ✨
|
||||
1 file reformatted, 94 files left unchanged.
|
||||
|
||||
|
|
@ -180,14 +160,11 @@ Prettier
|
|||
|
||||
To install Prettier, `install Node.js <https://nodejs.org/en/download/package-manager/>`__ and then run the following in the root of your ``datasette`` repository checkout::
|
||||
|
||||
npm install
|
||||
$ npm install
|
||||
|
||||
This will install Prettier in a ``node_modules`` directory. You can then check that your code matches the coding style like so::
|
||||
|
||||
npm run prettier -- --check
|
||||
|
||||
::
|
||||
|
||||
$ npm run prettier -- --check
|
||||
> prettier
|
||||
> prettier 'datasette/static/*[!.min].js' "--check"
|
||||
|
||||
|
|
@ -197,7 +174,7 @@ This will install Prettier in a ``node_modules`` directory. You can then check t
|
|||
|
||||
You can fix any problems by running::
|
||||
|
||||
npm run fix
|
||||
$ npm run fix
|
||||
|
||||
.. _contributing_documentation:
|
||||
|
||||
|
|
@ -268,7 +245,6 @@ Datasette releases are performed using tags. When a new release is published on
|
|||
* Re-point the "latest" tag on Docker Hub to the new image
|
||||
* Build a wheel bundle of the underlying Python source code
|
||||
* Push that new wheel up to PyPI: https://pypi.org/project/datasette/
|
||||
* If the release is an alpha, navigate to https://readthedocs.org/projects/datasette/versions/ and search for the tag name in the "Activate a version" filter, then mark that version as "active" to ensure it will appear on the public ReadTheDocs documentation site.
|
||||
|
||||
To deploy new releases you will need to have push access to the main Datasette GitHub repository.
|
||||
|
||||
|
|
@ -298,10 +274,6 @@ You can generate the list of issue references for a specific release by copying
|
|||
|
||||
To create the tag for the release, create `a new release <https://github.com/simonw/datasette/releases/new>`__ on GitHub matching the new version number. You can convert the release notes to Markdown by copying and pasting the rendered HTML into this `Paste to Markdown tool <https://euangoddard.github.io/clipboard2markdown/>`__.
|
||||
|
||||
Don't forget to create the release from the correct branch - usually ``main``, but sometimes ``0.64.x`` or similar for a bugfix release.
|
||||
|
||||
While the release is running you can confirm that the correct commits made it into the release using the https://github.com/simonw/datasette/compare/0.64.6...0.64.7 URL.
|
||||
|
||||
Finally, post a news item about the release on `datasette.io <https://datasette.io/>`__ by editing the `news.yaml <https://github.com/simonw/datasette.io/blob/main/news.yaml>`__ file in that site's repository.
|
||||
|
||||
.. _contributing_alpha_beta:
|
||||
|
|
@ -350,17 +322,20 @@ Upgrading CodeMirror
|
|||
|
||||
Datasette bundles `CodeMirror <https://codemirror.net/>`__ for the SQL editing interface, e.g. on `this page <https://latest.datasette.io/fixtures>`__. Here are the steps for upgrading to a new version of CodeMirror:
|
||||
|
||||
* Install the packages with::
|
||||
* Download and extract latest CodeMirror zip file from https://codemirror.net/codemirror.zip
|
||||
* Rename ``lib/codemirror.js`` to ``codemirror-5.57.0.js`` (using latest version number)
|
||||
* Rename ``lib/codemirror.css`` to ``codemirror-5.57.0.css``
|
||||
* Rename ``mode/sql/sql.js`` to ``codemirror-5.57.0-sql.js``
|
||||
* Edit both JavaScript files to make the top license comment a ``/* */`` block instead of multiple ``//`` lines
|
||||
* Minify the JavaScript files like this::
|
||||
|
||||
npm i codemirror @codemirror/lang-sql
|
||||
npx uglify-js codemirror-5.57.0.js -o codemirror-5.57.0.min.js --comments '/LICENSE/'
|
||||
npx uglify-js codemirror-5.57.0-sql.js -o codemirror-5.57.0-sql.min.js --comments '/LICENSE/'
|
||||
|
||||
* Build the bundle using the version number from package.json with::
|
||||
* Check that the LICENSE comment did indeed survive minification
|
||||
* Minify the CSS file like this::
|
||||
|
||||
node_modules/.bin/rollup datasette/static/cm-editor-6.0.1.js \
|
||||
-f iife \
|
||||
-n cm \
|
||||
-o datasette/static/cm-editor-6.0.1.bundle.js \
|
||||
-p @rollup/plugin-node-resolve \
|
||||
-p @rollup/plugin-terser
|
||||
npx clean-css-cli codemirror-5.57.0.css -o codemirror-5.57.0.min.css
|
||||
|
||||
* Update the version reference in the ``codemirror.html`` template.
|
||||
* Edit the ``_codemirror.html`` template to reference the new files
|
||||
* ``git rm`` the old files, ``git add`` the new files
|
||||
|
|
|
|||
|
|
@ -5,6 +5,87 @@ Custom pages and templates
|
|||
|
||||
Datasette provides a number of ways of customizing the way data is displayed.
|
||||
|
||||
.. _customization_css_and_javascript:
|
||||
|
||||
Custom CSS and JavaScript
|
||||
-------------------------
|
||||
|
||||
When you launch Datasette, you can specify a custom metadata file like this::
|
||||
|
||||
datasette mydb.db --metadata metadata.json
|
||||
|
||||
Your ``metadata.json`` file can include links that look like this:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"extra_css_urls": [
|
||||
"https://simonwillison.net/static/css/all.bf8cd891642c.css"
|
||||
],
|
||||
"extra_js_urls": [
|
||||
"https://code.jquery.com/jquery-3.2.1.slim.min.js"
|
||||
]
|
||||
}
|
||||
|
||||
The extra CSS and JavaScript files will be linked in the ``<head>`` of every page:
|
||||
|
||||
.. code-block:: html
|
||||
|
||||
<link rel="stylesheet" href="https://simonwillison.net/static/css/all.bf8cd891642c.css">
|
||||
<script src="https://code.jquery.com/jquery-3.2.1.slim.min.js"></script>
|
||||
|
||||
You can also specify a SRI (subresource integrity hash) for these assets:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"extra_css_urls": [
|
||||
{
|
||||
"url": "https://simonwillison.net/static/css/all.bf8cd891642c.css",
|
||||
"sri": "sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"
|
||||
}
|
||||
],
|
||||
"extra_js_urls": [
|
||||
{
|
||||
"url": "https://code.jquery.com/jquery-3.2.1.slim.min.js",
|
||||
"sri": "sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
This will produce:
|
||||
|
||||
.. code-block:: html
|
||||
|
||||
<link rel="stylesheet" href="https://simonwillison.net/static/css/all.bf8cd891642c.css"
|
||||
integrity="sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"
|
||||
crossorigin="anonymous">
|
||||
<script src="https://code.jquery.com/jquery-3.2.1.slim.min.js"
|
||||
integrity="sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="
|
||||
crossorigin="anonymous"></script>
|
||||
|
||||
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash
|
||||
matches the content served. You can generate hashes using `www.srihash.org <https://www.srihash.org/>`_
|
||||
|
||||
Items in ``"extra_js_urls"`` can specify ``"module": true`` if they reference JavaScript that uses `JavaScript modules <https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules>`__. This configuration:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"extra_js_urls": [
|
||||
{
|
||||
"url": "https://example.datasette.io/module.js",
|
||||
"module": true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Will produce this HTML:
|
||||
|
||||
.. code-block:: html
|
||||
|
||||
<script type="module" src="https://example.datasette.io/module.js"></script>
|
||||
|
||||
CSS classes on the <body>
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
|
|
@ -83,15 +164,6 @@ database column they are representing, for example:
|
|||
</tbody>
|
||||
</table>
|
||||
|
||||
.. _customization_css:
|
||||
|
||||
Writing custom CSS
|
||||
~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Custom templates need to take Datasette's default CSS into account. The pattern portfolio at ``/-/patterns`` (`example here <https://latest.datasette.io/-/patterns>`__) is a useful reference for understanding the available CSS classes.
|
||||
|
||||
The ``core`` class is particularly useful - you can apply this directly to a ``<input>`` or ``<button>`` element to get Datasette's default form styles, or you can apply it to a containing element (such as ``<form>``) to apply those styles to all of the form elements within it.
|
||||
|
||||
.. _customization_static_files:
|
||||
|
||||
Serving static files
|
||||
|
|
@ -107,49 +179,25 @@ Consider the following directory structure::
|
|||
You can start Datasette using ``--static assets:static-files/`` to serve those
|
||||
files from the ``/assets/`` mount point::
|
||||
|
||||
datasette --config datasette.yaml --static assets:static-files/ --memory
|
||||
$ datasette -m metadata.json --static assets:static-files/ --memory
|
||||
|
||||
The following URLs will now serve the content from those CSS and JS files::
|
||||
|
||||
http://localhost:8001/assets/styles.css
|
||||
http://localhost:8001/assets/app.js
|
||||
|
||||
You can reference those files from ``datasette.yaml`` like this, see :ref:`custom CSS and JavaScript <configuration_reference_css_js>` for more details:
|
||||
You can reference those files from ``metadata.json`` like so:
|
||||
|
||||
.. [[[cog
|
||||
from metadata_doc import config_example
|
||||
config_example(cog, """
|
||||
extra_css_urls:
|
||||
- /assets/styles.css
|
||||
extra_js_urls:
|
||||
- /assets/app.js
|
||||
""")
|
||||
.. ]]]
|
||||
.. code-block:: json
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
|
||||
extra_css_urls:
|
||||
- /assets/styles.css
|
||||
extra_js_urls:
|
||||
- /assets/app.js
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"extra_css_urls": [
|
||||
{
|
||||
"extra_css_urls": [
|
||||
"/assets/styles.css"
|
||||
],
|
||||
"extra_js_urls": [
|
||||
],
|
||||
"extra_js_urls": [
|
||||
"/assets/app.js"
|
||||
]
|
||||
}
|
||||
.. [[[end]]]
|
||||
]
|
||||
}
|
||||
|
||||
Publishing static assets
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
|
@ -157,7 +205,7 @@ Publishing static assets
|
|||
The :ref:`cli_publish` command can be used to publish your static assets,
|
||||
using the same syntax as above::
|
||||
|
||||
datasette publish cloudrun mydb.db --static assets:static-files/
|
||||
$ datasette publish cloudrun mydb.db --static assets:static-files/
|
||||
|
||||
This will upload the contents of the ``static-files/`` directory as part of the
|
||||
deployment, and configure Datasette to correctly serve the assets from ``/assets/``.
|
||||
|
|
@ -290,7 +338,7 @@ You can add templated pages to your Datasette instance by creating HTML files in
|
|||
|
||||
For example, to add a custom page that is served at ``http://localhost/about`` you would create a file in ``templates/pages/about.html``, then start Datasette like this::
|
||||
|
||||
datasette mydb.db --template-dir=templates/
|
||||
$ datasette mydb.db --template-dir=templates/
|
||||
|
||||
You can nest directories within pages to create a nested structure. To create a ``http://localhost:8001/about/map`` page you would create ``templates/pages/about/map.html``.
|
||||
|
||||
|
|
@ -345,7 +393,7 @@ To serve a custom HTTP header, add a ``custom_header(name, value)`` function cal
|
|||
|
||||
You can verify this is working using ``curl`` like this::
|
||||
|
||||
curl -I 'http://127.0.0.1:8001/teapot'
|
||||
$ curl -I 'http://127.0.0.1:8001/teapot'
|
||||
HTTP/1.1 418
|
||||
date: Sun, 26 Apr 2020 18:38:30 GMT
|
||||
server: uvicorn
|
||||
|
|
|
|||
|
|
@ -56,7 +56,7 @@ Create a file at ``/etc/systemd/system/datasette.service`` with the following co
|
|||
|
||||
Add a random value for the ``DATASETTE_SECRET`` - this will be used to sign Datasette cookies such as the CSRF token cookie. You can generate a suitable value like so::
|
||||
|
||||
python3 -c 'import secrets; print(secrets.token_hex(32))'
|
||||
$ python3 -c 'import secrets; print(secrets.token_hex(32))'
|
||||
|
||||
This configuration will run Datasette against all database files contained in the ``/home/ubuntu/datasette-root`` directory. If that directory contains a ``metadata.yml`` (or ``.json``) file or a ``templates/`` or ``plugins/`` sub-directory those will automatically be loaded by Datasette - see :ref:`config_dir` for details.
|
||||
|
||||
|
|
@ -79,7 +79,7 @@ Datasette will not be accessible from outside the server because it is listening
|
|||
.. _deploying_openrc:
|
||||
|
||||
Running Datasette using OpenRC
|
||||
==============================
|
||||
===============================
|
||||
OpenRC is the service manager on non-systemd Linux distributions like `Alpine Linux <https://www.alpinelinux.org/>`__ and `Gentoo <https://www.gentoo.org/>`__.
|
||||
|
||||
Create an init script at ``/etc/init.d/datasette`` with the following contents:
|
||||
|
|
|
|||
131
docs/facets.rst
131
docs/facets.rst
|
|
@ -98,16 +98,16 @@ You can increase this on an individual page by adding ``?_facet_size=100`` to th
|
|||
|
||||
.. _facets_metadata:
|
||||
|
||||
Facets in metadata
|
||||
------------------
|
||||
Facets in metadata.json
|
||||
-----------------------
|
||||
|
||||
You can turn facets on by default for specific tables by adding them to a ``"facets"`` key in a Datasette :ref:`metadata` file.
|
||||
|
||||
Here's an example that turns on faceting by default for the ``qLegalStatus`` column in the ``Street_Tree_List`` table in the ``sf-trees`` database:
|
||||
|
||||
.. [[[cog
|
||||
from metadata_doc import metadata_example
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"tables": {
|
||||
|
|
@ -117,82 +117,26 @@ Here's an example that turns on faceting by default for the ``qLegalStatus`` col
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
sf-trees:
|
||||
tables:
|
||||
Street_Tree_List:
|
||||
facets:
|
||||
- qLegalStatus
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"tables": {
|
||||
"Street_Tree_List": {
|
||||
"facets": [
|
||||
"qLegalStatus"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
Facets defined in this way will always be shown in the interface and returned in the API, regardless of the ``_facet`` arguments passed to the view.
|
||||
|
||||
You can specify :ref:`array <facet_by_json_array>` or :ref:`date <facet_by_date>` facets in metadata using JSON objects with a single key of ``array`` or ``date`` and a value specifying the column, like this:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
"facets": [
|
||||
{"array": "tags"},
|
||||
{"date": "created"}
|
||||
]
|
||||
})
|
||||
.. ]]]
|
||||
.. code-block:: json
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
facets:
|
||||
- array: tags
|
||||
- date: created
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"facets": [
|
||||
{
|
||||
"array": "tags"
|
||||
},
|
||||
{
|
||||
"date": "created"
|
||||
}
|
||||
]
|
||||
}
|
||||
.. [[[end]]]
|
||||
{
|
||||
"facets": [
|
||||
{"array": "tags"},
|
||||
{"date": "created"}
|
||||
]
|
||||
}
|
||||
|
||||
You can change the default facet size (the number of results shown for each facet) for a table using ``facet_size``:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"tables": {
|
||||
|
|
@ -203,41 +147,7 @@ You can change the default facet size (the number of results shown for each face
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
sf-trees:
|
||||
tables:
|
||||
Street_Tree_List:
|
||||
facets:
|
||||
- qLegalStatus
|
||||
facet_size: 10
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"tables": {
|
||||
"Street_Tree_List": {
|
||||
"facets": [
|
||||
"qLegalStatus"
|
||||
],
|
||||
"facet_size": 10
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
Suggested facets
|
||||
----------------
|
||||
|
|
@ -260,17 +170,14 @@ Speeding up facets with indexes
|
|||
The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by.
|
||||
Adding indexes can be performed using the ``sqlite3`` command-line utility. Here's how to add an index on the ``state`` column in a table called ``Food_Trucks``::
|
||||
|
||||
sqlite3 mydatabase.db
|
||||
|
||||
::
|
||||
|
||||
$ sqlite3 mydatabase.db
|
||||
SQLite version 3.19.3 2017-06-27 16:48:08
|
||||
Enter ".help" for usage hints.
|
||||
sqlite> CREATE INDEX Food_Trucks_state ON Food_Trucks("state");
|
||||
|
||||
Or using the `sqlite-utils <https://sqlite-utils.datasette.io/en/stable/cli.html#creating-indexes>`__ command-line utility::
|
||||
|
||||
sqlite-utils create-index mydatabase.db Food_Trucks state
|
||||
$ sqlite-utils create-index mydatabase.db Food_Trucks state
|
||||
|
||||
.. _facet_by_json_array:
|
||||
|
||||
|
|
|
|||
|
|
@ -64,9 +64,9 @@ The ``"searchmode": "raw"`` property can be used to default the table to accepti
|
|||
|
||||
Here is an example which enables full-text search (with SQLite advanced search operators) for a ``display_ads`` view which is defined against the ``ads`` table and hence needs to run FTS against the ``ads_fts`` table, using the ``id`` as the primary key:
|
||||
|
||||
.. [[[cog
|
||||
from metadata_doc import metadata_example
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"russian-ads": {
|
||||
"tables": {
|
||||
|
|
@ -78,40 +78,7 @@ Here is an example which enables full-text search (with SQLite advanced search o
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
russian-ads:
|
||||
tables:
|
||||
display_ads:
|
||||
fts_table: ads_fts
|
||||
fts_pk: id
|
||||
searchmode: raw
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"russian-ads": {
|
||||
"tables": {
|
||||
"display_ads": {
|
||||
"fts_table": "ads_fts",
|
||||
"fts_pk": "id",
|
||||
"searchmode": "raw"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
.. _full_text_search_custom_sql:
|
||||
|
||||
|
|
@ -177,14 +144,14 @@ Configuring FTS using sqlite-utils
|
|||
|
||||
Here's how to use ``sqlite-utils`` to enable full-text search for an ``items`` table across the ``name`` and ``description`` columns::
|
||||
|
||||
sqlite-utils enable-fts mydatabase.db items name description
|
||||
$ sqlite-utils enable-fts mydatabase.db items name description
|
||||
|
||||
Configuring FTS using csvs-to-sqlite
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
If your data starts out in CSV files, you can use Datasette's companion tool `csvs-to-sqlite <https://github.com/simonw/csvs-to-sqlite>`__ to convert that file into a SQLite database and enable full-text search on specific columns. For a file called ``items.csv`` where you want full-text search to operate against the ``name`` and ``description`` columns you would run the following::
|
||||
|
||||
csvs-to-sqlite items.csv items.db -f name -f description
|
||||
$ csvs-to-sqlite items.csv items.db -f name -f description
|
||||
|
||||
Configuring FTS by hand
|
||||
~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
|
|
|||
|
|
@ -8,7 +8,7 @@ Play with a live demo
|
|||
|
||||
The best way to experience Datasette for the first time is with a demo:
|
||||
|
||||
* `datasette.io/global-power-plants <https://datasette.io/global-power-plants/global-power-plants>`__ provides a searchable database of power plants around the world, using data from the `World Resources Institude <https://www.wri.org/publication/global-power-plant-database>`__ rendered using the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`__ plugin.
|
||||
* `global-power-plants.datasettes.com <https://global-power-plants.datasettes.com/global-power-plants/global-power-plants>`__ provides a searchable database of power plants around the world, using data from the `World Resources Institude <https://www.wri.org/publication/global-power-plant-database>`__ rendered using the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`__ plugin.
|
||||
* `fivethirtyeight.datasettes.com <https://fivethirtyeight.datasettes.com/fivethirtyeight>`__ shows Datasette running against over 400 datasets imported from the `FiveThirtyEight GitHub repository <https://github.com/fivethirtyeight/data>`__.
|
||||
|
||||
.. _getting_started_tutorial:
|
||||
|
|
@ -33,18 +33,29 @@ You can pass a URL to a CSV, SQLite or raw SQL file directly to Datasette Lite t
|
|||
|
||||
This `example link <https://lite.datasette.io/?url=https%3A%2F%2Fraw.githubusercontent.com%2FNUKnightLab%2Fsql-mysteries%2Fmaster%2Fsql-murder-mystery.db#/sql-murder-mystery>`__ opens Datasette Lite and loads the SQL Murder Mystery example database from `Northwestern University Knight Lab <https://github.com/NUKnightLab/sql-mysteries>`__.
|
||||
|
||||
.. _getting_started_codespaces:
|
||||
.. _getting_started_glitch:
|
||||
|
||||
Try Datasette without installing anything with Codespaces
|
||||
---------------------------------------------------------
|
||||
Try Datasette without installing anything using Glitch
|
||||
------------------------------------------------------
|
||||
|
||||
`GitHub Codespaces <https://github.com/features/codespaces/>`__ offers a free browser-based development environment that lets you run a development server without installing any local software.
|
||||
`Glitch <https://glitch.com/>`__ is a free online tool for building web apps directly from your web browser. You can use Glitch to try out Datasette without needing to install any software on your own computer.
|
||||
|
||||
Here's a demo project on GitHub which you can use as the basis for your own experiments:
|
||||
Here's a demo project on Glitch which you can use as the basis for your own experiments:
|
||||
|
||||
`github.com/datasette/datasette-studio <https://github.com/datasette/datasette-studio>`__
|
||||
`glitch.com/~datasette-csvs <https://glitch.com/~datasette-csvs>`__
|
||||
|
||||
The README file in that repository has instructions on how to get started.
|
||||
Glitch allows you to "remix" any project to create your own copy and start editing it in your browser. You can remix the ``datasette-csvs`` project by clicking this button:
|
||||
|
||||
.. image:: https://cdn.glitch.com/2703baf2-b643-4da7-ab91-7ee2a2d00b5b%2Fremix-button.svg
|
||||
:target: https://glitch.com/edit/#!/remix/datasette-csvs
|
||||
|
||||
Find a CSV file and drag it onto the Glitch file explorer panel - ``datasette-csvs`` will automatically convert it to a SQLite database (using `sqlite-utils <https://github.com/simonw/sqlite-utils>`__) and allow you to start exploring it using Datasette.
|
||||
|
||||
If your CSV file has a ``latitude`` and ``longitude`` column you can visualize it on a map by uncommenting the ``datasette-cluster-map`` line in the ``requirements.txt`` file using the Glitch file editor.
|
||||
|
||||
Need some data? Try this `Public Art Data <https://data.seattle.gov/Community/Public-Art-Data/j7sn-tdzk>`__ for the city of Seattle - hit "Export" and select "CSV" to download it as a CSV file.
|
||||
|
||||
For more on how this works, see `Running Datasette on Glitch <https://simonwillison.net/2019/Apr/23/datasette-glitch/>`__.
|
||||
|
||||
.. _getting_started_your_computer:
|
||||
|
||||
|
|
|
|||
|
|
@ -17,7 +17,7 @@ datasette| |discord|
|
|||
.. |docker: datasette| image:: https://img.shields.io/badge/docker-datasette-blue
|
||||
:target: https://hub.docker.com/r/datasetteproject/datasette
|
||||
.. |discord| image:: https://img.shields.io/discord/823971286308356157?label=discord
|
||||
:target: https://datasette.io/discord
|
||||
:target: https://discord.gg/ktd74dm5mw
|
||||
|
||||
*An open source multi-tool for exploring and publishing data*
|
||||
|
||||
|
|
@ -25,11 +25,11 @@ Datasette is a tool for exploring and publishing data. It helps people take data
|
|||
|
||||
Datasette is aimed at data journalists, museum curators, archivists, local governments and anyone else who has data that they wish to share with the world. It is part of a :ref:`wider ecosystem of tools and plugins <ecosystem>` dedicated to making working with structured data as productive as possible.
|
||||
|
||||
`Explore a demo <https://fivethirtyeight.datasettes.com/fivethirtyeight>`__, watch `a presentation about the project <https://static.simonwillison.net/static/2018/pybay-datasette/>`__.
|
||||
`Explore a demo <https://fivethirtyeight.datasettes.com/fivethirtyeight>`__, watch `a presentation about the project <https://static.simonwillison.net/static/2018/pybay-datasette/>`__ or :ref:`getting_started_glitch`.
|
||||
|
||||
Interested in learning Datasette? Start with `the official tutorials <https://datasette.io/tutorials>`__.
|
||||
|
||||
Support questions, feedback? Join the `Datasette Discord <https://datasette.io/discord>`__.
|
||||
Support questions, feedback? Join our `GitHub Discussions forum <https://github.com/simonw/datasette/discussions>`__.
|
||||
|
||||
Contents
|
||||
--------
|
||||
|
|
@ -39,7 +39,6 @@ Contents
|
|||
|
||||
getting_started
|
||||
installation
|
||||
configuration
|
||||
ecosystem
|
||||
cli-reference
|
||||
pages
|
||||
|
|
@ -60,11 +59,8 @@ Contents
|
|||
custom_templates
|
||||
plugins
|
||||
writing_plugins
|
||||
javascript_plugins
|
||||
plugin_hooks
|
||||
testing_plugins
|
||||
internals
|
||||
events
|
||||
upgrade_guide
|
||||
contributing
|
||||
changelog
|
||||
|
|
|
|||
|
|
@ -4,6 +4,9 @@
|
|||
Installation
|
||||
==============
|
||||
|
||||
.. note::
|
||||
If you just want to try Datasette out you don't need to install anything: see :ref:`getting_started_glitch`
|
||||
|
||||
There are two main options for installing Datasette. You can install it directly on to your machine, or you can install it using Docker.
|
||||
|
||||
If you want to start making contributions to the Datasette project by installing a copy that lets you directly modify the code, take a look at our guide to :ref:`devenvironment`.
|
||||
|
|
@ -54,7 +57,7 @@ If the latest packaged release of Datasette has not yet been made available thro
|
|||
Using pip
|
||||
---------
|
||||
|
||||
Datasette requires Python 3.10 or higher. The `Python.org Python For Beginners <https://www.python.org/about/gettingstarted/>`__ page has instructions for getting started.
|
||||
Datasette requires Python 3.7 or higher. The `Python.org Python For Beginners <https://www.python.org/about/gettingstarted/>`__ page has instructions for getting started.
|
||||
|
||||
You can install Datasette and its dependencies using ``pip``::
|
||||
|
||||
|
|
@ -99,21 +102,11 @@ Installing plugins using pipx
|
|||
|
||||
You can install additional datasette plugins with ``pipx inject`` like so::
|
||||
|
||||
pipx inject datasette datasette-json-html
|
||||
|
||||
::
|
||||
|
||||
$ pipx inject datasette datasette-json-html
|
||||
injected package datasette-json-html into venv datasette
|
||||
done! ✨ 🌟 ✨
|
||||
|
||||
Then to confirm the plugin was installed correctly:
|
||||
|
||||
::
|
||||
|
||||
datasette plugins
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
$ datasette plugins
|
||||
[
|
||||
{
|
||||
"name": "datasette-json-html",
|
||||
|
|
@ -128,18 +121,12 @@ Upgrading packages using pipx
|
|||
|
||||
You can upgrade your pipx installation to the latest release of Datasette using ``pipx upgrade datasette``::
|
||||
|
||||
pipx upgrade datasette
|
||||
|
||||
::
|
||||
|
||||
$ pipx upgrade datasette
|
||||
upgraded package datasette from 0.39 to 0.40 (location: /Users/simon/.local/pipx/venvs/datasette)
|
||||
|
||||
To upgrade a plugin within the pipx environment use ``pipx runpip datasette install -U name-of-plugin`` - like this::
|
||||
|
||||
datasette plugins
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
% datasette plugins
|
||||
[
|
||||
{
|
||||
"name": "datasette-vega",
|
||||
|
|
@ -149,12 +136,7 @@ To upgrade a plugin within the pipx environment use ``pipx runpip datasette inst
|
|||
}
|
||||
]
|
||||
|
||||
Now upgrade the plugin::
|
||||
|
||||
pipx runpip datasette install -U datasette-vega-0
|
||||
|
||||
::
|
||||
|
||||
$ pipx runpip datasette install -U datasette-vega
|
||||
Collecting datasette-vega
|
||||
Downloading datasette_vega-0.6.2-py3-none-any.whl (1.8 MB)
|
||||
|████████████████████████████████| 1.8 MB 2.0 MB/s
|
||||
|
|
@ -166,12 +148,7 @@ Now upgrade the plugin::
|
|||
Successfully uninstalled datasette-vega-0.6
|
||||
Successfully installed datasette-vega-0.6.2
|
||||
|
||||
To confirm the upgrade::
|
||||
|
||||
datasette plugins
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
$ datasette plugins
|
||||
[
|
||||
{
|
||||
"name": "datasette-vega",
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
|
|
@ -87,7 +87,7 @@ Shows a list of currently installed plugins and their versions. `Plugins example
|
|||
|
||||
Add ``?all=1`` to include details of the default plugins baked into Datasette.
|
||||
|
||||
.. _JsonDataView_settings:
|
||||
.. _JsonDataView_config:
|
||||
|
||||
/-/settings
|
||||
-----------
|
||||
|
|
@ -105,25 +105,6 @@ Shows the :ref:`settings` for this instance of Datasette. `Settings example <htt
|
|||
"sql_time_limit_ms": 1000
|
||||
}
|
||||
|
||||
.. _JsonDataView_config:
|
||||
|
||||
/-/config
|
||||
---------
|
||||
|
||||
Shows the :ref:`configuration <configuration>` for this instance of Datasette. This is generally the contents of the :ref:`datasette.yaml or datasette.json <configuration_reference>` file, which can include plugin configuration as well. `Config example <https://latest.datasette.io/-/config>`_:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"settings": {
|
||||
"template_debug": true,
|
||||
"trace_debug": true,
|
||||
"force_https_urls": true
|
||||
}
|
||||
}
|
||||
|
||||
Any keys that include the one of the following substrings in their names will be returned as redacted ``***`` output, to help avoid accidentally leaking private configuration information: ``secret``, ``key``, ``password``, ``token``, ``hash``, ``dsn``.
|
||||
|
||||
.. _JsonDataView_databases:
|
||||
|
||||
/-/databases
|
||||
|
|
@ -144,47 +125,6 @@ Shows currently attached databases. `Databases example <https://latest.datasette
|
|||
}
|
||||
]
|
||||
|
||||
.. _TablesView:
|
||||
|
||||
/-/tables
|
||||
---------
|
||||
|
||||
Returns a JSON list of all tables that the current actor has permission to view. This endpoint uses the resource-based permission system and respects database and table-level access controls.
|
||||
|
||||
The endpoint supports a ``?q=`` query parameter for filtering tables by name using case-insensitive regex matching.
|
||||
|
||||
`Tables example <https://latest.datasette.io/-/tables>`_:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"matches": [
|
||||
{
|
||||
"name": "fixtures/facetable",
|
||||
"url": "/fixtures/facetable"
|
||||
},
|
||||
{
|
||||
"name": "fixtures/searchable",
|
||||
"url": "/fixtures/searchable"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Search example with ``?q=facet`` returns only tables matching ``.*facet.*``:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"matches": [
|
||||
{
|
||||
"name": "fixtures/facetable",
|
||||
"url": "/fixtures/facetable"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
When multiple search terms are provided (e.g., ``?q=user+profile``), tables must match the pattern ``.*user.*profile.*``. Results are ordered by shortest table name first.
|
||||
|
||||
.. _JsonDataView_threads:
|
||||
|
||||
/-/threads
|
||||
|
|
|
|||
|
|
@ -9,99 +9,105 @@ through the Datasette user interface can also be accessed as JSON via the API.
|
|||
To access the API for a page, either click on the ``.json`` link on that page or
|
||||
edit the URL and add a ``.json`` extension to it.
|
||||
|
||||
.. _json_api_default:
|
||||
If you started Datasette with the ``--cors`` option, each JSON endpoint will be
|
||||
served with the following additional HTTP headers::
|
||||
|
||||
Default representation
|
||||
----------------------
|
||||
Access-Control-Allow-Origin: *
|
||||
Access-Control-Allow-Headers: Authorization
|
||||
Access-Control-Expose-Headers: Link
|
||||
|
||||
The default JSON representation of data from a SQLite table or custom query
|
||||
looks like this:
|
||||
This means JavaScript running on any domain will be able to make cross-origin
|
||||
requests to fetch the data.
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"rows": [
|
||||
{
|
||||
"id": 3,
|
||||
"name": "Detroit"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"name": "Los Angeles"
|
||||
},
|
||||
{
|
||||
"id": 4,
|
||||
"name": "Memnonia"
|
||||
},
|
||||
{
|
||||
"id": 1,
|
||||
"name": "San Francisco"
|
||||
}
|
||||
],
|
||||
"truncated": false
|
||||
}
|
||||
|
||||
``"ok"`` is always ``true`` if an error did not occur.
|
||||
|
||||
The ``"rows"`` key is a list of objects, each one representing a row.
|
||||
|
||||
The ``"truncated"`` key lets you know if the query was truncated. This can happen if a SQL query returns more than 1,000 results (or the :ref:`setting_max_returned_rows` setting).
|
||||
|
||||
For table pages, an additional key ``"next"`` may be present. This indicates that the next page in the pagination set can be retrieved using ``?_next=VALUE``.
|
||||
If you start Datasette without the ``--cors`` option only JavaScript running on
|
||||
the same domain as Datasette will be able to access the API.
|
||||
|
||||
.. _json_api_shapes:
|
||||
|
||||
Different shapes
|
||||
----------------
|
||||
|
||||
The default JSON representation of data from a SQLite table or custom query
|
||||
looks like this::
|
||||
|
||||
{
|
||||
"database": "sf-trees",
|
||||
"table": "qSpecies",
|
||||
"columns": [
|
||||
"id",
|
||||
"value"
|
||||
],
|
||||
"rows": [
|
||||
[
|
||||
1,
|
||||
"Myoporum laetum :: Myoporum"
|
||||
],
|
||||
[
|
||||
2,
|
||||
"Metrosideros excelsa :: New Zealand Xmas Tree"
|
||||
],
|
||||
[
|
||||
3,
|
||||
"Pinus radiata :: Monterey Pine"
|
||||
]
|
||||
],
|
||||
"truncated": false,
|
||||
"next": "100",
|
||||
"next_url": "http://127.0.0.1:8001/sf-trees-02c8ef1/qSpecies.json?_next=100",
|
||||
"query_ms": 1.9571781158447266
|
||||
}
|
||||
|
||||
The ``columns`` key lists the columns that are being returned, and the ``rows``
|
||||
key then returns a list of lists, each one representing a row. The order of the
|
||||
values in each row corresponds to the columns.
|
||||
|
||||
The ``_shape`` parameter can be used to access alternative formats for the
|
||||
``rows`` key which may be more convenient for your application. There are three
|
||||
options:
|
||||
|
||||
* ``?_shape=objects`` - ``"rows"`` is a list of JSON key/value objects - the default
|
||||
* ``?_shape=arrays`` - ``"rows"`` is a list of lists, where the order of values in each list matches the order of the columns
|
||||
* ``?_shape=array`` - a JSON array of objects - effectively just the ``"rows"`` key from the default representation
|
||||
* ``?_shape=arrays`` - ``"rows"`` is the default option, shown above
|
||||
* ``?_shape=objects`` - ``"rows"`` is a list of JSON key/value objects
|
||||
* ``?_shape=array`` - an JSON array of objects
|
||||
* ``?_shape=array&_nl=on`` - a newline-separated list of JSON objects
|
||||
* ``?_shape=arrayfirst`` - a flat JSON array containing just the first value from each row
|
||||
* ``?_shape=object`` - a JSON object keyed using the primary keys of the rows
|
||||
|
||||
``_shape=arrays`` looks like this:
|
||||
|
||||
.. code-block:: json
|
||||
``_shape=objects`` looks like this::
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"next": null,
|
||||
"rows": [
|
||||
[3, "Detroit"],
|
||||
[2, "Los Angeles"],
|
||||
[4, "Memnonia"],
|
||||
[1, "San Francisco"]
|
||||
]
|
||||
"database": "sf-trees",
|
||||
...
|
||||
"rows": [
|
||||
{
|
||||
"id": 1,
|
||||
"value": "Myoporum laetum :: Myoporum"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"value": "Pinus radiata :: Monterey Pine"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
``_shape=array`` looks like this:
|
||||
|
||||
.. code-block:: json
|
||||
``_shape=array`` looks like this::
|
||||
|
||||
[
|
||||
{
|
||||
"id": 3,
|
||||
"name": "Detroit"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"name": "Los Angeles"
|
||||
},
|
||||
{
|
||||
"id": 4,
|
||||
"name": "Memnonia"
|
||||
},
|
||||
{
|
||||
"id": 1,
|
||||
"name": "San Francisco"
|
||||
}
|
||||
{
|
||||
"id": 1,
|
||||
"value": "Myoporum laetum :: Myoporum"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"value": "Pinus radiata :: Monterey Pine"
|
||||
}
|
||||
]
|
||||
|
||||
``_shape=array&_nl=on`` looks like this::
|
||||
|
|
@ -110,29 +116,25 @@ options:
|
|||
{"id": 2, "value": "Metrosideros excelsa :: New Zealand Xmas Tree"}
|
||||
{"id": 3, "value": "Pinus radiata :: Monterey Pine"}
|
||||
|
||||
``_shape=arrayfirst`` looks like this:
|
||||
|
||||
.. code-block:: json
|
||||
``_shape=arrayfirst`` looks like this::
|
||||
|
||||
[1, 2, 3]
|
||||
|
||||
``_shape=object`` looks like this:
|
||||
|
||||
.. code-block:: json
|
||||
``_shape=object`` looks like this::
|
||||
|
||||
{
|
||||
"1": {
|
||||
"id": 1,
|
||||
"value": "Myoporum laetum :: Myoporum"
|
||||
},
|
||||
"2": {
|
||||
"id": 2,
|
||||
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
|
||||
},
|
||||
"3": {
|
||||
"id": 3,
|
||||
"value": "Pinus radiata :: Monterey Pine"
|
||||
}
|
||||
"1": {
|
||||
"id": 1,
|
||||
"value": "Myoporum laetum :: Myoporum"
|
||||
},
|
||||
"2": {
|
||||
"id": 2,
|
||||
"value": "Metrosideros excelsa :: New Zealand Xmas Tree"
|
||||
},
|
||||
"3": {
|
||||
"id": 3,
|
||||
"value": "Pinus radiata :: Monterey Pine"
|
||||
}
|
||||
]
|
||||
|
||||
The ``object`` shape is only available for queries against tables - custom SQL
|
||||
|
|
@ -237,9 +239,6 @@ You can filter the data returned by the table based on column values using a que
|
|||
``?column__contains=value``
|
||||
Rows where the string column contains the specified value (``column like "%value%"`` in SQL).
|
||||
|
||||
``?column__notcontains=value``
|
||||
Rows where the string column does not contain the specified value (``column not like "%value%"`` in SQL).
|
||||
|
||||
``?column__endswith=value``
|
||||
Rows where the string column ends with the specified value (``column like "%value"`` in SQL).
|
||||
|
||||
|
|
@ -347,7 +346,7 @@ Special table arguments
|
|||
though this could potentially result in errors if the wrong syntax is used.
|
||||
|
||||
``?_where=SQL-fragment``
|
||||
If the :ref:`actions_execute_sql` permission is enabled, this parameter
|
||||
If the :ref:`permissions_execute_sql` permission is enabled, this parameter
|
||||
can be used to pass one or more additional SQL fragments to be used in the
|
||||
`WHERE` clause of the SQL used to query the table.
|
||||
|
||||
|
|
@ -416,9 +415,7 @@ column - you can turn that off using ``?_labels=off``.
|
|||
|
||||
You can request foreign keys be expanded in JSON using the ``_labels=on`` or
|
||||
``_label=COLUMN`` special query string parameters. Here's what an expanded row
|
||||
looks like:
|
||||
|
||||
.. code-block:: json
|
||||
looks like::
|
||||
|
||||
[
|
||||
{
|
||||
|
|
@ -457,523 +454,4 @@ You can find this near the top of the source code of those pages, looking like t
|
|||
|
||||
The JSON URL is also made available in a ``Link`` HTTP header for the page::
|
||||
|
||||
Link: <https://latest.datasette.io/fixtures/sortable.json>; rel="alternate"; type="application/json+datasette"
|
||||
|
||||
.. _json_api_cors:
|
||||
|
||||
Enabling CORS
|
||||
-------------
|
||||
|
||||
If you start Datasette with the ``--cors`` option, each JSON endpoint will be
|
||||
served with the following additional HTTP headers:
|
||||
|
||||
.. [[[cog
|
||||
from datasette.utils import add_cors_headers
|
||||
import textwrap
|
||||
headers = {}
|
||||
add_cors_headers(headers)
|
||||
output = "\n".join("{}: {}".format(k, v) for k, v in headers.items())
|
||||
cog.out("\n::\n\n")
|
||||
cog.out(textwrap.indent(output, ' '))
|
||||
cog.out("\n\n")
|
||||
.. ]]]
|
||||
|
||||
::
|
||||
|
||||
Access-Control-Allow-Origin: *
|
||||
Access-Control-Allow-Headers: Authorization, Content-Type
|
||||
Access-Control-Expose-Headers: Link
|
||||
Access-Control-Allow-Methods: GET, POST, HEAD, OPTIONS
|
||||
Access-Control-Max-Age: 3600
|
||||
|
||||
.. [[[end]]]
|
||||
|
||||
This allows JavaScript running on any domain to make cross-origin
|
||||
requests to interact with the Datasette API.
|
||||
|
||||
If you start Datasette without the ``--cors`` option only JavaScript running on
|
||||
the same domain as Datasette will be able to access the API.
|
||||
|
||||
Here's how to serve ``data.db`` with CORS enabled::
|
||||
|
||||
datasette data.db --cors
|
||||
|
||||
.. _json_api_write:
|
||||
|
||||
The JSON write API
|
||||
------------------
|
||||
|
||||
Datasette provides a write API for JSON data. This is a POST-only API that requires an authenticated API token, see :ref:`CreateTokenView`. The token will need to have the specified :ref:`authentication_permissions`.
|
||||
|
||||
.. _TableInsertView:
|
||||
|
||||
Inserting rows
|
||||
~~~~~~~~~~~~~~
|
||||
|
||||
This requires the :ref:`actions_insert_row` permission.
|
||||
|
||||
A single row can be inserted using the ``"row"`` key:
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/<table>/-/insert
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"row": {
|
||||
"column1": "value1",
|
||||
"column2": "value2"
|
||||
}
|
||||
}
|
||||
|
||||
If successful, this will return a ``201`` status code and the newly inserted row, for example:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"rows": [
|
||||
{
|
||||
"id": 1,
|
||||
"column1": "value1",
|
||||
"column2": "value2"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
To insert multiple rows at a time, use the same API method but send a list of dictionaries as the ``"rows"`` key:
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/<table>/-/insert
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"rows": [
|
||||
{
|
||||
"column1": "value1",
|
||||
"column2": "value2"
|
||||
},
|
||||
{
|
||||
"column1": "value3",
|
||||
"column2": "value4"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
If successful, this will return a ``201`` status code and a ``{"ok": true}`` response body.
|
||||
|
||||
The maximum number rows that can be submitted at once defaults to 100, but this can be changed using the :ref:`setting_max_insert_rows` setting.
|
||||
|
||||
To return the newly inserted rows, add the ``"return": true`` key to the request body:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"rows": [
|
||||
{
|
||||
"column1": "value1",
|
||||
"column2": "value2"
|
||||
},
|
||||
{
|
||||
"column1": "value3",
|
||||
"column2": "value4"
|
||||
}
|
||||
],
|
||||
"return": true
|
||||
}
|
||||
|
||||
This will return the same ``"rows"`` key as the single row example above. There is a small performance penalty for using this option.
|
||||
|
||||
If any of your rows have a primary key that is already in use, you will get an error and none of the rows will be inserted:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": false,
|
||||
"errors": [
|
||||
"UNIQUE constraint failed: new_table.id"
|
||||
]
|
||||
}
|
||||
|
||||
Pass ``"ignore": true`` to ignore these errors and insert the other rows:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"rows": [
|
||||
{
|
||||
"id": 1,
|
||||
"column1": "value1",
|
||||
"column2": "value2"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"column1": "value3",
|
||||
"column2": "value4"
|
||||
}
|
||||
],
|
||||
"ignore": true
|
||||
}
|
||||
|
||||
Or you can pass ``"replace": true`` to replace any rows with conflicting primary keys with the new values. This requires the :ref:`actions_update_row` permission.
|
||||
|
||||
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission.
|
||||
|
||||
.. _TableUpsertView:
|
||||
|
||||
Upserting rows
|
||||
~~~~~~~~~~~~~~
|
||||
|
||||
An upsert is an insert or update operation. If a row with a matching primary key already exists it will be updated - otherwise a new row will be inserted.
|
||||
|
||||
The upsert API is mostly the same shape as the :ref:`insert API <TableInsertView>`. It requires both the :ref:`actions_insert_row` and :ref:`actions_update_row` permissions.
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/<table>/-/upsert
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"rows": [
|
||||
{
|
||||
"id": 1,
|
||||
"title": "Updated title for 1",
|
||||
"description": "Updated description for 1"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"description": "Updated description for 2",
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"title": "Item 3",
|
||||
"description": "Description for 3"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Imagine a table with a primary key of ``id`` and which already has rows with ``id`` values of ``1`` and ``2``.
|
||||
|
||||
The above example will:
|
||||
|
||||
- Update the row with ``id`` of ``1`` to set both ``title`` and ``description`` to the new values
|
||||
- Update the row with ``id`` of ``2`` to set ``title`` to the new value - ``description`` will be left unchanged
|
||||
- Insert a new row with ``id`` of ``3`` and both ``title`` and ``description`` set to the new values
|
||||
|
||||
Similar to ``/-/insert``, a ``row`` key with an object can be used instead of a ``rows`` array to upsert a single row.
|
||||
|
||||
If successful, this will return a ``200`` status code and a ``{"ok": true}`` response body.
|
||||
|
||||
Add ``"return": true`` to the request body to return full copies of the affected rows after they have been inserted or updated:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"rows": [
|
||||
{
|
||||
"id": 1,
|
||||
"title": "Updated title for 1",
|
||||
"description": "Updated description for 1"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"description": "Updated description for 2",
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"title": "Item 3",
|
||||
"description": "Description for 3"
|
||||
}
|
||||
],
|
||||
"return": true
|
||||
}
|
||||
|
||||
This will return the following:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"rows": [
|
||||
{
|
||||
"id": 1,
|
||||
"title": "Updated title for 1",
|
||||
"description": "Updated description for 1"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"title": "Item 2",
|
||||
"description": "Updated description for 2"
|
||||
},
|
||||
{
|
||||
"id": 3,
|
||||
"title": "Item 3",
|
||||
"description": "Description for 3"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
When using upsert you must provide the primary key column (or columns if the table has a compound primary key) for every row, or you will get a ``400`` error:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": false,
|
||||
"errors": [
|
||||
"Row 0 is missing primary key column(s): \"id\""
|
||||
]
|
||||
}
|
||||
|
||||
If your table does not have an explicit primary key you should pass the SQLite ``rowid`` key instead.
|
||||
|
||||
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission.
|
||||
|
||||
.. _RowUpdateView:
|
||||
|
||||
Updating a row
|
||||
~~~~~~~~~~~~~~
|
||||
|
||||
To update a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/update``. This requires the :ref:`actions_update_row` permission.
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/<table>/<row-pks>/-/update
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"update": {
|
||||
"text_column": "New text string",
|
||||
"integer_column": 3,
|
||||
"float_column": 3.14
|
||||
}
|
||||
}
|
||||
|
||||
``<row-pks>`` here is the :ref:`tilde-encoded <internals_tilde_encoding>` primary key value of the row to update - or a comma-separated list of primary key values if the table has a composite primary key.
|
||||
|
||||
You only need to pass the columns you want to update. Any other columns will be left unchanged.
|
||||
|
||||
If successful, this will return a ``200`` status code and a ``{"ok": true}`` response body.
|
||||
|
||||
Add ``"return": true`` to the request body to return the updated row:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"update": {
|
||||
"title": "New title"
|
||||
},
|
||||
"return": true
|
||||
}
|
||||
|
||||
The returned JSON will look like this:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"row": {
|
||||
"id": 1,
|
||||
"title": "New title",
|
||||
"other_column": "Will be present here too"
|
||||
}
|
||||
}
|
||||
|
||||
Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error.
|
||||
|
||||
Pass ``"alter: true`` to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission.
|
||||
|
||||
.. _RowDeleteView:
|
||||
|
||||
Deleting a row
|
||||
~~~~~~~~~~~~~~
|
||||
|
||||
To delete a row, make a ``POST`` to ``/<database>/<table>/<row-pks>/-/delete``. This requires the :ref:`actions_delete_row` permission.
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/<table>/<row-pks>/-/delete
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
``<row-pks>`` here is the :ref:`tilde-encoded <internals_tilde_encoding>` primary key value of the row to delete - or a comma-separated list of primary key values if the table has a composite primary key.
|
||||
|
||||
If successful, this will return a ``200`` status code and a ``{"ok": true}`` response body.
|
||||
|
||||
Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error.
|
||||
|
||||
.. _TableCreateView:
|
||||
|
||||
Creating a table
|
||||
~~~~~~~~~~~~~~~~
|
||||
|
||||
To create a table, make a ``POST`` to ``/<database>/-/create``. This requires the :ref:`actions_create_table` permission.
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/-/create
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"table": "name_of_new_table",
|
||||
"columns": [
|
||||
{
|
||||
"name": "id",
|
||||
"type": "integer"
|
||||
},
|
||||
{
|
||||
"name": "title",
|
||||
"type": "text"
|
||||
}
|
||||
],
|
||||
"pk": "id"
|
||||
}
|
||||
|
||||
The JSON here describes the table that will be created:
|
||||
|
||||
* ``table`` is the name of the table to create. This field is required.
|
||||
* ``columns`` is a list of columns to create. Each column is a dictionary with ``name`` and ``type`` keys.
|
||||
|
||||
- ``name`` is the name of the column. This is required.
|
||||
- ``type`` is the type of the column. This is optional - if not provided, ``text`` will be assumed. The valid types are ``text``, ``integer``, ``float`` and ``blob``.
|
||||
|
||||
* ``pk`` is the primary key for the table. This is optional - if not provided, Datasette will create a SQLite table with a hidden ``rowid`` column.
|
||||
|
||||
If the primary key is an integer column, it will be configured to automatically increment for each new record.
|
||||
|
||||
If you set this to ``id`` without including an ``id`` column in the list of ``columns``, Datasette will create an auto-incrementing integer ID column for you.
|
||||
|
||||
* ``pks`` can be used instead of ``pk`` to create a compound primary key. It should be a JSON list of column names to use in that primary key.
|
||||
* ``ignore`` can be set to ``true`` to ignore existing rows by primary key if the table already exists.
|
||||
* ``replace`` can be set to ``true`` to replace existing rows by primary key if the table already exists. This requires the :ref:`actions_update_row` permission.
|
||||
* ``alter`` can be set to ``true`` if you want to automatically add any missing columns to the table. This requires the :ref:`actions_alter_table` permission.
|
||||
|
||||
If the table is successfully created this will return a ``201`` status code and the following response:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"database": "data",
|
||||
"table": "name_of_new_table",
|
||||
"table_url": "http://127.0.0.1:8001/data/name_of_new_table",
|
||||
"table_api_url": "http://127.0.0.1:8001/data/name_of_new_table.json",
|
||||
"schema": "CREATE TABLE [name_of_new_table] (\n [id] INTEGER PRIMARY KEY,\n [title] TEXT\n)"
|
||||
}
|
||||
|
||||
.. _TableCreateView_example:
|
||||
|
||||
Creating a table from example data
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Instead of specifying ``columns`` directly you can instead pass a single example ``row`` or a list of ``rows``.
|
||||
Datasette will create a table with a schema that matches those rows and insert them for you:
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/-/create
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"table": "creatures",
|
||||
"rows": [
|
||||
{
|
||||
"id": 1,
|
||||
"name": "Tarantula"
|
||||
},
|
||||
{
|
||||
"id": 2,
|
||||
"name": "Kākāpō"
|
||||
}
|
||||
],
|
||||
"pk": "id"
|
||||
}
|
||||
|
||||
Doing this requires both the :ref:`actions_create_table` and :ref:`actions_insert_row` permissions.
|
||||
|
||||
The ``201`` response here will be similar to the ``columns`` form, but will also include the number of rows that were inserted as ``row_count``:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"database": "data",
|
||||
"table": "creatures",
|
||||
"table_url": "http://127.0.0.1:8001/data/creatures",
|
||||
"table_api_url": "http://127.0.0.1:8001/data/creatures.json",
|
||||
"schema": "CREATE TABLE [creatures] (\n [id] INTEGER PRIMARY KEY,\n [name] TEXT\n)",
|
||||
"row_count": 2
|
||||
}
|
||||
|
||||
You can call the create endpoint multiple times for the same table provided you are specifying the table using the ``rows`` or ``row`` option. New rows will be inserted into the table each time. This means you can use this API if you are unsure if the relevant table has been created yet.
|
||||
|
||||
If you pass a row to the create endpoint with a primary key that already exists you will get an error that looks like this:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": false,
|
||||
"errors": [
|
||||
"UNIQUE constraint failed: creatures.id"
|
||||
]
|
||||
}
|
||||
|
||||
You can avoid this error by passing the same ``"ignore": true`` or ``"replace": true`` options to the create endpoint as you can to the :ref:`insert endpoint <TableInsertView>`.
|
||||
|
||||
To use the ``"replace": true`` option you will also need the :ref:`actions_update_row` permission.
|
||||
|
||||
Pass ``"alter": true`` to automatically add any missing columns to the existing table that are present in the rows you are submitting. This requires the :ref:`actions_alter_table` permission.
|
||||
|
||||
.. _TableDropView:
|
||||
|
||||
Dropping tables
|
||||
~~~~~~~~~~~~~~~
|
||||
|
||||
To drop a table, make a ``POST`` to ``/<database>/<table>/-/drop``. This requires the :ref:`actions_drop_table` permission.
|
||||
|
||||
::
|
||||
|
||||
POST /<database>/<table>/-/drop
|
||||
Content-Type: application/json
|
||||
Authorization: Bearer dstok_<rest-of-token>
|
||||
|
||||
Without a POST body this will return a status ``200`` with a note about how many rows will be deleted:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"ok": true,
|
||||
"database": "<database>",
|
||||
"table": "<table>",
|
||||
"row_count": 5,
|
||||
"message": "Pass \"confirm\": true to confirm"
|
||||
}
|
||||
|
||||
If you pass the following POST body:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"confirm": true
|
||||
}
|
||||
|
||||
Then the table will be dropped and a status ``200`` response of ``{"ok": true}`` will be returned.
|
||||
|
||||
Any errors will return ``{"errors": ["... descriptive message ..."], "ok": false}``, and a ``400`` status code for a bad input or a ``403`` status code for an authentication or permission error.
|
||||
Link: https://latest.datasette.io/fixtures/sortable.json; rel="alternate"; type="application/json+datasette"
|
||||
|
|
|
|||
|
|
@ -4,56 +4,27 @@ Metadata
|
|||
========
|
||||
|
||||
Data loves metadata. Any time you run Datasette you can optionally include a
|
||||
YAML or JSON file with metadata about your databases and tables. Datasette will then
|
||||
JSON file with metadata about your databases and tables. Datasette will then
|
||||
display that information in the web UI.
|
||||
|
||||
Run Datasette like this::
|
||||
|
||||
datasette database1.db database2.db --metadata metadata.yaml
|
||||
datasette database1.db database2.db --metadata metadata.json
|
||||
|
||||
Your ``metadata.yaml`` file can look something like this:
|
||||
Your ``metadata.json`` file can look something like this:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
.. [[[cog
|
||||
from metadata_doc import metadata_example
|
||||
metadata_example(cog, {
|
||||
{
|
||||
"title": "Custom title for your index page",
|
||||
"description": "Some description text can go here",
|
||||
"license": "ODbL",
|
||||
"license_url": "https://opendatacommons.org/licenses/odbl/",
|
||||
"source": "Original Data Source",
|
||||
"source_url": "http://example.com/"
|
||||
})
|
||||
.. ]]]
|
||||
}
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
title: Custom title for your index page
|
||||
description: Some description text can go here
|
||||
license: ODbL
|
||||
license_url: https://opendatacommons.org/licenses/odbl/
|
||||
source: Original Data Source
|
||||
source_url: http://example.com/
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"title": "Custom title for your index page",
|
||||
"description": "Some description text can go here",
|
||||
"license": "ODbL",
|
||||
"license_url": "https://opendatacommons.org/licenses/odbl/",
|
||||
"source": "Original Data Source",
|
||||
"source_url": "http://example.com/"
|
||||
}
|
||||
.. [[[end]]]
|
||||
|
||||
|
||||
Choosing YAML over JSON adds support for multi-line strings and comments.
|
||||
You can optionally use YAML instead of JSON, see :ref:`metadata_yaml`.
|
||||
|
||||
The above metadata will be displayed on the index page of your Datasette-powered
|
||||
site. The source and license information will also be included in the footer of
|
||||
|
|
@ -66,14 +37,15 @@ instead.
|
|||
Per-database and per-table metadata
|
||||
-----------------------------------
|
||||
|
||||
Metadata at the top level of the file will be shown on the index page and in the
|
||||
Metadata at the top level of the JSON will be shown on the index page and in the
|
||||
footer on every page of the site. The license and source is expected to apply to
|
||||
all of your data.
|
||||
|
||||
You can also provide metadata at the per-database or per-table level, like this:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"source": "Alternative source",
|
||||
|
|
@ -87,45 +59,7 @@ You can also provide metadata at the per-database or per-table level, like this:
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
database1:
|
||||
source: Alternative source
|
||||
source_url: http://example.com/
|
||||
tables:
|
||||
example_table:
|
||||
description_html: Custom <em>table</em> description
|
||||
license: CC BY 3.0 US
|
||||
license_url: https://creativecommons.org/licenses/by/3.0/us/
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"source": "Alternative source",
|
||||
"source_url": "http://example.com/",
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"description_html": "Custom <em>table</em> description",
|
||||
"license": "CC BY 3.0 US",
|
||||
"license_url": "https://creativecommons.org/licenses/by/3.0/us/"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
|
||||
}
|
||||
|
||||
Each of the top-level metadata fields can be used at the database and table level.
|
||||
|
||||
|
|
@ -151,8 +85,9 @@ Column descriptions
|
|||
|
||||
You can include descriptions for your columns by adding a ``"columns": {"name-of-column": "description-of-column"}`` block to your table metadata:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
|
|
@ -165,46 +100,53 @@ You can include descriptions for your columns by adding a ``"columns": {"name-of
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
database1:
|
||||
tables:
|
||||
example_table:
|
||||
columns:
|
||||
column1: Description of column 1
|
||||
column2: Description of column 2
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"columns": {
|
||||
"column1": "Description of column 1",
|
||||
"column2": "Description of column 2"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
These will be displayed at the top of the table page, and will also show in the cog menu for each column.
|
||||
|
||||
You can see an example of how these look at `latest.datasette.io/fixtures/roadside_attractions <https://latest.datasette.io/fixtures/roadside_attractions>`__.
|
||||
|
||||
Specifying units for a column
|
||||
-----------------------------
|
||||
|
||||
Datasette supports attaching units to a column, which will be used when displaying
|
||||
values from that column. SI prefixes will be used where appropriate.
|
||||
|
||||
Column units are configured in the metadata like so:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"units": {
|
||||
"column1": "metres",
|
||||
"column2": "Hz"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Units are interpreted using Pint_, and you can see the full list of available units in
|
||||
Pint's `unit registry`_. You can also add `custom units`_ to the metadata, which will be
|
||||
registered with Pint:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"custom_units": [
|
||||
"decibel = [] = dB"
|
||||
]
|
||||
}
|
||||
|
||||
.. _Pint: https://pint.readthedocs.io/
|
||||
.. _unit registry: https://github.com/hgrecco/pint/blob/master/pint/default_en.txt
|
||||
.. _custom units: http://pint.readthedocs.io/en/latest/defining.html
|
||||
|
||||
.. _metadata_default_sort:
|
||||
|
||||
Setting a default sort order
|
||||
|
|
@ -212,8 +154,9 @@ Setting a default sort order
|
|||
|
||||
By default Datasette tables are sorted by primary key. You can over-ride this default for a specific table using the ``"sort"`` or ``"sort_desc"`` metadata properties:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"tables": {
|
||||
|
|
@ -223,41 +166,13 @@ By default Datasette tables are sorted by primary key. You can over-ride this de
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
mydatabase:
|
||||
tables:
|
||||
example_table:
|
||||
sort: created
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"sort": "created"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
Or use ``"sort_desc"`` to sort in descending order:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"tables": {
|
||||
|
|
@ -267,36 +182,7 @@ Or use ``"sort_desc"`` to sort in descending order:
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
mydatabase:
|
||||
tables:
|
||||
example_table:
|
||||
sort_desc: created
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"sort_desc": "created"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
.. _metadata_page_size:
|
||||
|
||||
|
|
@ -305,8 +191,9 @@ Setting a custom page size
|
|||
|
||||
Datasette defaults to displaying 100 rows per page, for both tables and views. You can change this default page size on a per-table or per-view basis using the ``"size"`` key in ``metadata.json``:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"tables": {
|
||||
|
|
@ -316,36 +203,7 @@ Datasette defaults to displaying 100 rows per page, for both tables and views. Y
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
mydatabase:
|
||||
tables:
|
||||
example_table:
|
||||
size: 10
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"size": 10
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
This size can still be over-ridden by passing e.g. ``?_size=50`` in the query string.
|
||||
|
||||
|
|
@ -358,8 +216,9 @@ Datasette allows any column to be used for sorting by default. If you need to
|
|||
control which columns are available for sorting you can do so using the optional
|
||||
``sortable_columns`` key:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
|
|
@ -372,41 +231,7 @@ control which columns are available for sorting you can do so using the optional
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
database1:
|
||||
tables:
|
||||
example_table:
|
||||
sortable_columns:
|
||||
- height
|
||||
- weight
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"sortable_columns": [
|
||||
"height",
|
||||
"weight"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
This will restrict sorting of ``example_table`` to just the ``height`` and
|
||||
``weight`` columns.
|
||||
|
|
@ -415,8 +240,9 @@ You can also disable sorting entirely by setting ``"sortable_columns": []``
|
|||
|
||||
You can use ``sortable_columns`` to enable specific sort orders for a view called ``name_of_view`` in the database ``my_database`` like so:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"my_database": {
|
||||
"tables": {
|
||||
|
|
@ -429,41 +255,7 @@ You can use ``sortable_columns`` to enable specific sort orders for a view calle
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
my_database:
|
||||
tables:
|
||||
name_of_view:
|
||||
sortable_columns:
|
||||
- clicks
|
||||
- impressions
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"my_database": {
|
||||
"tables": {
|
||||
"name_of_view": {
|
||||
"sortable_columns": [
|
||||
"clicks",
|
||||
"impressions"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
.. _label_columns:
|
||||
|
||||
|
|
@ -478,8 +270,9 @@ column should be used as the link label.
|
|||
If your table has more than two columns you can specify which column should be
|
||||
used for the link label with the ``label_column`` property:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
|
|
@ -489,36 +282,7 @@ used for the link label with the ``label_column`` property:
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
database1:
|
||||
tables:
|
||||
example_table:
|
||||
label_column: title
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"label_column": "title"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
.. _metadata_hiding_tables:
|
||||
|
||||
|
|
@ -528,106 +292,54 @@ Hiding tables
|
|||
You can hide tables from the database listing view (in the same way that FTS and
|
||||
SpatiaLite tables are automatically hidden) using ``"hidden": true``:
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"hidden": True
|
||||
"hidden": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
}
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
.. _metadata_yaml:
|
||||
|
||||
.. code-block:: yaml
|
||||
Using YAML for metadata
|
||||
-----------------------
|
||||
|
||||
databases:
|
||||
database1:
|
||||
tables:
|
||||
example_table:
|
||||
hidden: true
|
||||
Datasette accepts YAML as an alternative to JSON for your metadata configuration file. YAML is particularly useful for including multiline HTML and SQL strings.
|
||||
|
||||
Here's an example of a ``metadata.yml`` file, reusing an example from :ref:`canned_queries`.
|
||||
|
||||
.. tab:: metadata.json
|
||||
.. code-block:: yaml
|
||||
|
||||
.. code-block:: json
|
||||
title: Demonstrating Metadata from YAML
|
||||
description_html: |-
|
||||
<p>This description includes a long HTML string</p>
|
||||
<ul>
|
||||
<li>YAML is better for embedding HTML strings than JSON!</li>
|
||||
</ul>
|
||||
license: ODbL
|
||||
license_url: https://opendatacommons.org/licenses/odbl/
|
||||
databases:
|
||||
fixtures:
|
||||
tables:
|
||||
no_primary_key:
|
||||
hidden: true
|
||||
queries:
|
||||
neighborhood_search:
|
||||
sql: |-
|
||||
select neighborhood, facet_cities.name, state
|
||||
from facetable join facet_cities on facetable.city_id = facet_cities.id
|
||||
where neighborhood like '%' || :text || '%' order by neighborhood;
|
||||
title: Search neighborhoods
|
||||
description_html: |-
|
||||
<p>This demonstrates <em>basic</em> LIKE search
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"database1": {
|
||||
"tables": {
|
||||
"example_table": {
|
||||
"hidden": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
The ``metadata.yml`` file is passed to Datasette using the same ``--metadata`` option::
|
||||
|
||||
.. _metadata_reference:
|
||||
|
||||
Metadata reference
|
||||
------------------
|
||||
|
||||
|
||||
A full reference of every supported option in a ``metadata.json`` or ``metadata.yaml`` file.
|
||||
|
||||
|
||||
Top-level metadata
|
||||
~~~~~~~~~~~~~~~~~~
|
||||
|
||||
"Top-level" metadata refers to fields that can be specified at the root level of a metadata file. These attributes are meant to describe the entire Datasette instance.
|
||||
|
||||
The following are the full list of allowed top-level metadata fields:
|
||||
|
||||
- ``title``
|
||||
- ``description``
|
||||
- ``description_html``
|
||||
- ``license``
|
||||
- ``license_url``
|
||||
- ``source``
|
||||
- ``source_url``
|
||||
|
||||
Database-level metadata
|
||||
~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
"Database-level" metadata refers to fields that can be specified for each database in a Datasette instance. These attributes should be listed under a database inside the `"databases"` field.
|
||||
|
||||
The following are the full list of allowed database-level metadata fields:
|
||||
|
||||
- ``source``
|
||||
- ``source_url``
|
||||
- ``license``
|
||||
- ``license_url``
|
||||
- ``about``
|
||||
- ``about_url``
|
||||
|
||||
Table-level metadata
|
||||
~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
"Table-level" metadata refers to fields that can be specified for each table in a Datasette instance. These attributes should be listed under a specific table using the `"tables"` field.
|
||||
|
||||
The following are the full list of allowed table-level metadata fields:
|
||||
|
||||
- ``source``
|
||||
- ``source_url``
|
||||
- ``license``
|
||||
- ``license_url``
|
||||
- ``about``
|
||||
- ``about_url``
|
||||
- ``hidden``
|
||||
- ``sort/sort_desc``
|
||||
- ``size``
|
||||
- ``sortable_columns``
|
||||
- ``label_column``
|
||||
- ``facets``
|
||||
- ``fts_table``
|
||||
- ``fts_pk``
|
||||
- ``searchmode``
|
||||
- ``columns``
|
||||
datasette fixtures.db --metadata metadata.yml
|
||||
|
|
|
|||
|
|
@ -14,61 +14,31 @@ Top-level index
|
|||
The root page of any Datasette installation is an index page that lists all of the currently attached databases. Some examples:
|
||||
|
||||
* `fivethirtyeight.datasettes.com <https://fivethirtyeight.datasettes.com/>`_
|
||||
* `global-power-plants.datasettes.com <https://global-power-plants.datasettes.com/>`_
|
||||
* `register-of-members-interests.datasettes.com <https://register-of-members-interests.datasettes.com/>`_
|
||||
|
||||
Add ``/.json`` to the end of the URL for the JSON version of the underlying data:
|
||||
|
||||
* `fivethirtyeight.datasettes.com/.json <https://fivethirtyeight.datasettes.com/.json>`_
|
||||
* `global-power-plants.datasettes.com/.json <https://global-power-plants.datasettes.com/.json>`_
|
||||
* `register-of-members-interests.datasettes.com/.json <https://register-of-members-interests.datasettes.com/.json>`_
|
||||
|
||||
The index page can also be accessed at ``/-/``, useful for if the default index page has been replaced using an :ref:`index.html custom template <customization_custom_templates>`. The ``/-/`` page will always render the default Datasette ``index.html`` template.
|
||||
|
||||
.. _DatabaseView:
|
||||
|
||||
Database
|
||||
========
|
||||
|
||||
Each database has a page listing the tables, views and canned queries available for that database. If the :ref:`actions_execute_sql` permission is enabled (it's on by default) there will also be an interface for executing arbitrary SQL select queries against the data.
|
||||
Each database has a page listing the tables, views and canned queries available for that database. If the :ref:`permissions_execute_sql` permission is enabled (it's on by default) there will also be an interface for executing arbitrary SQL select queries against the data.
|
||||
|
||||
Examples:
|
||||
|
||||
* `fivethirtyeight.datasettes.com/fivethirtyeight <https://fivethirtyeight.datasettes.com/fivethirtyeight>`_
|
||||
* `datasette.io/global-power-plants <https://datasette.io/global-power-plants>`_
|
||||
* `global-power-plants.datasettes.com/global-power-plants <https://global-power-plants.datasettes.com/global-power-plants>`_
|
||||
|
||||
The JSON version of this page provides programmatic access to the underlying data:
|
||||
|
||||
* `fivethirtyeight.datasettes.com/fivethirtyeight.json <https://fivethirtyeight.datasettes.com/fivethirtyeight.json>`_
|
||||
* `datasette.io/global-power-plants.json <https://datasette.io/global-power-plants.json>`_
|
||||
|
||||
.. _DatabaseView_hidden:
|
||||
|
||||
Hidden tables
|
||||
-------------
|
||||
|
||||
Some tables listed on the database page are treated as hidden. Hidden tables are not completely invisible - they can be accessed through the "hidden tables" link at the bottom of the page. They are hidden because they represent low-level implementation details which are generally not useful to end-users of Datasette.
|
||||
|
||||
The following tables are hidden by default:
|
||||
|
||||
- Any table with a name that starts with an underscore - this is a Datasette convention to help plugins easily hide their own internal tables.
|
||||
- Tables that have been configured as ``"hidden": true`` using :ref:`metadata_hiding_tables`.
|
||||
- ``*_fts`` tables that implement SQLite full-text search indexes.
|
||||
- Tables relating to the inner workings of the SpatiaLite SQLite extension.
|
||||
- ``sqlite_stat`` tables used to store statistics used by the query optimizer.
|
||||
|
||||
.. _QueryView:
|
||||
|
||||
Queries
|
||||
=======
|
||||
|
||||
The ``/database-name/-/query`` page can be used to execute an arbitrary SQL query against that database, if the :ref:`actions_execute_sql` permission is enabled. This query is passed as the ``?sql=`` query string parameter.
|
||||
|
||||
This means you can link directly to a query by constructing the following URL:
|
||||
|
||||
``/database-name/-/query?sql=SELECT+*+FROM+table_name``
|
||||
|
||||
Each configured :ref:`canned query <canned_queries>` has its own page, at ``/database-name/query-name``. Viewing this page will execute the query and display the results.
|
||||
|
||||
In both cases adding a ``.json`` extension to the URL will return the results as JSON.
|
||||
* `global-power-plants.datasettes.com/global-power-plants.json <https://global-power-plants.datasettes.com/global-power-plants.json>`_
|
||||
|
||||
.. _TableView:
|
||||
|
||||
|
|
@ -87,7 +57,7 @@ Some examples:
|
|||
|
||||
* `../items <https://register-of-members-interests.datasettes.com/regmem/items>`_ lists all of the line-items registered by UK MPs as potential conflicts of interest. It demonstrates Datasette's support for :ref:`full_text_search`.
|
||||
* `../antiquities-act%2Factions_under_antiquities_act <https://fivethirtyeight.datasettes.com/fivethirtyeight/antiquities-act%2Factions_under_antiquities_act>`_ is an interface for exploring the "actions under the antiquities act" data table published by FiveThirtyEight.
|
||||
* `../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas <https://datasette.io/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=owner&_facet=country_long&country_long__exact=United+Kingdom&primary_fuel=Gas>`_ is a filtered table page showing every Gas power plant in the United Kingdom. It includes some default facets (configured using `its metadata.json <https://datasette.io/-/metadata>`_) and uses the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`_ plugin to show a map of the results.
|
||||
* `../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas <https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=owner&_facet=country_long&country_long__exact=United+Kingdom&primary_fuel=Gas>`_ is a filtered table page showing every Gas power plant in the United Kingdom. It includes some default facets (configured using `its metadata.json <https://global-power-plants.datasettes.com/-/metadata>`_) and uses the `datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`_ plugin to show a map of the results.
|
||||
|
||||
.. _RowView:
|
||||
|
||||
|
|
@ -100,10 +70,10 @@ Table cells with extremely long text contents are truncated on the table view ac
|
|||
|
||||
Rows which are the targets of foreign key references from other tables will show a link to a filtered search for all records that reference that row. Here's an example from the Registers of Members Interests database:
|
||||
|
||||
`../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001 <https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001>`_
|
||||
`../people/uk.org.publicwhip%2Fperson%2F10001 <https://register-of-members-interests.datasettes.com/regmem/people/uk.org.publicwhip%2Fperson%2F10001>`_
|
||||
|
||||
Note that this URL includes the encoded primary key of the record.
|
||||
|
||||
Here's that same page as JSON:
|
||||
|
||||
`../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json <https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json>`_
|
||||
`../people/uk.org.publicwhip%2Fperson%2F10001.json <https://register-of-members-interests.datasettes.com/regmem/people/uk.org.publicwhip%2Fperson%2F10001.json>`_
|
||||
|
|
|
|||
File diff suppressed because it is too large
Load diff
277
docs/plugins.rst
277
docs/plugins.rst
|
|
@ -25,7 +25,7 @@ Things you can do with plugins include:
|
|||
* Customize how database values are rendered in the Datasette interface, for example
|
||||
`datasette-render-binary <https://github.com/simonw/datasette-render-binary>`__ and
|
||||
`datasette-pretty-json <https://github.com/simonw/datasette-pretty-json>`__.
|
||||
* Customize how Datasette's authentication and permissions systems work, for example `datasette-auth-passwords <https://github.com/simonw/datasette-auth-passwords>`__ and
|
||||
* Customize how Datasette's authentication and permissions systems work, for example `datasette-auth-tokens <https://github.com/simonw/datasette-auth-tokens>`__ and
|
||||
`datasette-permissions-sql <https://github.com/simonw/datasette-permissions-sql>`__.
|
||||
|
||||
.. _plugins_installing:
|
||||
|
|
@ -51,16 +51,7 @@ This command can also be used to upgrade Datasette itself to the latest released
|
|||
|
||||
datasette install -U datasette
|
||||
|
||||
You can install multiple plugins at once by listing them as lines in a ``requirements.txt`` file like this::
|
||||
|
||||
datasette-vega
|
||||
datasette-cluster-map
|
||||
|
||||
Then pass that file to ``datasette install -r``::
|
||||
|
||||
datasette install -r requirements.txt
|
||||
|
||||
The ``install`` and ``uninstall`` commands are thin wrappers around ``pip install`` and ``pip uninstall``, which ensure that they run ``pip`` in the same virtual environment as Datasette itself.
|
||||
These commands are thin wrappers around ``pip install`` and ``pip uninstall``, which ensure they run ``pip`` in the same virtual environment as Datasette itself.
|
||||
|
||||
One-off plugins using --plugins-dir
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
|
@ -81,60 +72,6 @@ You can use the name of a package on PyPI or any of the other valid arguments to
|
|||
datasette publish cloudrun mydb.db \
|
||||
--install=https://url-to-my-package.zip
|
||||
|
||||
|
||||
.. _plugins_datasette_load_plugins:
|
||||
|
||||
Controlling which plugins are loaded
|
||||
------------------------------------
|
||||
|
||||
Datasette defaults to loading every plugin that is installed in the same virtual environment as Datasette itself.
|
||||
|
||||
You can set the ``DATASETTE_LOAD_PLUGINS`` environment variable to a comma-separated list of plugin names to load a controlled subset of plugins instead.
|
||||
|
||||
For example, to load just the ``datasette-vega`` and ``datasette-cluster-map`` plugins, set ``DATASETTE_LOAD_PLUGINS`` to ``datasette-vega,datasette-cluster-map``:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
export DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map'
|
||||
datasette mydb.db
|
||||
|
||||
Or:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map' \
|
||||
datasette mydb.db
|
||||
|
||||
To disable the loading of all additional plugins, set ``DATASETTE_LOAD_PLUGINS`` to an empty string:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
export DATASETTE_LOAD_PLUGINS=''
|
||||
datasette mydb.db
|
||||
|
||||
A quick way to test this setting is to use it with the ``datasette plugins`` command:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
DATASETTE_LOAD_PLUGINS='datasette-vega' datasette plugins
|
||||
|
||||
This should output the following:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
[
|
||||
{
|
||||
"name": "datasette-vega",
|
||||
"static": true,
|
||||
"templates": false,
|
||||
"version": "0.6.2",
|
||||
"hooks": [
|
||||
"extra_css_urls",
|
||||
"extra_js_urls"
|
||||
]
|
||||
}
|
||||
]
|
||||
|
||||
.. _plugins_installed:
|
||||
|
||||
Seeing what plugins are installed
|
||||
|
|
@ -144,12 +81,7 @@ You can see a list of installed plugins by navigating to the ``/-/plugins`` page
|
|||
|
||||
You can also use the ``datasette plugins`` command::
|
||||
|
||||
datasette plugins
|
||||
|
||||
Which outputs:
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
$ datasette plugins
|
||||
[
|
||||
{
|
||||
"name": "datasette_json_html",
|
||||
|
|
@ -166,8 +98,7 @@ Which outputs:
|
|||
cog.out("\n")
|
||||
result = CliRunner().invoke(cli.cli, ["plugins", "--all"])
|
||||
# cog.out() with text containing newlines was unindenting for some reason
|
||||
cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:\n")
|
||||
cog.outl(".. code-block:: json\n")
|
||||
cog.outl("If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::\n")
|
||||
plugins = [p for p in json.loads(result.output) if p["name"].startswith("datasette.")]
|
||||
indented = textwrap.indent(json.dumps(plugins, indent=4), " ")
|
||||
for line in indented.split("\n"):
|
||||
|
|
@ -175,9 +106,7 @@ Which outputs:
|
|||
cog.out("\n\n")
|
||||
.. ]]]
|
||||
|
||||
If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:
|
||||
|
||||
.. code-block:: json
|
||||
If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette::
|
||||
|
||||
[
|
||||
{
|
||||
|
|
@ -198,15 +127,6 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
|
|||
"register_output_renderer"
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "datasette.default_actions",
|
||||
"static": false,
|
||||
"templates": false,
|
||||
"version": null,
|
||||
"hooks": [
|
||||
"register_actions"
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "datasette.default_magic_parameters",
|
||||
"static": false,
|
||||
|
|
@ -231,19 +151,7 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
|
|||
"templates": false,
|
||||
"version": null,
|
||||
"hooks": [
|
||||
"actor_from_request",
|
||||
"canned_queries",
|
||||
"permission_resources_sql",
|
||||
"skip_csrf"
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "datasette.events",
|
||||
"static": false,
|
||||
"templates": false,
|
||||
"version": null,
|
||||
"hooks": [
|
||||
"register_events"
|
||||
"permission_allowed"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
|
@ -316,34 +224,18 @@ If you run ``datasette plugins --all`` it will include default plugins that ship
|
|||
|
||||
You can add the ``--plugins-dir=`` option to include any plugins found in that directory.
|
||||
|
||||
Add ``--requirements`` to output a list of installed plugins that can then be installed in another Datasette instance using ``datasette install -r requirements.txt``::
|
||||
|
||||
datasette plugins --requirements
|
||||
|
||||
The output will look something like this::
|
||||
|
||||
datasette-codespaces==0.1.1
|
||||
datasette-graphql==2.2
|
||||
datasette-json-html==1.0.1
|
||||
datasette-pretty-json==0.2.2
|
||||
datasette-x-forwarded-host==0.1
|
||||
|
||||
To write that to a ``requirements.txt`` file, run this::
|
||||
|
||||
datasette plugins --requirements > requirements.txt
|
||||
|
||||
.. _plugins_configuration:
|
||||
|
||||
Plugin configuration
|
||||
--------------------
|
||||
|
||||
Plugins can have their own configuration, embedded in a :ref:`configuration file <configuration>`. Configuration options for plugins live within a ``"plugins"`` key in that file, which can be included at the root, database or table level.
|
||||
Plugins can have their own configuration, embedded in a :ref:`metadata` file. Configuration options for plugins live within a ``"plugins"`` key in that file, which can be included at the root, database or table level.
|
||||
|
||||
Here is an example of some plugin configuration for a specific table:
|
||||
|
||||
.. [[[cog
|
||||
from metadata_doc import config_example
|
||||
config_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"tables": {
|
||||
|
|
@ -358,44 +250,7 @@ Here is an example of some plugin configuration for a specific table:
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
sf-trees:
|
||||
tables:
|
||||
Street_Tree_List:
|
||||
plugins:
|
||||
datasette-cluster-map:
|
||||
latitude_column: lat
|
||||
longitude_column: lng
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"tables": {
|
||||
"Street_Tree_List": {
|
||||
"plugins": {
|
||||
"datasette-cluster-map": {
|
||||
"latitude_column": "lat",
|
||||
"longitude_column": "lng"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
This tells the ``datasette-cluster-map`` column which latitude and longitude columns should be used for a table called ``Street_Tree_List`` inside a database file called ``sf-trees.db``.
|
||||
|
||||
|
|
@ -404,12 +259,13 @@ This tells the ``datasette-cluster-map`` column which latitude and longitude col
|
|||
Secret configuration values
|
||||
~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values.
|
||||
Any values embedded in ``metadata.json`` will be visible to anyone who views the ``/-/metadata`` page of your Datasette instance. Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values.
|
||||
|
||||
**As environment variables**. If your secret lives in an environment variable that is available to the Datasette process, you can indicate that the configuration value should be read from that environment variable like so:
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"plugins": {
|
||||
"datasette-auth-github": {
|
||||
"client_secret": {
|
||||
|
|
@ -417,38 +273,13 @@ Some plugins may need configuration that should stay secret - API keys for examp
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
plugins:
|
||||
datasette-auth-github:
|
||||
client_secret:
|
||||
$env: GITHUB_CLIENT_SECRET
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"plugins": {
|
||||
"datasette-auth-github": {
|
||||
"client_secret": {
|
||||
"$env": "GITHUB_CLIENT_SECRET"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
**As values in separate files**. Your secrets can also live in files on disk. To specify a secret should be read from a file, provide the full file path like this:
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"plugins": {
|
||||
"datasette-auth-github": {
|
||||
"client_secret": {
|
||||
|
|
@ -456,46 +287,21 @@ Some plugins may need configuration that should stay secret - API keys for examp
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
plugins:
|
||||
datasette-auth-github:
|
||||
client_secret:
|
||||
$file: /secrets/client-secret
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"plugins": {
|
||||
"datasette-auth-github": {
|
||||
"client_secret": {
|
||||
"$file": "/secrets/client-secret"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
If you are publishing your data using the :ref:`datasette publish <cli_publish>` family of commands, you can use the ``--plugin-secret`` option to set these secrets at publish time. For example, using Heroku you might run the following command::
|
||||
|
||||
datasette publish heroku my_database.db \
|
||||
$ datasette publish heroku my_database.db \
|
||||
--name my-heroku-app-demo \
|
||||
--install=datasette-auth-github \
|
||||
--plugin-secret datasette-auth-github client_id your_client_id \
|
||||
--plugin-secret datasette-auth-github client_secret your_client_secret
|
||||
|
||||
This will set the necessary environment variables and add the following to the deployed ``metadata.yaml``:
|
||||
This will set the necessary environment variables and add the following to the deployed ``metadata.json``:
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"plugins": {
|
||||
"datasette-auth-github": {
|
||||
"client_id": {
|
||||
|
|
@ -506,35 +312,4 @@ This will set the necessary environment variables and add the following to the d
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
plugins:
|
||||
datasette-auth-github:
|
||||
client_id:
|
||||
$env: DATASETTE_AUTH_GITHUB_CLIENT_ID
|
||||
client_secret:
|
||||
$env: DATASETTE_AUTH_GITHUB_CLIENT_SECRET
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"plugins": {
|
||||
"datasette-auth-github": {
|
||||
"client_id": {
|
||||
"$env": "DATASETTE_AUTH_GITHUB_CLIENT_ID"
|
||||
},
|
||||
"client_secret": {
|
||||
"$env": "DATASETTE_AUTH_GITHUB_CLIENT_SECRET"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -131,7 +131,7 @@ You can also specify plugins you would like to install. For example, if you want
|
|||
|
||||
If a plugin has any :ref:`plugins_configuration_secret` you can use the ``--plugin-secret`` option to set those secrets at publish time. For example, using Heroku with `datasette-auth-github <https://github.com/simonw/datasette-auth-github>`__ you might run the following command::
|
||||
|
||||
datasette publish heroku my_database.db \
|
||||
$ datasette publish heroku my_database.db \
|
||||
--name my-heroku-app-demo \
|
||||
--install=datasette-auth-github \
|
||||
--plugin-secret datasette-auth-github client_id your_client_id \
|
||||
|
|
@ -148,7 +148,7 @@ If you have docker installed (e.g. using `Docker for Mac <https://www.docker.com
|
|||
|
||||
Here's example output for the package command::
|
||||
|
||||
datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500"
|
||||
$ datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500"
|
||||
Sending build context to Docker daemon 4.459MB
|
||||
Step 1/7 : FROM python:3.11.0-slim-bullseye
|
||||
---> 79e1dc9af1c1
|
||||
|
|
|
|||
|
|
@ -11,11 +11,9 @@ Datasette supports a number of settings. These can be set using the ``--setting
|
|||
You can set multiple settings at once like this::
|
||||
|
||||
datasette mydatabase.db \
|
||||
--setting default_page_size 50 \
|
||||
--setting sql_time_limit_ms 3500 \
|
||||
--setting max_returned_rows 2000
|
||||
|
||||
Settings can also be specified :ref:`in the database.yaml configuration file <configuration_reference_settings>`.
|
||||
--setting default_page_size 50 \
|
||||
--setting sql_time_limit_ms 3500 \
|
||||
--setting max_returned_rows 2000
|
||||
|
||||
.. _config_dir:
|
||||
|
||||
|
|
@ -24,18 +22,17 @@ Configuration directory mode
|
|||
|
||||
Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose::
|
||||
|
||||
datasette one.db two.db \
|
||||
--metadata=metadata.json \
|
||||
--template-dir=templates/ \
|
||||
--plugins-dir=plugins \
|
||||
--static css:css
|
||||
$ datasette one.db two.db \
|
||||
--metadata=metadata.json \
|
||||
--template-dir=templates/ \
|
||||
--plugins-dir=plugins \
|
||||
--static css:css
|
||||
|
||||
As an alternative to this, you can run Datasette in *configuration directory* mode. Create a directory with the following structure::
|
||||
|
||||
# In a directory called my-app:
|
||||
my-app/one.db
|
||||
my-app/two.db
|
||||
my-app/datasette.yaml
|
||||
my-app/metadata.json
|
||||
my-app/templates/index.html
|
||||
my-app/plugins/my_plugin.py
|
||||
|
|
@ -43,16 +40,16 @@ As an alternative to this, you can run Datasette in *configuration directory* mo
|
|||
|
||||
Now start Datasette by providing the path to that directory::
|
||||
|
||||
datasette my-app/
|
||||
$ datasette my-app/
|
||||
|
||||
Datasette will detect the files in that directory and automatically configure itself using them. It will serve all ``*.db`` files that it finds, will load ``metadata.json`` if it exists, and will load the ``templates``, ``plugins`` and ``static`` folders if they are present.
|
||||
|
||||
The files that can be included in this directory are as follows. All are optional.
|
||||
|
||||
* ``*.db`` (or ``*.sqlite3`` or ``*.sqlite``) - SQLite database files that will be served by Datasette
|
||||
* ``datasette.yaml`` - :ref:`configuration` for the Datasette instance
|
||||
* ``metadata.json`` - :ref:`metadata` for those databases - ``metadata.yaml`` or ``metadata.yml`` can be used as well
|
||||
* ``inspect-data.json`` - the result of running ``datasette inspect *.db --inspect-file=inspect-data.json`` from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running
|
||||
* ``settings.json`` - settings that would normally be passed using ``--setting`` - here they should be stored as a JSON object of key/value pairs
|
||||
* ``templates/`` - a directory containing :ref:`customization_custom_templates`
|
||||
* ``plugins/`` - a directory containing plugins, see :ref:`writing_plugins_one_off`
|
||||
* ``static/`` - a directory containing static files - these will be served from ``/static/filename.txt``, see :ref:`customization_static_files`
|
||||
|
|
@ -69,13 +66,13 @@ default_allow_sql
|
|||
|
||||
Should users be able to execute arbitrary SQL queries by default?
|
||||
|
||||
Setting this to ``off`` causes permission checks for :ref:`actions_execute_sql` to fail by default.
|
||||
Setting this to ``off`` causes permission checks for :ref:`permissions_execute_sql` to fail by default.
|
||||
|
||||
::
|
||||
|
||||
datasette mydatabase.db --setting default_allow_sql off
|
||||
|
||||
Another way to achieve this is to add ``"allow_sql": false`` to your ``datasette.yaml`` file, as described in :ref:`authentication_permissions_execute_sql`. This setting offers a more convenient way to do this.
|
||||
There are two ways to achieve this: the other is to add ``"allow_sql": false`` to your ``metadata.json`` file, as described in :ref:`authentication_permissions_execute_sql`. This setting offers a more convenient way to do this.
|
||||
|
||||
.. _setting_default_page_size:
|
||||
|
||||
|
|
@ -114,17 +111,6 @@ You can increase or decrease this limit like so::
|
|||
|
||||
datasette mydatabase.db --setting max_returned_rows 2000
|
||||
|
||||
.. _setting_max_insert_rows:
|
||||
|
||||
max_insert_rows
|
||||
~~~~~~~~~~~~~~~
|
||||
|
||||
Maximum rows that can be inserted at a time using the bulk insert API, see :ref:`TableInsertView`. Defaults to 100.
|
||||
|
||||
You can increase or decrease this limit like so::
|
||||
|
||||
datasette mydatabase.db --setting max_insert_rows 1000
|
||||
|
||||
.. _setting_num_sql_threads:
|
||||
|
||||
num_sql_threads
|
||||
|
|
@ -198,34 +184,6 @@ Should users be able to download the original SQLite database using a link on th
|
|||
|
||||
datasette mydatabase.db --setting allow_download off
|
||||
|
||||
.. _setting_allow_signed_tokens:
|
||||
|
||||
allow_signed_tokens
|
||||
~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Should users be able to create signed API tokens to access Datasette?
|
||||
|
||||
This is turned on by default. Use the following to turn it off::
|
||||
|
||||
datasette mydatabase.db --setting allow_signed_tokens off
|
||||
|
||||
Turning this setting off will disable the ``/-/create-token`` page, :ref:`described here <CreateTokenView>`. It will also cause any incoming ``Authorization: Bearer dstok_...`` API tokens to be ignored.
|
||||
|
||||
.. _setting_max_signed_tokens_ttl:
|
||||
|
||||
max_signed_tokens_ttl
|
||||
~~~~~~~~~~~~~~~~~~~~~
|
||||
|
||||
Maximum allowed expiry time for signed API tokens created by users.
|
||||
|
||||
Defaults to ``0`` which means no limit - tokens can be created that will never expire.
|
||||
|
||||
Set this to a value in seconds to limit the maximum expiry time. For example, to set that limit to 24 hours you would use::
|
||||
|
||||
datasette mydatabase.db --setting max_signed_tokens_ttl 86400
|
||||
|
||||
This setting is enforced when incoming tokens are processed.
|
||||
|
||||
.. _setting_default_cache_ttl:
|
||||
|
||||
default_cache_ttl
|
||||
|
|
@ -356,25 +314,25 @@ Configuring the secret
|
|||
|
||||
Datasette uses a secret string to sign secure values such as cookies.
|
||||
|
||||
If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies and :ref:`API tokens <CreateTokenView>` will not stay valid between restarts.
|
||||
If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies will not stay valid between restarts.
|
||||
|
||||
You can pass a secret to Datasette in two ways: with the ``--secret`` command-line option or by setting a ``DATASETTE_SECRET`` environment variable.
|
||||
|
||||
::
|
||||
|
||||
datasette mydb.db --secret=SECRET_VALUE_HERE
|
||||
$ datasette mydb.db --secret=SECRET_VALUE_HERE
|
||||
|
||||
Or::
|
||||
|
||||
export DATASETTE_SECRET=SECRET_VALUE_HERE
|
||||
datasette mydb.db
|
||||
$ export DATASETTE_SECRET=SECRET_VALUE_HERE
|
||||
$ datasette mydb.db
|
||||
|
||||
One way to generate a secure random secret is to use Python like this::
|
||||
|
||||
python3 -c 'import secrets; print(secrets.token_hex(32))'
|
||||
$ python3 -c 'import secrets; print(secrets.token_hex(32))'
|
||||
cdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52
|
||||
|
||||
Plugin authors can make use of this signing mechanism in their plugins using the :ref:`datasette.sign() <datasette_sign>` and :ref:`datasette.unsign() <datasette_unsign>` methods.
|
||||
Plugin authors make use of this signing mechanism in their plugins using :ref:`datasette_sign` and :ref:`datasette_unsign`.
|
||||
|
||||
.. _setting_publish_secrets:
|
||||
|
||||
|
|
|
|||
|
|
@ -156,10 +156,7 @@ The `shapefile format <https://en.wikipedia.org/wiki/Shapefile>`_ is a common fo
|
|||
|
||||
Try it now with the North America shapefile available from the University of North Carolina `Global River Database <http://gaia.geosci.unc.edu/rivers/>`_ project. Download the file and unzip it (this will create files called ``narivs.dbf``, ``narivs.prj``, ``narivs.shp`` and ``narivs.shx`` in the current directory), then run the following::
|
||||
|
||||
spatialite rivers-database.db
|
||||
|
||||
::
|
||||
|
||||
$ spatialite rivers-database.db
|
||||
SpatiaLite version ..: 4.3.0a Supported Extensions:
|
||||
...
|
||||
spatialite> .loadshp narivs rivers CP1252 23032
|
||||
|
|
|
|||
|
|
@ -7,7 +7,7 @@ Datasette treats SQLite database files as read-only and immutable. This means it
|
|||
|
||||
The easiest way to execute custom SQL against Datasette is through the web UI. The database index page includes a SQL editor that lets you run any SELECT query you like. You can also construct queries using the filter interface on the tables page, then click "View and edit SQL" to open that query in the custom SQL editor.
|
||||
|
||||
Note that this interface is only available if the :ref:`actions_execute_sql` permission is allowed. See :ref:`authentication_permissions_execute_sql`.
|
||||
Note that this interface is only available if the :ref:`permissions_execute_sql` permission is allowed.
|
||||
|
||||
Any Datasette SQL query is reflected in the URL of the page, allowing you to bookmark them, share them with others and navigate through previous queries using your browser back button.
|
||||
|
||||
|
|
@ -53,29 +53,22 @@ If you want to bundle some pre-written SQL queries with your Datasette-hosted da
|
|||
|
||||
The quickest way to create views is with the SQLite command-line interface::
|
||||
|
||||
sqlite3 sf-trees.db
|
||||
|
||||
::
|
||||
|
||||
$ sqlite3 sf-trees.db
|
||||
SQLite version 3.19.3 2017-06-27 16:48:08
|
||||
Enter ".help" for usage hints.
|
||||
sqlite> CREATE VIEW demo_view AS select qSpecies from Street_Tree_List;
|
||||
<CTRL+D>
|
||||
|
||||
You can also use the `sqlite-utils <https://sqlite-utils.datasette.io/>`__ tool to `create a view <https://sqlite-utils.datasette.io/en/stable/cli.html#creating-views>`__::
|
||||
|
||||
sqlite-utils create-view sf-trees.db demo_view "select qSpecies from Street_Tree_List"
|
||||
|
||||
.. _canned_queries:
|
||||
|
||||
Canned queries
|
||||
--------------
|
||||
|
||||
As an alternative to adding views to your database, you can define canned queries inside your ``datasette.yaml`` file. Here's an example:
|
||||
As an alternative to adding views to your database, you can define canned queries inside your ``metadata.json`` file. Here's an example:
|
||||
|
||||
.. [[[cog
|
||||
from metadata_doc import config_example, config_example
|
||||
config_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"queries": {
|
||||
|
|
@ -85,36 +78,7 @@ As an alternative to adding views to your database, you can define canned querie
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
sf-trees:
|
||||
queries:
|
||||
just_species:
|
||||
sql: select qSpecies from Street_Tree_List
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"queries": {
|
||||
"just_species": {
|
||||
"sql": "select qSpecies from Street_Tree_List"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
Then run Datasette like this::
|
||||
|
||||
|
|
@ -147,60 +111,38 @@ Here's an example of a canned query with a named parameter:
|
|||
where neighborhood like '%' || :text || '%'
|
||||
order by neighborhood;
|
||||
|
||||
In the canned query configuration looks like this:
|
||||
In the canned query metadata (here :ref:`metadata_yaml` as ``metadata.yaml``) it looks like this:
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, """
|
||||
databases:
|
||||
fixtures:
|
||||
queries:
|
||||
neighborhood_search:
|
||||
title: Search neighborhoods
|
||||
sql: |-
|
||||
select neighborhood, facet_cities.name, state
|
||||
from facetable
|
||||
join facet_cities on facetable.city_id = facet_cities.id
|
||||
where neighborhood like '%' || :text || '%'
|
||||
order by neighborhood
|
||||
""")
|
||||
.. ]]]
|
||||
title: Search neighborhoods
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
Here's the equivalent using JSON (as ``metadata.json``):
|
||||
|
||||
.. code-block:: yaml
|
||||
.. code-block:: json
|
||||
|
||||
|
||||
databases:
|
||||
fixtures:
|
||||
queries:
|
||||
neighborhood_search:
|
||||
title: Search neighborhoods
|
||||
sql: |-
|
||||
select neighborhood, facet_cities.name, state
|
||||
from facetable
|
||||
join facet_cities on facetable.city_id = facet_cities.id
|
||||
where neighborhood like '%' || :text || '%'
|
||||
order by neighborhood
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
{
|
||||
"databases": {
|
||||
"fixtures": {
|
||||
"queries": {
|
||||
"neighborhood_search": {
|
||||
"title": "Search neighborhoods",
|
||||
"sql": "select neighborhood, facet_cities.name, state\nfrom facetable\n join facet_cities on facetable.city_id = facet_cities.id\nwhere neighborhood like '%' || :text || '%'\norder by neighborhood"
|
||||
"queries": {
|
||||
"neighborhood_search": {
|
||||
"sql": "select neighborhood, facet_cities.name, state\nfrom facetable\n join facet_cities on facetable.city_id = facet_cities.id\nwhere neighborhood like '%' || :text || '%'\norder by neighborhood",
|
||||
"title": "Search neighborhoods"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
Note that we are using SQLite string concatenation here - the ``||`` operator - to add wildcard ``%`` characters to the string provided by the user.
|
||||
|
||||
|
|
@ -211,13 +153,12 @@ In this example the ``:text`` named parameter is automatically extracted from th
|
|||
|
||||
You can alternatively provide an explicit list of named parameters using the ``"params"`` key, like this:
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, """
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
fixtures:
|
||||
queries:
|
||||
neighborhood_search:
|
||||
title: Search neighborhoods
|
||||
params:
|
||||
- text
|
||||
sql: |-
|
||||
|
|
@ -226,49 +167,7 @@ You can alternatively provide an explicit list of named parameters using the ``"
|
|||
join facet_cities on facetable.city_id = facet_cities.id
|
||||
where neighborhood like '%' || :text || '%'
|
||||
order by neighborhood
|
||||
""")
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
|
||||
databases:
|
||||
fixtures:
|
||||
queries:
|
||||
neighborhood_search:
|
||||
title: Search neighborhoods
|
||||
params:
|
||||
- text
|
||||
sql: |-
|
||||
select neighborhood, facet_cities.name, state
|
||||
from facetable
|
||||
join facet_cities on facetable.city_id = facet_cities.id
|
||||
where neighborhood like '%' || :text || '%'
|
||||
order by neighborhood
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"fixtures": {
|
||||
"queries": {
|
||||
"neighborhood_search": {
|
||||
"title": "Search neighborhoods",
|
||||
"params": [
|
||||
"text"
|
||||
],
|
||||
"sql": "select neighborhood, facet_cities.name, state\nfrom facetable\n join facet_cities on facetable.city_id = facet_cities.id\nwhere neighborhood like '%' || :text || '%'\norder by neighborhood"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
title: Search neighborhoods
|
||||
|
||||
.. _canned_queries_options:
|
||||
|
||||
|
|
@ -293,56 +192,21 @@ You can set a default fragment hash that will be included in the link to the can
|
|||
|
||||
This example demonstrates both ``fragment`` and ``hide_sql``:
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, """
|
||||
databases:
|
||||
fixtures:
|
||||
queries:
|
||||
neighborhood_search:
|
||||
fragment: fragment-goes-here
|
||||
hide_sql: true
|
||||
sql: |-
|
||||
select neighborhood, facet_cities.name, state
|
||||
from facetable join facet_cities on facetable.city_id = facet_cities.id
|
||||
where neighborhood like '%' || :text || '%' order by neighborhood;
|
||||
""")
|
||||
.. ]]]
|
||||
.. code-block:: json
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
|
||||
databases:
|
||||
fixtures:
|
||||
queries:
|
||||
neighborhood_search:
|
||||
fragment: fragment-goes-here
|
||||
hide_sql: true
|
||||
sql: |-
|
||||
select neighborhood, facet_cities.name, state
|
||||
from facetable join facet_cities on facetable.city_id = facet_cities.id
|
||||
where neighborhood like '%' || :text || '%' order by neighborhood;
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
{
|
||||
"databases": {
|
||||
"fixtures": {
|
||||
"queries": {
|
||||
"neighborhood_search": {
|
||||
"fragment": "fragment-goes-here",
|
||||
"hide_sql": true,
|
||||
"sql": "select neighborhood, facet_cities.name, state\nfrom facetable join facet_cities on facetable.city_id = facet_cities.id\nwhere neighborhood like '%' || :text || '%' order by neighborhood;"
|
||||
"queries": {
|
||||
"neighborhood_search": {
|
||||
"sql": "select neighborhood, facet_cities.name, state\nfrom facetable join facet_cities on facetable.city_id = facet_cities.id\nwhere neighborhood like '%' || :text || '%' order by neighborhood;",
|
||||
"fragment": "fragment-goes-here",
|
||||
"hide_sql": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
`See here <https://latest.datasette.io/fixtures#queries>`__ for a demo of this in action.
|
||||
|
||||
|
|
@ -355,132 +219,55 @@ Canned queries by default are read-only. You can use the ``"write": true`` key t
|
|||
|
||||
See :ref:`authentication_permissions_query` for details on how to add permission checks to canned queries, using the ``"allow"`` key.
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"queries": {
|
||||
"add_name": {
|
||||
"sql": "INSERT INTO names (name) VALUES (:name)",
|
||||
"write": True
|
||||
"write": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
mydatabase:
|
||||
queries:
|
||||
add_name:
|
||||
sql: INSERT INTO names (name) VALUES (:name)
|
||||
write: true
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"queries": {
|
||||
"add_name": {
|
||||
"sql": "INSERT INTO names (name) VALUES (:name)",
|
||||
"write": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
This configuration will create a page at ``/mydatabase/add_name`` displaying a form with a ``name`` field. Submitting that form will execute the configured ``INSERT`` query.
|
||||
|
||||
You can customize how Datasette represents success and errors using the following optional properties:
|
||||
|
||||
- ``on_success_message`` - the message shown when a query is successful
|
||||
- ``on_success_message_sql`` - alternative to ``on_success_message``: a SQL query that should be executed to generate the message
|
||||
- ``on_success_redirect`` - the path or URL the user is redirected to on success
|
||||
- ``on_error_message`` - the message shown when a query throws an error
|
||||
- ``on_error_redirect`` - the path or URL the user is redirected to on error
|
||||
|
||||
For example:
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, {
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"queries": {
|
||||
"add_name": {
|
||||
"sql": "INSERT INTO names (name) VALUES (:name)",
|
||||
"params": ["name"],
|
||||
"write": True,
|
||||
"on_success_message_sql": "select 'Name inserted: ' || :name",
|
||||
"write": true,
|
||||
"on_success_message": "Name inserted",
|
||||
"on_success_redirect": "/mydatabase/names",
|
||||
"on_error_message": "Name insert failed",
|
||||
"on_error_redirect": "/mydatabase",
|
||||
"on_error_redirect": "/mydatabase"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
}
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
mydatabase:
|
||||
queries:
|
||||
add_name:
|
||||
sql: INSERT INTO names (name) VALUES (:name)
|
||||
params:
|
||||
- name
|
||||
write: true
|
||||
on_success_message_sql: 'select ''Name inserted: '' || :name'
|
||||
on_success_redirect: /mydatabase/names
|
||||
on_error_message: Name insert failed
|
||||
on_error_redirect: /mydatabase
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"queries": {
|
||||
"add_name": {
|
||||
"sql": "INSERT INTO names (name) VALUES (:name)",
|
||||
"params": [
|
||||
"name"
|
||||
],
|
||||
"write": true,
|
||||
"on_success_message_sql": "select 'Name inserted: ' || :name",
|
||||
"on_success_redirect": "/mydatabase/names",
|
||||
"on_error_message": "Name insert failed",
|
||||
"on_error_redirect": "/mydatabase"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
|
||||
You can use ``"params"`` to explicitly list the named parameters that should be displayed as form fields - otherwise they will be automatically detected. ``"params"`` is not necessary in the above example, since without it ``"name"`` would be automatically detected from the query.
|
||||
You can use ``"params"`` to explicitly list the named parameters that should be displayed as form fields - otherwise they will be automatically detected.
|
||||
|
||||
You can pre-populate form fields when the page first loads using a query string, e.g. ``/mydatabase/add_name?name=Prepopulated``. The user will have to submit the form to execute the query.
|
||||
|
||||
If you specify a query in ``"on_success_message_sql"``, that query will be executed after the main query. The first column of the first row return by that query will be displayed as a success message. Named parameters from the main query will be made available to the success message query as well.
|
||||
|
||||
.. _canned_queries_magic_parameters:
|
||||
|
||||
Magic parameters
|
||||
|
|
@ -513,10 +300,10 @@ Available magic parameters are:
|
|||
``_random_chars_*`` - e.g. ``_random_chars_128``
|
||||
A random string of characters of the specified length.
|
||||
|
||||
Here's an example configuration that adds a message from the authenticated user, storing various pieces of additional metadata using magic parameters:
|
||||
Here's an example configuration (this time using ``metadata.yaml`` since that provides better support for multi-line SQL queries) that adds a message from the authenticated user, storing various pieces of additional metadata using magic parameters:
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
.. [[[cog
|
||||
config_example(cog, """
|
||||
databases:
|
||||
mydatabase:
|
||||
queries:
|
||||
|
|
@ -530,49 +317,6 @@ Here's an example configuration that adds a message from the authenticated user,
|
|||
:_actor_id, :message, :_now_datetime_utc
|
||||
)
|
||||
write: true
|
||||
""")
|
||||
.. ]]]
|
||||
|
||||
.. tab:: datasette.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
|
||||
databases:
|
||||
mydatabase:
|
||||
queries:
|
||||
add_message:
|
||||
allow:
|
||||
id: "*"
|
||||
sql: |-
|
||||
INSERT INTO messages (
|
||||
user_id, message, datetime
|
||||
) VALUES (
|
||||
:_actor_id, :message, :_now_datetime_utc
|
||||
)
|
||||
write: true
|
||||
|
||||
|
||||
.. tab:: datasette.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"mydatabase": {
|
||||
"queries": {
|
||||
"add_message": {
|
||||
"allow": {
|
||||
"id": "*"
|
||||
},
|
||||
"sql": "INSERT INTO messages (\n user_id, message, datetime\n) VALUES (\n :_actor_id, :message, :_now_datetime_utc\n)",
|
||||
"write": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
|
||||
The form presented at ``/mydatabase/add_message`` will have just a field for ``message`` - the other parameters will be populated by the magic parameter mechanism.
|
||||
|
||||
|
|
@ -613,7 +357,7 @@ The JSON response will look like this:
|
|||
"redirect": "/data/add_name"
|
||||
}
|
||||
|
||||
The ``"message"`` and ``"redirect"`` values here will take into account ``on_success_message``, ``on_success_message_sql``, ``on_success_redirect``, ``on_error_message`` and ``on_error_redirect``, if they have been set.
|
||||
The ``"message"`` and ``"redirect"`` values here will take into account ``on_success_message``, ``on_success_redirect``, ``on_error_message`` and ``on_error_redirect``, if they have been set.
|
||||
|
||||
.. _pagination:
|
||||
|
||||
|
|
|
|||
|
|
@ -33,16 +33,16 @@ You can install these packages like so::
|
|||
|
||||
pip install pytest pytest-asyncio
|
||||
|
||||
If you are building an installable package you can add them as test dependencies to your ``pyproject.toml`` file like this:
|
||||
If you are building an installable package you can add them as test dependencies to your ``setup.py`` module like this:
|
||||
|
||||
.. code-block:: toml
|
||||
.. code-block:: python
|
||||
|
||||
[project]
|
||||
name = "datasette-my-plugin"
|
||||
# ...
|
||||
|
||||
[project.optional-dependencies]
|
||||
test = ["pytest", "pytest-asyncio"]
|
||||
setup(
|
||||
name="datasette-my-plugin",
|
||||
# ...
|
||||
extras_require={"test": ["pytest", "pytest-asyncio"]},
|
||||
tests_require=["datasette-my-plugin[test]"],
|
||||
)
|
||||
|
||||
You can then install the test dependencies like so::
|
||||
|
||||
|
|
@ -82,34 +82,6 @@ This method registers any :ref:`plugin_hook_startup` or :ref:`plugin_hook_prepar
|
|||
|
||||
If you are using ``await datasette.client.get()`` and similar methods then you don't need to worry about this - Datasette automatically calls ``invoke_startup()`` the first time it handles a request.
|
||||
|
||||
.. _testing_datasette_client:
|
||||
|
||||
Using datasette.client in tests
|
||||
-------------------------------
|
||||
|
||||
The :ref:`internals_datasette_client` mechanism is designed for use in tests. It provides access to a pre-configured `HTTPX async client <https://www.python-httpx.org/async/>`__ instance that can make GET, POST and other HTTP requests against a Datasette instance from inside a test.
|
||||
|
||||
A simple test looks like this:
|
||||
|
||||
.. literalinclude:: ../tests/test_docs.py
|
||||
:language: python
|
||||
:start-after: # -- start test_homepage --
|
||||
:end-before: # -- end test_homepage --
|
||||
|
||||
Or for a JSON API:
|
||||
|
||||
.. literalinclude:: ../tests/test_docs.py
|
||||
:language: python
|
||||
:start-after: # -- start test_actor_is_null --
|
||||
:end-before: # -- end test_actor_is_null --
|
||||
|
||||
To make requests as an authenticated actor, create a signed ``ds_cookie`` using the ``datasette.client.actor_cookie()`` helper function and pass it in ``cookies=`` like this:
|
||||
|
||||
.. literalinclude:: ../tests/test_docs.py
|
||||
:language: python
|
||||
:start-after: # -- start test_signed_cookie_actor --
|
||||
:end-before: # -- end test_signed_cookie_actor --
|
||||
|
||||
.. _testing_plugins_pdb:
|
||||
|
||||
Using pdb for errors thrown inside Datasette
|
||||
|
|
@ -313,19 +285,3 @@ When writing tests for plugins you may find it useful to register a test plugin
|
|||
assert response.status_code == 500
|
||||
finally:
|
||||
pm.unregister(name="undo")
|
||||
|
||||
To reuse the same temporary plugin in multiple tests, you can register it inside a fixture in your ``conftest.py`` file like this:
|
||||
|
||||
.. literalinclude:: ../tests/test_docs_plugins.py
|
||||
:language: python
|
||||
:start-after: # -- start datasette_with_plugin_fixture --
|
||||
:end-before: # -- end datasette_with_plugin_fixture --
|
||||
|
||||
Note the ``yield`` statement here - this ensures that the ``finally:`` block that unregisters the plugin is executed only after the test function itself has completed.
|
||||
|
||||
Then in a test:
|
||||
|
||||
.. literalinclude:: ../tests/test_docs_plugins.py
|
||||
:language: python
|
||||
:start-after: # -- start datasette_with_plugin_test --
|
||||
:end-before: # -- end datasette_with_plugin_test --
|
||||
|
|
|
|||
|
|
@ -7,30 +7,6 @@ You can write one-off plugins that apply to just one Datasette instance, or you
|
|||
|
||||
Want to start by looking at an example? The `Datasette plugins directory <https://datasette.io/plugins>`__ lists more than 90 open source plugins with code you can explore. The :ref:`plugin hooks <plugin_hooks>` page includes links to example plugins for each of the documented hooks.
|
||||
|
||||
.. _writing_plugins_tracing:
|
||||
|
||||
Tracing plugin hooks
|
||||
--------------------
|
||||
|
||||
The ``DATASETTE_TRACE_PLUGINS`` environment variable turns on detailed tracing showing exactly which hooks are being run. This can be useful for understanding how Datasette is using your plugin.
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
DATASETTE_TRACE_PLUGINS=1 datasette mydb.db
|
||||
|
||||
Example output::
|
||||
|
||||
actor_from_request:
|
||||
{ 'datasette': <datasette.app.Datasette object at 0x100bc7220>,
|
||||
'request': <asgi.Request method="GET" url="http://127.0.0.1:4433/">}
|
||||
Hook implementations:
|
||||
[ <HookImpl plugin_name='codespaces', plugin=<module 'datasette_codespaces' from '.../site-packages/datasette_codespaces/__init__.py'>>,
|
||||
<HookImpl plugin_name='datasette.actor_auth_cookie', plugin=<module 'datasette.actor_auth_cookie' from '.../datasette/datasette/actor_auth_cookie.py'>>,
|
||||
<HookImpl plugin_name='datasette.default_permissions', plugin=<module 'datasette.default_permissions' from '.../datasette/default_permissions.py'>>]
|
||||
Results:
|
||||
[{'id': 'root'}]
|
||||
|
||||
|
||||
.. _writing_plugins_one_off:
|
||||
|
||||
Writing one-off plugins
|
||||
|
|
@ -124,6 +100,7 @@ And a Python module file, ``datasette_plugin_demos.py``, that implements the plu
|
|||
"random_integer", 2, random.randint
|
||||
)
|
||||
|
||||
|
||||
Having built a plugin in this way you can turn it into an installable package using the following command::
|
||||
|
||||
python3 setup.py sdist
|
||||
|
|
@ -163,8 +140,6 @@ Where ``datasette_plugin_name`` is the name of the plugin package (note that it
|
|||
|
||||
`datasette-cluster-map <https://github.com/simonw/datasette-cluster-map>`__ is a useful example of a plugin that includes packaged static assets in this way.
|
||||
|
||||
See :ref:`customization_css` for tips on writing CSS that is compatible with Datasette's default CSS, including details of the ``core`` class for applying Datasette's default form element styles.
|
||||
|
||||
.. _writing_plugins_custom_templates:
|
||||
|
||||
Custom templates
|
||||
|
|
@ -209,12 +184,10 @@ This will return the ``{"latitude_column": "lat", "longitude_column": "lng"}`` i
|
|||
|
||||
If there is no configuration for that plugin, the method will return ``None``.
|
||||
|
||||
If it cannot find the requested configuration at the table layer, it will fall back to the database layer and then the root layer. For example, a user may have set the plugin configuration option inside ``datasette.yaml`` like so:
|
||||
If it cannot find the requested configuration at the table layer, it will fall back to the database layer and then the root layer. For example, a user may have set the plugin configuration option like so::
|
||||
|
||||
.. [[[cog
|
||||
from metadata_doc import metadata_example
|
||||
metadata_example(cog, {
|
||||
"databases": {
|
||||
{
|
||||
"databases: {
|
||||
"sf-trees": {
|
||||
"plugins": {
|
||||
"datasette-cluster-map": {
|
||||
|
|
@ -224,77 +197,21 @@ If it cannot find the requested configuration at the table layer, it will fall b
|
|||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
databases:
|
||||
sf-trees:
|
||||
plugins:
|
||||
datasette-cluster-map:
|
||||
latitude_column: xlat
|
||||
longitude_column: xlng
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"databases": {
|
||||
"sf-trees": {
|
||||
"plugins": {
|
||||
"datasette-cluster-map": {
|
||||
"latitude_column": "xlat",
|
||||
"longitude_column": "xlng"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
In this case, the above code would return that configuration for ANY table within the ``sf-trees`` database.
|
||||
|
||||
The plugin configuration could also be set at the top level of ``datasette.yaml``:
|
||||
The plugin configuration could also be set at the top level of ``metadata.json``::
|
||||
|
||||
.. [[[cog
|
||||
metadata_example(cog, {
|
||||
{
|
||||
"title": "This is the top-level title in metadata.json",
|
||||
"plugins": {
|
||||
"datasette-cluster-map": {
|
||||
"latitude_column": "xlat",
|
||||
"longitude_column": "xlng"
|
||||
}
|
||||
}
|
||||
})
|
||||
.. ]]]
|
||||
|
||||
.. tab:: metadata.yaml
|
||||
|
||||
.. code-block:: yaml
|
||||
|
||||
plugins:
|
||||
datasette-cluster-map:
|
||||
latitude_column: xlat
|
||||
longitude_column: xlng
|
||||
|
||||
|
||||
.. tab:: metadata.json
|
||||
|
||||
.. code-block:: json
|
||||
|
||||
{
|
||||
"plugins": {
|
||||
"datasette-cluster-map": {
|
||||
"latitude_column": "xlat",
|
||||
"longitude_column": "xlng"
|
||||
}
|
||||
}
|
||||
}
|
||||
.. [[[end]]]
|
||||
}
|
||||
|
||||
Now that ``datasette-cluster-map`` plugin configuration will apply to every table in every database.
|
||||
|
||||
|
|
@ -317,7 +234,7 @@ To avoid accidentally conflicting with a database file that may be loaded into D
|
|||
|
||||
- ``/-/upload-excel``
|
||||
|
||||
Try to avoid registering URLs that clash with other plugins that your users might have installed. There is no central repository of reserved URL paths (yet) but you can review existing plugins by browsing the `plugins directory <https://datasette.io/plugins>`__.
|
||||
Try to avoid registering URLs that clash with other plugins that your users might have installed. There is no central repository of reserved URL paths (yet) but you can review existing plugins by browsing the `plugins directory <https://datasette.io/plugins>`.
|
||||
|
||||
If your plugin includes functionality that relates to a specific database you could also register a URL route like this:
|
||||
|
||||
|
|
@ -347,65 +264,3 @@ This object is exposed in templates as the ``urls`` variable, which can be used
|
|||
Back to the <a href="{{ urls.instance() }}">Homepage</a>
|
||||
|
||||
See :ref:`internals_datasette_urls` for full details on this object.
|
||||
|
||||
.. _writing_plugins_extra_hooks:
|
||||
|
||||
Plugins that define new plugin hooks
|
||||
------------------------------------
|
||||
|
||||
Plugins can define new plugin hooks that other plugins can use to further extend their functionality.
|
||||
|
||||
`datasette-graphql <https://github.com/simonw/datasette-graphql>`__ is one example of a plugin that does this. It defines a new hook called ``graphql_extra_fields``, `described here <https://github.com/simonw/datasette-graphql/blob/main/README.md#adding-custom-fields-with-plugins>`__, which other plugins can use to define additional fields that should be included in the GraphQL schema.
|
||||
|
||||
To define additional hooks, add a file to the plugin called ``datasette_your_plugin/hookspecs.py`` with content that looks like this:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from pluggy import HookspecMarker
|
||||
|
||||
hookspec = HookspecMarker("datasette")
|
||||
|
||||
|
||||
@hookspec
|
||||
def name_of_your_hook_goes_here(datasette):
|
||||
"Description of your hook."
|
||||
|
||||
You should define your own hook name and arguments here, following the documentation for `Pluggy specifications <https://pluggy.readthedocs.io/en/stable/#specs>`__. Make sure to pick a name that is unlikely to clash with hooks provided by any other plugins.
|
||||
|
||||
Then, to register your plugin hooks, add the following code to your ``datasette_your_plugin/__init__.py`` file:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from datasette.plugins import pm
|
||||
from . import hookspecs
|
||||
|
||||
pm.add_hookspecs(hookspecs)
|
||||
|
||||
This will register your plugin hooks as part of the ``datasette`` plugin hook namespace.
|
||||
|
||||
Within your plugin code you can trigger the hook using this pattern:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from datasette.plugins import pm
|
||||
|
||||
for (
|
||||
plugin_return_value
|
||||
) in pm.hook.name_of_your_hook_goes_here(
|
||||
datasette=datasette
|
||||
):
|
||||
# Do something with plugin_return_value
|
||||
pass
|
||||
|
||||
Other plugins will then be able to register their own implementations of your hook using this syntax:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from datasette import hookimpl
|
||||
|
||||
|
||||
@hookimpl
|
||||
def name_of_your_hook_goes_here(datasette):
|
||||
return "Response from this plugin hook"
|
||||
|
||||
These plugin implementations can accept 0 or more of the named arguments that you defined in your hook specification.
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue