diff --git a/docs/advanced_export.png b/docs/advanced_export.png new file mode 100644 index 00000000..cec0c4eb Binary files /dev/null and b/docs/advanced_export.png differ diff --git a/docs/config.rst b/docs/config.rst index e0013bf0..57bb28eb 100644 --- a/docs/config.rst +++ b/docs/config.rst @@ -35,6 +35,8 @@ You can optionally set a lower time limit for an individual query using the ``?_ This would set the time limit to 100ms for that specific query. This feature is useful if you are working with databases of unknown size and complexity - a query that might make perfect sense for a smaller table could take too long to execute on a table with millions of rows. By setting custom time limits you can execute queries "optimistically" - e.g. give me an exact count of rows matching this query but only if it takes less than 100ms to calculate. +.. _config_max_returned_rows: + max_returned_rows ----------------- @@ -126,23 +128,27 @@ Sets the amount of memory SQLite uses for its `per-connection cache ` where an entire table +(potentially hundreds of thousands of rows) can be exported as a single CSV +file. This is turned on by default - you can turn it off like this: :: datasette mydatabase.db --config allow_csv_stream:off +.. _config_max_csv_mb: max_csv_mb ---------- The maximum size of CSV that can be exported, in megabytes. Defaults to 100MB. -You can disable the limit entirely by settings this to 0:: +You can disable the limit entirely by settings this to 0: + +:: datasette mydatabase.db --config max_csv_mb:0 diff --git a/docs/csv_export.rst b/docs/csv_export.rst new file mode 100644 index 00000000..10f65b76 --- /dev/null +++ b/docs/csv_export.rst @@ -0,0 +1,63 @@ +.. _csv_export: + +CSV Export +========== + +Any Datasette table, view or custom SQL query can be exported as CSV. + +To obtain the CSV representation of the table you are looking, click the "this +data as CSV" link. + +You can also use the advanced export form for more control over the resulting +file, which looks like this and has the following options: + +.. image:: advanced_export.png + +* **download file** - instead of displaying CSV in your browser, this forces + your browser to download the CSV to your downloads directory. + +* **expand labels** - if your table has any foreign key references this option + will cause the CSV to gain additional ``COLUMN_NAME_label`` columns with a + label for each foreign key derived from the linked table. `In this example + `_ + the ``city_id`` column is accompanied by a ``city_id_label`` column. + +* **stream all records** - by default CSV files only contain the first + :ref:`config_max_returned_rows` records. This option will cause Datasette to + loop through every matching record and return them as a single CSV file. + +You can try that out on https://latest.datasette.io/fixtures/facetable?_size=4 + +Streaming all records +--------------------- + +The *stream all records* option is designed to be as efficient as possible - +under the hood it takes advantage of Python 3 asyncio capabilities and +Datasette's efficient :ref:`pagination ` to stream back the full +CSV file. + +Since databases can get pretty large, by default this option is capped at 100MB - +if a table returns more than 100MB of data the last line of the CSV will be a +truncation error message. + +You can increase or remove this limit using the :ref:`config_max_csv_mb` config +setting. You can also disable the CSV export feature entirely using +:ref:`config_allow_csv_stream`. + +A note on URLs +-------------- + +The default URL for the CSV representation of a table is that table with +``.csv`` appended to it: + +* https://latest.datasette.io/fixtures/facetable - HTML interface +* https://latest.datasette.io/fixtures/facetable.csv - CSV export +* https://latest.datasette.io/fixtures/facetable.json - JSON API + +This pattern doesn't work for tables with names that already end in ``.csv`` or +``.json``. For those tables, you can instead use the ``_format=`` querystring +parameter: + +* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv - HTML interface +* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=csv - CSV export +* https://latest.datasette.io/fixtures/table%2Fwith%2Fslashes.csv?_format=json - JSON API diff --git a/docs/index.rst b/docs/index.rst index 031ff0fa..4fa403fd 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -19,6 +19,7 @@ Contents getting_started json_api sql_queries + csv_export facets full_text_search spatialite diff --git a/docs/json_api.rst b/docs/json_api.rst index 97029bb3..f010d08c 100644 --- a/docs/json_api.rst +++ b/docs/json_api.rst @@ -1,5 +1,5 @@ -The Datasette JSON API -====================== +JSON API +======== Datasette provides a JSON API for your SQLite databases. Anything you can do through the Datasette user interface can also be accessed as JSON via the API. diff --git a/docs/sql_queries.rst b/docs/sql_queries.rst index 0355d9d5..cfff8136 100644 --- a/docs/sql_queries.rst +++ b/docs/sql_queries.rst @@ -95,6 +95,8 @@ will then be able to enter them using the form fields on the canned query page or by adding them to the URL. This means canned queries can be used to create custom JSON APIs based on a carefully designed SQL. +.. _pagination: + Pagination ----------