mirror of
https://github.com/ai-robots-txt/ai.robots.txt.git
synced 2025-12-29 12:18:33 +01:00
Merge pull request #201 from Anshita-18H/add-requirements-file
Add requirements.txt with project dependencies
This commit is contained in:
commit
91959fe791
2 changed files with 13 additions and 1 deletions
11
README.md
11
README.md
|
|
@ -49,9 +49,16 @@ file on-the-fly.
|
|||
|
||||
A note about contributing: updates should be added/made to `robots.json`. A GitHub action will then generate the updated `robots.txt`, `table-of-bot-metrics.md`, `.htaccess` and `nginx-block-ai-bots.conf`.
|
||||
|
||||
You can run the tests by [installing](https://www.python.org/about/gettingstarted/) Python 3 and issuing:
|
||||
You can run the tests by [installing](https://www.python.org/about/gettingstarted/) Python 3, installing the depenendcies, and then issuing:
|
||||
```console
|
||||
code/tests.py
|
||||
|
||||
### Installing Dependencies
|
||||
|
||||
Before running the tests, install all required Python packages:
|
||||
pip install -r requirements.txt
|
||||
|
||||
|
||||
```
|
||||
|
||||
## Releasing
|
||||
|
|
@ -97,3 +104,5 @@ But even if you don't use Cloudflare's hard block, their list of [verified bots]
|
|||
- [Blockin' bots on Netlify](https://www.jeremiak.com/blog/block-bots-netlify-edge-functions/) by Jeremia Kimelman
|
||||
- [Blocking AI web crawlers](https://underlap.org/blocking-ai-web-crawlers) by Glyn Normington
|
||||
- [Block AI Bots from Crawling Websites Using Robots.txt](https://originality.ai/ai-bot-blocking) by Jonathan Gillham, Originality.AI
|
||||
|
||||
|
||||
|
|
|
|||
3
requirements.txt
Normal file
3
requirements.txt
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
beautifulsoup4
|
||||
lxml
|
||||
requests
|
||||
Loading…
Add table
Add a link
Reference in a new issue