bskyweb: update robots.txt (#595)

This is to make crawling more explicitly allowed, communicating
expectations.

If we ever end up with "expensive" routes on this service, will want to
add Crawl-Delay.
zio/stable
bnewbold 2023-05-07 12:19:56 -07:00 committed by GitHub
parent 0c604ff1c2
commit 6d9e23b1be
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 9 additions and 1 deletions

View File

@ -1 +1,9 @@
# hello friends!
# Hello Friends!
# If you are considering bulk or automated crawling, you may want to look in
# to our protocol (API), including a firehose of updates. See: https://atproto.com/
# By default, may crawl anything on this domain. HTTP 429 ("backoff") status
# codes are used for rate-limiting. Up to a handful concurrent requests should
# be ok.
User-Agent: *
Allow: /