bsky-app/bskyweb/static/robots.txt
bnewbold 6d9e23b1be
bskyweb: update robots.txt (#595)
This is to make crawling more explicitly allowed, communicating
expectations.

If we ever end up with "expensive" routes on this service, will want to
add Crawl-Delay.
2023-05-07 12:19:56 -07:00

9 lines
367 B
Text

# Hello Friends!
# If you are considering bulk or automated crawling, you may want to look in
# to our protocol (API), including a firehose of updates. See: https://atproto.com/
# By default, may crawl anything on this domain. HTTP 429 ("backoff") status
# codes are used for rate-limiting. Up to a handful concurrent requests should
# be ok.
User-Agent: *
Allow: /