bskyweb: update robots.txt (#595)
This is to make crawling more explicitly allowed, communicating expectations. If we ever end up with "expensive" routes on this service, will want to add Crawl-Delay.zio/stable
parent
0c604ff1c2
commit
6d9e23b1be
|
@ -1 +1,9 @@
|
||||||
# hello friends!
|
# Hello Friends!
|
||||||
|
# If you are considering bulk or automated crawling, you may want to look in
|
||||||
|
# to our protocol (API), including a firehose of updates. See: https://atproto.com/
|
||||||
|
|
||||||
|
# By default, may crawl anything on this domain. HTTP 429 ("backoff") status
|
||||||
|
# codes are used for rate-limiting. Up to a handful concurrent requests should
|
||||||
|
# be ok.
|
||||||
|
User-Agent: *
|
||||||
|
Allow: /
|
||||||
|
|
Loading…
Reference in New Issue