From 6d9e23b1be92f00304ca267b49a8339e9b505ee7 Mon Sep 17 00:00:00 2001 From: bnewbold Date: Sun, 7 May 2023 12:19:56 -0700 Subject: [PATCH] bskyweb: update robots.txt (#595) This is to make crawling more explicitly allowed, communicating expectations. If we ever end up with "expensive" routes on this service, will want to add Crawl-Delay. --- bskyweb/static/robots.txt | 10 +++++++++- 1 file changed, 9 insertions(+), 1 deletion(-) diff --git a/bskyweb/static/robots.txt b/bskyweb/static/robots.txt index d3475984..4f8510d1 100644 --- a/bskyweb/static/robots.txt +++ b/bskyweb/static/robots.txt @@ -1 +1,9 @@ -# hello friends! +# Hello Friends! +# If you are considering bulk or automated crawling, you may want to look in +# to our protocol (API), including a firehose of updates. See: https://atproto.com/ + +# By default, may crawl anything on this domain. HTTP 429 ("backoff") status +# codes are used for rate-limiting. Up to a handful concurrent requests should +# be ok. +User-Agent: * +Allow: /