aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorRobin H. Johnson <robbat2@gentoo.org>2019-07-22 13:28:15 -0700
committerRobin H. Johnson <robbat2@gentoo.org>2019-07-22 13:30:38 -0700
commita4b5fb2ffbdffd35f6628739cb6e9c766c838ee1 (patch)
tree564430c228d101d51e8788fc76226d425f723faf
parenttemplates: Move RISC-V to ~arch group (diff)
downloadbugzilla-a4b5fb2ffbdffd35f6628739cb6e9c766c838ee1.tar.gz
bugzilla-a4b5fb2ffbdffd35f6628739cb6e9c766c838ee1.tar.bz2
bugzilla-a4b5fb2ffbdffd35f6628739cb6e9c766c838ee1.zip
robots.txt: allow more indexinggentoo-5.0.4.8
A long time ago, web crawlers were discouraged from using SSL to crawl bugzilla, because of CPU load caused in SSL, and that it was a duplicate of results on the non-SSL side and the bots got that wrong. In the SSL-Everywhere world, with forced upgrades to SSL, and cheaper CPU, this was never revisited, and means that Gentoo Bugzilla isn't on Google. Re-enable the safer terms, and let the bots come in! Fixes: https://bugs.gentoo.org/show_bug.cgi?id=690438 Signed-off-by: Robin H. Johnson <robbat2@gentoo.org>
-rw-r--r--robots-ssl.txt13
1 files changed, 6 insertions, 7 deletions
diff --git a/robots-ssl.txt b/robots-ssl.txt
index 3e832535d..808943e32 100644
--- a/robots-ssl.txt
+++ b/robots-ssl.txt
@@ -1,11 +1,10 @@
User-agent: *
-Disallow: *
-Disallow: /
-Disallow: /index.cgi
-Disallow: /show_bug.cgi
-Disallow: /attachment.cgi
-Disallow: /data/duplicates.rdf
-Disallow: /data/cached/
+Allow: /
+Allow: /index.cgi
+Allow: /show_bug.cgi
+Allow: /attachment.cgi
+Allow: /data/duplicates.rdf
+Allow: /data/cached/
Disallow: /query.cgi
Disallow: /enter_bug.cgi
Disallow: /userprefs.cgi