sks-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Sks-devel] Request: Install an efficient robots.txt file


From: m
Subject: Re: [Sks-devel] Request: Install an efficient robots.txt file
Date: Tue, 20 Jun 2017 23:03:49 -0400 (EDT)

Hi,

Thanks for pointing this out. I never thought about robots.txt on my keyserver, 
and sure enough, it was missing.

It's easy to add: just drop it in the web/ directory in the skskeyserver 
directory. I did have to reload sks to get it to see the new file. It's now in 
place on gpg.n1zyy.com!

Beyond the privacy implications, I'm also happy to keep search engines from 
performing a ton of 'searches' on my server by following links between signed 
keys!

(And while it's true that _public_ keys are, well, public, I'm happy to not be 
directly giving Google et al. email addresses linked to real names.)

-----Original Message-----
From: "robots.txt fan" <address@hidden>
Sent: Tuesday, June 20, 2017 4:35am
To: "address@hidden" <address@hidden>
Subject: [Sks-devel] Request: Install an efficient robots.txt file

_______________________________________________
Sks-devel mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/sks-devel
Dear Sirs and Madams,

I would like to thank all of you for doing this. You are a necessary pillar to 
PGP and it is awesome that you are there to provide the infrastructure to host 
everyone's key.

Without attempting to diminish the previous sentence, I have a request to make 
to some of you.

Most of the SKS serve an efficient robots.txt that prevents everyone's 
un-deletable name and email showing up on search engines. However, there are 
some exceptions. I like to keep a low profile, but when searching for my name, 
for example on Google, a significant amount of results are from SKS pages, or 
to be more specific, these:

keyserver.nausch.org
pgp.net.nz
pgp.circl.lu
keyserver.rayservers.com
sks-keyservers.net
keyserver.mattrude.com (special case: blocks /pks, but not /search, a 
non-standard (?) directory)

I would like to ask the owners of these pages to take the time to install an 
efficient robots.txt file, for example something like this:

User-agent: *
Disallow: /pks/

To all others, I would like to ask you to take the time to check if your server 
serves an efficient robots.txt file, and if it does not, to please install one.

If there is any doubt that a robots.txt file is a good idea, I can elaborate on 
that.

Thank you for your time.

RTF




reply via email to

[Prev in Thread] Current Thread [Next in Thread]