I used to work there a lifetime ago and it was actually my job to respond to takedown requests. There’s this thing called ‘robots.txt’ that allows websites to tell others (search engines and such) what content should be off limits. For example, in the robots.txt file you would have a line that says “disallow /cgi” and the search engines would skip over the stuff in the /cgi directory. And, yes, to the others that mention DMCA takedown requests, yes the archive honors those, but that is a last resort.
Latest Answers