I’m using a robots.txt, partly generated by Google, to stop direct links to my image folder and a folder with files that can be downloaded from my site (CVs, desktop design, PDFs). The contents of these folders (“img” and “dwn”) are still shown by Google Images and other search engines. Could anyone tell me what’s wrong?
[code]User-agent: *
Disallow: /img/
User-agent: msnbot-media
Disallow: /
User-agent: Googlebot-Image
Disallow: /
User-agent: Googlebot-Image
Disallow: /dwn/
User-agent: *
Disallow: /Templates/
User-Agent: BDFetch
Disallow: /
User-Agent: BPImageWalker/2.0
Disallow: /
(BPImageWalker and BDFetch are malicious crawlers)