/    Sign up×
Community /Pin to ProfileBookmark

[RESOLVED] shared ssl / bot blocking

We have a dedicated server. (LAMP) For any client on our server that needs ssl, we can use “secure.ourdomain.net/~theirusername/…” The problem I am trying to solve is demonstrated by doing a google search like “site:secure.ourdomain.net” and it finds secure pages from many of our clients’ sites.

How do I prevent following and indexing these pages?

to post a comment
SEO

6 Comments(s)

Copy linkTweet thisAlerts:
@svidgenSep 07.2010 — Refer to http://www.robotstxt.org/robotstxt.html

BUT, be aware that these rules need not be adhered to. And, while you'd be hard-pressed to find a major crawler that indexes pages disallowed by robots.txt, there are no legal or "mechanical" reasons that a search engine can't simply ignore the file. (none that I am aware of, anyway)
Copy linkTweet thisAlerts:
@TecBratauthorSep 07.2010 — svidgen,

I am familiar with robots.txt. The problem is that I am trying to deal with this on a multiple domain level. secure.ourdomain.net/robots.txt does not exist. There is no way for me to create one. I could create secure.ourdomain.net/~username/robots.txt, but anything in that file would actually effect www.usersdomain.com. Do you understand? Do you have any other ideas?

I also understand that robots.txt file only applies to polite robots and might actually help impolite ones find sensitive data.
Copy linkTweet thisAlerts:
@criterion9Sep 07.2010 — Do you have a root level folder for "secure.ourdomain.net"? If so place your robots.txt there with a blanket deny all. That should affect all sub folders (~theirusername/) equally without affecting theiruserdomain.com.
Copy linkTweet thisAlerts:
@TecBratauthorSep 08.2010 — That would have been too easy. Unfortunately, I have not found a way to access such a folder. I have been using SSH a little more lately. Maybe I'll poke around in my server and see if I can find it.

edit:

I actually checked an old ticket with my hosting company and they said there is no way to write to such a folder.
Copy linkTweet thisAlerts:
@TecBratauthorSep 08.2010 — I posted another ticket to my hosting company and they finally gave me the right answer:

For https://secure.ourdomain.net/ the document root is /usr/local/apache/htdocs

So I just had to go there and write that file with pico.

Hopefully this helps someone else too. ?
Copy linkTweet thisAlerts:
@svidgenSep 09.2010 — Glad you got it figured out.
×

Success!

Help @TecBrat spread the word by sharing this article on Twitter...

Tweet This
Sign in
Forgot password?
Sign in with TwitchSign in with GithubCreate Account
about: ({
version: 0.1.9 BETA 5.18,
whats_new: community page,
up_next: more Davinci•003 tasks,
coming_soon: events calendar,
social: @webDeveloperHQ
});

legal: ({
terms: of use,
privacy: policy
});
changelog: (
version: 0.1.9,
notes: added community page

version: 0.1.8,
notes: added Davinci•003

version: 0.1.7,
notes: upvote answers to bounties

version: 0.1.6,
notes: article editor refresh
)...
recent_tips: (
tipper: @AriseFacilitySolutions09,
tipped: article
amount: 1000 SATS,

tipper: @Yussuf4331,
tipped: article
amount: 1000 SATS,

tipper: @darkwebsites540,
tipped: article
amount: 10 SATS,
)...