I have read the FAQs and checked for similar issues: YES
My site's URL (web address) is: site in question is not live yet
Description (including timeline of any changes made):
I'm not used to trying to block search enhgines from anything - I usually try to encourage them!2010-09-08
Why not place the script(s) in a directory and block that directory (disallow) using robots.txt?
thats actually the plan - like I said I have no experience blocking spiders - I have spent the last five years trying to get them to index my site and am totally clueless about blocking them.
And I don't want to annoy googlegods by looking like a link farm, which I think the site may if I don't do it correctly. Which is why I'm asking for help here.
Unless there are a LARGE number (50+) external links in the menu then you'll be fine. Internal links should never get you considered to be a link farm so you should be fine there.
To be honest in most cases its far better to make the menu in HTML where possible even if you need to use JS to get the drop-downs in simply so Google can see your link structure more clearly.
what about putting the robots.txt in the webroot of you server with noindex , nofollow?
IF the links are in JS, then the chances are that G will not associate them as far as Value goes,
as G tends to ignore JS./
(Yes - it may look, see the URLs and note them, but won't pass value, so won't penalise as not manipulation occurs).
If instead you have the links in teh main code as html, and use JS to do fancy things ... then blockign the JS won't achieve anything anyway.
To disallow - jsut do as per the robots.txt info linked to from the FAQs - crawling/indexing/ranking - Don't Index section
You realise that if the links are JS only, that some users may not be able to see/use them (only a small %, but it is worth noting for the sake of accessibility).
cool - thanks for that