Google Product Forums
Re: Think you're affected by the recent algorithm change? Post here.
Robert - NoVa
Mar 15, 2011 1:25 PM
Posted in group:
Crawling, indexing & ranking :
I don't know if this is related, but .. starting Feb 27, Googlebot began finding thousands of bogus urls on my site, such as
However, this is a composite of several valid urls.
This was not a problem previously. I also got the following
"Restricted by robots.txt (24,468)"
The value should be under 300. Apparently, since google now thinks that my site is some kind of spammer (or something), my search ranks have tanked and my alexa ranging has tanked.
This appears to also be associated with my site no longer being "verified". (I have fixed that, but a explanation on the site would have been nice.)
Also, once this is fixed, how do I clear the robots cache so that only valid data is displayed?