|Change frequency||Richard Heyes||7/1/12 3:44 PM|
At the moment I'm constantly (usually weekly, but it can be every few days) updating my site to try and move it up in the rankings. Would the fact that it's constantly changing detract from the rank at all (as it never seems to get anywhere)?
|Re: Change frequency||webado||7/1/12 6:17 PM|
It can only move up the rankings if there are more backlinks from relevant and related sites, as well as of course if the content is the best match for the searches.
Updating regulalry, often can result in Googlebot visitging a bit more frequently to recrawl, but you won't usually see any immediate results.
If you update more frequently than Googlebot crawls it will take longer to catch up. That's my opinion at least.
Mind you it depends on the type of website.
For a "normal" site (i.e. not a blog), adding extra pages that are there to stay for the long haul is OK. Eventually the robot gets to them.
For a blog, adding posts more often than the robots crawl can result in the site never being cached quite up-to-date.
|Re: Change frequency||Richard Heyes||7/2/12 1:56 AM|
But the frequency of updates wouldn't cause a drop by itself would it? Sites are changing all the time and I would have expected that fresher sites would do better than stale ones.
|Re: Change frequency||Phil Payne||7/2/12 5:42 AM|
> But the frequency of updates wouldn't cause a drop by itself would it? Sites are changing all the time and I would have expected that fresher sites would do better than stale ones.Absolutely not - Google does indeed tend to prefer 'fresher' sites. If such updates are a regular feature of your site, I seriously recommend providing Google with an up to date XML sitemap whenever you make a change. If your changes are genuine and you submit an updated and honest sitemap each time, you will find the Googlebot will start to 'trust' you and spider each changed page within minutes.
|Re: Change frequency||Richard Heyes||7/2/12 9:51 AM|
|Re: Change frequency||Phil Payne||7/2/12 9:54 AM|
> Is this due to the requests being routed to different datacenters?Possibly, but it's rarer these days than in the past.