Re: Think you're affected by the recent algorithm change? Post here.
Mar 6, 2011 12:02 PM
Posted in group:
Crawling, indexing & ranking
I spent quite a bit of time looking through some of the comments on the sites that were hit by the farmer algorithm. From the gist of what they are saying I think Google is separating out sites based on what they think is the site generating the original document. If they think you post a lot if copied content you end up in their naughty column.
For good sites caught in the algorithm I think Google needs a better system. Google depends on spiders to find new content. If you submit a site map this simplifies the process but the process still takes time and Google does not index everything either. There are scraper sites that are scraping real time at this point so the chance that they might actually beat you in the "precedence" game is very real.
So if Google thinks that half of your content is yours and half scraped what do does the farmer algorithm do? It appears that it gives the traffic to the scrapers. So what is an honest man to do?
I suggest that we ask Google to set up a system that will allow text/movies/pictures to be submitted to their engine along with the domain that it will be posted on. This would allow a hash code to be created along with embedded time stamp. If it was scrambled with 256 bit encryption with Google holding the password and the scrambled code given back to the submitter to add to the page as a silent html tag there never would be another question of precedence.
The nicest thing about this idea is that it could be completely automated and be nearly 100% bullet proof. It will not of course change the storm that we are in the middle of but it would certainly take the wind out of the sails of the scrappers.