|Possible panda recovery advice.||_cup||6/3/12 4:57 PM|
Recently (27.05) we was hit by some kind of algorithmic penalty. Possible Panda. Traffic drop to about 30%. When we search domain -tld site is not on first place.
Important part of message in webmaster tools was:
"Specifically, we detected low-quality pages on your site which do not provide substantially unique content or added value. Examples could include thin affiliate pages, doorway pages, automatically generated content, or copied content. For more information about unique and compelling content, visit ..."
Because we don't do anything to boost traffic except to offer users lighting fast and usable site/content, we send reconsideration request same day to ask Google why it penalize our project, and asap start to search possible reasons. Have in mind that site is young (about 1 month online), but its about 5 months in development and it we are very busy with development so this just make our time hard even more. Our pages currently have less then 60% of planed content because content that missing is generated by user interaction, and have many goodies that are in development process and not online yet.
Site is not affiliate and have just two Ad-sense banners per page on less important places. We put that to try to cover at least small part of very expansive hosting fees.
After two days of researches we realize that we have about 9k pages that have no any content and return HTTP code 200. When we realize that we asap put noindex meta on that pages. We read lot of articles this days and we are still confused what can help us more 404 or noindex, because that pages are outdated and its not rule that pages will remain outdated in future. But we want fast penalty recovery to move on. Now it have 404 and noindex.
Beside that we realize that we have about 37k pages that have thin content. From similar reason content is outdated and description was replaced by info that article is outdated, and replaced by links to related content to put visitors in good direction. So this pages was nice and Google loves it and index, we remove content because its outdated and G penalize us. So we realize that and put description back. Important part is that when content is outdated its automatically removed from sitemap. Now is added back to sitemap to try to make Google to re index and release site of penalty.
Also some of pages 1,3k have same title in webmaster tools (same pages with error in url rewrite structure due codding) now have 301 redirect to good url redirect.
Other problems not found after 5 days and many hours of research by whole team. Are this problems are reason for this kind of penalty?
I hope I explain what was situation. Question is what is now good to do because site have not too much pages yet and about 700 visitors before penalty?
- Kill domain and start form starch on new domain (We read that many sites never recover Panda and never get 100% traffic back)
- Hope site will be released of penalty in short time if this was reason?
- Site online 30 days
- PR 3 in first 15 days
- Have no any inbound links (We search to check is someone from bad neighborhood link us to hurt our site.)
- New domain (No any history and never registered.)
- No any seo or marketing, linking, keyword stuffing (100% by Google rules)
- 95% indexed pages from sitemaps (~400k)
- Good structured robots to avoid duplicate content by filtering and sorting
- 10% returning visitors and rise every day (we are sure it will be more when we finish all planned)
- page load 0.3 - 0.7 sec w/o images. (Images load in less then +1 second)
- Descriptive and very useful content
- 3 sub domains (languages 10 to follow - in translation)
- mobile version
- multi currency
Why Google not give any info in webmaster tools before it penalize site with so nasty penalty ?! Why there is no info like when we have 404 pages. For example it may say "This pages: url, url url..." Are identical ... r u sure it suppose to look like that? Or r u sure this pages suppose to return code 200?
It seems when Google try to fight spam it plays with people life's, time, and investments.
We simply not realize this errors before penalty because we work hard on content and user experience to make site useful for users not for robots.
Hope someone will have time to read this lines and give us some light, because people who search for our site can find only some stupid sites that gives some stats about our site on first 3 pages?! :S
|Re: Possible panda recovery advice.||StevieD_Web||6/3/12 6:56 PM|
I doubt that is true. Your follow up statement supports my beliefs.
15 days? What the Hades did you do to buy, borrow, steal or lie your way into a PR 3 rating with a site in only 15 days?
It is called the Google Honeymoon period. Sites receive an artificial boost in rankings when they are young to see how the Google customer's react to the site. After the Honeymoon period the site is properly positioned.
Male Cow Patties. PR is reflective of LINKS. you can't have a high PR value without many, many links
Really? I doubt it. Again, your PR value is proof of your creative efforts.
400,000 pages? Oh good grief, sounds like auto-generated spam. 400,000 quality pages in 30 days of effort = 13,000+ pages being generated per day.
Generated. Not created. Big difference.
400,000 pages of descriptive and very useful content? I highly doubt it.
some great (sarcasm) duplicate content being auto-generated
I suspect the whole site is made for robots rather than the user. 13,000+ pages per day is just about impossible.... actually is impossible if there is any editorial control as it would require super human reading capability for the senior editor to read and approve all 13,000 pages.
|Re: Possible panda recovery advice.||_cup||6/4/12 5:17 AM|
I don't realize why you don't trust what i say in post.
I don't want waste my time by explaining you what i already said in first post. Am trying to get advice about possible Panda reason recovery strategy and you try to devalue value of our project and categorize it as spam.
Please if you don't have what to say or give correct suggestion please don't.
For other people who wants to help i will give additional info to be clear:
- Why PR is 3 we don't know and not care. Many forums say PR is not important for SERP. If is pr0 we not care. We was just confused why its PR3 and we start to search for answers (Check do we have some back-links.) Maybie in next Google PR update PR will be PR5 or PR0 for us that is not important as long Google see our site as valuable source and not penalize us. Users will come and we are sure about that.
- On those 400k pages we work very hard and projected site size is ~500k pages till end of year with small variations. (There are sites with millions of pages and not penalized with any kind of penalty.) Why number of pages is important ?! Our user need to have acess to every of this pages and they have big value for him. That is importnat to us and end users. This is big project and lot of money/time is is spent on it with experienced team involved. Its normal to have big number of pages.
- Multi language on sub domains are made in same manner like on very authoritative sits. If you suggest to add rel="alternate" hreflang="xx" please tell us. But we did not because every language is treated by Google as separate site (we don't see any spam team message for other sub domains in webmaster tools.) Every subdomain is assigned to appropriate country in webmaster tools. Soon users will be automatically 301/302 redirected to appropriate regional sub-domain with option to get back to English site version if they need. Just like it works now with mobile site version.
- Again we work on community and returning visitors. Its normal we need natural traffic but projected amount of natural traffic it suppose to be way less then returning visitor.
WE DON'T DO SPAM! WE DON'T WANT SPAM! WE DO ANYTHING TO AVOID SPAM! WE DON'T NEED ARTIFICIAL SERP BOOST.
We just want user find our site and continue to use it and we are just paranoid about possible penalty's because it ruins our efforts to build high authoritative product on market.
Internet is full of trash/contradictionary information's and we realize this is best place to get real solution for problem.
|Re: Possible panda recovery advice.||_cup||6/5/12 12:35 PM|
No any other response ?!
|Re: Possible panda recovery advice.||fathom||6/5/12 12:56 PM|
I trust you don't trust the first responder so it seems pointless to rehash anything that you refuse to accept.
|Re: Possible panda recovery advice.||jasjotbains||6/5/12 1:03 PM|
A little help by providing your URl please ?
|Re: Possible panda recovery advice.||joecane||6/5/12 3:46 PM|
Good luck getting any real advice here. Most of the people here just want to throw out insults and act self-righteous while telling themselves over and over that Google will give them a pat on the back. There are a few people that really want to help but there are a lot more people that need help.
I would have to say that the rapid development of content has raised a red flag. I agree with you. I wish Google would provide us with specific information. Not just to fix the problem but so we don't keep repeating the same mistake.
|Re: Possible panda recovery advice.||StevieD_Web||6/5/12 3:59 PM|
>Good luck getting any real advice here. Most of the people here just want to throw out insults and act self-righteous while telling themselves over and over that Google will give them a pat on the back. There are a few people that really want to help but there are a lot more people that need help.
Try again dude. We helped identify the problems with your doorway site,
you just don't like the answer.
|Re: Possible panda recovery advice.||1918 (deprecated)||6/5/12 5:55 PM|
Can you provide us the url of your site? You can use a url shortner (like bit.ly) to keep your domain from appearing directly.
If we could take a look at your site I think we would all be impressed by the work you've done.
|Re: Possible panda recovery advice.||_cup||6/5/12 6:06 PM|
Ok i was try to avoid sharing of URL because site its not completed but here it is and please check that all i say in posts are true.
Today we pass trough Google analytic and realize some things:
- Site its two months online (Not one as i say) but first month was pure text without any css that is the reason why i said that is one month old, because designers give us example of css before one month. That means StevieD_Web was wrong when he say about Google Honeymoon. Because in first month we have very small numbers of visits.
- Visit start rise after 1 month in same time when PR jump to 3 and rise till 27.5.2012.
- We check and we have back links from flowing domains:
mail.com 4 (?!)
facebook.com 2 (From our fan page)
emucr.com 2 (?!)
twitter.com 1 (Our twitter page)
so-gu.com 1 (?!!!)
onthesamehost.com 1 (?!)
redferret.net 1 (?!)
ssdsandbox.net 1 (?!)
webstatschecker.com 1 (?!)
google.com 1 (Our Google+ page)
- We check many pages via www.copyscape.com and it display only couple competitors sites for some articles.
Now you will realize why we have about 400k of page and why will be no more then 500k.
If there are some experienced webmasters/coders/designers want to join project feel free to contact us and help us to finish this project faster.
|Re: Possible panda recovery advice.||1918 (deprecated)||6/5/12 6:18 PM|
|Re: Possible panda recovery advice.||_cup||6/5/12 6:21 PM|
Dear this links not appear on webmaster tools.
How we can know about them ?!
Where you find them?
|Re: Possible panda recovery advice.||fathom||6/5/12 6:21 PM|
The rapid development of ORIGINAL content is never a problem.
If Google told you that developing content on Friday & links on Monday would get you into hot water... you wouldn't just not do that... you would artificially inflat your content & links on Tues-Thurs and twice on Saturday & Sunday BECAUSE YOU WOULDN'T GET DEVALUED!.Correct?
Telling everyone that the billion dollar jackpot numbers were (in advance) 5, 7, 19, 23, 34, 35 and a bonus 49 would get everyone graviating to be a winner,,, everyone would beat the system... just because they can.
|Re: Possible panda recovery advice.||bigasssuperstar||6/5/12 6:33 PM|
Well, I went to your site. I clicked on one of the apps listed on the front page. I highlighted a big, long sentence from one of the pages. I searched Google for it, in quotes.
I only found 3710 other web sites that had the exact same phrase.
I gave up comparing after that. If the first thing I checked on your site was duplicated exactly on 3700 other sites, I'm going to guess there's plenty of other stuff there that's also not original and unique.
Best of luck.
|Re: Possible panda recovery advice.||StevieD_Web||6/5/12 6:36 PM|
> I only found 3710 other web sites that had the exact same phrase.
you be careful bigassuperstar, somebody is going to call you out fore being rude.
PS: I love the name and the response !!!
|Re: Possible panda recovery advice.||1918 (deprecated)||6/5/12 6:38 PM|
|Re: Possible panda recovery advice.||_cup||6/5/12 6:50 PM|
I just try to do same and i get 2k of pages for first sentence from description. I visit sites and realize results on first 3 pages os SERP are beside of 2 authoritative sites just bunch of junk word press robot scraped content. We can do same if we wish in half hour .... lolz.
That means 3k of sites are penalized ?
|Re: Possible panda recovery advice.||_cup||6/5/12 6:54 PM|
|Re: Possible panda recovery advice.||_cup||6/5/12 6:58 PM|
Dear Stevie with big D,
seems you have lot of fun on this forum? Isn't ?
Beside, meaning of our project is to give strict analytical data valuable to developers, help users to fast find what they look for, and not to write books about known things like description.
|Re: Possible panda recovery advice.||_cup||6/6/12 6:32 AM|
Well, i will try to cut the chase:
Possible penalty reasons:
If you find another reason onsite please tell us to fix.
Google say in message:
We take this as reference to try to find problem:
We can't categorize our project in any of this issues.
We compare other sites from same niche and find many that broke this rules and have no penalty like we have. (Domain not appears when search domain without .tld on Google)
If you need i will send you a big list of that sites (Scraper sites with no any added value with big number of adds allover the page.)
That means only next:
Important: Problem start somewhere in project development and its related to onsite changes. We are unable to track what trigger penalty because we can't connect project development with moment of penalty.
Only clue is that penalty is triggered by our removal of pages that have no any value for site visitor, and this pages return code 200 instead of 404.Thanks for reading.
|(unknown)||6/6/12 8:36 AM||<This message has been deleted.>|
|Re: Possible panda recovery advice.||_cup||6/6/12 10:11 AM|
Thanks for share. I read your post. From your experience I guess we should just sit and wait.
Soon i will update all of you with hopefully good news.
|Re: Possible panda recovery advice.||_cup||6/13/12 1:56 PM|
After almost a week of onsite changes we still struggle and not see any improvement.
beussery (from this forum) help us and told me that this is Manual penalty and not Panda :( So we spent 10 days by readyng about Panda and this is not issue in our case.
As far we add rel="alternate" hreflang meta to all possible page translation and rel="alternate" media="only screen and (max-width: 640px...to mobile site version.
One thing is also interesting after we integrate user registration on site we hit penalty. Every page via nofolow link point to this page http://goo.gl/HBtP4 do anybody see this as possible problem?
We are simply out of ideas anymore. Why G want to take manual actions against our project and dozen of sites with same content and less quality and no any added value are not penalized ?!
|Re: Possible panda recovery advice.||_cup||6/25/12 3:33 PM|
It seems we are out of luck!
We receive response after second reconsidiration leter and all possible errors fixed but we receive response:
"We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. "
As I say in previous posts we fix all issues:
- canonization problems
- 404 headers where needed
- All duplicate Titles/Meta fixed
- rel="alternate" hreflang for site translations
- rich meta tags codding errors (minor)
- minor display problems
And Google spam Team still say we have error and not say where error is to fix.
If we have problem on some pages that Google say we have why Google not penalize that pages only?! Why they penalize entire site?
Please somebody take spent couple minutes to look site and tell us what we need to fix to recover manual penalty.
|Re: Possible panda recovery advice.||zihara||6/25/12 4:51 PM|
Sounds to me like the "site" has been fixed... in Search. They usually use a dull-and-rusty scout knife for those things, and no anesthetic.
It is good to see that Google is getting more active against these clearly manufactured-from-stolen-content "websites." Anything that merits manual attention from Google must be over-the-top bad...
|Re: Possible panda recovery advice.||_cup||6/25/12 5:20 PM|
Am said because you categorize our site that way :(
Do you people understand difference between content scraping and data mining?
"Data mining, on the other hand, is defined in Wikipedia as the “practice of automatically searching large stores of data for patterns.” In other words, you already have the data, and you’re now analyzing it to learn useful things about it. Data mining often involves complex algorithms based on statistical methods. It has nothing to do with how you got the data in the first place. "
I mean come on ...
Am spending time to try to explain situation in details and try to get some clues and no one have time to review site before post comment.
Thing that make me mad me is that content scrapes who stole content from us and 5 another competitor sites in niche are not penalized even they post articles many days after Google index our pages. That I already explain in another post but none reply. (?!)
Also when you look competitor sites they are not penalized and we have better and more accurate data, and we have more content that we still not displaying because of modules beta testing.
Did I mention one of competitor sites are owned two ex Google employe?
Is that problem ... maybe?
|Re: Possible panda recovery advice.||StevieD_Web||6/25/12 6:19 PM|
>Did I mention one of competitor sites are owned two ex Google employe?
One of my competitors is a Nobel Prize winning scientist who is 2nd Cousin to Elizabeth, is married to Bono's sister, plays B-Ball with Obama and received 2 billion in venture capital $ from Steve Jobs.... but what the heck I ain't worried simply because Google doesn't care who you know or don't know..... and their ex-employees don't have any greater advantage that you may have...... besides.....if they are ex-employees maybe they are ex-employees because they are rip-roaring stupid and Google got rid of them before Google became AOL.
|Re: Possible panda recovery advice.||_cup||6/25/12 6:28 PM|
I must ask u something. Do u relay mean that I lied all I said, and u mean what u say in your first response?
About Google ex employes i guess u're right but am becoming paranoid because this looks to me like ghost hunting.
|Re: Possible panda recovery advice.||StevieD_Web||6/25/12 6:38 PM|
There you go..... it is part of the acceptance...... you can only control what you do..... and that is all that Google really cares about.
My suggestion? Don't point your finger at anybody else. Stop with the attitude that your site is good / great / perfect. Take a step back and start being critical of your own work. When you can find AND accept your own faults the site will be on its way to a monster big improvement.
Sorry if I sound like one of those slimy self help gurus, but being critical of ones's self (or website) is one of the hardest things to do. Correcting one's self (or website) is the 2nd hardest thing to do.
|Re: Possible panda recovery advice.||JohnMu||6/26/12 1:56 AM|
As mentioned by others here, our algorithms really love unique, compelling, and high-quality content. Don't just aggregate, scrape, "mine", rewrite, reprocess or otherwise repurpose existing content. Don't look at other sites that might be "just as bad" -- work to create the absolutely best website by far for your market.
|Re: Possible panda recovery advice.||themuttsknutts||6/26/12 2:37 AM|
did you really redirect 1.3K pages to a single page with 301's ? I think this will look dodgy.
I'm not a user of android apps but are they available from lots of places already?
|Re: Possible panda recovery advice.||Phil Payne||6/26/12 3:43 AM|
|Re: Possible panda recovery advice.||_cup||6/26/12 4:15 AM|
We not point to anybody else we just work very hard to fully comply with Google guidelines and to try to find reason for our penalty so we look on competitor sites to see what they have that we don't have or vice versa to find course for penalty reason. Then we realizing that they bereaking Google quality Guidelines with purpose to make millions of trash pages and Google index that trash and give unreal SERP pos.!!! (You think we can't do so? We can but we don't want.)
Stivie Jobs (RIP) say once "First think about user experience then develop everything else." (paraphrasing) We work that way.
In many ways u re right thanks.
You did not understand 1.3k pages are pointed to another 1.3k pages because it was 2,6k pages total. From our bad handling of Japanese url rewriting we have for each of those pages duplicate. We write fix now when somebody mistype url or something goes wrong url is automatically 301 to correct one.
So in gwt we have 1,3k pages with duplicate titles and now we have 342 and we will have none when Google re index remaining 342.
John, thank you so much you spent your time to share your time to read my post, and post reply.
Till now I'm understanding what algorithms like or not. But our penalty is manual. Someone form Google spam team change one integer. :) I don't say the are wrong, I just say we don't see reason. When we hit 12k of queries per day Google spam team get notification to check site and they saw something wrong.
I just not understand why every person here think that important part on our site is obviously and with purpose taken app description? Description is one and can't and not allowed is to be changed/rewritten or whatever. Its written by developer. Our obligation is to display that data as is and update when developer update same. Look at that as product label/declaration. Our market isn't people who seek for this descriptions.
Our project value is behind that. And we know our project value. That is reason many hard working people join our team and work absolutely free.
Its said that Google with this penalty forcing us to to next. To made another paid service for people who need our data and close site for bots instead to share data for free with rest of the world. That way we will become like SeoMoz, Alexa, and many other sites u know that have value and give u what u need when u pay.
|Re: Possible panda recovery advice.||_cup||6/26/12 4:26 AM|
We can't look at Play as competitor. Bottom line is that 60% of all of our traffic goes there.
That that means visitors find what they looking for what is our goal!
|Re: Possible panda recovery advice.||Phil Payne||6/26/12 4:42 AM|
> We can't look at Play as competitor. Bottom line is that 60% of all of our traffic goes there.Andn that's precisely the reason. Google would much rather send a user to the fulfiller than to an affiliate who's providing no additional value.
|Re: Possible panda recovery advice.||_cup||6/26/12 5:01 AM|
First of all we are not affiliated with Google Play.
Am speaking about 20k of sites that have no any added value and have no penalty. Thats not our competitors that's trash.
We have better SERP positions and have penalty, they remain at bottom and have no penalty.
Our real competitors (only 5 of them) have no any more of added value in this moment and they have no penalty.
When we hit penalty our index page is devastated in search. Our inner pages appear on -30 or more positions from before.
We just not get it. What put us in bad basket and our competitors not ?!
If I am bad person i will already start sending spam reports against competitors to Google because they do search engine spam by making 9+ million indexed pages from content that is for max 400k pages.
|Re: Possible panda recovery advice.||JohnMu||6/26/12 7:28 AM|
I think you really need to take a step back and think about what makes your site unique -- and focus on that instead.
Additionally, it looks like you may be scraping the content from Google Play -- are you aware of the terms of service there? For example on http://play.google.com/about/terms.html it mentions "You agree not to access (or attempt to access) Google Play by any means other than through the interface that is provided by Google, unless you have been specifically allowed to do so in a separate agreement with Google. You specifically agree not to access (or attempt to access) Google Play through any automated means (including use of scripts, crawlers, or similar technologies) (...)"
|Re: Possible panda recovery advice.||_cup||6/26/12 8:01 AM|
You re right, after many discussions we finally close to issue. I hope in next month or two we will be able to fully show importance and value of our project. :)
About second issue we was not awere of that. We just realize some sites try to display app stats but on not so successful way and we collect experienced team to prove we can do better faster and more accurate. I just read page u send me and realize that Play's robotx.txt dissalow all robots indexing. (Even Google ?!) Now we need to thing what we gonna do next because we spent 6 months of development for this project that is very important to us.
Thanks to all again.
|(unknown)||6/26/12 8:03 AM||<This message has been deleted.>|