Categories: Crawling, indexing & ranking :

Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided)

Showing 1-16 of 16 messages
Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) arikuchinsky 1/20/11 2:22 PM
I have read the FAQs and checked for similar issues: YES
My site's URL (web address) is: http://socialnews.biz
I am writing this post to alert Google that  the “reconsideration request” process is grossly inadequate in its current form. In this post I will describe the problem and I will offer a solution.

GOOGLE’S MONOPOLY LIKE STATUS

Currently, Google is by far the dominant search engine in the United States (and this is generally true globally as well). I would argue that Google has a monopoly on search and that search is a critical part of the Internet. Put another way, Google search arguably has “utility like” status in the same manner as an electric provider or an airline. Simply put, it is difficult to locate information without Google.  To Google’s credit, the term Google is  a synecdoche just like Coke, Kleenex and Xerox.

RELIANCE ON GOOGLE

There are a variety of legitimate business models on the Internet. Almost all of these businesses have one risk factor in common: Google. For example, Q&A sites such as eHow, Quora and Mahalo rely greatly on Google sending search queries to their answer database. Blogs rely on search traffic to grow their audience. Commerce sites typically rely on search traffic for a portion of their traffic and indirectly this means commerce sites rely on search traffic for sales. Frankly, there are not too many websites on the Internet that do not rely on Google for traffic (although the degree varies from website to website).

WHAT DOES BEING BLACKLISTED MEAN?

I define a blacklisted website as a website that has a consistent level of Google organic search traffic over a period of months or years; instantly, the website loses 95%+ more of its traffic across all search terms.

BLACKLISTED SITES

Because Google is both a monopoly and a utility, if Google blacklists a website that relies on search traffic, the website can go from a thriving site to a site with no traffic (and no revenue) even faster than a search using Google Instant. Google needs to keep in mind that these websites have real human faces behind them. In some instances, a website being blacklisted means fired employees and financial ruin for the owner of the website. In many respects, it is analogous to a utility company cutting power to a business.


THE CURRENT PROCESS

When a site is blacklisted, Google typically (based on experience and reading about the experience of other people) does not notify the site owner even if the site owner has provided an email address through Google Webmaster Tools. There is no notification process.

After the website owner conducts checks to determine that Google has blacklisted the website (which can take a lot of time), the only recourse available is to fill out a “reconsideration request” via an online form.  

After submitting the form, the website owner is alerted that the request may take several weeks to process and there is no guarantee that you will even receive a response.

For many sites, losing Google traffic means shutting down. This can be particularly true for a site that has server fees, employees, perishable inventory  and other costs. It may be hard for many website owners to “wait it out” without a revenue source. Again, it is similar to a utility company turning the power off without notice. Yet, in this analogy, the utility company won’t tell you whether they didn’t receive your payment, whether it is a technical issue on their end or some other problem.

WEBMASTER FORUMS AND WEBMASTER GUIDELINES

While the website owner waits, the only resource generally available is the Google Webmaster Forums. The Webmaster Forum can be very helpful for the novice webmaster, but most professionally run websites do not want to raise their issues publicly on a Google forum. Additionally, the responses are often not helpful.

A typical response is to “read the Google Webmaster Guidelines.” The problem with the Webmaster Guidelines is that they are so broadly written that it is really an interpretative question as to whether a website complies except in the most egregious cases.

For example, the quality guidelines state “Provide unique and relevant content that gives users a reason to visit your site first.” Many popular news sites syndicate the same content from the Associated Press. In fact, a certain percentage of every newspaper in the United States is syndicated duplicate content. So should newspaper sites be banned from Google? What if the Dallas Morning News and Chicago Tribune publish the same AP article on their website? Is that duplicate content?

BIG BOYS VERSES THE LITTLE GUY

As a practical matter, only the little guy is relegated to Reconsideration Requests and Webmaster Forums. If a large company had an issue, the CEO would call their contacts at Google on the phone and would have the problem remedied. Even if it wasn’t remedied, at least they would have the access to find out what caused the blacklist. Additionally, a large company would have the funds to launch a lawsuit against Google likely in the form of an injunction. Thus, through networking and legal action, a large company would not be relegated to “waiting and hoping for the best” through a Reconsideration Request.

THE PROBLEM FOR GOOGLE

I understand that Reconsideration Requests are a structural problem for Google. Globally there are thousands of websites. Many of these websites are trying to scam the search rankings using blackhat techniques. Some of these websites are of poor quality and Google simply doesn’t have the resources to have employees reviewing Reconsideration Requests for thousands of requests. Especially when many of these requests don’t have merit.


THE SOLUTION

The solution is that Google needs to provide website owners an expedited Reconsideration Request (a response within hours including a telephone call similar to tech support if necessary). However, it is not reasonable to expect Google to hire an army of employees to handle these requests. The solution is to charge an expedite fee. Perhaps $100? This would provide payment to Google to recoup the cost of dealing with these requests. Moreover, the likelihood is that people paying $100 for a Reconsideration Request are more likely to be making a legitimate request. Most of the “noisy” requests would be relegated to the current system.  After all, if a website owner is not willing to pay $100 to determine the problem, the website owner is likely not losing a lot as a result of the blacklist and can wait for a response using the current system.

CONCLUSION

Google is a monopoly and a utility. People depend on Google for their business. Small websites must have a live channel to Google if they encounter an issue with how Google is interacting with their site. A legitimate business would be willing to pay a fee to at minimum get an answer as to why their website is blacklisted. People depend on the Google platform and shouldn’t lose everything without any kind of explanation or method to fix the problem.

SOCIALNEWS.BIZ

I’m writing this post because I spent a year building Socialnews.biz. Socialnews.biz uses semantic web technology to categorize business news and provide curated lists of “trending news.” While Socialnews.biz won’t be the next Facebook, the website has a legitimate purpose and many users find it to be a helpful tool. Last week, Google organic traffic went from its normal levels to zero. I sent a reconsideration request. I tried contacting Google engineers through friends of mine (to no avail yet). It has been a week and I am yet to receive a response. Unfortunately, unlike larger players, I have no direct channel with Google. Out of desperation, I’m now stuck bringing up this issue publicly in the Webmaster Forums.

If Google had an expedite fee, I would have paid the fee on January 12th and I would have received a response. The response likely would have consisted of fixing the issue or telling me the issue causing the blacklist. Either way, I would have had a way to proceed. Instead, I’m stuck with a hope and a prayer.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) squibble 1/20/11 2:28 PM
Additionally, the responses are often not helpful.

mm. I wonder what you define as unhelpful. Myself - I find that users think I am very unhelpful when I am writing what they do not want to read - even though the answer may be totally correct.

For your site - however you arrange, organise, present - it is still scraped, duplicate content.
 
To expedite the reconsideration - try and make your site more compliant with the infamous guidelines you mentioned.  A reconsideration request does not entail you explaining the purpose or usefulness - it means you resolve the issues.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) arikuchinsky 1/20/11 2:44 PM
Squibble, just out of curiosity, do work at Google? Second, whether something is duplicate content is pretty subjective. Candidly, much of the site can be considered duplicate content. After all, it is an RSS reader. The site uses feeds and none of the content is scraped. The list of trending topics and curated lists of trending content is all original content. The presentation is original and the way it is packaged provides usefulness and is original. Moreover, the way content is tagged and indexed using semantic technology is original.

Whether something is duplicate content is generally not black or white. Certainly a site that is 100% scraped with no value is duplicate content. Certainly a blog with 100% original writing is all original. Like I said above, many other well known sites fall somewhere in between. Any newspaper will have a certain amount of duplicate content because they syndicate from news services. Should they be banned? I'm not sure if duplicate content is even the issue, but if it was, Socialnews.biz falls somewhere in the middle of that spectrum.

Of course, many people say there is no duplicate content penalty. Google merely filters out duplicate results. I don't have an issue with that. But that isn't the problem with Socialnews.biz. No results are showing up.

Additionally, this post is only partially about Socialnews.biz. The main issue is how Google is dealing with website operators. Providing an expedited process with a fee could be a win for all involved. I'm obviously disappointed about how Google is handling Socialnews.biz, but the bigger problem is how they are dealing with website operators generally.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) Chris Hunt 1/20/11 3:11 PM
> Google is a monopoly and a utility.

It's not either of those things. Sure, it's the biggest player in the market, but that doesn't make it a monopoly. Nothing about it makes it a utility.


>  People depend on Google for their business.

If they're depending on a single free service for their business, more fool them.


> Small websites must have a live channel to Google if they encounter an issue with how Google is interacting with their site.

You're looking at it now.


> A legitimate business would be willing to pay a fee to at minimum get an answer as to why their website is blacklisted.

Spammers would be willing to pay a fee to get more clues on how the algorithm works. People who had paid that fee would complain mightily if it didn't result in them getting the listing they thought they deserved. People who couldn't afford it would complain that Google gave undue favour to those who could. It'd just be a big can of worms that Google don't want to open.

> I spent a year building Socialnews.biz [...] many users find it to be a helpful tool.

Then what are you worried about? The repeat custom of all those users who you've attracted over the year and who "find it a helpful tool" should tide you over while you figure out what's got Google's goat.


> What if the Dallas Morning News and Chicago Tribune publish the same AP article on their website? Is that duplicate content?
Yes it is, and Google is unlikely to list both versions in a query that returns that article. One page will be listed, and the rest will be filtered out. That's not "blacklisting" either site - since most of the pages on each site will be unique to that paper.

Since your site is built almost entirely from other people's content, a lot more of its pages are going to be filtered out. That's going to leave Google wondering whether the bit that's left is worth listing.


Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) squibble 1/20/11 3:12 PM
Discussions surrounding your site are of questionable value as judgement has been cast and your site has been deindexed.

Paying a fee for reconsideration would be a waste in your instance as you have not amended your site.

Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) arikuchinsky 1/20/11 3:29 PM
Gentlemen, I appreciate your comments. But the very fact that the only channel to discuss this issue is with bulletin board posters verses receiving an answer from a Google employee is the problem. I'm sure you gentlemen have valuable skills, but your answers are opinions. Whether a poster on this board thinks I'm right or completely wrong is irrelevant. Ultimately, only a response from Google carries any weight.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) squibble 1/20/11 3:31 PM
Your site is not deindexed anyway. I thought it was. Is your complaint that you dont return high enough for the content of others ?
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) arikuchinsky 1/20/11 3:56 PM
@squibble: I have no issue with how they rank pages. In fact, I would agree that in many cases there are low quality pages that might need to be filtered into Google's supplemental index. But rather than noindexing those pages, I let Google make the call. The site is indexed as you state because I see pages when I do "site:socialnews.biz"  But if I do a search for 'socialnews", it used to come up on page 1 or 2. It is now on page 5-6 (indicating a penalty some might say).

Other pages only come up if I do a very targeted search. For example, inserting the URL or doing "socialnews.biz" and "[term]" If I do other searches with text from the website, I get scraper sites pointing to Socialnews.biz but my results are filtered out. I wouldn't call that ranking poorly, I would call that being filtered out completely.

Moreover, if traffic was slipping that would seem to point towards a site losing in the rankings. But this traffic loss was like an on/off switch. Literally, all Google traffic stopped being passed through at the same moment. That points towards a global issue dealing with the site rather than certain pages not ranking well or falling from the rankings.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) seo101 1/20/11 3:57 PM
Google is not a monopoly.

You just whinging becasue your scraping site is not ranking as high as you would like.  Why did you even spam google with a reconsideration request?

The sooner Google ranks even more of your type of sites lower, the better my search results will become.


"People depend on Google for their business."

Any business who has a business model that depends on Google deserves to fail.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) arikuchinsky 1/20/11 4:13 PM
Dear Google:

Responses such as @seo101 above is exactly why website developers need a better system for dealing with reconsideration requests (and a private system). Website developers encountering problems with Google search are looking for answers from Google (and perhaps technical and helpful comments from others). It isn't helpful for anyone to receive opinions and trolling such as "You just whinging becasue your scraping site is not ranking as high as you would like.  Why did you even spam google with a reconsideration request?." It is a waste of the posters time and it is waste of my time for having to read about it (especially with all the typos and grammatical errors).
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) leonel80 1/20/11 8:27 PM
wow. I am just stunt by this discussion. First, I usually do not post here but I need to clarify something. Google is a private business that is involved in the search market. Google is a for profit organization not the government or something that you should take for granted. They have their propetary ranking system that for good or for bad ranks the Internet for search queries. I can tell you that I have lost google traffic for more than 80%. I have posted here my site and have been told things that nobody wants to hear but guess what that will make you stronger. It is like going to the doctors office for headaches and finding out that you have cancer. Yes, it hurts but you have to learn from it. You either implement it or not. But Google has no obligation of ranking any website better than other. Again it is a private company not your city government. I have filled rec requests and never got a response back but you have to take responsability for your own business. Now, I enjoy writing unique content that I know will enrich the readers. My traffic is growing and not because google is sending more people but because people are sticking to the site. May be in the future google will reconsider my site and give my rankings back but in the mean time I will keep building a strong website with lots of unique content and a fully compliant with google rules.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) JohnMu 1/22/11 3:17 PM
Hi arikuchinsky and a belated welcome to the forum!

In general, reconsideration requests are processed within a few days, and if you do not see changes, then it's possible that you have not remedied the issues (or, in some cases, it's possible that the site does not have anything specific that can be affected with a reconsideration request). The forums here are often even better than feedback from Google -- the users here are fast, knowledgeable and extremely honest. Taking Leonel80's comparison and modifying it slightly, I'd say it's more like going to the doctor with a headache, hoping for a pill, and finding out that you should to stop eating chocolate & go running more often; it's not easy, and maybe you'll decide to ignore it, but the doctors here are fairly experienced :-).

Looking through your site, I agree that you do have some novel ideas there, but overall, users expect to find unique and compelling content in the search results, so indexing content that's already been indexed before, and showing those copies in search results, might not always be what users want. Our algorithms work hard on providing users a variety of content. 

One thing I would try to do with a site like that is to make sure that the crawlable and indexable content is all of high quality, and unique and valuable. It's fine to sometimes reuse content from elsewhere -- as long as that's not the bulk of your site and provided it's blocked from crawling and indexing appropriately. You can find more about our stance on this kind of content in the links below. 

Hope it helps!
John
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) arikuchinsky 1/22/11 7:06 PM
@JohnMu,

I appreciate that you have taken the time to respond on behalf of Google. The main goal of my original post was to provide feedback to Google that might help other webmasters. Google can either accept the feedback in full, in part or discard it completely. In my opinion, beefing up the procedures for reconsideration requests is something Google needs to do to protect its "do no evil" motto. At this point in time, this post is for the benefit of other webmasters. The current process has already failed Socialnews.biz.

Search is a crucial component of Socialnews.biz's article ranking algorithm. Google controls search (funny that most people would agree with the term "control", but certain folks won't accept the term monopoly - both are synonyms for "market power" over Internet search). Socialnews.biz cannot run and curate news without search traffic. Because Bing, Yahoo and other engines have such a minority share of search, Google's unilateral decision to remove Socialnews.biz from its listings has destroyed its utility and revenue stream in an instant. Without utility, the website serves no purpose (without search traffic, the ranking algorithms will not work). Without revenue, expenses such as paying for dedicated servers cannot be met; moreover, because I have no feedback from Google there is no reason to believe Google will ever reverse their decision. In fact, I likely will not even receive any kind of response to my reconsideration request from Google. It is hard to keep paying the bills when it is doubtful I'll even get a response. Some say to fix the problem. But how can you fix the problem if Google will not even tell you what the problem is.

I reiterate that I appreciate you emailing. I realize you did not have to do so. With that being said, the generic response is not helpful to me and that process of providing a generic response is not helpful to other webmasters. The guidelines which you reference are written in broad strokes. For example, I consider Socialnews.biz's list of curated articles to be original content. Others on this board consider it to fall into "little or no original content." When you take a snippet of an article and aggregate it with information referenced in the article such as a description of the people and companies in the article is that duplicate content? Or does the unique presentation constitute original content?  

Certainly, at the extremes, it is easy to see where certain websites fall. A website with an original blog post is certainly original content. A hacker with a scraper site or "auto blog" is certainly duplicate content and not original. But what about websites that fall in the middle? For example, where would a popular site such as Techmeme or HuffingtonPost fit within these guidelines? They are certainly not wholly original and they are certainly not worthless scraper sites.

You said:

>One thing I would try to do with a site like that is to make sure that the crawlable and indexable content is all of high quality, and unique and valuable. It's fine to    >sometimes reuse content from elsewhere -- as long as that's not the bulk of your site and provided it's blocked from crawling and indexing appropriately.

Again, this is the problem with the Webmaster Guidelines. What is high quality content? Is this subjective? Is quality an algorithmic determination made by Google? I consider the curated lists and presentation of content on Socialnews.biz to be original. Many people would come to the same conclusion. I'm sure there are others that consider it low quality and don't like the website. So except at the extremes, how do these guidelines really help?

Google enjoys a powerful position controlling search. If Google is going to block a site's pages from its listings, I think the webmaster owning the site in question should at least be entitled to a prompt response letting them know that action has been taken against their website (so the webmasters knows it is a penalty) and what the cause of the penalty is.

Example of a helpful but negative response to reconsideration request:

Dear Webmaster:

Google recently has filtered your website out of our index. Your site was filtered out because Google deemed that your site does not have original content. Our algorithm ranks originality on a score 1-100. Only sites scoring above 10 out of 100 are displayed within the results. Your site has an originality ranking below 10.

----

While not great for the webmaster, at least the webmaster is alerted of the problem and knows that the algorithm made an objective analysis of the issue with the website.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) arikuchinsky 1/22/11 7:12 PM
Put another way, Google should give webmasters due process if a penalty is levied on a website. The current system is a far cry from due process under any objective standard and certainly under a "due no evil" standard.
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) being 1/25/11 9:15 AM
Just a remark here regarding Google Spam report form:

The one and most annoying form of spam is Low Content Websites.  These are being created by SEO firms as their regular function. They call it "White SEO" (!). They create a so called "legitimate content website" and use it as a platform to send links to the websites of their client. The website itself contains low quality / worthless "articles".

That is the most frustrating and common way of spamming. I have found, in my experience, that Google does not take any actions at all on such websites, despite reports that I have made.   In my own niche/ keyword , half of the websites on the first page are websites that are getting backlinks links from such SEO created content websites. As time moves on , and as Google refuse to recognize them as spam, their position actually becomes stronger (!).  

All one has to do is to create a website about various construction topics (Kewords) and than create links to the client of that particular topic.   Google then treats it as a legitimate link and the result is awfull: a low quality website, of someone who does not even know how to write, becomes second or third in the Search Results.

Google should state clearly in the Spam Report form, that such websites should be reported.  

In general we all know that an SEO can never create a good content website, which is useful for the user, since he does not know the topic he is pretending to write on. So why not just state the truth boldly and clealy for what it is ? (Stuff made for search engines instead of for the user) ?..
Re: Google Reconsideration Request Process Is Grossly Inadequate (Solution Provided) weighttrain 2/7/11 10:14 AM
Did you shut your site down?