I have read the FAQs and checked for similar issues: YES
My site's URL is:
THE USER'S EXPERIENCE:
1. A user requests my web page.
2. The web page is served without any affiliate links or content.
3. The user's web browser fires the onload event.
4. A function defined in my external .js file is called.
5. The function adds affiliate links/banners to my web page.
6. The user sees a web page that contains affiliate links and content.
1. Googlebot requests my web page.
2. The web page is served without any affiliate links or content.
3. My external .js file is blocked by robots.txt.
4. Google never sees any affiliate links or content.
Important details to consider:
1. The HTML document served to users and Google is exactly the same.
3. There are no script tags nor noscript tags in the HTML body.
The question is...
Does this violate Google's Webmaster/Quality Guidelines in any way?
Or in other words...
Is this something that Google's web spam team would penalize in any way?
If possible, I'd really like to hear from John Mueller or another Google employee who can give us an official answer here. Thanks in advance! :)
If you show Googlebot one thing, and users another, you are playing with fire. Google needs to see what a user will see, not what you want them to see or not see if it's different than what a searcher will see if Google sends them to your site.. Keep it simple, keep it transparent, no reason to jump thru those hoops.
Here is the releveant quote from the guidelines:
Quality guidelines - basic principles
That is official position....
QUOTE: "Google needs to see what a user will see, not what you want them to see or not see if it's different than what a searcher will see if Google sends them to your site."
I don't get that impression from this auto-response:
And I seriously covered my behind with;
============= Are you sure you want to do it? =============
After reading the text immediately about (There may be some Downsides),
are you 100% sure you want to do it?
At the end of the day - if you don't want it indexed, then maybe it shouldn't be on the page?
If you are happy that a % of users may not benefit from that content - are you sure that the remaining % really need it?
Does that content really do your Users any favours/benefit?
QUOTE: "Keep it simple, keep it transparent, no reason to jump thru those hoops."
The reasons for jumping through these hoops are:
1. Prevent passing PageRank through affiliate links.
2. Decrease the amount of time it takes to load the page.
HI SEOmofo. You got a direct quote from the Google guidelines which says don't show users different content than search engines. It can't be more clear. You are misinterpreting the auto response to justify deceiving Google.
Refer to my playing with fire advice above.
Your time would be better spent making a better site and not a site that skirts or breaks good practices, common sense, and the guidelines.
Whatever you decide going forward, good luck.
So - the Big Question is .....
? Why ?
Why do you want to avoid showing your affiliate stuff to Google?
The only conceivable reasons I can .... well ... conceive of (darn sucky sentence!) are;
* you don't want the additional Links to affect PR flow
* you don't want the additional Content to affect Relevancy
In both cases ... that means you are specifically/intentionally attempting to alter how G perceives the page/content for Ranking purposes.
If you disagree - please - I'd love to see a different view of things.
But in fact you can use it, after all Adsense and other ad server servcies do that, and it will work to prevent discovery and crawling fo the urls quite OK.
AdSense is actually a semi-valid example.
Tehre's a few others like it too .... and a few Feed services that are JS reliant.
But I think - as with quite a few things for G - it's "intent" that carries the weight.
If G perceives it as an attempt at manipulation ... it may have consequences.
"The reasons for jumping through these hoops are:SEOmofo, you are wasting your time. Both of those reasons show an intent to violate Google guidelines and fool and deceive Google.
Great idea... let's load all outgoing links only to users, not to Google so they don't dilute pagerank. Do you think you are the first person that thought of this?
Great idea ... let's fool Google about the actual loading time of our pages by showing Google a fast page while serving the users the actual dog slow one. Brilliant.
Say it out loud a few time to a competitor and see how it goes over. Why comply with the guidelines when you can scheme to violate them and figure you will get away with it. Good luck whatever you decide.
QUOTE: "After reading the text immediately about (There may be some Downsides),
are you 100% sure you want to do it?"No, I'm only 90% sure. The other 10% is waiting to hear what Google has to say. ;)
QUOTE: "At the end of the day - if you don't want it indexed, then maybe it shouldn't be on the page?"
These are affiliate links and advertisements we're talking about here. I don't see any difference between this content and AdSense content. I'm pretty sure no one--Google included--wants AdSense being indexed.
QUOTE: "If you are happy that a % of users may not benefit from that content - are you sure that the remaining % really need it?"
I'm not sure if anyone really NEEDS to see advertisements.
QUOTE: "Does that content really do your Users any favours/benefit?"
Yes. Without the possibility of my own personal financial gain, the site wouldn't exist at all. And my users would miss out on all the great ideas I'd otherwise write about.
Sorry Autocrat and kidcobra, I disagree. The intent is vis-à-vis the robots. As long as you do not show robots MORE content than you show humans, and if you do not show robots links that can pass pagerank which should not, it's OK.
Just keep in mind that some flash can be crawled and Googlebot has becoem quite adept at finding links in on-page javscript. Since your js is external, then you are well protected.
QUOTE: "SEOmofo, you are wasting your time. Both of those reasons show an intent to violate Google guidelines and fool and deceive Google."
I am indeed wasting my time...but only because I've already had this conversation. With myself.
Disagreeing is fine.
And to be honest - there are times when I think such things are sensible.
But so far - the OP hasn't said anything that suggests a "valid" reason - other than as ...kidcobra... points out,
it looks like efforts for the sake of manipulation.
There's no real user benefit there at all.
That said - I do think your point of the it not showing "more" to the bots (such as spammy stuffing) is valid,
and likely the main issue of cloaking.
In short - though it "sounds" like "cheating" to me,
it's likely fine :D
Yeah, that's why I added that qualifier too :)
Buying or selling links that pass PageRank is in violation of Google's webmaster guidelines and can negatively impact a site's ranking in search results.
Not all paid links violate our guidelines. Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results. Links purchased for advertising should be designated as such. This can be done in several ways, such as:
* Adding a rel="nofollow" attribute to the <a> tag
* Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file
Seriously...why can't I find information on how to mark up my Help Forum responses? I'd like to make links, blockquotes, code examples, etc. I just tried HTML and it didn't work, but I see other contributors using formatted text...how are they doing this? And I have no idea what to make of this "Add references" form.
To be honest, this forum UI is extremely lacking and nowhere near intuitive. Surprisingly low quality, coming from Google.
You need to go up a few more levels :)
Are you telling me only the top contributors have the "privilege" of formatting their comments?
That seems absolutely absurd to me, but even if that's the case, why isn't that information posted somewhere obvious?
No, but you need to go up just a few more levels.
Mainly its' a deterrent for spamming with links that can be clicked. The other goodies are just perks for sticking around and contributing to the forum.
I just found something on Maile Ohye's blog that pretty clearly states that my technique is not only acceptable...it's actually something Google engineers are working towards implementing.
Question: "What about ads? The slowest thing on my website costing me the last 7 points to the full 100 in Page Speed is Google’s AdSense ads."
Answer: "One factor that makes ads kind of slow is their use of inline DOM elements like document.write(), which doesn’t allow deferred loading (because the document.write may alter the page’s content, the browser has to wait).
The good news is that Steve Souders, Alex Russell, along with several of our co-workers and many outside developers, are looking into improving the speed of external factors like ads, etc. There are some promising things to keep an eye out for: html5 and its iframe attributes (seamless and srcdoc) and the FRAG tag.
Additionally, asynchronous loading would be a terrific improvement in the ads space. In fact, companies like BuySellAds.com are already using this technique to improve performance for their publishers."
Does anyone disagree with my interpretation of this quote? @kidcobra? @Autocrat?
What you're doing is unlikely to matter.
Why you're doing it is actually all the more relevant.
If you want to hide this content because you think it might affect Google's perceived quality of your page then you might get into trouble.
you provided this Source & Quote;
The question is ... why?
You havea site.
You have pages.
Due to legal reasons you have to have some boiler plate copy on many pages.
You are worried that G will take that badly.
a) You can leave it in there as normal, inline content.
b) You could create a file for that content, set it as no index, and load it in via an iframe
You have a site.
You have pages.
You decide you want to rank for 672 different variations of the the term "red rabbits".
a) You cram in all those words in the footer of the page.
b) You could create a file for that content, set it as no index, and load it in via an iframe
Similar situations ... same methods.
What do you think G will do?
For the first scenario - G should be fine.
It's pretty good at spotting boiler-plate stuff, and generally ignoring it/devaluing it and focusing on the "main content" instead.
For the second scenario - G would ideally drop kick you past mars!
The "intent" is what G focuses on.
So far - your intent really does seem to point in the direction of "fooling Google".
It seems much more along the lines of scenario 2 than scenario 1.
(That said - both sceanrios show a proclavity towards focusing on SEO a little too much :D)
Now - I'm not Stating it's spammy.
I'm not stating that I think you are attempting Spammy techniques.
I'm just staying that so far - that's the impression you are giving.
(And that's to several people)
Now - I'd love for someone from G to comment.
But it isn't likely ... as things change, and they hate making hard-line statments etc.
Further - for every inch they give - thousands of SEO's take it for granted and go a mile with it.
So - it's going to boil down to "you".
You are going to have to sit back, look and ask yourself if you honestly think it is 100% legit.
If so - do it.
If not - don't.
And if you are going to be advising others for it - then get them to ask the same question.
As - depending on situation and intent - the answer is likely to change.
Oh, for goodness sake!
>>> THE USER'S EXPERIENCE:
>>> GOOGLEBOT'S EXPERIENCE:
You forgot to mention:
THE GOOGLE TOOLBAR'S EXPERIENCE:
A couple of visitors with the Google-toolbar activated in their browsers suffice to let Google detect your "intelligent" cloaking technique in an hour's time. It just does not work.
Sorry to disappoint you,
I was just starting to wonder when the cavalry would arrive :)
Yup - there's ...luzie... - riding his Camel!
(Or is that one a Horse? Which one has the hump?)
(No bad attitude jokes about "hump" :D)
I mentioned the quote because it shows that Google doesn't want paid links to pass PageRank. My technique prevents paid links from passing PageRank.
I don't understand your examples. In the first example, you're suggesting that Google doesn't mind if you're "hiding" boilerplate legalese. In the second example, you're suggesting that keyword-stuffing an iframe would result in ranking for those keywords. As far as I know, Google doesn't associate iframe content with the parent document, so this wouldn't work.
Sorry, Becky, but no it's not. Even if it were, that would be irrelevant to the question at hand. Furthermore, I would have expected a more intelligent, more meaningful response from a "Bionic Poster." If you're looking for threads to troll, I'm sure MySpace can afford you that opportunity.
That will do. There's no call for insult hurling.
Specious is the first word that came to mind when seeing this argument.
Google wants to know what you are putting on the page for users. It's the only way they can operate a search engine. The policy and reasons for it are crystal clear and those clear policies directly and specifically point to your plan being a bad idea. You can't get around that by ignoring it. Your plan is best summarized as follows:
If Google gives credit for seeing something on my page, and I don't want to show it to my users, I'll only show it to Google so I can get credit AS IF users were getting the content.
If Google doesn't want something on my page, or gives any kind of nick for having something on page, and I want it there, I'll only show it to users and hide it from Google AS IF the content was not there.
The whole point - to game or trick Google into sending me searchers that it would otherwise not send me.
Say that out loud a few times.
>>Specious is the first word that came to mind when seeing this argument.<<
A nice word Greg, and apt, too.
The cavalry were a bit late because we were waiting for fresh horses; I mean - when did you ever see the cavalry ever arrive at less than at a full gallop or on a dusty, sweaty horse - or indeed any earlier than the eleventh hour?
I think you're confused about the definition of cloaking. I'm not advocating the use of cloaking. Apparently you have some misconceptions about the Google Toolbar as well. If you'd like to explain your theories about how my technique interacts with the Google Toolbar, I'd be happy to criticize them.
Regarding your "white hat" remark...
My technique improves the user experience. If you would like to see evidence of this, feel free to check out the waterfall graph at the bottom of this post:
If you feel that reducing the page load time from 12 seconds to 2 seconds is NOT beneficial to the User, then by all means, go ahead and support that argument. Otherwise, explain why a "white hat SEO" would forgo the obvious benefits this technique provides to Users, simply because "Google won't like it" or because "it's suspicious."
I design websites for humans--not Google. My technique improves the user experience. Ultimately, that's what Google's Webmaster Guidelines are trying to promote. If you're trying to tell me that I should avoid web design techniques that knowingly and intentionally exploit the inherent weaknesses of search engine ranking algorithms--TO THE DETRIMENT OF MY USERS...then I'm sorry, but I disagree. If you think "white hat SEO" means optimizing websites to comply with the literal, taken-out-of-context interpretation of every sentence in Google's Guidelines, then perhaps you're missing the entire point.
I already know my technique benefits users. The only thing I'm unsure about...is whether Google's search quality team is intelligent enough to realize it...or are they as near-sighted and self-centered as the Top Contributors in these Help Forums.
Speaking of speciousness...the first paragraph of your "best summary of my plan" is entirely your own fabrication of truth. In anticipation of these kinds of false assumptions and accusations, I included a section in my description, called "Important details to consider," under which bullet point #2 says:
Furthermore, your summary omits one vital piece of information: my plan benefits users by reducing page load times.
Meh. Darren's approach is perfectly legit, IMHO. I'm even serving search engine crawlers a bold 403 when they request the destination of my (usually condomized) aff links put with client sided scripts. Disallowing a script that serves ads, or cross links et cetera, in robots.txt is'nt considered fishy. Actually, that's best practice. Crawlers don't pull out the plastic, so there's no reason to bother them with ads. Also, rendering ads after the content is obviously user friendly.
Yes/No to the load times.
It permits "initial" viewing faster.
The rest of the content is then loaded after.
That doesn't mean you are reducing the load times per-se, just the rendering is staggered.
(Which - admittedly - is nicer than straight loading)
Now - you seem to be getting irate.
Somewhat understandable - as you seem to think that no one is "getting it".
Well - you're wrong.
I'm pretty sure everyone so far as "got it".
What isn't successfully being grasped is this.
You are going to great effort to conceive and implement a method that will intentionally alter how your page, and it's placement in the web is perceived by Google.
That "could" be construed as Manipulation.
Now - I can see "why" you want to do it.
I've made simialr points my self.
Progmatic Backlinks, Print page, Accessible view, email to friend, add to wish list .... ALL of those are links.
Quite common ones.
Add ot that the various/numerous "meMeME" links some folk insist on having ...
and without any form of "navigation" or "contextual" links ... a page can easily clear 10 links.
That's 10 holes for PR to drip away through - nofollow or not.
Infact - the more you improve your page to be "usable" and to offer a "better experience" ... the more damage you may be doing!
The more you try to "market" with the "meMeME!" sites ... the more damage your may be doing!
So - here's the point.
Where do you draw the line?
At what point do you say
" gonna stop right 'there' - else it's gonna get a bit spammy "
Furhter - where should G stand?
What should they look at?
There is a line.
No idea where G currently has it.
But attempting to alter what G see's in an effort to alter how it ranks the page ... is probably pretty darn close to it,
and I'm not sure which side of it you would be on.
Does that make sense?
Can you see what we've been trying to get at?
Thank you for that. I needed a good laugh. But if we can be serious for just a moment, then I'd still like to give you--and everyone else in this forum who "gets it"--a chance to answer the following questions:
1. What provides a better user experience:
a. Showing a user the top navigation of your page, waiting 15 seconds for an ad server to time out, and then rendering the rest of the page.
b. Showing the user the main content as soon as possible, loading the ad banners separately in the background, and if the ad server doesn't respond...it silently AND UNOBTRUSIVELY times out...without interrupting the user, who is already reading paragraph #2 of your content.
2. If you answered "a" to question #1, then please explain why you believe a good user experience is one that makes users wait for advertisements to download. If you answered "b" to question #1, then please explain how a webmaster can achieve the same page speed benefits this technique provides...but without manipulating Google's Site Performance metric.
3. If you [correctly] answered "b" for question #1 and [inevitably] could NOT answer question #2, then please explain why a webmaster should make a conscious decision to provide a WORSE user experience, solely for the purpose of NOT manipulating search engines.
All the "Top Contributors" have done a fine job of avoiding these questions and stating the obvious. If you truly "get it," then you should have no problems answering those questions. Until then, the only logical interpretation of your collective wisdom...is that webmasters should provide a bad user experience for the sake of Google.
Come to think of it...I'm pretty sure the decision you're advocating is the very definition of black hat SEO.
>>> If you'd like to explain your theories about how my techniqueAlright. The toolbar can monitor user behaviour. It'll detect that your visitors sometimes leave your site towards another one. How? The bot didn't see any links there? Quite a contradiction, is it? Something must be wrong with your site, let's send in the next quality rater available. S/He comes with the source code in hands the bot has crawled - and sees the links the bot had not seen: Cloaking. That's exactly what cloaking means: showing the user and the search engine different things. Of course that's against the guidelines. It's also why your call for a Googler will remain in vain here, they don't need to answer, as they already did - in the guidelines, it's all too obvious.
Ehm ... you're participating in an affiliate scheme that needs 12 seconds to deliver an ad??? *Shaking head*.
And, come on ... first you talk about optimizing PR-flow, now your reason for using this age-old trick is a better user experience? If any of that were true, you could open your scripts for the bots, why don't you ...?
That's about it ... :D
I think you're mixing two unrelated topics:
1. You're hiding links from Googlebot. Personally, I wouldn't recommend doing that as it makes it harder to properly understand the relevance of your website. I also doubt that you'd see any kind of visible change with regards to PageRank. Ultimately, that's your choice, just as it would be our choice to review those practices as potential attempts to hide content & links. What's the advantage for the user?
2. You're using late loading of content in an attempt to improve the perceived page loading time. To be honest, given your high level of technical expertise, I'm certain that you could have the full content load within a much lower time without doing that. Personally, I'd go for c) showing the user the full content within less than 2 seconds. It's not impossible -- and not only will Googlebot like it, your users will as well. I find slow loading affiliate sites (regardless of whether or not it's showing some of the content alread) a big pain and a frequent sign of low-quality content. Make your site run like lightning all the time.
Have you seen the stats by Yahoo & Amazon on slow sites? There's some great stuff at http://developer.yahoo.com/performance/ and http://code.google.com/speed/ . Also, did you see http://blog.mozilla.com/metrics/2010/04/05/firefox-page-load-speed-–-part-ii/ ? They estimate 60 million more clicks by making a single late-loading image show up 2.2 seconds earlier. In comparison, you're considering delaying the loading of your affiliate content for users by up to 12 seconds (ok, looking at the waterfall diagram you included, it's not quite that long, but you get the idea)...
So personally, instead of doing these tricks, I'd work on creating a long-term infrastructure that would allow you to serve your content at a blazing speed, all the time, without an attempt to do a short-term optimization for Googlebot.
Would it be fair to say then John - it is basically a bad idea to ever try and hide something from Google, whether it is a link, form, hidden element, featured content slider etc?
|This message has been hidden because it was flagged for abuse.|
I can still see where the OP is coming from - up to a point.
If they had focused on PageSpeed, improved User Experience, content refinement etc.. ... it wouldn't ahve been so bad.
Yet instead the primary focus is given as altering how G perceives the page and it's relationship on the net - blatant SE Manipulation.
Only after being hammered has the OP switched tune.
I can appreciate them getting narky ... it's always difficult when you hold a view and multiple others disagree with it.
That said - they've popped in here often enough to know that most folk here don't advocate anything that comes close the breaching the Guidelines ... so I don't know why they are surprised.
I disagree entirely, Autocrat - he's repeatedly emphasised that the focus of his suggestions is upon the benefits for the end user in terms of page speed, and that the 'manipulation' of Google is purely a means to an end designed to adhere the intent of the guidelines (better UE, lobotomising paid links etc), if not the letter of them.
Additionally, he's not advocating breaching guidelines, he's posted an interesting theory on a possible approach to improving his website for users, as well as search engines (in line with various well-hammered-home messages about de-juicing affiliate links in addition to the mantra of 'design for the user'), and merely asked for insight and feedback from somebody who might be able to shed some light on whether, as a tricky grey area subject, it could be kosher if done with the right intent and execution - I imagine he's narky because the vast majority of responses have been copy-paste lifted from the guidelines out of context, and without any thought to the actual issue at hand.
Has anybody actually read the original blog post, considered the implications, and subsequently considered whether this is a 'sensible' or 'ok' approach, regardless of the letter of the law? After all, the Google guidelines are just that, 'guidelines', and let's face it, the web's changed quite a lot recently.
This was a valid question, an interesting, fresh approach tackled with exquisite attention to detail and was presented with a heavy dose of 'Hey guys, what do you think?' - and all we've got is a page of flak, and some quite possibly automated comments about 'doing other, more important things first' which is so often the party line from G when anybody asks questions about specific, detailed technical implementations.
I'm eagerly awaiting feedback from somebody who'll approach and consider this from a bigger perspective.
Question in return - did you actually bother reading through the OPs original blog post and all of this topic?
I have to ask, as I'm thinking you may have missed a few bits.
If you did - you would clearly see the primary motivation is naff all to do with Usability/User Experience.
It flamming well says it in the Blog post and in the topic starter here.
The OP wants to find ways to stop Google seeing links and reducing the PR flow due to those links.
Lets look at some of the things you put....
So how does using JS to either
a) Alter the existing content presentation/behaviour
b) Shortcutting the Request/Load process
equate to masking/hiding links?
a) would mean the content isn't removed/added - it is merely adapted/shown differently.
It is still normal content with JS off.
It is still there with JS off.
b) would mean that JS is used instead of normal click and load processes.
It defaults to normal click and load with JS off.
The links are still there with JS off.
Now - considering you seemed to have missed the OPs blatant "PR" statements previously,
I think I'll spell this out for you,
as you may have issues spotting the obvious...
None of the outlined stuff equates to masking/hiding things from Google.
None of them Remove Content.
None of them Hide Content.
None of them stop G seeing Links.
At the end of the day - you cannot legitimise such manipulation for such intent.
Just because you don't like what you are hearing/reading doesn't make it wrong.
You can post a flippant response if you want.
Sure - you can say "biggoted" and "short sighted" and whatever else you want.
Not a problem - you wanna say "knee jerk reaction" and "panic" - go for it.
And I'm sure you fancy making a few personal insults to.
But - unless you are actually gonna sit there and strain your brain to find a Real reason that such a practice should be used, you are wasting time,
and just showing youself up as one of the people that give SEO a bad name.
And ... a quick step and follow...
I take it the way it was intended;
a) You make sure that your content is Accessible,
that without JS/Flash/CSS/Images - the content is there and readable.
b) You don't attempt to manipulate how G perceives a page in an effort to alter rankings.
Sure - plenty of examples out there where either/both of those aren't the case ... and that's part of the problem with the "modern" stuff - it's normally generated by those without a clue, and followed by similar.
Now - on the flip-side ... yes, things should update.
As I mentioned previously - there are times when some content/links may be somewhat detrimental.
here's the thing though.
t havet o change.
G should be able to spot a link trend, and internally ignore those links.
Links to all the "meMeME!" sites should be basically ignored.
Links to "print view" etc. should automatically be ignore.
And - you know what .... G does actually update and alter how it sees/treats such things.
Look at how it treats nav/footer/content links.
No need to change how those are seen - is there?
So .... maybe they already handle some types of links differently too?
Not just location in the code .... but the destination?
And if not yet - then maybe soon?
Virtually everything you said in your response is either false or invalid, and I will gladly disprove/discredit all of it in my next blog post.
I will re-post one of my previous comments, since you and your fellow "Top Contributors" still haven't acknowledged it.
Feel free to post the URL of a page that you are having trouble optimizing to load in less than 15 seconds. I'm pretty sure someone here can help improve the loading and rendering time.
The only thing I'm having trouble understanding...is why Google's Help Forum "Top Contributors" are advising webmasters to avoid manipulating search engines at all costs--even if it means providing a worse User experience.
If you would like to explain to me why you are recommending we build websites for search engines, not for users, then I'd greatly appreciate it.
I think u have to live with your decision to change your site and that such change needs more load time. You can change it back if you really feel that the load time isnt what you expect. i think if seo were a form of sport hiding something is like doping.
@SEOmofo you're moving the goal posts, and then using non-relevant examples (15 seconds waiting for ad servers). Your original post said that you were deferring the load of your own content, not content from a 3rd party. I doubt this was ever about user experience - if it was you'd just get yourself a faster server, and use other performance imporvements.
I'll hang my colours - I imagine this is purely about hiding affiliate links from Googlebot. After reading your original blog post (I'd seen it before) it's clear to me this has nothing to do with user experience. That's perfectly fine, but I think you should be honest with people rather than trying to bend the argument with the user experience card. This argument falls down immediately based the overhead of introducing links to the page after HTML has loaded.
I think your intent (as I read it) puts this clearly into the "grey" area. (Note: I'm simply stating what I think you're doing, not offering any opinion on the rights or wrongs of doing so.)
I commented over on your blog also, as I was really surprised to see Jill Whelan commenting that this was "perfectly acceptable to Google".
Personally I think the whole purpose has nothing to do with PR Manipulation or Load Times etc.
It's the OP - attempting to create a stink and get a bit of infamy/notoriety.
I've personally ignored the questions as
a) They are loaded
b) Have little relevancy with the initial reasoning for the suggested method
c) Are a flailing attempt by the OP to divert attention
LEts try getting back on topic shall we?
And - as you seem a fan of the whole "direct questioning method...
1) Do you deny that the priority of your method/blog post is PR Manipulation?
a) Yes - you deny it.
In which case you are obviously having thought based issues as the initial part of both your blog post and this topic focused primarily on PR
b) No - you don't dent it.
In which case I take my hat of to you for actually manning up all of a sudden :D
2) Do you deny that you have spent ages squirmming and attempting to deflect attention from 1a) above?
a) Yes - you deny it.
In which case you are possibly goign to be nominated for this years "Fantasy and Fiction" award.
b) No - you don't deny it.
Which leaves me wondering why you've been spending so much time/effort on it so far.
3) Do you deny that you had intentiosn of creating a scene and envisioned yourself coming out smelling of roses?
a) Yes - you deny it.
In which case - I'm nipping ot the bookmakers and mnaking myself rich on you winning not only this upcomming Fantast and Fiction award ... but also taking the "surrealism" and "off this planet" awards too.
b) No - you don't deny it.
In which case, I'm not only taking my hat off to you for manning up ... but also so I can scratch my headi n utter disbelief and bewilderment.
You see - I can not only fire off loaded questions - but I can do it with humour and flare to ;)
HEre's an analogy for you.
You have jsut written up an ingenious blog post ... on how to knock off the old lady at teh end of the street with the noisy dog.
You started the blog post stating oyu are tired of the dog and the old lady - and that you intend to bump them off.
You go through explaining how you are going to do it.
You then show peopel that post.
You then sit there and state you are doing it for the good of the neighbours, and to help the old ladies children with their inheritance.
Yup - it has been That far fetched so far.
You've said "A" so long ... then when people have questioned it ... started shouting "no no! I meant B!".
Doesn't work that way.
Sure - you could go change your blog post.
But this .... this woderful topic ... it's gonna remain.
people can see you banging on about manipulating PR, then trying to squirm from it.
The real shame of it is...
... if you hadn't focused on the PR manipulation to begin with,
and hadn't tried being such a smart butt, things would likely have played out differently.
>>> The only thing I'm having trouble understanding...is whyIt's nothing but the easiest thing to do:
1. Use that technique of yours if you need it to speed up rendering, nothing against that ... >
I have no idea why you're all saying that I'm trying to change my story or my focus. What thread have YOU been reading? I just reread this entire thread, and virtually every single comment I wrote acknowledges that this linking technique manipulates search engines. From my previous comments:
"The reasons for jumping through these hoops are: (1) PREVENT PASSING PAGERANK THROUGH AFFILIATE LINKS. (2) Decrease the amount of time it takes to load the page."
"If you're trying to tell me that I should avoid web design techniques that KNOWINGLY AND INTENTIONALLY EXPLOIT THE INHERENT WEAKNESSES OF SEARCH ENGINE RANKING ALGORITHMS--to the detriment of my users...then I'm sorry, but I disagree."
"Until then, the only logical interpretation of your collective wisdom...is that webmasters should provide a bad user experience FOR THE SAKE OF GOOGLE."
"The only thing I'm having trouble understanding...is why Google's Help Forum 'Top Contributors' are ADVISING WEBMASTERS TO AVOID MANIPULATING SEARCH ENGINES AT ALL COSTS--even if it means providing a worse User experience."
The reason I continue to bring up the user experience side of things...is because none of you will admit that this technique improves the user experience!
You don't need to tell me I'm manipulating search engines, and you don't need to keep mentioning "intent." The former is obvious and the latter is irrelevant.
Yes, this technique manipulates Google.
No, this technique does not have pure intentions.
The question is...and always has been:
If a web design technique improves the user experience...but also manipulates Google, should a webmaster use it?
In other words:
Does the end justify the means?
In other words:
Should I provide the best user experience possible, or should I provide the best user experience that doesn't manipulate Google?
The whole purpose of my first article was to educate webmasters about how to defer the loading of affiliate links and advertisements. The purpose of the second article was to draw attention to the first article and to start a conversation about its "ethical" implications. The purpose of this thread is to draw attention to both articles, to get feedback from a Google employee, and to generate examples to cite in my next blog post, which I've tentatively titled "Google Help Forum: Where Stupid Advice is Labeled as 'White Hat SEO'"
I can't find the humor or the flare, but I'll answer your questions nonetheless.
No, I do not deny it.
I think I made myself perfectly clear when I wrote this:
As an SEO–the best in the world, as a matter of fact–my primary concern is avoiding the potential “evaporation” of PageRank that might occur as the result of nofollowed affiliate links in my page content. Therefore, my super-advanced SEO solution must NOT rely on the use of the
Personally, my favorite part of this technique is that it prevents the loss of PageRank, without using the nofollow attribute. I don't believe hosting advertisements should require me to destroy my own PageRank. I also don't believe in relying on the rel="nofollow" attribute. It's clear to me that this attribute is not handled uniformly by all crawlers. And Google hasn't been completely transparent about how they handle it, so in my opinion...it can't be trusted.
Not everyone feels the same way, however. For some, their priority might be the page speed benefits. Still others might use it for its ability to simplify the management of code and ad campaigns (something I'll be discussing in Part 2 of the "Advanced SEO for Affiliate Marketing" series).
Yes, I deny it.
I think perhaps the veterans of this forum are so used to webmasters trying to hide suspicious activity that you don't know what honesty looks like anymore. I haven't squirmed nor deflected anything. The only thing I haven't addressed yet is luzie's misinformation about how the Google Toolbar works. Frankly, I don't think I'm willing to spend the time to explain something that doesn't affect the question at hand. I will still explain it...but not in this thread.
The only people squirming and deflecting are the "Top Contributors" and John Mueller. You all avoided the quote from Maile Ohye's blog, and I have yet to hear anyone acknowledge the fact that lazy-loading nonessential content improves the user experience. John flat out denied it,
My previous comment contains 10 examples of me acknowledging that this technique manipulates Google. There are probably several dozen more examples in my 2 articles.
So not only do I deny this absurdity...but I also think it's so far from reality that it discredits you and makes me second-guess everything you've ever said.
No, I do not deny it.
My second article, several of my Twitter updates, and this thread...were all partially motivated by my desire to ruffle some feathers and to get attention for myself and for my website. However, if you're implying that these were wholly motivated by such things, or that these motivations somehow jeopardized the integrity of my responses...then yes, I deny it. I stand behind everything I've said, and I am genuinely interested in finding an answer to my question(s).
Wow - you lack the ability to make a cogent argument don't you?
How on earth can this improve user experience? It can only slow the page load. This isn't lazy loading of links. Maihle was talking about non-essential content, whereas you're using this to manipulate content with no benefit to the user.
First you tell us this is for user experience, then you tell us this is about PR manipulation, and now you're going back to user experience. Stop trying to paint this as something it's not - you're making a fool of yourself.
Actually there's really no point in having any more discussion with you - you're obviously only willing to believe yourself.
Maile was talking about ads, just as I am. The technique defers the loading of non-essential content until after the essential content is done loading. The user benefits by getting their content first, without having to wait for advertisements to download. I don't know how much simpler I can put this. If you still don't understand how this benefits users at this point, then you're probably never going to. And that's perfectly okay. Not everyone is born with the cognitive wherewithal to grasp these concepts. I'm sure you more than make up for it with your artistic abilities and emotional sensitivity.
I already addressed this laughable notion in my previous comment. Please read the entire thread carefully before you comment. That way you can avoid repeating the same ridiculous arguments your colleagues have unsuccessfully attempted to make.
I'm now pretty much convinced...
1) The OP isn't actually reading what is posted,
but merely looking for potential negatives to whinge about.
2) The OP is still squirming (Being called out on their feeble attempt at it being UX, they now jump back to the PR?).
3) They are attempting to get as much "sympathy" support as possible whilst grabbing attention.
a) if you cannot handle people not agreeing with you
b) if you cannot sustain a reason-based discourse
c) if you cannot actually read what people put
d) if you cannot face reality
then what do you think you are doing here?
You got an answer from a Google Employee.
You complained about it?
Come back when you've grown up.
Do you really need one of us to go through all your posts in this topic?
Would it help if we showed you when/where you got called out for PR Manipulation,
then you switched to claiming it was UX, thgen you going back to PR ?
Would that help you see what we see?
Do you want someone to hold your hand whilst that is done for you?
Maybe help wipe your nose too?
Or how about you just call it quits?
You came here - you posted, you got responses, and a response from a GE.
You haven't apparently liked a single one.
You aren't going to like anything that follows either.
So just walk away.
I'm sorry, but I'm not going to waste any more time disproving your outlandish theories. You aren't providing any new or useful information; you're just trying to be offensive and disruptive. This thread is clearly beyond the scope of your experience and technical knowledge, and I can't keep slowing down just so you can catch up. I encourage you to move on to another thread, where perhaps you might be of some use to somebody.
|This message has been hidden because it was flagged for abuse.|
I absolutely agree with you. The thing that troubles me is that John Mueller failed to mention this. Instead, he made false statements and misquoted his only source. Thank you for actually taking the time to understand the question. :)
How did JM make False Statements?
Which ones where false?
What did JM Misquote?
What is the original source of the quote - and what is it meant to be quoted as/pertaining too?
You are sat there all angry and lashing out - and to be honest, it's laughable.
You've completed turned about face (again).
You've completely ignored the posts where people have given credit, given support, stated understanding.
You've completely disregarded the various Valid issues pointed out.
It's not a discussion.
It's you sat there, with your fingers in your ears, screaming away.
As you aren't liking what you are being told/shown/questioend - you scream all the louder.
Not conducive to making anything close to being a valid discourse.
I've said it before, and I'll say it again,
(and I'll type slower for you) ...
.... I can see where you are coming from.
There are numerous forms of links that may have undesirable negatives on the performance of a page/site.
But we simply have to accept that G handles them how G wants ... and havea bit of faith (and I admit to lacking that in most cases) that G will figure/spot such links, and handle them differently to "normal" links.
It's G that needs to update/change - not us.
|This message has been hidden because it was flagged for abuse.|
>> I already addressed this laughable notion in my previous comment. Please read the entire thread carefully before you comment. That way you can avoid repeating the same ridiculous arguments your colleagues have unsuccessfully attempted to make.
Your blog posts all stated this technique was used for hiding affiliate links. Again you're changing the stated goal to suit your own argument.
If you want to defer 3rd party content fine, and yes it will likely improve UX.
If you just want to hide your aff links from Googlebot that's also fine, but don't paint it as something done to improve UX if you want to be taken seriously.
>>> none of you will admit that thisI readily admitted it (did you miss that?). I just don't understand why a disallow in robots.txt is necessary for the task. As long as you circumvent this question again and again we`re discussing in vain ...
>>> ... Hey Matt Cutts I'm hiding links from Google, cool?
*LOL* - you're being 5 to 10 years late. We used the trick (without asking Matt Cutts silly questions) ages ago, that'S why I'm so confident in telling you it won't work :-)
My two cents, @SEOmofo:
1. Extremely interesting question: very borderline, and thus -by definition- intriguing for any open-minded SEO. :) It's not the first time I see someone ask it, though: if you're interested, I'll be glad to point you to at least a couple of threads on discussion boards on this side of the Atlantic where Italian SEOs posed the very same question (regarding the "legitimacy" of hiding content from bots) years ago... Your question is a bit more complex, though, because it involves the UX side of things, and ultimately poses an even bigger, more profound question: to what extend can search engine optimization and user-centered design go hand in hand? Chapeau. I'm afraid I've got no easy answer to that, my friend. :]
2. "Will Google frown on this"? It depends. In slippery cases like this, I believe it all boils down a matter of trust, and inferred intent. Let's try to break it down into simpler, easier to answer questions. Can the disallowed external JS technique be used for legitimate purposes (e.g., preventing ads from being crawled)? Absolutely: as Sebastian already pointed out, blocking ad servers via robots.txt is actually best-practice. Is that cloaking? Technically speaking, yes. Can cloaking be legit? Yes. Can Google afford to trust anyone hiding content from them, regardless of their intent? No (they would be spammed to death if they did). Can Google read a webmaster's mind? Not yet. :P Do they (Google) try to infer intent from context? I'm sure they do their best -and yes, I also think toolbar data might help with that (and would be interested to know why you think it doesn't)... As JohnMu told you: it's your right to block things from Google's view, sonny; however, it's their right not to trust you completely when you don't let them take a peek at the whole picture. It's the great trade-off of SEO: it seems you'll have to surrender your privacy and sacrifice some of "your" hard-earned PageRank in order to be fully trusted by Google. ;)
I think Autocrat nailed it with this response:
> There is a line.
> No idea where G currently has it.
> But attempting to alter what G see's in an effort to alter how it ranks the page ... is probably pretty darn close to it,
> and I'm not sure which side of it you would be on.
We cannot say with any certainty whether Google would approve this behaviour. There would appear to be a real risk that they won't, with possible dire results for your listing. If you think that's a risk worth taking, fine, but it does answer one of your other questions:
They do so because it is good advice. The potential risks outweigh the benefits of any attempted manipulation, so you're recommended not to do it. You may still decide that it's worth the risk - good luck! I hope you don't need it.
Jolly well put.
But Matt Cutts advocates "Forget the search engines." So why do you care what google thinks? Do what is best for your user... "If you build it they will come."
... but since you are here at Google's door then you do care what they think... Sounds like a personal decision that we will not be able to help you with.
But don't mind me, I am just the pest control guy.
How long does it take the Gbot to crawl a site? If it's more than 15secs, surely it'll notice the ads..
Or am I too green to know what I'm talking about.?.
I get nothing for that Domain ... and you don't exactly give a clue as to what we are meant to find.
Wouldn't be a spam post for a new domain would it?
I have some similar code to Darren, but my code
1. Converts all anchor links on a page that include nofollows to encoded spans
I haven't released the code - I have actually sat on it for 3 or 4 months & the initial work (as in thought process) started maybe 12 months ago
Intent has been discussed here heavily...
Some foums serve Google content that has most of the links, especially user generated content links stripped out
Advertising systems such as Kontera add keyword based links to the content that search engines can't see - the equivalent on a WordPress blog currently using affiliate links the best they can do is use nofollow and block using robots.txt
I have helped sites with SEO that have huge amounts of user generated content currently with all the links nofollow, passing through tracking links, maybe using onclick events to direct the user (as an alternative to the tracking links) with most of the links blocked with robots.txt
Sites use AJAX to add content blocks for navigation they don't want Google to see/crawl, though that isn't very useful - that is especially true if they want content served dynamically based upon behavioural factors.
Google (well Matt Cutts) stated that links using nofollow can evaporate PageRank - lots of people came up with methods to remove links totally from comments on blogs, or only have them appear after a certain number of comments - when you have a blog post with 100+ comments and 400 pingback/trackbacks that isn't ranking, you start thinking about ways to fix that problem. No information has been forthcoming regarding the reset vector, and strangely every time I add a question to Matt when he calls for questions via moderator, my question gets a whole load of negative votes as if someone doesn't want to open up a can of worms.
A plugin like this does open up the possibility of PageRank manipulation on a scale similar to what some news sites were doing with nofollow, which I have never advocated, but at the same time allows testing of the "reset vector" and what is happening, if Google choose not to be forthcoming - the same would apply to the reset vector in regards robots.txt (e.g. how much of the evaporated PageRank might rain on a very nearby mountain)
In some ways my solution could be looked at as the next step from Darren's but there are issues, the biggest being accessibility.
There are various ways to handle accessibility, the easiest is to give members links that are not coded though it would also be possible to not encode links based on the status of a cookie.
My intent is very clear - for Google not to count links that would otherwise have a nofollow, primarily for user generated content as that is what affects me the most, though I can see it being used for affiliate links as well.
The encryption method ensures nothing breaks as anchors can have all kinds of valid/invalid code within.
I plan to release this code in the next week or so, as it has been burning a hole in my HD for a while wanting to get out.
Do you have a link to where he said this? It doesn't sound very likely. If PR is passed down links that are "nofollowed", what's the point of nofollow?
I take it your code is running on the server and tweaking your html before it reaches the browser. It all sounds like quite a palaver to do what rel=nofollow does anyway.
I really think that instead of worrying about hiding things from Google, your time is better spent generating good content for them to see.
|This message has been hidden because it was flagged for abuse.|
Sorry it has taken me a while to get back to this, and I didn't get any reply notifications.
Matt's statement was very well publicised through most webmaster channels
rel="nofollow" is not working the way you think it is.
There has been no further clarification from Google, especially in regards to links within UGC such as comments.
Generating UGC that gets tons of comments and links has never been a problem, but when it gets more and more comments, and tons of links, and you see rankings go down... around the time Google makes changes, then you wonder what action should really be taken.
Here is an example:-
There are over 400 links on that page from user generated content - I could just 301 redirect all the links to a new clean page with no comments, then change the slug on the page and link to comments.
But then I would suck as much as most of the large news organizations.
Or I could strip out all links just like people using Vbulletin.
I am serving Google exactly the same content a user sees, just Google can't read all of it
The code I have isn't much of a palaver, I have a WordPress plugin version that does everything for you without even an interface. It adds some time to page generation, but hopefully people cache pages or use APC etc.
1) NoFollow works Exactly how it should - it tells GoogleBot to ignore that URL (so it won't at it to it's crawling list).
2) What NoFollow doesn't do (but originally did) is exclude the link from the total of links on a page for Caclulating PR Flow.
People originally used it to "sculpt" PR.
Say you had a page with PR5.
The page had 10 links; 5 nofollowed, 5 normal
It used to be that meant PR would flow through 5 links, after being divided by the number of normal links,
(PR5 divided by 5 = 1PR flow etc.)
But for hte last year or so, the nofollowed links are counted,
(PR5 divided by 10 = 0.5PR flow)
[Note: Figures are for example only - not to be taken as accurate/not to be taken as correct]
3) Why do a 301 Redirect on such links?
Why not internally process the links, and force them through a single "redirect" script?
(One of the most common practices for the last X number of years)
(Thus a link to http://www.example.com/ would become http://www.yoursite.com/script.php?url=http://www.example.com/)
4) You shouldn't be permitting anonymous people to post links without vetting/moderating them in the first place.
Stripping links that are valid, related and worthwhile is a negative to the site in some cases,
but may help reduce spam/crummy link drops
5) You could look at "white listing" methods - auto-block/stripp links, add "good URLs" to a list/DB table,
and let the script check against that. Over time ... those that contribute and are seen as reliable are autoatmically permitted, those that are unknown etc. don't get any benefit without your say so.
(You could even go the route of 3 tiering - auto-strip, permit and pass to redirect, permit and pass straight to URL)
Now now - stop trying to Justify ... it's lame.
GoogleBot does Not watch Videos.
All it see's is the URL to such a file, and hopefully some sort of contextual content for it to help rank it (as well as links to it etc.).
At the ened of the day - the primary principal being suggested here (repeatedly!),
Do Not attempt to manipulate your rankings by altering content shown to GoogleBot vs. Users.
That simple really.
And remember - it's not that people don't udnerstand the reasoning/logic/desire etc.,
jsut that we advise good practice instead of risky.
|This message has been hidden because it was flagged for abuse.|
You really have no idea how Google (Bing, Yahoo, etc.) handles the nofollow attribute. Anybody can say "googlebot ignores nofollowed links," but that's a "For Dummies" explanation that conveniently avoids the technical details of how search engines work. There's a huge difference between "not adding a URL to the crawl queue" and "ignoring a link." Google obviously doesn't ignore any recognizable links.
RE: "Do Not attempt to manipulate your rankings by altering content shown to GoogleBot vs. Users. That simple really."
You don't seem to understand the difference between the raw HTML of a web page vs. the rendered content. Cloaking and "showing different content to users" are both referring to the raw data returned in a server response. The point Andy was making is that Google, web browsers, and users will never UNDERSTAND content in exactly the same way. There's nothing to justify--web technology is outpacing Google's ability to process it intelligently. Google is going to get "tricked" by new web design trends, regardless of anyone's intent. Telling everyone to stop innovating isn't a solution.
Erm ... okay.
So how does Google handle nofollow?
Care to step up and educate?
Of course not.
NoFollow means exactly what it says on the tin.
Google will Not pay any attention to the links destination URL in regards to using it as a source for URL crawling,
nor for attributing value from the linking page to the destination URL.
G have been explaining that from day one.
Why some people seem to struggle with that concept I have no idea.
No one is suggesting nor astating to stop "innovating".
What people are suggesting is that people stop attempting to Manipulate.
Do Not do things in an effort to connive greater rankings by preventing Google from seeing content you show Users.
You said it yourself (several times - then attempted to backtrack and cover) that you intended this methodology to prevent loss of PR.
The User aspect came after that primary goal.
So you yourself admitted to a blatant attempt to manipualte your rankings by altering the content you show to GoogleBot vs Users in an effort to improve rankings.
Are you going to deny you said that?
Are you going to deny that is what you put on your blog post?
Are you going to deny that iws what you posted here?
So - on the one hand ... we have a number of people advocating following Googles Guidelines and not attempting to manipulate rankings with strange tricks,
and on the other we have a number of people advocating blatant attempts at manipulating content in an effort to alter PR flow etc.
The bit that seems to be eluding you is ... many folki can see why/where you are coming from.
Though people (such as myself) appreciate the thoughts and efforts, and understand why you want to do it etc.,
you instead sit there railing at us.
If you paid attention and looked around - you'd see that a fair number of folk that contribute here point out issues/errors/screwups to Google, and argue for improvements etc.
None of us are "Google Lovers".
We all support the net-community and strive to make things better.
What we don't do is go around advising people breach the guidelines etc. though.
We can hardly rant at G to make improvements whilst standing on the wrong side of the fence, can we?
You'll also notice that we are fairly careful with how we present/phrase thigns - so that we aren't to be seen as being against the guidelines (something you failed at).
If you ahd focused solely/pri arily on the User experience ... on improving perceived load times, altering the presentation for a better experience etc.,
this topic may have gone a different way entirely.
Instead - you breached, then bleated.
Please - don't try blaming others for your mistakes and misunderstandings,
and try not to assume others are as lacking in comprehension as you may be.
Thanks for the link, but I still don't see Matt Cutts stating anywhere that "links using nofollow can evaporate PageRank". You may think that's the implication of what he said, but that makes it your opinion not his.
Matt Cutts: "Instead of the PageRank flowing around naturally on your site, some of it just sort of evaporates or disappears."
Never ceases to amaze me how people manage to take what they want from the G Docs/Vids etc.
1) That was refering to Internal Links
2) He was explaining that using NoFollow can result in PR not flowing through Links
3) He was stating that you should avoid using NF on Internal Links as it can have unforseen/damaging impact on how PR flows through your site
Yes - he did use the term "evaporate" etc.
But the context of that usage was in regards to faffing around and attempting to PR Sculpt, and it having a negative affect rather than the controlled affect people assume.
Or did you hear/see something utterly different in that video?
I posted the video in response to Chris. He said he couldn't find an example of Matt Cutts saying that nofollow evaporates PageRank. I provided him with an example. Your comment didn't deserve a response, but if you're feeling left out...you can always recycle one of my previous replies.
"Absolutely: as Sebastian already pointed out, blocking ad servers via robots.txt is actually best-practice."how would one go about bocking extrenal adservers with a robot.txt?
1.) Link to an intermediate URL (on a domain you control) instead of directly to the ad server URL.
2.) Configure your server (e.g. via .htaccess) to redirect the intermediate URL to the ad server URL.
3.) Disallow the intermediate URL in robots.txt.
Google is disallowed from the intermediate URL, so it never requests it from your server. Therefore, Google never sees the ad server URL that's being redirected to.
1) Nofollow works exactly how Google are interpreting it currently, which might not be the same as Yahoo or Microsoft
2) That isn't the whole calculation - Google haven't stated what happens with the "reset vector" - testing which is what Google assume people in the SEO field do at any kind of scale requires a way to take links out of the link graph without it affecting users.
3) I don't think you quite understood what I was explaining.
If you have an article with 500+ external UGC link on it due to lots of trackbacks, one option is to recreate the page without the comments and trackbacks. There are a number of ways to handle it depending on platform and programming.
On WordPress the easiest way is to change the slug of the original to "old-original-slug", and then create a new page/post with the same content, and potentially link through to the old content as a new page just for comments - even with the original comments removed.
Alternatively the new page could have a new slug, with juice 301 redirected - this is the best option if you are revamping the content and want to promote it in social media as something new.
Also just because someting is common practice doesn't mean it is good for users or search engines - one of the reasons I created the code in the first place was as an example implementation for a site that had a few million redirects indexed... that were blocked with robots.txt. The best way to keep them out of search would be to let Google crawl the redirect if the redirect somehow gained link juice.
4) "Stripping links that are valid, related and worthwhile is a negative to the site in some cases, but may help reduce spam/crummy link drops"
Very rarely with a spammer I just strip the link & email address (to ensure future comments are still counted as first time), normally to deride them.
The majority of links are not relevant to the discussion, but are relevant to the person. Regular community members get followed links (I was an instigator of the "dofollow movement" after all)
The links from occasional commenters are primarily valuable for disclosure purposes to understand more of a person's opinion, and if someone wants to contact them.
5) See above
Lame examples? It was just another example as I had already mentioned comment systems - need more or more specific things mentioned?
Many review systems used for ecommerce sites
Most social media buttons which include a count - the count data is not available to search engines and nor are any linnks either to the form or various search results (such as Tweetmeme or Topsy)
Individual voting icons using onclick
Recent visitor widgets such as Blogcatalog, Mybloglog etc
Facebook community and fan page widgets
Recent tweet widgets
Google Friend Connect
A whole load of this "content" appears on a page, includes information and links that humans can see and interact with, but are not available to Google and other search bots.
Problem - you have a page that isn't ranking because it has 500+ links on it that are nofollow from user generated content and trackbacks.
The alternative solutions
Comments = trackbacks for the following
1. have no links for comments/trackbacks at all
2. have links to a profile of comment history - I don't really want to rank for people's names just because they leave a comment
3. have comments on a different page - poor user experience
I genuinely think 6. is the best user experience and possibly the best option for Google.
Lots of sites take option 1 - you can't discover the conversation elsewhere and the comments are probably worthless anyway - do the comments on perezhilton.com really offer Google something of value?
Option 4. is becoming predominant on many large sites.
That magical phrase you are quoting is primarily intended for wholesale cloaking
This is the code currently used on the official Google blog for trackback/blogback links
<a class='comment-link' href='' id='Blog1_backlinks-create-link' target='_blank'>
Yet a human visitor can see 3 nofollow links
So the official Google blog is cloaking those links? Right?
"Should I provide the best user experience possible, or should I provide the best user experience that doesn't manipulate Google?"
I'd just like to comment on this... as has been said you have a technique that slightly improves user experience and u use that to justify manipulating googlebot, however there are beter ways to get the improvement to the user experience w/o deluding googlebot...
so answer this question if you had to choose between #1bad user experience and no deluding google bot #2 half ecent user experience and manipulating google bot or #3 best user experience and perfect indexation by google bot, which would u choose, so fa u seem to prefer #2 (logicaly not #1 but why not go for #3?)
as for the loading times i think simply using ajax to load your adds would serve both your users and google bot better (replacing all those span tags by a tags is going to be a performance isue if you're entirely honest...
My apologies ... I utterly missed your response (which I actually enjoyed :D)
Erm ... well, yup :D Cannot disagree wit hthat at all.
Again - cannot disagree, and wouldn't if I could.
I've argued more than a few tiems that there seems a conflict between UserFocus (usability/User Experience) and SEO.
The number of links like you suggest is one such area (not just things like comments, but large dropdown menu's, printer versions, email friends etc.)
I don't think using JS for anything is "good".
Not on it's own.
I honestly believe that JS is for bells/whistles and for enhancing/improving.
There should Always be a default method in place... so No JS only commenting system.
(currently haveing a bit of an arguement with G about similar at the moment - G seems to think it is fine to tell Us not to cloak etc., but believe it is fine for Them to provide JS only things (gfo look at the commenting via Knol!).)
Not 100% sure on this ... we are beggining to talk about 2 slightly different things,
and hte differences are are important.
(Remember - the origin of this topic boiled down to intentionally manipulating G's perception of a page for SE Gain - Not about usage/UserFocus being detrimental to SEO.)
But in general, again, I agree.
Spam is spam, and should be detered as well as prevented whenever possible.
DoFollow (which has caused WAY to much confussion!) is a good concept ... so long as it is monitored.
At no point have G (or any of us here?) ever stated you should block all links.
What is stated is that you should only permit links of quality/worth.
That means monitoring for spam, and removing it.
It is Not G's fault that many people don't monitor and act.
5) See above
(Not sure if the bit that fgollwos pertains to (5) or not?)
Now ... here's the bit I struggle with.
Your option 6 .... will result in your still being crawled/indexed/ranked for the content of the comments,
including peoples names etc.
Further ... as some many commenting systems, the user part is likely to be a link to a profile anyways.
Then there is the issue that some folks (such as myself) may not have JS enabled, so don't get the benefit either.
So you may still face - to certain degree's,
some of the issues you list, even with option 6.
Of them all,
I think the best method is to have the majority of comments on a different URL (/comments1 or &commentspg1 etc.)
Taht leaves you a Pure page for ranking/viewing, without massive loadup times etc.
If people want to read comments, they can click a button/link to do so.
THEN you can opt for JS to load it up on the same page etc. (and remember, there would still be the actual/normal page if JS is disabled).
You could also look at using Noindex and limited Links in teh template to focus any PR back, or attempt to use the CLE to pass value back etc.
As for hte GoogleBlog comments ... don't get me started.
At the end of the day, G Corp is rapidly becoming a * Hypocrit in regards to such things,
(again, look at the Knol commenting system, then do so with JS disabled).
But as pointed out previously,
this is kind of a different perspective/focus that the original issue of this topic.
This topic is meant to be focusing on using JS to alter content as a means of SE Manipulation.
Where as most of what we have jsut been discussing is more towards the SEO vs UF.
(There's a few topics touching on such issues, inc. one with ...Venessa Fox... and ...JohnMu... posting,
you may want to review that and start commenting .... or generate a new Topic focusing on the SEvsSF ? Could be an interesting one :D)
Why does Google in this very forum cloak it's own links???? Is that not deceptive????
For example a simple link to Youtube:
Late to the party here.
I'm with SEOmofo and Andy Beard on this one. There are some very real issues here that need to be addressed; Darren's approach is not "evil" and should not be discounted so swiftly.
|This message has been hidden because it was flagged for abuse.|
Actually, the links are 302 redirected. Check the headers.
1) We are discussing Hiding Links/Content from Googlebot, but showing it to Users,
and focusing onthe aspect of doing this to alter how G perceives the site/pages/content.
(I make the additional point of the focus, as it is different than simply saying "hiding some stuff"!)
We are Not discussing redirecting URLs, nor tracking methods etc.
2) How is that a Cloaked link?
Further - the point about hte JS was whether the file was being blocked from G so that they could not ascertain the behaviour of hte file and possibly identify spammy/questionable behaviour.
3) Please do Make sure you read things through thoroughly before jumping in.
We bring up "intent" as Google brings up intent.
There are Numerous topics around here, as well as a few of hte MC vids,
where the word "intent" is used.
If the site is reviewed by G at any point, and is deemed as intending to manipulate,
then it may well get slammed.
So - anything "clever" that is done should only be done for the benefit of the User etc.
Doing stuff and intending it to manipulate rankings is dodgy/dangerous, as well as going against the guidelines.
Does all that make sense,
or should I start typing slower for you?
Tracking, Comments etc. has no bearing on "This" discussion.
"This" discussion is about a highly questionable practice of intentionally altering the content that G see's so as to alter the PageRank of pages/a site.
If you want to have a discussion about SEO vs. UserFocus - then feel free to start a new topic about that.
Webmaster intent isn't machine readable, and individual site reviews isn't scalable. Sooner or later, Google will have to face the reality that static HTML content is becoming the exception in a world of dynamic, asynchronous content and applications.
RE: benefit of the User
RE: Does all that make sense, or should I start typing slower for you?
Yes, please type slower. Your onslaught of unique ideas and technical wizardry is too much for me to digest all at once.
Hi SEOmofo. It's been a while.
About your last answer:
"Webmaster intent isn't machine readable..."
That is completely untrue. Of course intent is machine readable. G may read your intent wrong, just like a person might read it wrong. And G may be better at reading intent in some areas than others, just like a person might be better at being alerted by some cues than others. But intent is read countless times by machines all over the world every day. I don't want to waste your time with a lot of examples, but I'm sure if you think about it, you will realize your statement is incorrect.
And a final note, you sound like a pretty smart person ... and I mean that seriously. I would think you probably have a lot of positive and productive things you could be doing. Good luck going forward.
RE: Of course intent is machine readable.
Yes, intent in general is machine readable--even human emotions are machine readable--but I'm talking specifically about Google's ability to discern whether a page was designed by a webmaster with the intention of deceiving googlebot vs. the intention of lazy-loading irrelevant advertisements to improve the user experience. A perfect example is the Disqus comment management system. If I use it on my site because it improves the UX and helps pages load faster, then it's okay. But as soon as I'm aware of its inherent ability to "hide" all the user-generated comments and author links from Google...suddenly it's NOT okay? Ignorance is righteousness?
That defeats the whole (speed-related) purpose of asynchronously loading the content in the first place. The idea is to remove the advertising-related content from the HTML so that it doesn't delay the rendering of the important content (i.e. the content the visitor wants to see).
Some people simply cannot handle being told if they are wrong, or if others deem them wrong,
I liked the Intent example, and agree with it too.
No one is saying the G is a mind reader,
and if you bothered to actually read things,
you'd have seen some of us state that G appear to use a "sliding scale" sort of approach.
Though this isn't G actually deciding what your intentions are/where - it does mean they are attempting to guess those intentions.
There are somethings you simply don't do, somethings you shouldn't, something can be done other ways, and some thigns appear perfectly safe/fine.
Somethings move it along more than others.
If you reach certain points, you may trip some flags.
I assume that certain flags result in certain checks (automated and manual).
As for the comments/loading some content only with JS ...
again - some of that may not pertain to This discussion.
Remember - we are talking about your spammy attempts to cloak links so as to manipulate PR.
The User-Focus aspects were simply throw in ideas you came up with afterwards in an attempt to legitimise things.
(Which suggests that you already knew you were thinking something bad - else why attempt to justify it?)
Now, personally, I'm more than a little bored with your pathetic immaturity.
So unless someone else posts something interesting/worthwhile (or miracles occur and you somehow manage a cognitive/coherent arguement without backpedaling/lying),
then I'm simply walking away from you.
I am deeply saddened to see you go. Each second you're gone is like a thousand years of torture and heartache. This world is meaningless without you, my love.
HI SEOmofo. You said this in response to my question shown below with your answer:
My understanding (having never used a noscript tag) is that the information placed there will only load if the browser being used is an older browser that does not support JS, or if JS is turned off, therefore not recognizing either script or no script tags and loading the content in the noscript tag as a result of ignoring the tag. Browsers that support JS and have it on, will load the JS and ignore the noscript tag. But of course, G will see what is inside the noscript tag when they crawl your page and by inference know what is in the JS that it cannot see directly. By using a noscript tag, you are not slowing down users with JS, which are the only ones that can view the stuff in your JS as it stands now, and you are enhancing the user experience for those people with older browsers or with JS turned off, and you are allowing G to see what is in the JS. Unless I am missing the boat (known to happen of course), it is a win for the users, win for G, and puts you in compliance with the first principle of showing G and users the same thing.
You make some good points, but IMO the noscript tag is not the ideal solution for affiliate marketing links. Here are a few reasons/scenarios that come to mind:
1. Affiliate links usually = banners
If you are part of an affiliate program/network, there's a good chance you'll be loading remotely-hosted banners, images, flash files, etc. Really, it doesn't matter what the content is; simply embedding an advertiser's resources in your web page creates a single point of failure. If a resource fails to load and it's "synchronously coded," all your content below that resource will refuse to load until the affiliate server sends a response (404, 5**, etc.) or times out. Therefore, even if we disregard the theoretical benefits to SEO (e.g. "PageRank sculpting"), asynchronously loading 3rd-party content is still a best practice, because it severs your site's dependence on web servers you don't control. As my original post demonstrates, asynchronously-loaded content fails silently and unobtrusively.
(BTW, before anyone chimes in with something like...
"Well then don't load any external resources--just put a plain text link in the noscript tag and add the rel="nofollow" attribute to it!"
...let me mention that advertisers usually have a very specific set of ad banners they want you to use, so plain text links aren't always an acceptable format. Plus, the idea of creating crawlable links just to turn around and nofollow them--that seems a bit ridiculous to me.)
2. Noscript tag content is still cloaking/spammy/manipulative/[enter Autocrat-generated accusation here]
To be perfectly honest, I would never attempt to "not manipulate Google" by putting affiliate links/content in the noscript tag. That's just asking for trouble. In other words, if I made a deliberate effort to avoid "hiding links" from Google...by providing a noscript alternative (e.g. a brief description of the original ad banner content + a nofollowed plain-text affiliate link)...I would have to have faith in Google's engineers' abilities to understand my intent. I would have to convince them (and their algorithms) that I'm not manipulating Google to achieve better rankings--I simply unengineered my web page so it doesn't gain any theoretical benefits over my technically-unsavvy competitors.
What I'm trying to say is...I can't think of a safe (noscript) alternative to affiliate banners that wouldn't be criticized just as much as the other techniques mentioned in this thread.
And yes...that last sentence was a shoutout to my man, Autocrat, who is definitely still following this thread and just can't stop thinking about me.
SEOmofo was right all along.
HI SEOmofo. Nice video.
I didn't see the part where all webmaster guidelines go out the window, and that it's OK to show Google one thing and users another. Assumedly you would not think it would be OK to violate 5 other guidelines in conjunction with this technique, so I would think it's a dangerous assumption to think you can violate the one we have been talking about.
No one is disagreeing that it speeds things up. But especially here, where he emphasizes that the speed thing is a nearly non existent issue for most sites. except outlierly slow sites (new word there :), it doesn't seem plausible that it can be used as a reason to violate other longstanding and significant guidelines that are important factors for all sites! And here, there is specific guidance in existence from Google on this very point, which is to let Google know what you are showing users, and that guidance is specifically given in conjunction with the use of JS.
In any event, I'm sure more information will be forthcoming (news from G, news from a site that got hit hard with a penalty, etc.) that will clear it up before too long. By the way, what is your web address where you are using this technique... maybe we can speed things up :) ... just a joke there ..... and a play on words at the same time ........
>>> 1. Googlebot requests my web page.
>>> 2. The web page is served without any affiliate links or content.
>>> 3. My external .js file is blocked by robots.txt.
>>> 4. Google never sees any affiliate links or content.
You've tried to sell that as a brand new trick to fool Google, look at what your intent is:
>>> Second of all, bandwidth costs money, and I’m not obligated to serve Googlebot
Ok. Why do you bother what Google does with it or does not? In the end it's your JS-file. Hide it "to save bandwith" (LOL) or don't. Of course you're "not obligated to serve Googlebot anything", you could as well block everything from being crawled and thereby get rid of any trouble you may have with Google right away. Why not rely on Bing and Yahoo if you don't like the way Google treats your d... JS-file :-(
Open the thing to the bot and this futile discussion is ended. Block it and you're cloaking. There's no arguing in between.
You really should be given the bandwith-saviour-Nobel-prize ... :D
why can't you load that content asyncronously without hiding where it is coming from?
for instance i always use this on my links when i use AJAX
<a href="www.google.com" onclick="AJAX(this.href, targetDiv);return false;">klick!</a>
why cant u do that instead of using the span tags replacing them b a tags restoring the href bla bla bla...
as for 'hiding' the js file using robots.txt if i where google i'd have a simple solution js is loaded on this page ok whats in it? i can't see? ok page is now out of my list... now if they did that would u still use this technique to hide your links and adds?
by all means try to save bandwidth and load slow shit asynchronously but don't purposly hide links and then dicuss wether that is to save users time or cloak....
Geez, after reading all that there still isn't closure.
I'm like totally new to this whole PageRank thing, but this thread really injected a load of information into my brain. Now it's time to see how much I understand.
Say now I have a blog with loads of useful posts. To generate income I also have hundreds of affiliate links. When people a page, blog or what ever, anything that's not in the region of the paragraph get's blurred out, be it ads, pictures, random words. Therefore, ads on a blog hardly prevents users from reading posts. They'll notice the ads when they scroll down, but not when they're reading.
Now, from what I've read it appears that Google prohibits you from filtering PageRank through specific links to affiliate sites, since there's the possibility that those sites could be paying you to do that, and that's no different from paying someone to place your website at the top of the SERP.
Apparently webmasters prevented this by placing a nofollow on those links to stop PageRank from being passed on to affiliates, in accordance with Google's Guidelines/Laws/Whatever.
Then little Johnny pops up and realizes, "Hey, nofollow's can be used to boost the PageRank of specific pages of my website. I don't care how irrelevant my web pages are, I simply want my web page to rank higher in the SERP." And thus PageRank Sculpting came to be.
Bare with me, I'm simply arranging my thoughts using my imagination and this forum. If anything I said is wrong, then please correct me.
Now because of PageRank Sculpting, Google changed the way NoFollow works. According to the words of Matt Cutts, if you use NoFollows with internal links, then your PageRank just sort of evaporates/disappears, so that PageRank Sculpting may be hindered. I also noticed, as someone else also mentioned, that Matt didn't say specifically that it also applies to external links, although I'm not sure yet what implications it has. Matt may have neglected to mention that it also applies to external links. I'm sure he was trying to explain the nofollow change in as simple terms as possible, but he may not have realized just how seriously people took his words. Well, whatever.
From what I've gathered it appears that PageRank can pass through your blog and get passed onto another website. If the other website is an affiliate, then doing this is prohibited, so you stop google from crawling the affiliates site and giving them PageRank. But passing on PageRank is a bad idea in the first place since it (apparantly) reduces the Page Rank of...
Alright, my brain has finally entered replay mode to make more sense of what I'm thinking. This basically means I've got too many thoughts being processed and I need to give my brain a Kit Kat. I'll revise my post when my brain can focus again.
So over and out.
well the gist of it is ,whether u can or cannot do this is cloudy/gray territory (lets just say its not exactly white hat..) and whether or not you should do it is basically (imo. and i think most people share that opinion) a No because there are better ways to do the parts of this which are good for your user like slow loading adds etc. (but that lacks the desired SEO effects like links never being shown to google bot).
hope that helps ;)
PS. if I'm wrong by all means someone correct me xD
Not wrong at all.
At the end of the day ... attempting to manipulate Google is what SEO is about.
But the more you do, the more blatant it is, the bigger the difference between GoogleBot and User views,
the greater the risk.
you "can" breach the guidelines.
You can run multipel sites and interlink.
You can buy/sell a few links.
You can keyword stuff a little.
You can hide some content from users and display only for bots.
You can have content geared towards SEs rather than Users.
You can use Canonical Link elements to sculpt PR/Relevancy.
You can use 301 redirects to retain and redirect Value when you remove content.
You can do all these thigns - and more.
The question is ... where do you thnk G draws the line,
and how do you think they are going to react?
At the end of the day ... G isn't overly concerned about little cheats.
It's the ones that seriously impact the results for many Users they seem to focus on.
(personalyl I think it sucks - cheating is cheating ... whether you steal £5 or £1,000,000 - it's stealing etc.)
So a little "wandering" in SE endevours is kind of "safe" (though not advisable!).
But when you cross the line ... you may suffer, and suffer hard/long.
We get them here every so often.
They bought links, stole content, generated 99 different sites for the same thing, linked between their 99 sites and hired 5 shoddy indian SEO companies to dump 7,500 links in the wilds etc.
And they WHINGE like mad because they got caught and suffered ... and want it all back.
They refuse to accept that G isn't penalising them, jsut ignoring all teh value/benefit from their summy little tricks.
I see this as little different.
The OP openly stated it was for manipulation of PR for the purpose of rankings.
Personally - I think they could get away with it ... quite easily too.
But I don't condone it at all, and think that suggesting it to others is unfair.
Considering even a Google Employee has passed comment on it, and it's being ignored,
should shed a little light on thr value of such a method and it's deviser.
I know this is all advanced SEO :), but back to basics is the newly updated Google SEO starter guide, which is more my speed anyway. Here is a quote straight out of the guide:
"To remain within our guidelines, you should serve the same content to Googlebot as a typical desktop user would see, and the same content to Googlebot-Mobile as you would to the browser on a typical mobile device."
Here is a link to the entire thing:
RE: I didn't see the part where all webmaster guidelines go out the window, and that it's OK to show Google one thing and users another.
This is where I think we need some clarification from Google. In your opinion, I'm showing Google something different than I'm showing users--but in my opinion, I'm showing them the exact same thing. This difference of opinions stems from Google's FUD definition of "cloaking." Is cloaking limited to server-side tactics (I say yes), or does it also include client-side tactics? And if we're going to put browser-dependent content under the microscope, then where do we draw the line? Is image text cloaking? Are videos cloaking? Is the canvas tag cloaking? The entire point of this thread is/was to get a definitive answer from Google.
RE: But especially here, where he emphasizes that the speed thing is a nearly non existent issue for most sites. except outlierly slow sites (new word there :), it doesn't seem plausible that it can be used as a reason to violate other longstanding and significant guidelines that are important factors for all sites!
I hate to nitpick, but this point is invalid because you've blurred the definitions of words like "issue" and "factor." Speed is not a significant ranking factor, but it's a very important usability issue.
It is abundantly clear that you have nothing new to add to this thread, so please stop polluting it with your off-topic remarks. You've already made the following points:
* I don't have pure intentions.
* This is not a new tactic.
* I'm not clever.
* I am consciously manipulating Google.
How about this...I agree with all of the above. I hereby acknowledge your points and accept them as truths. Furthermore, I credit you alone as the mastermind responsible for establishing these truths, and they are good, and you are good, and I only wish I were as good as you are.
Done. Moving on.
RE: why can't you load that content asyncronously without hiding where it is coming from?
Apparently you don't understand what we're talking about here. You're suggesting to use a link to load affiliate ads? So instead of loading the affiliate banners when the page loads, I should just provide a link in my content that says something like "click here to load an affiliate advertisement"?
Even if your solution made any sense whatsoever, you'd still be recommending something that violates the well-known best practice of separating content, style, and behavior. Plus, depending on the size of the site and the CMS being used, your solution might be prohibitively expensive.
RE: as for 'hiding' the js file using robots.txt if i where google i'd have a simple solution js is loaded on this page ok whats in it? i can't see? ok page is now out of my list...
Quite the contrary, Google's founders intentionally programmed their search engine to be able to rank web documents without ever "seeing" them. This is why blocking a web page in robots.txt does NOT prevent it from showing up in Google's search results. In other words, Google ranks entire web pages without seeing any content, dependent files, or embedded objects. Needless to say, de-indexing web pages simply because they reference a blocked file would be a very bad idea, and we are all fortunate that you are in fact NOT Google.
RE: ...whether or not you should do it is basically ... a No because there are better ways to do the parts of this which are good for your user like slow loading adds etc.
I'm still waiting for someone to explain these "better ways." Until then...you are wrong.
RE: Considering even a Google Employee has passed comment on it, and it's being ignored,
First of all...welcome back. I missed you.
Second of all...John's comment was extremely misleading and contained several flaws. I'll be highlighting said flaws in an upcoming blog post, but my site is tied up at the moment. I'm not ignoring his comment; I'm waiting until I can address it in front of a larger audience.
RE: To remain within our guidelines, you should serve the same content to Googlebot as a typical desktop user would see...
Again, what exactly is the difference between serve the same content and show the same content? I'm doing the former; the latter is impossible.
To paraphrase: For true believers, no evidence is needed. For skeptics, no evidence is enough. Time will tell.
>>> In your opinion, I'm showing Google something>>> different than I'm showing users--but in my opinion ... [bla bla]
Yeah! Thats's exactly what you're boasting about :-)
>>> I'm showing them the exact same thing.Now what ... ?
... and don't you complain about me "polluting" your d*** thread, as it's you yourself who's getting completely messed up in contradicting statements about what you're showing where to whom and why and when ... show one thing to everybody - and you're done ^^
... a spammer's live is hard, isn`t it?!
i wasn't talking about the way you loaded your adds, thats a fine idea i was talking about you replacing your a tags by span tags and then replacing them by a tags client side using JS THAT seems like a ridiculous idea and in some cases highly annoying for your users with as an only purpose hiding those links from google.
so what IS this thread about the way you load your adds or the way you display your links because the discussion has been about both (that or I'm going crazy both are quite possible since the only thing keeping my eyes open for the past month is caffeine...)
ok NVM me going crazy (i turned "affiliate links" into "links" can't u just cal them adds?:P), I'm entirely fine with what you do xD
though if the problem is the speed of the adds why not cache them on your server?
tough if u wanted to be kind why not put an a tag linking to the add in the spot where u load it later, you do want your JS disabled users to at least be able to click relevant adds right?
Does this violate Google's Webmaster/Quality Guidelines in any way?
well as you've pointed out I'm not google xD but I think they'd prefer the link
great story :P
and i have a nice analogy for you all.
compare websites with formula one cars, if i where to read in the newspaper about a race or a specific car would i be interested in knowing there was Marlboro plastered all over the car? no i would not be, I'd just like to know who won or how fast the car goes or what it can do.
rather then asking the reporter to ignore the advertisements u simply show them the same car but rather then plastering Marlboro all over it you release it at the point before u plaster the adds on there (w/o the adds) because in this case the reporter doesn't know (or need to check for) the difference between the Marlboro logo and the small tag with the cars price tag...
just one last thing in your site you in some cases load an add in the middle of other content, can u either set width and height on these elements or keep them outside of text (the text is moving as the adds are loaded in, tough this only took a split second and i hadn't even start reading yet it might decrease user experience for people with a slower connection.)
and the only thing that is still find weird is why block it using robots (I don't see any good reason to do so, ok neither do i see a reason not to do so but why do something pointless)
This has probably been said before, but it's worth summarizing.
Ultimately, it comes down to this: Google engineers have to figure out how to get their crawlers to see and understand page content in the same way humans do. The algorithms ultimately need to determine which content is useful and which content is not useful and rank accordingly. In order to prepare for the eventualities of the future, if you assume that the system will eventually be perfect, then it only makes sense that the crawler needs to see things in exactly the same way a user sees things. By putting a blindfold on the crawlers today, even partially, prevents this perfect future from ever happening. So, when you say you are only trying to "help"... instead, you are "hurting". It's like the person who tries to "help" the fire fighters by running in and trying to put out the fire by himself, but only "hurts" the fire fighters by being another potential fire-victim to save. Just build your website for users (ignoring the fact that bots and crawlers and algorithms exist), give the crawlers access to the same things users see, and leave the improving of the algorithms to Google.