|Fetch and Render: Partial vs Complete||KGonzo||5/27/14 10:29 AM|
I have a few clarification questions about the new Fetch and Render too:
Any clarifications would be much appreciated!
|Re: Fetch and Render: Partial vs Complete||StevieD_Web||5/27/14 3:05 PM|
I have played with the new fetch feature using a number of different sites
Partial means Google was unable to read/decipher all of the components trying to load because they were blocked by robots.txt (or possibly another cause).
Complete means the entire site loaded without error.
Complete is great.
Partial is not automatically bad as what failed to load may be intentional on the part of the site owner.
As to the follow question, I don't know if there is a limit on the size/amount of data that can be rendered.
|Re: Fetch and Render: Partial vs Complete||black belt||5/27/14 5:12 PM|
|Re: Fetch and Render: Partial vs Complete||KGonzo||5/28/14 9:58 AM|
I read the blog post yesterday, and figured as much regarding Partial, I was just looking for clarification on that. Thanks! Luckily all the "blocked by robots.txt" I've come across have been from 3rd parties (mostly AdSense).
If the rendered page it displays is rendering everything it can though then that is a concern. As I mentioned it seems to cut off the bottom of the page occasionally.
|Re: Fetch and Render: Partial vs Complete||StevieD_Web||5/28/14 12:33 PM|
> As I mentioned it seems to cut off the bottom of the page occasionally.
Then it would appear there is a cut off (limit) on the size of the page.
Do you have an estimated size (MB) or line count ?
|Re: Fetch and Render: Partial vs Complete||KGonzo||5/28/14 3:05 PM|
I just came across this answer page: https://support.google.com/webmasters/answer/158587
It doesn't specifically state if this applies to the rendered image of the page, but considering they are limiting what text is returned it is probably safe to assume they limit the rest as well.
|Re: Fetch and Render: Partial vs Complete||Henrik Erlandsson||5/30/14 1:13 AM|
Just click on the fetched row and you'll see what's blocked by robots.txt. Normally css images and fonts (even googlefonts.com fonts!), which is quite in order. No point in indexing fragmentary design elements.
|Re: Fetch and Render: Partial vs Complete||PamS1234||6/23/14 8:55 AM|
What concerns me is I have read some people that have actually tested the partial robots.txt and found that there is definitely a connection with some rankings dropping because of the Panda 4.0.
In my case, actually it is google serving a js that is being blocked via robots.txt for a adwords remarking.
So I was just curious, why would Google put into place this tool that flat out says, do not block any css and js, but yes Google is doing it. And what's worst is, it is via a paid service, which we spend a fair amount of money on. Then just to be knocked down on organic rankings via Google's own guidelines. If that was truly the case, then Google is saying, 'spend money, but don't worry, we will keep your organic rankings down so you will continue to have to just spend more money to be seen.'
It's pretty sad I think this way. :)
Any ideas on this?
|Re: Fetch and Render: Partial vs Complete||StevieD_Web||6/24/14 9:38 PM|
> I have read some people that have actually tested the partial robots.txt and found that there is definitely a connection with some rankings dropping because of the Panda 4.0.
Yep. And I got a date with Scarlett Johansson, so everything one reads on the interwebs must be true.
PS; You might want to read up on small sample bias and correlation does not equal causation.
|Re: Fetch and Render: Partial vs Complete||dmb123||7/5/14 2:05 PM|
I have a partial return for the first time and I have dropped off Google rankings, though I am still indexed.
At the same time I received a 'partial' return I feel from nine months on page one google, to not being indexed.
I do not know anything about a date with Scarlet Johansson ... but I believe you are on to something definitive here. StevieD sounds misinformed.
|Re: Fetch and Render: Partial vs Complete||webado||7/5/14 4:14 PM|
|Re: Fetch and Render: Partial vs Complete||Lee Hampton||7/9/14 4:48 AM|
Any advice on how you rectify the partial when it is due to Google Fonts? Seems very odd that the robot txt would block their own fonts?
This is our error: http://fonts.googleapis.com/css?family=Source+Sans+Pro:400,300italic,300,400italic,600,600italic
Thanks in advance
|Re: Fetch and Render: Partial vs Complete||webado||7/9/14 5:05 AM|
There's nothing that needs rectifying for that. It has no impact whatsoever on your site's indexing and ranking. Fancy fonts are just that: fancy. An element of styling. If the font isn't available you should have a fall-back font anyway in your stylesheet.
|Re: Fetch and Render: Partial vs Complete||Lee Hampton||7/9/14 5:45 AM|
thanks for the advice and super super quick response :)
|Re: Fetch and Render: Partial vs Complete||medical-news||7/19/14 5:29 PM|
Thanks a lot for asking this question and also the answers from all the other.
I don't use Google font, because I think they are not so much fast in load time.
|Re: Fetch and Render: Partial vs Complete||Andrew713||7/20/14 9:22 AM|
I am always getting endless partials. The only page recently that did come back 'complete' was the only one without google+ code. All the partial ones were listed with google coding denied by robots.txt (for google+). Recently I also re-added addthis back onto my site and that was also listed as blocked by robots.txt. So unless I remove the google+ code and the addthis, I assume that all the fetches will end up partial.
The page itself seems to render fine and looks complete, just the scripts
|Re: Fetch and Render: Partial vs Complete||webado||7/20/14 9:30 AM|
It's not a problem.
|Re: Fetch and Render: Partial vs Complete||theBirchy||8/4/14 9:44 PM|
It sure as heck needs rectifying when it generates crawl errors, which kill traffic. I lost 80% of my traffic in less than a week because of Google fonts triggering crawl errors. Took over a day to find out the Fetch and Fetch & Render give different results. Thankfully, the traffic recovered as quickly as it died, after telling WMT that the crawl errors had been Fixed.
There also *seems* to be big lag in the Fetch system, because it took several hours for changes made on my site to show different results in the Fetch and Render results (which makes no sense, the Fetch should be real time. Is it caching my robots.txt file?).
|Re: Fetch and Render: Partial vs Complete||webado||8/5/14 4:52 AM|
The only crawl errors that are reported concern files and urls from your site, not third party content.
You don't get crawl errors from Google fonts unless you have used them incorrectly and they appear to be files from your own site rather than third party external ones.
The robots.txt file that's used in Webmaster Tools is usually cached for 24 hours. For real crawling the current robots.txt file is used.
You can see the version used under Crawl > robots.txt Tester. To the far right you have a link to the current live robots.txt file.
|Re: Fetch and Render: Partial vs Complete||bhavesh desai||9/19/14 11:17 AM|
WEBADO- U are right "You don't get crawl errors from Google fonts unless you have used them incorrectly and they appear to be files from your own site rather than third party external ones"
Otherwise same problem should have been on all the pages and not only on few pages.
My website's home page is Complete but inner few pages are Partial. Reasons are same Google Font blocked by robots.txt
How to rectify this? I am not developer. But if get the solution, will get it implemented from the developer.
|Re: Fetch and Render: Partial vs Complete||agirlandamac||10/7/14 8:20 AM|
See if you can determine what the source of the block is, and are you blocking a lot of CSS and scripts? Or is just one thing shown. I can get a complete render until I add Google's script for remarketing. Then I get a partial. So Google is what's blocking Google.
|Re: Fetch and Render: Partial vs Complete||webado||10/7/14 5:58 PM|
>>Reasons are same Google Font blocked by robots.txt. How to rectify this?
You don't need to rectify anything. It's OK. As long as your own content is not blocked (or if it's blocked, you understand why and it makes sense) you need not do anything.
|Re: Fetch and Render: Partial vs Complete||bhavesh desai||10/12/14 7:39 AM|
Sorry for my late reply.
Let me work out with my developer!
|Re: Fetch and Render: Partial vs Complete||webado||10/12/14 8:33 AM|
Crawl errors are not the same as partial rendering in Fetch and render.
|Re: Fetch and Render: Partial vs Complete||Chimezie Gabriel Nwatarali||11/6/14 4:27 AM|
FYI it doesn't read Google maps script. This is intentional on the part of Google.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 1:16 PM|
I do have currently the same problems. All my pages I fetch and render are only rendered PARTIAL.
I don not have anything special on my page. I use a standard WordPress Theme, just text and images. My articles are usually very long. 7k words plus.
After rendering in the webmaster tool, I can clearly see that only about 25% of my content is rendered. The rest is not there.
So what do I have to do that my pages are rendered COMPLETE ?
Here is my page:
as you can see, it's super simple. nothing crazy at all.
I also read that you should check your robots.txt file. Ok, I did. mine looks like this.
So that's about it. As you can see.. I do have a simple page, with text and a few images. But google doesn't render and fetch the page 100%.
What can I do that the Status jumps from PARTIAL to COMPLETE ?
And what am I doing wrong ?
Thank you so much for any little help, appreciate it !
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 2:23 PM|
It tells you why it's partial.
It tells you what is being blocked.
Your robots.txt file may need to be expanded a bit:
Allowing more than that is probably useless as far as Fetch and Render goes.
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 2:32 PM|
Web Page Speed Report
Object Size Totals
*Note that these download times are based on the full connection rate for ISDN and T1 connections. Modem connections (56Kbps or less) are corrected by a packet loss factor of 0.7. All download times include delays due to round-trip latency with an average of 0.2 seconds per object. With 56 total objects for this page, that computes to a total lag time due to latency of 11.2 seconds. Note also that this download time calculation does not take into account delays due to XHTML parsing and rendering.
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 2:36 PM|
It shows poor performance both for Desktop and Mobile.
Images need to be optimized as well.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 7:03 PM|
Puuhh.. ok.. well.. I am not a coder, and I am not a pro web developer. I am simple using WordPress.
However, I checked a few things and was wondering why the report is like that.
For instance. When you go to my mentioned page, I can only count 8 images.
These images have been compressed with photoshop to the lowest web optimized image quality: 0
Images are about 20-40 KB .. which is already super compressed, but still I get the report to compress it more.
What are objects ? I simple don't know.
like mentioned above, you can also check it.. you'll find only 8 images...
as far as I know, I only have one CSS file.. this CSS file usually comes with WordPress... I am using the standard TwentyFourten WordPress Theme.
My page is slightly over 1MB.. which is in my opinion for 7k words pretty good... what the heck ?
So my page size is 1.088157 MB - to be specific. Can you see what I mean ?
is 16. for external scripts.. ok.. I am assuming these are all JAVA Scripts. I read that I should compress them.. ok.. but I don't know how to do so ? I read once about the WordPress Plugin W3 Total Cache.
Is this plugin compressing the scrips ? If not, what do I have to do then ?
@ACTIVATE BROWSER CACHING:
Uff.. I read the recommendation to activate the browser cache.. but heck, how to do it ? I have no freaking idea ?
Maybe there are anywhere some video tutorials ?
THEN, I FOUND THE MAIN REASON WHY THE RENDERING STOPS:
Good.. but he question is: HOW ?
And last but not least, I was told I shoul dminify all scripts and css files.
Also, I don't know how you can minify something within WordPress.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 7:13 PM|
ohh and I updated my robots.txt
but still having the same issue:
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 7:18 PM|
Just because you are using an official Wordpress theme doesn't mean it's a good, optimized theme.
7K words don't greatly increase the size of the html markup. Images, scripts and css do.
And yes, indeed maybe break up your page content ... 7K words by themselves are not too much as far as indexing goes, but for visitors it's quite a lot.
I'll take a look at the images and let you know what I find. It may be that while their definition is optimized they may be too large perhaps by not being sized to the display.
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 7:24 PM|
The fact is that all your images can be optimized by anywhere from 25% - 50% and maybe even more easily using an image optimizer.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 7:25 PM|
hmm.. I see what you mean... well I just installed the W3 Total Cache.. maybe that helps a bit.
However, I guess the most important thing is the fact that I have to
Hmm.. I think that's the key.. if I can fix that issue, the page will not be rendered PARTIAL anymore, instead COMPLETE.
But that's my problem. I have no clue how I can do that ?
What do I have to do to eliminate render blocking Java Script and CSS in above the fold content ?
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 7:25 PM|
Webmaster Tools may not have cached the new robots.txt file yet.
Still you can tell from Fetch and Render just what is deemed to be blocked and you can see whether it's being blocked currently or not and whether it matters.
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 7:26 PM|
>>What do I have to do to eliminate render blocking Java Script and CSS in above the fold content ?
Don't worry about that for now.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 7:29 PM|
still only PARTIAL, but better now.. got about 50% rendered of the page.. still missing important 50% of the content.. dang...
here the result of the PARTIAL rendering:
Googlebot couldn't get all resources for this page. Here's a list:
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 7:39 PM|
ok.. what's actually the difference between fetching and rendering ?
I mean, when I look at my result.. I can clearly see that my page is not fully rendered.. only partial.. the fetching code looks different..
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 7:46 PM|
Fetching is when Google gest and rads the content.
Rendering is when it also shows you on the screen what it THINKS it looks like on the chosen type of device, i.e. desktop or mobile, the way a standard browser would.
If the page is very large, long or heavy it will not render all of it because the time and memory it allots to that step is limited.
As I told you a page that's 1M is really too big. It could never load in a timely manner on a low memory/low speed device, like my old clunker PC at work.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 7:53 PM|
hmm.. ok.. but what to do then ?
How to fix that technical issue ?
I am really concerned about ranking. I was focusing on high quality content, also with more text.
I just checked another site, which is very good.. gives me about 4k users per month.
And I am shocked now... why ?
I could see that the google cache has my page from October cached.. but I added so much more content to the page.. which looks like wasn't even considered by google ?
Holy molly.. so what do you suggest ?
What should I do then ?
Just make the pages smaller ?
It's weired that google can't handle a 7k + page.. with nothing special on it..
that's the cache info from the page...
and this is the real page today:
of course.. I fetched and rendered it.. and also.. same issue.. just PARTIAL
don't know anymore what to do.. but it looks like google can not index the entire content... right ?
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 7:57 PM|
so the cached one has only 30 points, which looks like this is what google indexed so far....
but my new one.. has now already 57 points....
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 8:00 PM|
It's a 1MB page Bernhard, too, too large. Believe me.
But Google will get around to cache it if you're not preventing it.
Mind you that damn appearing floating div is very annoying.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 8:08 PM|
hmm.. ok.. so I have to make it smaller.. maybe 0.5 MB ?
you say.. I should prevent google from caching it ?
Hmm.. don't know what you mean...
floating div ? what floating div do you mean ?
I don't mind at all, any critic helps me to make it better.. please.. that's great...
just don't know what you mean or to what you are refering to with the floating div ?
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 8:17 PM|
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 8:58 PM|
ohh this one.. ok.. I can see.. yeah.. the conversion rate is not very big on this one.. so I guess it's better to remove it...
but finally.. what should I do with the PARTIAL render ?
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 9:09 PM|
Nothing, don't worry about the partial render, it's all explained by the robots.txt file which is fine.
|Re: Fetch and Render: Partial vs Complete||remozben||11/26/14 9:19 PM|
so that means, even if it's partial rendered google still get's the whole content and is indexing everything ?
|Re: Fetch and Render: Partial vs Complete||webado||11/26/14 9:35 PM|
|Re: Fetch and Render: Partial vs Complete||remozben||11/27/14 4:59 PM|
yes, but still... look at that issue:
this is my original page with 57 points.
but google cached only 30 points:
that makes a big difference. It looks like lot's of my content hasn't been recognized by google, or not even indexed...
or can't we say that with the example above ?
I mean, does google still recognize and read ALL OF MY CONTENT, all my 57 points ?
Or does GOOGLE NOT GET ALL THE CONTENT ?
|Re: Fetch and Render: Partial vs Complete||webado||11/27/14 7:24 PM|
That cache is from Oct 20. The cache has not yet been refreshed.
There appears to be a lag right now where many/most/all sites have their page caches stuck around Oct 20/21 . This, I have been told, is not to be considered a real problem. It doesn't mean that Google has not recrawled the site nor that it hasn't kept an internal cache of pages more recent than Oct 20/21. It's just not public yet.
Ok it's sort of a bug ... but not a serious one. Really it's not. Relax.
|Re: Fetch and Render: Partial vs Complete||webado||11/27/14 7:28 PM|
Proof that Google has indeed indexed the content beyond the 30 points we see in the cache.
The query refers to text from point #51 which is not in the cache, yet Google returns the page with it.
I still however think your page is too, too long.
|Re: Fetch and Render: Partial vs Complete||remozben||11/27/14 8:49 PM|
hmm.. interesting.. I see what you mean....
about the length and size of the pages.. hmm.. I think you are right... I mean, what other option would I have ?
To split the page into smaller subpages... but heck.. I fear to lose ranking...
|Re: Fetch and Render: Partial vs Complete||webado||11/27/14 8:59 PM|
I don't see why you'd lose ranking by reorganizing the site a little better.
A book never consists of one page alone, nor one single chapter.
|Re: Fetch and Render: Partial vs Complete||remozben||11/27/14 9:07 PM|
hmm.. that's where I am unsure... I just started having success with traffic.. but I don't know how it will effect the ranking of one page... if the content is gone... you see what I mean ?
so right now, I have lot's of content collected on one page... but what happens, if I split that content ?
Let's say.. right now.. I have about 57 chapters...
what would happen if I split one page into 57 pages ?
you think my ranking would be still the same ?
my traffic ?