1 2 I'm trying to develop a multiplot heatmap.2 saved to a pdf. I'm having some success but the axis labels are getting chopped off. Subplot titles are also desirable but again the labels are getting chopped. Here's my reproducible code: library(gridExtra) library(grid) library(gridGraphics) library(gplots) Col = colorRampPalette(c("red","orange","yellow", "white")) grab_grob <- function() grid.echo() grid.grab() par(cex.main=0.1, mar = c(1,1,1,1) ) #data<-read.table("heatmap.input.matrix.data.txt") lmat = rbind(c(2,3),c(4,1),c(4,1)) lwid = c(2.5,4) lhei = c(0.5,4,3) labRowvec <- c(rep(NULL, dim(matrix(runif(1000, 1,10),ncol=50))[1])) labColvec <- c(rep(NULL, dim(matrix(runif(1000, 1,10),ncol=50))[2])) gl <- lapply(1:12, function(i) heatmap.2(matrix(runif(1000, 1,10),ncol=50), dendrogram = "none",offsetRow=-0.5, offsetCol=-1,srtCol=0, density="density", lmat =lmat,lhei = l
Alan Mosley
2012-06-29T17:43:23-07:00
good work, glad someone did the tests we have been too lazy to do.
Takeshi Young
2012-06-29T10:29:29-07:00
Hey, thanks for the article, these are some interesting tests.
I was wondering, how long did you run the test for? And how did you figure out whether the content in the external document was indexed?
Also, I'm confused about your implementation of AJAX tabs. I'm surprised that they wouldn't be indexed properly, given that you jQuery tabs were. Can you provide more details as to what kind of code you used?
Matthew Edgar
2012-06-29T12:29:29-07:00
Hi Takeshi - my method was pretty straight forward. Put content on the page via JS or in an external file via JS, then see if Google found that content (site:domain.com "content here"). I did each test on 3 different domains using distinct content on new pages. As for timing, Google's results updated typically within a week with the new pages and the content.
Regarding the AJAX tabs, I am certain there is a way to make it work. My point though was that jQuery tabs are better because the content is directly on the page (in an AJAX scenario, the content would not be on the page). You want to make it easy for Google to index your content so something like jQuery tabs does that. I used the out of the box approach on the AJAX tabs which has hyperlinks with an href going to an external page. The AJAX script loads in the content onto the page that contain's the tabs instead of linking to that external page. In that scenario, the external pages were indexed separately.
Sasha Zabelin
2012-07-02T16:51:25-07:00
I think its some 1 year old news when google announced about the indexing of JavaScript and AJAX, but no test were done to confirm this. Since I have seen the indexed files of JS so I can say that your test must be correct.
Thanks for sharing with us.
Dubs
2012-07-02T08:03:49-07:00
Great article!! We had a feeling that javascript and jquery were indexed by Google. The Google Page Speed tool suggests external JavaScript files will increase load times. I agree that the content in same file would be useful to increase site load times as less files are needed to render the website.
Jason Mikula
2012-07-05T17:30:46-07:00
Good follow up reading for this post is Mike King's "Just How Smart Are Search Robots", which goes into additional detail about the concept of "headless browsers" -
http://www.seomoz.org/blog/just-how-smart-are-search-robots
mrefghi
2013-09-06T08:56:02-07:00
I really like that this post explores basic implementations first, and then works its way up. I can think of countless more things to test, though, and look forward to someone doing that. Good job getting the ball rolling! (And maybe now that it's about a year later, they might have evolved their JavaScript support.)
Daniel Crocker
2013-01-29T13:44:21-08:00
I'm a little late to the party here, but I'm glad I found this post. I've found that Google have been crawling certain parts of my JavaScript content in recent months, and I do believe it has been helping with ranking; things like Facebook comments especially. I don't yet think they're following (or at least counting) links within scripts, however.
skifr
2013-05-02T15:30:10-07:00
Hi,not a single word about link juice?
Jay Viks
2012-06-30T00:46:04-07:00
Good start in this area.
We can make this test more comprehensive by covering other area's like JavaScript menus and dropdowns, mega menus, accordion scripts, scrollers, text animations, etc.
Btw, also not too sure on how Google exactly sees the HTML5, CSS3 works.
Alan Mosley
2012-06-30T00:52:11-07:00
I think these sort of links all depend on wether the html is there on load or it is displayed on demand via ajax. if it is thre on load, they i think its pretty safe bet it will be indexed, via ajax, probably not.
What i would like to know, is if js links bound to elements from a external file on document.ready will be followed.
Jay Viks
2012-06-30T01:56:13-07:00
Yes, possibly on-load is not an issue, but yes an AJAX is. I think the outbound links won't be followed by Google if they have been loaded through an external file - though not too sure on this.
Other area's that still need to be evaluated are the content/messages that are shown to user on their specific action as well as rotation or scrolling of information on a page. Many of these JS tricks make use of "display:none" property on page load, and are invoked on user action through JS by changing property to "inline" / "block". I do believe this as a practice of hiding information from Google, however there seems no strong empirical data to prove this at the moment.
Matthew Edgar
2012-07-02T04:52:17-07:00
That is a good question about document.ready.
And JayViks, you bring up a good point about events triggered by users instead of events triggered on load. Might need to test those scenarios out. :)
As for the dropdowns and changes to the display attribute, from what I've seen (though not tested widely) I think you are correct that if the HTML exists on the page on load (in a UL later styled and manipulated with CSS, JS, jQuery, etc.), it should be accessible when Google crawls the page. I'm not sure that is an effective means of hiding content from Google.
Kevin_P
2012-11-26T15:22:05-08:00
Thanks for the great article! Since your post went live, I was wondering if you had any further findings about using display:none with jQuery.
I'm creating a webpage that will rely heavily on a slide effect. Most instances of my keyword will not be visible until the user clicks a button that will show the content. No problem if Google can see this content, but a big problem if the content is not accessible to Google.
I appreciate any thoughts you have on this issue!
Modestos Siotos
2012-07-01T07:43:54-07:00
Nice experiment - the results are quite shocking though!
Google's page speed tool suggests placing JavaScript code into external files to boost page loading time. However, your tests showed that Google is unable to read content placed into external files which certainly opens up the can of worms for manipulation.
Matthew Edgar
2012-07-02T04:45:04-07:00
For speed, you are absolutely correct about JS being on an external file. The key thing is you don't want your content in that external file. Preferably, you use JS for effects on text contained in HTML (like the jQuery tabs) instead of having the content contained in JS.
jlchereau
2014-02-10T00:35:30-08:00
Google search engine can actually do a lot more than described here.
Compare http://seotests.memba.com
and https://www.google.co.uk/search?q=site%3Aseotests.memba.com
For sure, Google reads the head values (title, metatags, ...) before executing any JavaScript, so there is no way to set them in JavaScript. Google reads the body after executing some JavaScript. "some" means "not too complex".
Although Google can load and execute jQuery, which is pretty awesome, there seems to be a limit to the complexity of the script it can execute. In the absence of any obvious rule, I guess there might be a timeout.
della12
2013-09-23T22:17:39-07:00
hi
I am using Advance Ajax in my website and only # tag is used but not the #!.
So should i use #! in order to make pages crawlable ???or it is fine with # only ?
and some pages does not change URL even after the click of user,as they run ajax but URL remains the same.
So,should i modify them and URL must be changed on every click ??
Please reply ASAP
AlexDee
2012-12-27T10:25:06-08:00
Great experiment. Thanks for sharing the results. Certainly cleared a few questions I had as well. Thanks.
Brahmadas R
2013-01-20T23:41:13-08:00
Really good experiment. I must say you got the gutz to do this test. I think in house SEOs like me will really fear to these kind..... If my company website loose something with my experiment cant even imagine.
Ricardo Zea
2014-03-24T06:53:20-07:00
Hello,
This article is from June, 2012. Today, March 2014, is this still the case with Google?
I plan to implement something like this but I haven't found more recent information about this topic around the web.
Thanks.
Keri Morgret
2014-03-24T09:43:00-07:00
Google is better at javascript these days, but I don't know just how good.
Paul Brommer
2013-04-23T10:32:05-07:00
I'm a little late to the party here as well, but noticed that similar questions continue to appear in the Q & A Forum.
I just wanted to mention that Google has provided some direction on making AJAX crawlable should anyone need it -
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=174992
djdaniel150
2012-11-26T11:54:08-08:00
I think Google already answered this question for everyone on webmaster central. The answer, yes, they can read some scripts, but not all! Keep in mind, to date, their is NO standard programming languages or scripting in existence, never has been for that matter. All scripting/programming is totally proprietary. On the other hand, HTML, XML, and CSS, are all standards, as defined by the W3C. They are universal like the the electrical plug on every household electronic device in your home. We have standards for a reason. What if every appliance in the world used a different size plug/power rating? Google still says, "use static pages and text links." They've been saying this for years, and everyone and their grandmother is still building websites with PHP, javascript, jsp, etc. Scripts are fine, as long as you are not housing your HTML with it. You can house scripts in HTML, but never put your pages structure inside of scripts. This is still an issue with all this "dynamic" garbage that CMS's are dumping all over the internet. There's your answer. Oh, and what is a static page? One that uses HTML and CSS to structure its content without the use of scripts! The worst thing I see here, is that you are showing people how to put textual data inside of scripts and display it on your pages, why? You should be using scripts for apps, not for markup. This is bad practice.
BWIRic
2012-07-11T07:26:42-07:00
Good test and info - good to know the ins and outs of what Google can access. Thanks.
Tom Wilkinson
2012-07-02T03:06:30-07:00
the same file concept is useful to know. i wonder if matt cutt's would answer this question directly? it would be of much help if this was confirmed.
Andy_Fletcher
2012-07-02T02:44:03-07:00
"Can Google Really Access Content in JavaScript? Really?"
I'm afraid so, and I found this out to my cost. I thought I could "hide" links from Google in javascript. Didn't work, Google spotted the href's and continued to spider them. The approach I found that worked was to remove the href completely and serve variables the server recognised, but Google couldn't.
Andy_Fletcher
2012-07-11T04:01:08-07:00
I'm replying to myself. What is wrong with me.
Adding....if:
1. you don't want it crawled and you use javascript to achieve that, Google will crawl it.
2. alternatively if you do want it crawled and you code it in javascript, Google won't crawl it.
The law of sod in action.
Marie Haynes
2012-06-30T14:24:14-07:00
Very cool experiment. I, too, have found that tabs that are hidden/shown by jquery are indeed indexed by Google. But, my disqus comments which I believe are powered by javascript are not. (Granted, I haven't looked into the "new" disqus that is available now).
Jason Mikula
2012-07-05T17:28:45-07:00
Also my understanding that comments powered by the Facebook plugin are typically not indexed. Too bad, as these solutions are great for promoting user-gen content, but if that content is un-indexable, it's of limited utility SEO-wise.
Joel Rivera
2012-06-29T18:33:31-07:00
Thanks for the article. I've always been hesitant to use a lot of javascript on my site because of this very issue. Thanks for showing us this test.
ewwink
2012-11-11T05:08:54-08:00
Google read all "readable" file including javascript and css or any other format if found URL it will be target for crawl but the text/URL will not included in search result but different if the content type text/html/doc it will included in the search result.
eververs
2012-07-04T14:06:45-07:00
Good study, we have always been asking this relevant question. Luckily our (and your) hunch was correct. Thank you.
shaam
2012-07-26T23:02:49-07:00
If we use Ajax format most of the url's will not be crawled which might cause to lose traffic.
Sam Parker
2012-07-14T02:37:22-07:00
As a SEO point of view we have to take care of all those things. Here you have described by very logical way. I always looking some tricky and technical question , and this was one of those. thanks
Thumb up to you !!!
Sha Menz
2012-07-04T22:15:06-07:00
Thanks for sharing this...nice to know one way or the other.
Just wondering if you happened to test also by using fetch as googlebot rather than just waiting for the files to be crawled in the normal scheme of things. No logical reason why there would be any difference, but I've always wondered if using the fetch feature has potential to improve effectiveness of the crawl.
Sha
Fluid Advertising
2014-07-29T09:30:52-07:00
Being that this article is a few years old, I would be very interested in technical data from 2014 about this subject.
Wonderkidxx
2012-07-02T04:20:49-07:00
Thanks for the willingness to do this "experiment" and then to present your findings so clearly. I think it's fine to say that most of us knew this anyway, you say so yourself on the first line of the post...
"Matt Cutts says Google can access some content within JavaScript and AJAX."
So you basically stated exactly what Matt Cutts says and what all the Webmasters think yet you did an experiment? I guess you wasted your time really, shame...
Funny none the less.
Thumbs up and +1