Industry Observations-Technical Insight

JavaScript Rendering and SEO

Over the weekend I spent some thinking about the fact that Google renders JavaScript. It occurred to me that Google is almost certainly smart enough to (a) cache all JS so that if it sees it multiple times it’s not going to pull it multiple times, and (b) also check to make sure that the JavaScript renderer doesn’t run away and eat up tons of processor time if the JavaScript is poorly written. Therein may lie opportunity for the malicious marketer (often called an SEO – search engine optimizer/search engine optimizing) who is trying to get to the top of the Google search results page.

Let’s say Google will only perform a loop as long as it believes it is not infinite and/or as long as the rendering engine doesn’t loop “too many” times. By finding that looping limit of N, you can do N+1 loops and put your cloaking code (where Google sees a benign high quality site and the user sees something quite differently) in plain sight. You may have to do some testing, and it’s entirely possible that Google’s rendering engine uses the same limits the browser does when it determines if something is going to run slowly before it throws a “slow script” warning (that’s the worst case scenario for messing with it).

Google’s rendering engine has almost certainly got to have limits though, and there is some experimentation that can be done there. The trick would be to find a mathematical JavaScript function that Google gives up on but the browser wouldn’t. As an aside, if Google is not smart enough to put limits in place, you can use it to mine bitcoin for you by making each URL unique so that Google can’t cheat and cache it. Either way, a malicious marketer wins, assuming there is no penalty associated.

Of course, it is possible that Google may use the “slow script” as a signal of a poor quality site; I know I would. It stands to reason that they may use that as a signal, because page load time has always been something Google has professed to care about, and with JavaScript rendering they can get a clearer view of that. That means that marketers should be very cautious in allowing third party JavaScript on their site for reasons beyond the security implications, since if it slows down page rendering time that could easily be cause to reduce their SEO rankings. There’s definitely some experimentation to do there.

The short of it is, if you run a website, try to avoid 3rd party JavaScript whenever possible to avoid this sort of slowness that might be used as a negative SEO tool.

Tags: Google, JavaScript
  • http://platinummetrics.com Shah Paracha

    Hmm interesting I’ll have to try to code something out for that then and I’ll get back to this blog with my results. Thanks for the tip!

    Shah Paracha

    Cheap Animation Video

    Director of Operations

    http://www.platinummetrics.com

  • http://www.wireharbor.com TK

    While this may work temporarily, Google regularly manually reviews any suspicious sites or linking. When they find it, they will ban your page rank permanently – and good luck getting traffic to your site after that happens.

  • http://netzaffin.de Pat

    I understand the cloaking issue – btw, nice idea.

    As I know, Google only uses first byte to see a page performance. (See: http://moz.com/blog/improving-search-rank-by-optimizing-your-time-to-first-byte) But users will quit (and the will, if you play with their time) and go back to the SERP, so Google will see a very high bounce rate for that site. This is also used as ranking factor. So endless looping is not the best solution I think.

    Patrick

  • Eric De-Villiers

    You touch on negative SEO in this piece, and I can see the issues you would have dealing with it as far as malicious code went.

    I don’t even need to see bad code to get a bad experience, Sometimes Java and Flash elements on page can reduce my browser to a crawl or even a halt. My retrieval time from a website can increase many times over. Embedded Youtube videos are about the worst. The wider point?

    I’ve just sign a petition to stop negative SEO as far as deliberately using link spam is concerned

    http://www.demondemon.com/2013/10/25/petition-to-halt-the-increase-in-negative-seo/

    And wonder if there is a way to get Google (through Chrome) and the other main browser developers to take this on board as well.

    I mean if you are going to stop negative SEO, you might as well hit it in all its forms

    Eric