Get Help Ranking
Tim Hill SEO
How to SEOin2019
With me, Tim Hill

What does Second Wave Indexing mean in SEO?

Second Wave Indexing is used by Google and probably true of other search engines.

To understand it first you need to understand how your device shows you a page on a website. So ....
  • You decide to open howtoseo.link2light.com
  • Your browser requests the source code of that web page from my hosting company and reads it line by line
  • Early on in the code it will come across a reference to another file (the CSS file) so it will request this from the server. The CSS file defines what things should look like. How big the text should be, what should be the background colors of various areas on the page, how wide certain elements should be, etc., etc., etc. There might be multiple CSS files that it needs to request.
  • Then it will come across references to Javascript files. Each one of these then needs to be requested from my hosting company and downloaded. Javascript files generally tell the browser how to handle certain user reactions. For example you scroll the page and the red box on the right hand side which says 'Get Help Ranking' scrolls with you.
  • Your browser puts all these things together and then shows you the page.

Google's crawler robot wanders around the web searching for new websites it doesn't know about or collecting changes that have occurred on the ones it already has in its database. The crawler doesn't open the website in the same way as you do.

Here's how it works:
  • The robot only reads the source code, it ignores any of the other files referenced in the source code such as the CSS file which dictates how it should look (text sizes, background colors, etc. etc.) or JS files (Javascript which drives interactive parts of the page)
  • Later, and this could mean days, weeks or months, Google then looks at the page as a human would -i.e. it renders the page with reference to all the external files taken into account.

The first step is often referred to as Headless Browsing because the robot is ignoring a lot of information in the Header area of the website code. In reality there are plenty of other places on the page which can be affected so don't take that term too literally.

The rendering which happens later is referred to as a Second Wave Indexing. It is looking to see if the first crawl missed anything which can only be seen when external files are loaded.

Second Wave Indexing has a fundamental impact on SEO because if you are not showing all your content in the basic source code Google isn't going to rank you anywhere until it has done tis full rendering. If links to other parts of your website are only displayed after full rendering Google will only become aware of these once it does its full render which means it can take months for it to discover all your pages.

In fact sometimes it never does because at some point it will go back to the first page it discovered and start over (just to check nothing has changed).

Those with websites that use heavy Javascript suffer most from this. Here's an example:
  • You decide to open howtoseo.link2light.com
  • Your browser requests the source code of that web page from my hosting company and reads it line by line.
  • My external Javascript file tells the browser to display different content depending on your screen size.

Now the Google crawler never sees this in the first visit so to it my page would look virtually empty ... nothing of interest here. That's why for all crucial content I do one of the following:
  • Avoid using Javascript to load crucial content - this is my approach on howtoseo.link2light.com
  • Make sure your Javascript code is not in an external file - not guaranteed to work though. In my example the Javascript said what content to load depending on screen size but a crawler has no screen.
  • Have comprehensive content in the basic code which then gets replaced depending on the users screen size - this way the crawler always see one version of the content.

  • To see if your website is hiding anything from Google open the site, find a blank space, right hand click and select 'View Source'. It might be something slightly different to 'View Source' depending on the browser but it should be quite obvious.

    You'll usually find hundreds or thousands of lines of code here. Press Ctrl and F at the same time to open a search box and then try searching key elements of your content - menu options, text in the main body, etc. If you are missing anything that you can see on the website then it is probably because it is hidden away in something like Javascript.

    Web Developers love using Javascript to make sexy websites and who can blame them - you can do all sorts of fun things that make for a better user experience like showing more relevant content depending on screen sizes. But Web Developers don't always understand SEO and so can end up making life very difficult for search engine robots. There are ways you can do both, any good Web Developer will know how.

I'm Tim Hill, a Search Engine Optimisation and Online Marketing specialist. I created this site to help others understand that SEO is not a mysterious black art!.

If your a newbie try the Getting Started in SEO page, otherwise feel free to dig around and learn more.

Find me on Facebook or get in touch if you need help.

Need Help? Seo Assistance
this man can affect your rankings