Crawl Rate - On Page SEO - Tag Manager

How To: Implement Robots Noindex Tag With Google Tag Manager

By on 18th July 2019

Reading Time: 5 minutes

  • LinkedIn
  • Twitter
Robots Tutorial

The Story Behind This Case Study

As a website optimisation consultant, one of the things that is most important for me on a website is to have full control over the content that you allow Google to crawl and index.

Whilst the modern-day website CMS generally offers a flexible approach to technical SEO on both a site and page level, you invariably face issues that prevent you from implementing various page elements to assist with your onsite SEO.

For example, there are a number of instances where thin, low quality content is present on a site due to the inability to add the robots noindex tag, such as:

  • Search filter & URL parameterised pages
  • Paginated content
  • Blog taxonomy and archive pages

Whilst, in theory, the canonical tag can assist with the consolidation of such pages (which you can also add with Google Tag Manager) it doesn’t always have the desired effect and in my experience it’s best to go ahead and noindex duplicate, near duplicate and thin content to enhance site quality and organic visibility – particularly if those pages target similar terms to key landing pages on the site.

Some may see this as a maverick move, and it should be noted that Google state that if you add noindex to a page, eventually, the links will be nofollowed, so if there are key pages linked to from pages you are setting to noindex, ensure these are linked to from elsewhere within your site.

Real Life Case Study

To demonstrate how this works, I’ll use my contact page as a guinea pig.   At time of writing, it is in Google’s index which can be checked using the “site:” search command:

site:organicdigital.co/contact/

site search command

You can also use Google Search Console to check a page’s indexed status:

GSC Status

When viewing the source code, you can see the robots index tag in place:

Inspect Code

How To Add Custom Scripts Using Google Tag Manager

This article assumes Google Tag Manager is already in place on your site, if not I would recommend Google’s Setup and Installation guide as your starting point.

The robots tag can be added using a “Custom HTML” tag which implements the following script:

<script>
// Removes any existing meta robots tag
jQuery('meta[name="robots"]').remove();
// Create an empty meta element, called 'meta'
var meta = document.createElement('meta');
// Add a name attribute to the meta, with the value 'robots'
meta.name = 'robots';
// Add a content attribute to the meta element, with the value 'noindex, follow'
meta.content = 'noindex, follow';
// Insert this meta element into the head of the page, using jQuery
jQuery('head').append(meta);
</script>

As you can see from the comments, this script will remove the existing tag, create a new one, add the desired directive then append this to the meta section of your page:

Custom HTML

Adding Triggers To Fire NoIndex Tags

You then set up triggers which will determine on which pages the tag is fired on.  In this case, you select a trigger type of “DOM Ready”.   You can also use “Page View”, but I prefer “DOM Ready” as, in simple terms, sometimes certain triggers will not fire if you use page view as the page hasn’t been fully constructed.

The next part is key – you must specify the conditions (in this case the URLs) on which you want the tag to be implemented.  You do this by setting “This trigger fires on” to “Some DOM Events”.

Do not, repeat, do not set this on “All DOM Ready Events” as this will add the tag to EVERY page on your site.

Once selected, set the conditions to “Page URL” and then determine the URL structure that matches which pages you want this to fire on. 

For example, if you wanted all pages within /blog/tags/ to fire this tag, you would set “contains” and “/blog/tags/”

If you want to set this on specific pages, you can set “equals” and the full URL (and add multiple filters for multiple page).   So for this example, we set “equals” to “https://organicdigital.co/contact/”:

Trigger Configuration

Testing Your Tags and Triggers

I cannot stress this enough, you must test your tags and triggers to ensure they only fire on the desired pages.  To do this, enter “Preview” mode within GTM, where you will then see the following message:

Preview Workspace

You can then navigate your site in a separate tab to see which tags are, and aren’t, being fired.

If I now visit my home page, I can see the only tag being fired is a GA Page View, and the noindex tag lays dormant:

NoIndex Tag Not Firing

If I then navigate to the contact page, I can see that the tag is fired:

NoIndex Tag Firing

Once you have tested the arse of this across various pages and are happy that the tag only fires on the desired pages, publish your changes.

How To Check The Tag Is Live

It is important to note when reviewing source code to check your changes you will not see this change if you simply “view source”.  By default, this will show you unparsed code – so in order to see your changes, you must use the “Inspect” tool which will allow you to see the fully parsed source code:

Inspecting Source Code

In time, your page will be crawled, parsed and de-indexed.   To speed this process up, you can request a page be crawled via Google Search Console.

It should also be noted, this approach only works for search engines which render Javascipt, and as there can a gap between crawling and rendering, it is not a permanent replacement for the noindex tag. But, you know, it works for Google, so, crack on and worry about Bing, Yahoo, Yandex and DuckDuckGo later.

What Next?

That’s it.  Adding tags via GTM is a relatively straightforward approach which harnesses great technical SEO power over your site’s structure, page elements and content.  

There are no limits to what you can do, or more to the point, I’ve been able to implement a wide range of tags and elements in order to enhance pages, functionality and tracking across numerous sites.

I will share more examples of these moving forward. 

Just be careful, it’s just as easy to do great damage if you don’t properly configure and throughloy test your triggers.

Dave Ashworth

About The Author -

I would describe myself as an SEO Expert and a specialist in technical optimisation with a professional approach to making websites better for people and better for search engines.

When I'm not blogging, I deliver website optimisation consultancy and organic SEO solutions by addressing a website's technical issues and identifying opportunities for growth


Get In Touch

Fill in the form below if you want to enhance your website's organic visibility