Crawl Rate - On Page SEO - Tag Manager

How To:  De-Index Time Sensitive Content Using Google Tag Manager

By on 4th March 2022

Reading Time: 3 minutes

  • LinkedIn
  • Twitter
Calendar

Whilst doing some Google Tag Manager consultancy recently, I was working on a client’s website where a large number of events had been listed over a number of years.  The vast majority of these had since passed so the content was effectively redundant.

The client wanted to keep the events listed to show that their venue was a vibrant hub of activity (and because no one could be bothered taking the time to remove them all and redirect the old URLs).

From my point of view, there was no reason to keep these events indexed as they were no longer relevant and de-indexing these would keep the indexed content on the site “fresh” and minimal, rather than having search engines crawl and index unnecessary pages.

It was therefore decided to noindex any event that had now passed, though to go through each event in the CMS and noindex them individually would be an arduous task, and no doubt one which would soon be forgotten about leaving more redundant content indexed.

So, I came up with a solution to use Google Tag Manager to detect the date of the event, and if it had passed, then add a noindex tag to the page.

The first step was to grab the event date and time from each page.  This was done by getting the datetime attribute from the particular div on the page – to do this, we set a variable which pulled in the DOM element using a CSS selector as followed:

Event Date

I then created a trigger that detected when an event was being viewed as follows:

Event URL

And that in turn fired the following custom html tag:

NoIndex Old Event

The code used is as follows:

<script>
var dateInPast = function(firstDate, secondDate) 
{
  if (firstDate.setHours(0, 0, 0, 0) <= secondDate.setHours(0, 0, 0, 0)) {
    return true;
  }
  return false;
};

var event_date = new Date({{Event Date}});
var today_date = new Date();
var in_the_past = dateInPast(event_date, today_date);

if (in_the_past)
{	
	jQuery('meta[name="robots"]').remove();
	var meta = document.createElement('meta');
	meta.name = 'robots';
	meta.content = 'noindex, follow, noarchive';
	jQuery('head').append(meta);
}
</script>

Here, a function is created which passed in the event date and compares it to todays date.  If that date is in the past, it returns true, and in turn, removes any existing robots tag and inserts a new one.

If you site doesn’t support JQuery, you can use this code instead to add the new robots tag:

var metatag = document.createElement('meta');
metatag.setAttribute('name', 'robots');
metatag.content = 'noindex,follow';
document.getElementsByTagName('head')[0].appendChild(metatag);

And that’s it.   For us, this was a good all round solution that meant minimal housekeeping for the client, and kept it’s index fresh and up to date.

Dave Ashworth

About The Author -

I would describe myself as an SEO Expert and a specialist in technical optimisation with a professional approach to making websites better for people and better for search engines.

When I'm not blogging, I deliver website optimisation consultancy and organic SEO solutions by addressing a website's technical issues and identifying opportunities for growth


Get In Touch

Fill in the form below if you want to enhance your website's organic visibility