The Story Behind This Case Study
I recently did some WordPress SEO Consultancy for a website that was configured to pull in landing page content across a number of locations from a 3rd party source using AJAX. There were two issues:
- The URLs included a hashbang
- The content was injected into the page template, which meant the title tag, meta description and canonical tag all pointed to the primary URL without the hash
What is a Hashbang URL?
A hashbang URL effectively contains #! – everything to the left of it is a standard URL, everything to the right is used to load in dynamic content.
For example:
https://domain.com/services-near-you/#!/location-a
https://domain.com/services-near-you /#!/location-b
https://domain.com/services-near-you /#!/location-c
How Do Hashbang URLs affect SEO?
For a few years now Google has been able to crawl and index such URLs with no issues and recognises them as unique pages.
The issue in this instance was that whilst Google was crawling and indexing each of the locations with unique in page content generated by AJAX, but because of how the website was configured, the meta data in terms of page title, canonical and meta description all resolved to that of the parent URL:
https://domain.com/services-near-you/
As there were over 30 of these location pages, there were 30+ pages targeting different locations with the same non-targeted meta data.
E.g. every page had the following:
Title: Our Services Near You | brand
Meta Description: Find out more about our services at each of our locations
Canonical Tag: https://domain.com/services-near-you/
Despite this, because of the in page content that had specific location data/content, they generated reasonable levels of traffic, but then, their performance dropped following the March 12th Google Update:
These pages weren’t the only ones affected by the drop, and there is work to be done in general around site content and EAT (this is a site that’s actually within the Medic niche), but, it was clear that the services near you pages were affected because of this issue:
It became apparent that there was no development resource to make the fundamental changes to site functionality so that each page could generate unique meta data, so, as has been the case so many times before, I turned to Google Tag Manager to resolve the issues.
How To Implement Meta Data Using Google Tag Manager
The method I chose to use was some Javascript via a custom HTML tag as follows:
<script>
<!-- remove the existing elements -->
jQuery('meta[name="description"]').remove();
jQuery('title').remove();
jQuery('link[rel="canonical"]').remove();
<!-- create and append a new title tag -->
var new_tt = document.createElement('title');
new_tt.text = '"Location Services" | "Brand"';
jQuery('head').append(new_tt);
<!-- create and append a new canonical tag -->
var new_can = document.createElement('link');
new_can.rel = 'canonical';
new_can.href = 'https://domain.com/services-near-you/#!/location-a';
jQuery('head').append(new_can);
<!-- create and append a new meta description -->
var new_md = document.createElement('meta');
new_md.name = 'description';
new_md.content = 'Click through for contact details for our "location" branch and to book appointments.';
jQuery('head').append(new_md);
</script>
It’s fairly straightforward – you remove the existing elements, then create and populate the new ones.
The stumbling block came about when trying to implement this at URL level. For example, you would normally set up a trigger as follows:
https://domain.com/services-near-you/#!/location-a
However, hashbang URLs don’t work in the same way normal URLs do when setting up a trigger as anything after the # is effectively ignored and not part of the full URL, so instead you need to create a custom Javascript variable that pulls in the full URL with everything to the right of the # included, this is done with the following script:
function() {
return window.location.pathname + window.location.search + window.location.hash;
}
You then create a custom variable of type “Javascript Variable”:
Then set that up as a trigger to fire when the variable contains your desired string:
The good news for us was that this all worked, and Google will crawl, parse and index the modified content that is generated and added to each page in this way.
The even better news was that now each page had its own unique meta data, following the latest Google Update on September 24th, the location pages started to see an improvement in visibility:
Which has led to a 20% increase in traffic to these pages since:
Which was nice.
About The Author - Dave Ashworth
I would describe myself as an SEO Expert and a specialist in technical optimisation with a professional approach to making websites better for people and better for search engines.
When I'm not blogging, I deliver website optimisation consultancy and organic SEO solutions by addressing a website's technical issues and identifying opportunities for growth