How We Increased Organic Traffic by Over 50% Using Technical SEO Updates

Matthew Howells-Barby
Matthew Howells-Barby

Updated:

Published:

We've got a confession to make: We've been making a few rookie SEO mistakes within our own website.

magnifying glass

For a company that is supposed to be at the forefront of inbound marketing and SEO, this isn't something that we're entirely proud of. But even experts make mistakes, right?

Instead of shying away from these oversights, we thought we'd show you where we went wrong so that you don't make the same mistakes. More importantly, we'll show you what we did to fix them and how these solutions resulted in us growing our organic traffic by more than 50% in just one month.

How strong is your website? Grade it using HubSpot's free Website Grader.

The Importance of Technical SEO

Technical SEO is often something that's thought about when you first put your website together. It's typically based on best practices that include things like:

  • Only having one H1 on each page
  • Having your highest value pages linked to from the main navigation
  • Adding alt attributes to all of your images
  • Creating clean URLs without dynamic characters
  • Ensuring page load times are kept to a minimum

While these factors are all fairly simple, you'd be surprised how many websites get this stuff wrong. For example, making on-going technical tweaks can do wonders for your organic traffic growth, yet a lot of companies only look at technical issues once and then never look at them again. 

When I joined HubSpot last year, I had the chance to take a fresh look at everything that we'd been doing to date and to start asking more and more questions about our strategy. Were we making any of these little mistakes? I set out to find out. 

One of the first things that I started to look into were the factors contributing to the success of some of our best content. From here it lead me to run a detailed audit of our website to identify potential problems and areas where there were opportunities for growth. I soon started to realise that we weren't as perfect as we thought ...

7 Technical SEO Problems (And Solutions) We Learned the Hard Way

Problem 1: Broken Links, Redirects & the 404 Page

Whenever someone tried to visit a URL that didn't exist on our website, it redirected them through to our 404 page. For example, when you visited http://www.hubspot.com/science-of-social-media/, it would 301 redirect you to this page.

This is usually a good thing because it says to Google, "This page doesn't exist so don't crawl the page." If Google were to crawl all of these incorrect URLs then they would waste a huge amount of time, and ultimately spend less time crawling the pages we want them to.

Unfortunately our 404 page wasn't actually resolving a 404 server response. Instead, it resolved a 200 response. In other words, it said to Google, "Hey, I'm a real page so come and crawl and index me!" 

This was an enormous problem because Google was wasting a ton of its time crawling and indexing non-existent content on our site. But that was just one of the issues ...

The other (bigger) issue was the fact that a lot of these incorrect URLs have come about because people had linked to us incorrectly from other websites. Take this URL, for example: http://www.hubspot.com/products/inbound-marketing/. It has 370 links from 84 domains pointing to it, including a .gov link -- and it doesn't exist.

Links to our non-existent inbound marketing page

What we were doing by redirecting all of these pages to our "/not-found" page that resolves a 200 response, is pass through all of the PageRank to that "/not-found" page instead of directing it through to the correct URL or a relevant page. Yikes.

Here's a crazy stat: The "/not-found" page had over 8,000 backlinks from over 2,000 different domains. If only we were trying to rank that page ...

Links to Not Found page

What We've Done About It

The first thing that we've done is remove the 301 redirect pointing any page on the offer.hubspot.com or www.hubspot.com subdomain through to the "/not-found" page. This means that if the incorrect URL hasn't had a 301 redirect set up to the correct page then it will go to a 404 and tell Google to not include it within their index.

The second thing that we've done is remove the "/not-found page" so that it goes through to a 404 page and isn't treated as a site page on our website.

The third and final thing that we've done is set up 301 redirects for all of the incorrect URLs that have been linked to so that they point to a relevant or correct URL.

Why Is This Good?

The number of pages indexed by Google should drop dramatically and Googlebot will focus on crawling the pages more important to us more frequently, rather than larger volumes of URLs less frequently.

On top of this, all that lovely PageRank will be pushed into the content we want to rank for and give it a huge boost from the influx of new links pointing correctly to it.

Problem 2: Blog Pagination

One of the things that was directly affecting the blog content on our website was the way that pagination was being handled within the listing pages. There had been issues with the way that we linked through to the listing pages in the blog, i.e. blog.hubspot.com/marketing/page/1, blog.hubspot.com/marketing/page/2, etc.

All we had was a 'Next' and 'Previous' button and these hadn't even been styled properly. It wasn't so much of a UX issue because it's very rare that our visitors were clicking on them -- where it was a problem was within the search engines.

Old blog navigation

When Google crawls our site to find content, it has to follow links on our webpages until it can find the page it's looking for. To find a page that was written say, a year ago, it had to navigate to the blog and then follow each 'Next' link until it reached the blog listing page with a link to that article.

Every time Googlebot (and any other search bot for that matter) follows a link, it's digging one level deeper in the website's architecture. The deeper it goes, the less authoritative the webpage is in the eyes of the search engines and the less it is crawled. In some cases, if a page is very deep in the architecture, it may not be crawled at all. 

What We've Done About It

We wanted to design the pagination navigation on the blog in a way that enabled Google to jump multiple pages at a time in its crawl and raise a large portion of our blog posts significantly higher in the website architecture.

To do this, we implemented the following navigation:

HubSpot_New_Navigation

Why Is This Good?

The inspiration for this solution came from my talented colleague, Pam Vaughan. Pam is heading up another project around republishing old content on the blog to push it higher in the architecture and ultimately rank better. (Click here to learn more about her historical optimization project.)

If this works well, we could see a number of blog posts all receive a ranking boost as a result. This is a simple and small change but it could add a ton of value. In any case, this is a big improvement to our blog content architecture and to the general UX of the blog.

Problem 3: Blog Schema Markup

To date, we hadn't used any Schema.org markup across any of our blog content (or any of our content for that matter) so that Google can break down and understand the individual elements within our webpages. Whoops.

What's that?

In layman's terms, Schema.org markup is used so that the search engines can understand what type of content is on your webpage(s).

What We've Done About It

In the case of our blog, we marked up the code on all of our blog posts to tell Google the following things:

1. This is a blog post.
2. This is the featured image of the blog post.
3. This is the date and time it was published.
4. This is the headline of the article.
5. This is the body content of the article.
6. This is the category which the article falls under.
7. This was published by HubSpot.
8. This is the name of the author of the post.
9. This is the URL of the author's page.
10. This is an image of the author.
11. Here's a brief description of the article.

You can actually check this out by using Google's structured markup tool. Simply click on Fetch URL, enter a URL of one of our blog posts, and then click Fetch and Validate. Once you do that, the tool will show all of the data under the BlogPosting dropdown.

Why Is This Good?

This is a good thing because Google will be able to better understand our content and can make more informed decisions on how to display it. Google often uses this data to help customise search result snippets and results within their Knowledge Graph.

It's not going to make a revolutionary impact but it's well worth doing.

Problem 4: Custom H1 & Intro on Topic Pages

Within our blog we recently set up a new and improved Topics page. When this was launched I started looking at some of the individual topic pages and noticed that there was nothing to really differentiate the way that they looked from one another. With this being the case, it was difficult to make a case to Google for why these pages should rank in the search engines.

What We've Done About It

For each of our blog topic pages we had a generic heading to the page that didn't explain what it was. It would just say it was the HubSpot blog and give a feed of posts. We've now added in a custom H1 relevant to the topic, such as "CRM Blog Posts," as well as a short custom description for the topic.

Here's an example:

Blog Topic Pages

Alongside this, we've added a "/topics" page to each of our international websites to improve the blog architecture.

Why Is This Good?

Each of these blog topic pages were the same and there wasn't really any unique content that could help them rank in search engines at all. We've added this from both an SEO and a UX point of view.

Problem 5: HREFLANG Tags

Turns out, we had a pretty big HREFLANG tag mess on our hands. 

What's a HREFLANG tag? Well, HREFLANG tags are used to let Google know what alternative versions of a page exist in different countries/languages. For example, if we want to tell Google what the Spanish equivalent of our .com homepage is then we can use one of these tags to do this.

They work in a similar way as a canonical tag does to show when a duplicate version of a page exists. But in this case, it helps Google index local content easier within local search engines. (Click here to learn more about how they work.)

What We've Done About It

Previously, our HREFLANG tags were all over the place, and in most cases, implemented incorrectly. This wasn't doing our international SEO efforts any favours.

After a lot of hard work, we've managed to set these up for all of the core pages that have country-specific variations -- all the product pages, homepage, etc. An example is on the /products page:

<link rel="alternate" href="http://www.hubspot.com/products" hreflang="x-default">
<link rel="alternate" hreflang="en" href="http://www.hubspot.com/products">
<link rel="alternate" hreflang="de-DE" href="http://www.hubspot.de/products">
<link rel="alternate" hreflang="es" href="http://www.hubspot.es/products">
<link rel="alternate" hreflang="fr-FR" href="http://www.hubspot.fr/products">
<link rel="alternate" hreflang="ja-JP" href="http://www.hubspot.jp/products">
<link rel="alternate" hreflang="pt-BR" href="http://br.hubspot.com/produtos">

Why Is This Good?

We're now creating a solid link between our main site page content on the .com site and the content on our international domains. This will pass trust across our sites and improve the way Google crawls these pages.

Problem 6: Language Meta Tags

Ever heard of Language meta tags? Well, it turns out we were missing those, too. 

Language meta tags are slightly similar to HREFLANG tags in the sense that they tell search engines what language a piece of content is written in. The tags should be present on all webpages so that search engines can easily understand which country version of their search engine to index them into. In particular, Bing uses these tags a lot. (Yes, Bing is still a thing.)

What We've Done About It

Up until now we'd never had any language meta tags set up across any of our web properties, including our international sites. An example would be within our German site where we now have the following code implemented into each page:

<meta http-equiv="Content-Language" content="de-DE" />

Why Is This Good?

Whilst this isn't going to give us massive spikes in traffic, it will help search engines crawl, index, and rank our local content more efficiently.

Problem 7: XML Sitemap

On our offers.hubspot.com subdomain we house all of our offer content. In a nutshell, this is all of the content that we use to generate leads with -- our ebooks, templates, webinars, etc. This is the content that we really want to rank well in the search engines.

Guess what? We didn't even have an XML sitemap set up for this subdomain.

What We've Done About It

We went through and created a brand new XML sitemap for all of our offers content and have submitted this to Google. (Want to learn more about sitemaps? Read this.)

Why Is This Good

The architecture across this subdomain still needs quite a bit of work but this will do a good job in helping Google discover any new content that we publish and get it ranking quicker.

The Results

technical SEO results

The above graph speaks for itself really. Moral of the story: Don't underestimate the power technical SEO changes.

What technical SEO issues have you run into? Share your experiences in the comment section below.

Improve your website with effective technical SEO. Start by conducting this  audit.  

 

Topics: Technical SEO

Related Articles

Grade your website in seconds. Then learn how to improve it for free.

GET YOUR SCORE

Marketing software that helps you drive revenue, save time and resources, and measure and optimize your investments — all on one easy-to-use platform

START FREE OR GET A DEMO