Skip to content

HubSpot SEO Strategy

SEO Strategy 2020

How would you describe what SEO is today? This is a nice, broad one to kick us off with. We’ll try and get through as many of these questions as possible and do our best to follow up as well.

What we thought would be useful in talking about here is what a lot of people struggle with when they start getting to grips with SEO, and that's often the concern and the worry that SEO moves quicker than you can keep up with and learn about. That’s always a big stumbling block for people. 

Download The Free SEO eBook!

SEO Today

The thing we hear probably most frequently when talking to people about SEO is, “How do you stay up to date with things? Google is constantly pushing out updates, things are changing, etc.”

Describe SEO Today

The thing we always come back to here is that in the past 10 years now of doing SEO, the foundations on which SEO is built, the core underlying principles of SEO, really have not changed that much. The thing that has changed a whole lot is actually the people searching within Google. That’s us and all of you here consumers, and the way that we search has dramatically changed the way the SEO is seen, but more importantly, ultimately, how the web is worked.

As a result, a lot of the focus in technological developments on Google and those other search engines – there are others like Bing etc., but primarily Google – and why they have run away with so much market share in English speaking markets is their advancements in natural language processing and natural language understanding.

1. How We Search: A good example around this is the way that we search for a place to either grab food, lunch or dinner now versus how we did that even as much as like five to six years ago. So, if someone’s in London right now, and they’re not from London and don't know the area and they want to go grab some lunch, they may kind of just go into Google and such via mobile device or speak into Siri or into Google assistant and just say, “Where are the best places to eat right now?”

That's a very conversational query and is something that, five to six years ago, if you saw someone asking that into Google, would seem very strange because the interpretation there is that you are speaking to a robot and you are speaking to them like a human and that's not the way Google worked. Back then, they would have probably searched “lunch, restaurant London”, very much like speaking explicitly on a keyword basis.

2. What We Use to Search: Our habits have changed. Voices shape this, but so have mobile devices. 2014 was the first ever time that global mobile device usage had eclipse that of desktop, and it's worth not underestimating the impact that just a simple change of a device has on the way that we search and the way we think about searching as individuals. That has a big knock-on effect onto SEO where, ultimately, the interface through which we are searching is the thing that's changed so dramatically and it's going to continue to.
Voice search is an incredibly fast-growing area of search, and as a result, it's one of the reasons why every day something like 14% of all of the queries that were searched for in Google on that one day were net new queries, i.e. things that had never been searched for before. It gives you a kind of scope here of like the kind of things that Google is having to keep up with that. How do you show results for something you've never had to show results for before? And this is a big part of Google's introduction of more intelligent technology.

Going back to 2015, that was the time that Google made probably the most significant change in their search algorithm. They completely overhauled their search algorithm and introduced what was called RankBrain, which many people called an algorithm update, but it was much more than that. Ultimately, this was Google's biggest improvement to understanding and processing natural language. It’s very much cutting-edge technology that enabled them to serve more relevant results that understood the intent behind what people are searching for.

3.How Implicit Information Influences Results: Slight examples of this nuance is, as I mentioned previously, the restaurant example and that restaurant example completely revolves around Google being explicitly told what to return results for. Now, in the example of “restaurants London,” it’s very clear. You have explicitly stated, “I want to see restaurants in the geographic region, London.” That's a relatively simple thing for Google to understand. Now, “Where's the best place to eat right now?” contains some explicit parts to that query, but probably 90% of it is all implicit.

What we mean by that is: “where” would define something explicitly that you're looking for – a location of something of an entity. “Eat” implies that you are looking to find somewhere that sells food or that you can eat within. “Best” best would imply a high rating, but more importantly, you haven't said the location. So, within this query, there was a whole lot more that Google's gathering – for example, your IP address and your location based on that. They would also be looking at whether you are using a mobile device or on desktop.

If you're on desktop, you're probably going to be looking for something with a much longer timeframe, so if you're someone looking to eat lunch, then it may be that you're doing this such in the morning. If you're using a mobile device and you're on mobile data instead of Wi-Fi (which are all things that Google will get as part of this query all the way through to the exact device type that you're using), then Google interprets it as, “Okay, well this probably needs to be walking distance for this person because they’re on the move using mobile data.” They're going narrow in the radius in which these restaurants that they're going to show are related, and they need to be open right now. You haven't explicitly stated any of these things, but Google understands it and can process and give you completely tailored language.

This was a big part of why in SEO, the focus on ranking for individual keywords become a whole lot tougher because you don't rank number one for where's the best place to eat right now because there is no universal number one ranking. Whether we search in our current location or one mile away from here, the results will be different. This is a big part SEO: understanding, in particular, how Google’s advancements in this has been an incredible advancement in their natural language processing.

For any of you that follow news within SEO, you've probably heard the news around Google introducing Burt, which is probably one of the most complex and impressive advancements in machine learning driven natural language processing. It's basically like RankBrain – we would say 2.0, but it's more 10.0 at this point – and that means that Google can really understand the thing that the user wants without them having to say it even on a much more granular level. That allows them to do things like show featured snippets or product carousels, etc., so there's a whole lot that's going into SEO nowadays and much more of a focus on getting more granular with the content that you're trying to serve up and tailoring it to a much smaller group of users than before.

EAT-ing Well

What is EAT and how do we increase it? What impact will it have on SEO 2020 onwards?

Expertise Authority Trust Rating and Impact on SEO

Similar to some of the other things we previously discussed, this is where I think probably the most misinformation in SEO is right now. Going back to RankBrain and Burt being huge algorithm shifts, the first thing that we always notice when things like that come up is there are articles that talk about how to optimise for RankBrain or Bert or there'll be consultants that start all of a sudden selling Burt optimisation services. The reality is you cannot optimise for those things. They are natural language processing algorithms – that's like saying “optimise for Google.” You are doing that anyway. It's very different from trying to optimise for a specific part of search or specific element of an algorithm.

EAT – which is an acronym for Expertise, Authority and Trust – came from Google's Search Rater Guidelines Handbook. What the Search Rater Guidelines Handbook is for – and again this is where a lot of misinformation happens – is Google's search raters, individuals that are contracted by Google to do manual evaluation of the search results pages. Alongside the algorithms that they use, they have a manual layer of individual people to do QA and sense-check certain things. They’ll go in and have a look; they’ll be served up random results, go in and look at the site and say, “Should this be ranking here?” They'll take those results back and it will become a very small part of the process through which Google will have a look at adjusting the results pages.

More often than not, it’s not for just those individual searches. They often use this on a sample of thousands and thousands of pages. It goes into Google's Search Rating Handbook that’s given to each of these Search Raters to use as a guide (which eventually leak because they’re just PDF files), and a big part of that is telling people how to determine whether a site actually contains legitimate information – how to avoid scam websites, how to point out whether something is fake news or not, etc. They do this with a rating system that assesses whether it demonstrates expertise – do the people or the brand that created this content have expertise in this area?

They’ll give some guidelines on to how you might determine that. Are they authoritative in that area as well? Are they trustworthy? Again, they give little playbooks because you're manually going through a site. Now what happens here is then this goes out into the SEO community and all of a sudden there are EAT experts and that becomes how to increase your EAT rating or your EAT score and why this is important, and if you have bad EAT, you're going to tank in the rankings.

Just to clear it up for everyone here: this is not a thing to worry about. This is also not a ranking factor, regardless of what's shared online. Even though we don't always trust official statements from Google, there are official statements from Google that further back this up. But it also just goes against all the fundamentals of search.

If this was a major ranking factor, Google would literally have to have hundreds of thousands, maybe millions of people operating on this model and would defy the whole point of what their search engine is, which is a machine learning driven algorithm to understand language and provide results. If you're just having literally one to one people going through and doing this, that doesn't scale. Where this came in is for YMYL categories of sites – so, “your money, your life” (YMYL), which is basically a category that determines sites that are within areas that are like health or life coaching, those general categories where misinformation can be really bad. If you just Google “your money, your life”, you'll see a list of those kind of categories.

Things where if misinformation at scale is spread and ranks really well in the search engines can be very dangerous. Getting bad health and medical advice is not something advisable for your general health and Google surfacing that information is not going to help any of the searches. Now, where this comes into play: if your site has a manual penalty that's been placed on it by Google and you are in the YMYL category, this can sometimes play a bearing when you're getting manually reviewed and you're going through a reconsideration request – it is useful to still follow all of the things that Google is saying in terms of EAT. The thing that we would just say is: you don't need to take these things as literal.

Just to take this a step back, there is very rarely a case where you would build out content where you wouldn't demonstrate the things that Google is saying around X. First of all, you're never going to actively not demonstrate that you are an expert or an authority in your area, nor are you actively going to go against building trust with the people reading it. There are some minimum viable stuff that you can do on this level, but honestly your energy is spent much better elsewhere and thinking about EAT as a ranking factor is just the wrong way to approach this. It’s an important thing to clarify because this is definitely one of the areas where there's a ton of misinformation.

Benchmarking and Expectation Management

What are the benchmark expectations to assets for clients after 3-6-12 months of SEO activity?

Benchmark Expectations to Set for Clients

Benchmarking and, more importantly, setting expectations are important especially if you're selling SEO, but also if you're operating within a company where you have to answer to a leadership team or exec teams or just any of your direct reports where you have to basically build out a model for what can be expected from the inputs that you're putting into this. One of the things I would always say is that there is an element of unpredictability within SEO, but there is always a logical process that you can take to building out some kind of forecasts. Forecasts, by their nature, are guides and the more data over time that you have, the better those forecasts can become.

When you are starting out an SEO project, in terms of expectations for a client within the first three months, it's pretty rare, especially if you're starting something from scratch, that they're going to see much in terms of tangible output of organic traffic gains, for example, unless you're taking over an existing site that has a bunch of quick wins that you can start tapping into. But the most important thing within benchmarking is that every single individual website that you focus on will be completely different from the next. There are so many variables that have to be taken into account to correctly start building out expectations like this, but there are always a few things that we tend to always do at the start of a project or midway through a project to start understanding this.

The first thing that you want to do is building out a keyword opportunity model or like a universe of keywords, and this is kind of like a TAM analysis – a total addressable market analysis – and the goal of this is to understand, for instance, over a timescale of 12 months, we want to have a look at the potential addressable search volume that we could go after. This requires a lot of upfront, pretty deep keyword research that you're going to start doing, and you probably want to bucket this out into varying groups. So, what are some of the more transactional terms, the kind of things that people are going to be searching for that could turn directly into conversions?

Let's use the e-commerce example that's much easier in this sphere to do this for: we're going to look at a product related queries, people searching for our product types and the categories from which they fall within, and we're going to get an extensive list of all of those keywords – the main head keywords really, and when we say head keywords, we mean the broadest adaptation of that term. “Red dresses,” for example. Then you start off and adding in modifiers to that, so you may have the brand name at the start. If you sell multiple brands of red dresses, etc., you build out a big list of those transactional keywords. You're not going much deeper than that to begin with.

Then you want to bring in the more informational queries, like “autumn, winter, 2019 fashion trends,” so you've got more informational stuff for people that are not looking to make a purchase, but are in the related topic area – maybe more things that you would kind of class as blog content. You build out a big list of some of the core topics that you would assign to this brand and the queries that are being searched for within it.

Lots of great tools that you can use to do keyword research. A few examples we would pull out: (1) Ahrefs – a great SEO tool; (2) SEMrush. You probably need only one of those two competitors. They both do a great job and have great keyword research features – both paid-for tools. For free tools, you've got AnswerThePublic.com, a free keyword research tool that you can start with, especially if there's informational searches. A new tool that's really interesting is AlsoAsked.com. You can basically type a broad topic in, and it'll give you a load of different questions that are related to that and subtopics – great for throwing that into HubSpot as well, and then expanding that further. There’s a few things in there that you can start using. There's KeywordKeg.com, other lower-level keyword research tools, and that should be a good starting point.

Get HubSpot Free

When you've got all of these keywords and you've mapped them out across the different buyer journeys, you're probably going to have a pretty large list. It's not necessarily saying these are all the things that we will rank for; it’s that these are the things we could go after and that are hyper relevant to our business. What you want to do is get the monthly search volume for each of these keywords, which you can pull through from. If you spend a decent amount on Google AdWords, you can get it through Google's keyword planner. You can put that into most keyword research tools and they'll be able to spit out the keyword search volume on a monthly level. Then summing all of that up and you've got, for instance, a total addressable search volume on a monthly basis of 1 million monthly searches, what you're able to say is, “Okay, we believe by the 12th month mark, we'll be able to create enough content to go after 10% of these search queries and that 10% equates to roughly a hundred thousand searches per month.”

Now off that, if we get, for example, page one rankings across a decent portion of those, say 50% of all of those, and we start bringing through traffic, we're going to do some rough click-through rate analysis and we can model out to say, “Okay, we believe we're going to go from like zero traffic up to say 20,000 monthly visits per month on a mid-range. Higher end, we may get closer to the 30 to 40,000. Lower end, it may be the 10K mark.” What you’ve started to do then is you've got this growth model where you've got a lower quartile chart that you can start building and modelling out, upper quartile and then you've got the total addressable market on top of that – a great visual to start showing to clients, internal stakeholders, and also it means you can assign key milestones that need to be hit along the way. For instance: we need to publish content around all these keywords. We need to have product pages with in-depth descriptions across all of these. We want to have a review scheme set up within each of them. We want to make sure that we're doing like consistent backlink building across all of this period of time to drive and build our authority up so that we can start ranking.

That would always come from a data point of view. Go in with the caveats like you're not going to be able to say exact numbers, but giving that broad range and what a possible addressable market is, it makes it so much easier then for a client to come back and say, “Well, you know, what, if we doubled what we're doing, can we go after a bigger slice of that addressable market?” or, “You know, this isn't quite enough for us in this region.” You can come back with, “Okay, well this is the input required for us to get to that stage.” There’s lots of things you can do with that. That's always the way we would approach a lot of that.

Link Strategy Building Relevance

How is building a link strategy still relevant these days?

Building Link Strategy Relevance

It is incredibly relevant. It's the foundation of SEO. Link-building is not only the foundation of SEO, but it is the foundation of the web – the web works by pages interlinking between one another.

Ultimately, this is one of the things that we mentioned in the beginning where some of the foundations – core underlying principles of SEO – have not changed over the years. One of those underlying principles is that Google determines the authoritativeness of a web page by the number of backlinks it has from other web pages. Now, the thing that has changed quite a bit is Google's ability to understand quality and not just quantity. When we started SEO, it was a very different time, and it was all about volume and pumping in as many backlinks as possible into pages. Search results pages were much easier to manipulate as a result because the relevance of the pages that you got backlinks for and the quality of them was not as much of a core focus. This resulted in let the proliferation of things like blog comments, spam and things like that, but nowadays, like Google is very good at this and it's not just about volume.

Your page could have 10 backlinks, and your competitor’s page could have 100 backlinks, yet you could rank higher than them. The reason for that is that Google looks at this – if we had to simplify this – in three buckets. You've got the volume of links that you have, which is probably the lowest in the list. The highest in the list is the authority of the links that you get from other websites which, simply put, refers to how many high-quality backlinks the page that you’re getting a backlink from have and so on. Third, the semantic relevance of the pages that you're getting links from to your page and what it's talking about. For example, if you are running a pet eCommerce store that sells pet treats for dogs and dog toys, chew toys, beds, etc., getting a backlink from an authoritative financial services website is probably less valuable than getting a medium level authority link from a pets blog.

The semantic relevance of the pages you're getting linked to is so much more important. We talk a lot about this and we published an article way back in probably 2015/2016 when we first started kind of talking publicly around the topic cluster methodology that I'm sure many of you have probably read or, at least, follow that framework. It all comes down to the same core principle: the topic cluster methodology is much more focused around interlinking within your own website, but it's built on the foundations of the broader web. Google still looks at how much authority is passed from links internally within your website in the same way that it does for links passed from external web pages into your web pages. Thinking about topic clusters – like you would build out content in your site or interlinked around the same core broad topics – well, links from external websites, you want to think about, “Do they also fit within this kind of broad topic?” Sometimes actually, having a ton of backlinks from completely irrelevant topics can be more detrimental than they can be beneficial.

We wouldn't worry about that at this stage unless you're getting like serious volumes of backlinks, but when you're thinking about building out a link building strategy, getting it from relevant websites that meet a minimum criterion of authority is really important. Determining authority of websites is always a tough thing to do manually. There are tons of tools that at least give you a rough idea of this: Moz, Ahrefs SEMrush. With pretty much any backlink analysis tool, you can go in and they'll have their own version of some kind of authority scoring. Moz’s is ‘Domain Authority’ and it’s a hundred-point scale. With Ahrefs, it’s ‘Domain Rating’. Majestic has ‘Trust Flow’ and ‘Citation Flow’. SEMrush has their own one, etc.

All of them are still estimates, but they’re good guidelines to go on. That would be our broad advice. In short, it's incredibly relevant and in our opinion always will be relevant.

FREE eBook  - How to Innovate your Content Strategy

The Future of SEO in the Age of Paid Ads

Is SEO going to become more irrelevant with more paid ads in SERP and now also Google Gallery Ads?

Future of SEO in Age of Paid Ads

No, it’s not going to become irrelevant and the reason why we say this is that the crux of what’s being asked here is whether the increase in paid ads going to negatively impact the amount of traffic that we're going to be able to generate from SEO – from organic listings. The answer to that question is actually, yes.

We’re already seeing that. In fact, there was some recent Jumpshot data that was released and they have clickstream data which can look into stuff like this. Only – and this may be a couple of percent off so cut us some slack on this – something like 41.5% of all searches that are made in Google search engine result in a visit to an organic listing that isn't owned by Google. If you think about that, of all of the searches that are made, many people assume that every search that's made results in a click – it doesn't. In fact, nearly half – about 49% -- of all searches that are made do not result in a click-through to a website at all, and that's dramatically grown, especially on mobile devices because of the likes of featured snippets, like what some of you may know is like answer boxes.

There’s Google Gallery Ads, but there's also a ton of SERP features that you can see that give you the answer without having to make a click. But also, then you've got about 41.5% of these going to organic clicks, not owned by Google. The rest of that are going to either paid ad listings or some organic ad listings from things like Google Flights, for example, or Google Jobs. These are all organic listings and Google own those web properties.

So, from 1000 searches made for a search query every month, actually, of that, only around 400 or so of those searches result in a click to an organic listing. And then if you're not ranking number one, you're getting even smaller amounts. Within all of this, it does have a big effect. Mobile is where things are getting eaten up the most because, as many of you have probably noticed, since perhaps March this year, when they did it for the first time, whenever you had ads being showed in Google Search, there used to be these big green labels that said ‘Ad’ and you could clearly see it was an ad, but slowly and surely Google introduced more ads and there was more green, and then all of a sudden one day they turned into these really small black little ad signs that you can barely see without squinting.

All of a sudden, they’re blended and they look exactly like organic search results. We're starting to see over time, especially in the past 12 to 18 months, the number of clicks per search happening going down and down, and mobile is driving this heavily. So, the answer would be: it's absolutely not going to be more irrelevant. You're going to be squeezed a lot more. That said, because you're being squeezed by ads, this is a time when you want to be thinking about whether you should also be pushing out ads along with your organic search listings, and they're complementary channels – paid acquisition and organic search. They’re not enemies.

That’s one of the most important things to think about, which a lot of people very much involved in SEO for the long haul sometimes fall into the trap of thinking of paid ads the opposite of organic. It's very much a complementary channel that you can think about.

Evaluating Competition

How do you evaluate the competition and the possibilities you have to reach the highest position within the SERP?

Evaluating Competition Highest Position in SERP

This kind of falls as a follow on from the questions around setting expectations. You've set the expectation with the client, you've showed them what the total addressable search market is for your business, what you can potentially go after, and you've said to them, “Well, you know, if we rank really well for this group of keywords, we may get X amount of traffic by X amount of time,” for example. The next question you need to ask is, and what maybe the client may ask is, “Well, how feasible is it for us to rank for X amount of these queries within X amount of this time?”

It often comes to your ability to compete within this market, and every keyword has a differing level of difficulty to rank for that, but this is where you can start going in and getting a little bit more granular. If you are trying to rank for a keyword, usually the way things go is the more searches that a keyword has per month, the tougher it is to rank for it. There are always exceptions to the rule and if you can find the exceptions, they're the real gold mines within SEO. Largely, this rule applies. The reason why this rule applies is because the more search volume, the higher the traffic potential; the higher the potential profit, the more money other companies are willing to invest into ranking for it, exactly the same way as with a CPM-like basis with paid acquisition.

Now the way that you can start evaluating this – the best way to do this – is, first of all, and this kind of model applies whether you're a site that has existing traffic or you're a brand-new site, it comes down to your authority within that topic. The first thing is topical relevancy and then the next thing to layer in is authority. If you are, as with our previous example, an online store that sells pet products more specifically for dogs, you have a ton of content that is already built out all around this core topic and you have back goings from other like pet blogs, suppliers of pet goods, maybe a pet health and veterinary care websites. You have authority within that topic. The further you stray out of that topic trying to rank for keywords, the tougher it gets to rank for those keywords. The best way to figure out your ability to compete within your broad topic, in a relatively broad sense – within the dog pet space in this example – is not necessarily using some expensive sophisticated tool; it’s doing the thing that most people in SEO neglect, which is the most obvious thing, and that's Googling it. By that what we mean is Googling the keyword, seeing what competitors rank right now and having a look at a couple of things: the general level of authority that those websites have – and again, you can use some of those metrics provided by both free and paid SEO tools to get a general benchmark – and how relevant those websites are to this core topic.

The sweet spot here is having a keyword that you're going after where: (1) you have a search results page on page one full of not overly relevant websites and (2) relatively level authority. They're the things that you think you'll be able to feel pretty good, that you can go and do some damage within, those that are made up by some of the big players in the space that have been there for a long period of time. The other big thing is if you track that search results page and you see that the listings don't change that much and they're made up by the big names like the Amazon product pages, the top products that have been there and continue to stay there and have thousands of links pointing to those pages – they’re much tougher to move off of.

What we tend to try and do here is say, “Right, if it's relevant and we we're going after, say, ranking one of our core product pages: how many backlinks does our core product page have right now?” Then we can look at the top 10 listings on page one, for example, top five, whatever you want to do, and just have a look at how many backlinks, those individual pages, not the whole website, have. Run an average and let's say on average these pages have between 20 and 30 backlinks, so then you can say, “Well my benchmark is: how long will it take me to get to the level of having say 25 backlinks on this page?” That's when you can start to be able to put some kind of broad rough timelines on things.

“It takes my team roughly a month to build eight backlinks and we're doing that by guest posting; we're doing it via partnerships, etc.” As a result, what you think is, it’s going to take roughly four months to get close to that level of being able to rank really well. There you go. And you've got a baseline target, you've got a baseline timeline and you can start modelling out that for a bunch of different key terms. Now, paired up with your like total addressable market analysis, you've got both a potential opportunity and a competition-based timeline. What you want to do is to be revising this over time. How well did our forecast match our actual outcome? What was the delta between that? Start adjusting like your ability to forecast, and it will get easier over time because you're going to have a general sense of how well that site or your client's site ranks out of the gate for some of these keywords and you can start to build off of that.

Cluster Topics

How granular should cluster topics be?

How Granular Cluster Topics Should be

There's no one-size-fits-all approach here. In all honesty, you should not necessarily answer this question by focusing just on all of the potential content you could create. It should be driven by either potential search demand or present potential conversion opportunities. What we mean by that is, for example, we're going after this broad topic of pet products. We'll go back into this or let's even go one layer deeper than that: dog toys. You could get incredibly granular with the type of content that you’re creating just by blasting out a bunch of different ideas. We could probably create a list of a hundred different things we could create. Does that mean you should create content all the way to that kind of level? No.

What we need to do is start by building out the keyword universe around this one topic. Then what you’re doing is finding out how many times are each of these terms searched for every month, and what is our minimum viable amount of search volume that we're going to look to capture for this whole topic to make it actually worth our time. For some of those where the topic actually may be something that converts really well for us, that search volume like benchmark may be much lower because we know we're going to get higher conversion rate. We can justify on a revenue level or conversion level of some sort. On just a broad search volume, from an awareness point of view, you need to just have a cut-off point where you say, “Well, this is just not viable for us to be going this granular because no one's searching for it. No one is going to discover it.”

The only exception to some of these rules is where search is not the primary function of why you're building out the content. Maybe that's customer support content. Maybe that’s sales enablement content that you can share with the sales team to help close deals better. Maybe this is an emerging topic that you are anticipating is going to get more interest and demand over time. You want to be an early adopter, so you start building content around now to rank well later. That’s the framework we would often give within building out content in topics, versus just a very granular thing of “How many pieces of content should be in a topic cluster?” and “Should it be four or eight?” That's not the way to approach it.

How big is the potential universe of keywords that are being talked about in this topic? What’s the potential search volume that you can go after? There’s been a big discussion of this inside Traffic Thinktank not that long ago, and everything that was talked about around this always came back to search volume. This is where we were first shared the tool AlsoAsked.com, which we highly recommend. It’s just in beta right now; it's free. Go and check it out because it's super useful for answering this question and can build out from a core topic.

Would you rather start with the pillar page or with the subtopics?

SEO Advantage Pillar Pages

To take a step back, the reason why we talked a lot initially around topic clusters with pillar pages and subtopic pages is that the idea of the pillar page, more than anything, is to kind of be the central part a topic that covers the broadest aspect of it. Often, in the informational query domain of this is the overarching guide that is focused on that core topic and can be a great conversion point. Sometimes, the pillar of a topic cluster could be a product page if that's actually the place where you want people to convert and you're building subtypes around. Usually, we would say start from that point. That said, this whole thing is not black and white like that. You could start by building out more informational stuff, and it comes back down to where the most demand for information is. The beauty of building things in topical clusters, as we refer to them, is that you're just creating content around individual topics that all interlink to one another. As you build more, you interlink between the two, it creates a good user experience, you're building real depth of a semantic relevance around that topic in your website and it also gives you a great way of pushing authority between all of the different pages involved.

Get SEO Prices

Is there a perfect length for blog posts? Is it enough for it to be 600 words or is it better for it to be 2000 or is it not important at all?

Length of Blog PostIt is very important – the length and format of a piece of content. What is absolutely not the case is that there is a one-size-fits-all or perfect length for content. This is yet another thing that is shared so frequently in SEO, and probably over the years we have shared something similar when you're talking about individuals search studies. If we pull a list of a million URLs that are all ranking on page one and that primarily are all for informational queries and we did this say a couple of years ago and we plotted out the word count of all of these posts, we would easily be able to draw a positive correlation towards long-form content and page one rankings.

Now, that does not mean that all of your content should be long form content to rank on page one. In fact, some of the things we talked about right at the start of this session was Google being unable to understand the intent of searches and serve much more granular content as a result. One of the biggest shifts we've seen in the past like 18-24 months is that long-form content is driving a lot less traffic, and the reason for that is that previously you’d have really long form content, three-, four-, five-thousand-word pieces, and they would be focused around a broad topic. For instance, pillar page content: a lot of the time, it was said to keep that very long form. It's not always the case now because what would happen is you would rank for this really broad term and then you would rank for loads of like variations of that session.

For example, for someone searching for “dog toys” or “best dog toys” or “kid-friendly dog toys” or “dog toys for small dogs”, this one big guide around the best dog toys rank for all of those. Now what’s started happening is it ranks for say one-fifth of those queries because Google is actually able to find those slight nuances in the things that had been searched for and serving much more granular results that are much more tailored to that very, very specific nuance to the query. This is one of the things that's talked about, you may hear that there's been a real increase in the number of unique search result pages. What we mean by that is for that set of 10 different queries, the same search results once showed for that. Now, we see different search results for every single one of those very similar queries and it means that longer form content is ranking for a lot less of that because it's less of a kind of one-size-fits-all.

Similarly, sometimes short-form content is not enough. The best example, and many of you here can probably relate to doing this at one point in time, is where long form content is terrible. Even with the whole thing of, “Oh well you have to have content that's at least four- or five-hundred words, otherwise you can't rank,” is wrong. For example, you're sitting in the cinema, and you're watching some form of superhero movie, and the end credits start rolling and you sit there, maybe with your partner or friend and you're going, “I wonder if there's an end credit scene,” and then you wonder, “Do I sit through all of these credits to wait to find out if there is an end credit scene or do I just roll out of the cinema and then potentially miss something?” So, you very frantically pick up your cell phone and quickly Google this: “Is there an end credit scene for Avengers End Game?” You get these results and you click on some of them and they're 5,000-word memoirs that talk about the entire plot of the film and everything that goes into it. You've got reviews and things about the characters and then right at the very bottom of this thing that's covered in ads is a “No.” By the time you read through the whole thing, the end credits are finished and you’ve realised this.

The thing is, what you're starting to see now for queries like that is that, actually, things like just the word “no” is able to rank. The key thing to do here is, and to really tie this together in what our advice for this situation is, it goes back to Googling it. You're trying to rank for a query and you're determining whether you should write something that's a bit shorter, a bit longer, how many words – have a look at what's ranking there right now. The tendency across many things in marketing is often, say, “Let's look at what's happening right now and let's do something different. Let's make ours unique.” Do not do that in SEO! This is often very counter intuitive, but the worst thing you can do in SEO is try and reinvent the wheel.

What Google is telling you by showing you the first page of results is, “These are the exact things that our users want. This is how they want it and this is how we're going to rank things.” Do and follow the patterns that the things that people are actually already ranking for. If the average word count of something at the top page of Google is a thousand words, shoot for that. It's not going to guarantee you rank, but it's going to let you match the right kind of a result that the searcher wants, and that's the goal of Google. Everything else then follows in. This will be different for every query that you build around, but simply applying like a, “We're going to do 2000 words for all of these,” could just be a massive waste of your time and has no logic other than simplifying something for scale, which you're only going to end up like reworking in the future.

It’s also, as a side note, a really good way to do a bit of a content audit. We've seen countless case studies where people have had tons of really long-form content. They've looked at the other competitors in the SERP and they all have a word count that’s a fifth of theirs. They've cut out four-fifths of their content and actually ended up ranking way better. That’s not to say just go shedding out half of your content on everything you have. If you start to see that pattern in the search results, it's probably a good thing to test out.

Tools for SEO StrategyThe biggest takeaway everyone can take from all of the things we’ve discussed is really there really isn't a one-size-fits-all that you can apply with content in search and you can learn so much from just Googling the thing that you're going and ranking for, versus actually just trying to take an approach of “Okay, we're going to put this into some like expensive tool that's going to tell us a one-size-fits-all strategy.”

Use common sense in this piece, and that common sense is delivered to you by Google more often than not.