SEO Archives - The SEO Works https://www.seoworks.co.uk/category/seo/ SEO Works. Award winning SEO company, Sheffield and Leeds Thu, 15 May 2025 09:43:09 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.1 How to Survive a Rebranding https://www.seoworks.co.uk/how-to-survive-a-rebranding/ Wed, 14 May 2025 15:28:26 +0000 https://www.seoworks.co.uk/?p=17512 image showing two theatre masks encompassed by a large green circular arrow“Rebranding” can feel like a scary word in the world of SEO or marketing, all the work put into creating trust around your current brand for it to suddenly change overnight. The process of rebranding also requires a lot of work as there are many moving parts, and it is vital that everyone who has...

The post How to Survive a Rebranding appeared first on The SEO Works.

]]>

“Rebranding” can feel like a scary word in the world of SEO or marketing, all the work put into creating trust around your current brand for it to suddenly change overnight.

The process of rebranding also requires a lot of work as there are many moving parts, and it is vital that everyone who has a hand in the campaign is kept in the loop throughout to ensure no information is lost and that the transition happens smoothly. 

A rebranding doesn’t have to be scary, you just need to be well-prepared and have the data to support you in any eventuality. So, if you’ve decided it’s time for a spring-clean then it might be good to figure out where to begin.

Types of Rebranding

A rebrand doesn’t always have to mean a change of brand name, although this is one aspect of it, rebranding could also mean changing the name of a product, or changing the theming or structure of your website.

Ensuring that a site rebranding runs as smoothly as possible and that the site as a whole is successfully adapted and rebranded without losing its SEO value is of utmost importance.

The site has built up organic visibility and equity over time which has resulted in obtaining rankings and visibility increases. It is important to protect this foundation throughout the rebranding process.

The main idea is to keep the old brand relevant as well as products/services whilst also bringing relevance to the new changes.

Lots of client reassurance, guidance and support is needed during this time, and it is good to share some best practice to make sure everyone is on the same page.

Goals of a Rebrand

The goal of any rebranding is to do so while also mitigating traffic and ranking losses and drops as much as possible; conducting a thorough SEO rebranding strategy ensures any major losses are prevented where possible. 

When you change your URLs, content, and the website overall, you change what search engines know about your website and the metrics they’ve applied to it to give you your rankings.

It is important to make changes slowly to allow the search landscape to adapt to your new changes.

Building a Good SEO Strategy Before Rebranding

When overseeing a website rebranding or change of any kind, it is important to develop a strategy that focuses on elements of SEO that we can begin monitoring to aid with the process.

Having a strategy in place will allow us to provide a framework for supporting the rebranding process.  

Not only this but it will help us to retain current keyword rankings as best as possible, prevent the occurrence of broken links across the site and ensure that content changes incorporate tracked keywords.

There are lots of types of rebranding changes that could be made, so the strategy should ultimately reflect your client’s needs.

Setting Appropriate Redirects

The most important step in a site rebranding process is to set up appropriate redirects across the site where changes have been made.

A 301 redirect will pass all current SEO equity from the original version of the site to the new version if implemented and managed correctly and lead to relevant target pages.

The complexity of the redirect set-up depends on how many changes are made to the overall structure of the site. 

As the rebranding is an opportunity to improve the overall site structure and user experience, the set-up of redirects can be a little more complex as redirects will likely have to be added on a per-page basis. Although time-consuming, this will prevent redirect errors from being made and also stop redirect chains and loops from being created.

A nice thing to do would be to set up a staging site for any content changes or new pages. This will enable you to make changes and implement redirects without officially publishing the content, allowing room for error and room for fixes or improvements too.

If you (or your client) are managing the rebrand internally, it is important to be aware of these points:

If there are pages that you do not intend to keep on the new site and there is no other page with similar or relevant content to redirect to, then it is best not to redirect at all, just no index them/410 them to tell Google the page has permanently gone.

There is often a temptation to redirect these types of pages to the site’s homepage, however, search engines often see this as spam, and it generally results in both pages being dropped from indexing.

Ensure that you have plans to optimise your new 404 page to reflect your rebranding and provide users with other pages to access.

What to do in Case of a Site Name Rebrand

If you’re dealing with the most typical rebranding scenario, then you may wish to change your brand name or URL.

If you move from ‘brand X’ to ‘brand Y’ people will still search for the old brand name.

You will likely require a migration to a new domain, which is accompanied by a new site design that reflects the rebranding.

In this scenario, it is important that you have exported all current site data for benchmarking at a later date once the new site is up and running. 

The first step would be to 301 redirect every page of the old domain to the new destination, if possible keep URL structures the same or similar as this will make redirecting simpler.

Creating a new page in the new domain that is about the old brand can serve as a bridge page to help target and rank for the old branded queries. This will also help to explain the reasons for the rebranding to your customers and help with the transition to the new brand name.

You could also add explanatory text to each page of the site for a short while if you feel it is necessary, through the form of a pop-up or banner.

Next, you should audit the URLs of the old site via Google Search Console or SEMrush/Ahrefs to see which of the old site pages are currently performing for branded terms. It is likely that these pages will need to be dealt with in a similar way to the homepage and can link to and from the bridge page on the new site too.

You should also ensure Google My Business information, and all other NAP (name, address, phone number) details across the web, including social media, are updated as this may impact local search.

Lastly, complete a thorough backlink audit so you can ensure all outreach links have been updated and correctly point to the new site.

What To Do in Case of a Product Rebranding

Despite not requiring a full domain migration and that it will only affect a number of pages across the site, product name rebrands often involve the most important pages on a site in terms of conversions, and therefore need full attention and support.

There are 3 ways this could go:

  • The names overlap (i.e. Yellow ——>Yellow B)
  • The name somewhat overlap (i.e. Yellow ——-> RED Yellow)
  • The names are completely different (i.e. Yellow ——-> Blue)

When Product Names Overlap

If the product names overlap, then it may be possible to keep the old product URL and simply update the content to target the new product name.

If you have to have a new page entirely, you should move the old product URL to the new product URL via 301 redirect. 

It is also good to make sure you have updated any structured data/schema if it’s implemented, and most importantly, the page content. You could also use explanatory text via a banner or a pop-up on the new product page if you feel it is necessary.

When Product Names Somewhat Overlap

If the product names somewhat overlap, you can leverage the old product page to help to directly establish the new one more quickly.

Firstly, you should 301 redirect the old product page to the new one to maintain brand relevance. You may also choose to create a bridge page to target old brand queries if this is necessary, there is also the option to include explanatory text on the new product page in the form of a banner or a pop-up.

You can then link to the bridge page from the new product page and to the new product page from the bridge page. It is also vital to update any existing internal links and place in any new internal links that may be deemed necessary now the product name has changed.

When Product Names Are Completely Different

If the product names are completely different, you can still leverage the old product page to serve as a bridge page that will explain the rebrand, keep the brand relevance (and rankings) for old brand queries whilst also linking to the new product page. 

A new product page for the new brand should then be created to target new brand queries. This new page should be linked internally, and all links should be updated. It is also important to update your outreach links if necessary.

What To Do in The Case of a Site Restructure

Site restructures are tricky as although they aren’t technically a rebrand in essence, users will be landing on a different version of your site which can impact user experience and conversion funnels. It may impact click-through rate or site engagement if users don’t feel familiar with a site set-up.

There are lots of moving parts to a site restructure, and it takes a lot of time, care and attention to successfully implement and follow through with a full restructure to ensure the process runs smoothly.

URL Auditing

To enable you to identify what SEO value a site structure currently has and what is worth maintaining during the restructure, an audit of all current URLs will need to be conducted to look for high-value content, key backlinks, and to assess the site’s current ranking positions within SERPs. 

All current URLs need to be taken through this process, as it will also allow you to set up efficient redirects later on in the process. 

You can also use this opportunity to see which URLs have lower SEO value (i.e. pages that are no longer relevant to the site, and pages that have little high-value content on them) they can then be optimised to a greater extent during the rebranding process or redirected appropriately. 

Conducting a URL audit will also allow you to identify duplicated content across the site. It is important to highlight where these areas exist and decide how to best proceed in the context of the rebrand.

Identify High-Performing Pages and Keywords

Another important step is to identify which of the tracked keywords are bringing in the most traffic to the site, as well as the top performing pages for those keywords.

You should look at pages that are sitting within positions 1-20 (or 1-30 if the site is rebranding in its infancy) and use this as an opportunity to identify keywords that the client may want to target more heavily in the rebrand than on the current site.

Identifying the top-performing pages with the highest SEO value will ensure that we are preserving content that is useful to the site during the rebranding process. 

You should ensure that if structural updates are made to these pages, such as layout or sentence changes, the overall topics and high-volume keywords remain the same where possible.

Preserving content is also important to consider during the redirect process, especially when making alterations to the structure or layout of pages across the site. If you are combining pages, it is important to incorporate relevant keywords to each page.

While many SEO elements will transfer via redirects, keywords can only be maintained if they are appropriately incorporated throughout the rebranded content and metadata.

Backlinks are just as important as internal links and content when it comes to rebranding, especially if the brand name has changed or a product name has changed. 

With the help of a backlink audit, you can track any changes that may occur during the rebranding process and make key changes and fixes if necessary. 

Once the rebranded site is published, you will then need to look over the backlink audit once more and complete some link update outreach. This involves contacting the companies, journalists and bloggers who have included a backlink to your site and asking them to change the old link to reflect any URL changes or updates that have occurred during the rebranding process.

Post Publication

Once the rebranding is completed and launched, it is vitally important to submit a new version of your sitemap.xml file to ensure that search engines are aware of the changes made to the site structure. 

After the launch date, closely monitor site analytics to ensure that everything is still running smoothly and as you want it to. Also, look out for any technical issues and implement immediate fixes if necessary to keep URLs updated.

It is expected to see some ranking and traffic drops during the period directly after the rebranding to allow Google to update its database. However, these changes will start to steadily improve again shortly after updates are made.

So there you have it, you’ve just survived a rebranding, and it doesn’t need to be as scary as it’s first made out to be! With a few simple steps to ensure a smooth and reliable process, you can achieve rebranding success.

Want to be sure you don’t miss anything during your migration? We have a FREE downloadable checklist, which you can find here. This checklist includes everything you need to do from an SEO perspective before, during and after a web migration. We’ve also included information on what each step involves and why.

Interested in outsourcing your migration? We offer assistance with the whole process as part of our SEO campaign package. Contact us today to speak to one of our experts about our award-winning SEO services.

The post How to Survive a Rebranding appeared first on The SEO Works.

]]>
Ghost Jobs: The Dark Side of Branded Search https://www.seoworks.co.uk/ghost-jobs-and-branded-search/ Wed, 23 Apr 2025 16:04:11 +0000 https://www.seoworks.co.uk/?p=17409 When brands post ghost jobs for SEO gains, who you gonna call? In March 2025, a high-profile industry newsletter went out to thousands of SEOs and digital marketers with what we think is a fairly controversial tip. It concerns branded search, something of a hot topic after a few insightful Google leaks pointed to its...

The post Ghost Jobs: The Dark Side of Branded Search appeared first on The SEO Works.

]]>

When brands post ghost jobs for SEO gains, who you gonna call?

In March 2025, a high-profile industry newsletter went out to thousands of SEOs and digital marketers with what we think is a fairly controversial tip. It concerns branded search, something of a hot topic after a few insightful Google leaks pointed to its importance in the search rankings.

We won’t call out the newsletter because we’re not about naming and shaming (and it’s actually one of our favourite publications, usually). But it made a couple of rogue suggestions about how to level up your branded search traffic, namely, advertising fake or ‘ghost’ jobs in order to get jobseekers to Google you. Here’s the tip they sent out:

“Run job ads (even if you’re not actively hiring) to generate brand-name searches from jobseekers. Don’t include the link to your company website in the job ad – make candidates Google you!”

Shudder.

In this article, we’ll cover what ghost jobs actually are, why anyone would want to post one, and provide what we think are some less sinister ways to improve your branded search performance.

What is a Ghost Job?

We know that sometimes, companies will have an open call for CVs on their website, or keep an old vacancy open if the right applicant hasn’t been found, but this is something else altogether. 

Ghost jobs are adverts for jobs that literally don’t exist. The company aren’t hiring – and don’t plan to, but for any number of reasons decide to post a vacancy online. They’re usually found on job boards like LinkedIn Jobs or Indeed, lurking in the shadows.

Who is Posting Ghost Jobs? And Why?

A survey of 1,641 hiring managers last year revealed that a shocking 40% of companies admit to posting ghost jobs, with 13% of those companies posting 75 or more fake roles in the space of a year.

And if that makes your blood run cold, then you might want to be sitting down for this.

The Chilling Reasons Companies Post Fake Jobs:

  • To create the illusion of growth
  • To mislead employees into believing their workload will be alleviated
  • To make employees feel replaceable (yikes)
  • To collect resumes or data

According to the survey, those are the main reasons companies have historically posted ghost jobs. But jobseekers are now contending with a new shady school of thought; that there are SEO gains to be had from advertising ghost jobs.

The name of the game? Branded search.

Branded search refers to when users are searching specifically for your brand name, product or service, or when their search queries include your brand.

“Royal Mail tracking”

“IKEA roman blinds”

“Papa Johns vs Dominos”

Branded search will signal strong credibility to search engines, and if the Google leaks are to be believed, it’s a ranking factor with some weight behind it.

Search engines know that there’s strong intent and trust behind a branded search, which go hand-in-hand with high clicks and low bounces.

In a nutshell, if users are searching for your keywords and your brand, it’s a ringing endorsement when search engines are deciding who to prioritise in the SERPs, and what to show in related searches.

“When many users are searching for a specific term, such as “3M chemical goggles,” instead of just “chemical goggles,” Google starts to show that particular term in its suggestions over time. If branded keyphrases become more popular, they move higher in the suggestions list.”

Greg Heilers, Moz

While we accept that ‘black hat SEO’ (unethical/manipulative SEO tactics) can get quick results that might impress clients initially, gaming the system is rarely future-proof.

Google will get wise to any loophole eventually, and a history of hacky, spammy practices is likely to do more harm than good to your brand. Reputation takes years to build, and seconds to destroy.

When it comes to ghost jobs, any shortsighted wins (if they even work) will have been at the expense of real people going through tough times. So there’s a significant human cost attached.

We spoke to Paul, a father of two and a content marketer with a decade of experience who’s been looking for a job for about five months.

“Real job applications and interviews are hard enough to get genuine responses and feedback from. To know some brands are willingly wasting applicants’ time, people who are often desperate to find work to provide for their families in a highly competitive job market, is frustrating but ultimately unsurprising.”

Hunting through poorly targeted job ads for hours on end, paying to access paywalled opportunities, jumping through endless hoops, and being rejected or ghosted by roles you’re overqualified for, all take a huge mental toll on beleaguered jobseekers. 

Meanwhile, household outgoings continue to soar to record levels. Just last week, the charity Citizens Advice said the finances of millions in the UK were “stretched beyond breaking point” even before April’s rise in energy and council tax bills.

It’s a jungle out there, and you only need to spend five minutes on the subreddit /r/UK Jobs to understand how cruel it can be. At the time of writing, these are some of the top posts of the day:

  • ‘I’ve applied for over 1,500 IT jobs in the last 6 months in the UK with zero progress’’
  • ‘A 3 day test for an entry-level social media role. I give up.’
  • ‘Finally got a job after almost 3 years of active searching’
  • ‘Today I applied for my 400th job in 2 years’
  • ‘Got asked to do an AI interview…’

Hannah Ellis, Founder of Sheffield Young Professionals, spends a lot of time talking careers with the region’s most promising talent, and had this to say about ghost jobs:

“Applying for jobs is a time-consuming and mentally taxing task at the best of times. And if you’re out of work when applying for roles, the pressure is tenfold.”

“Spending time filling out applications and proceeding to hear nothing back can be soul destroying – these fake listings are exploiting people’s ambition and self-esteem purely for their own gain.”

We’re inclined to agree.

Posting ghost jobs is unethical, and unethical SEO practices have no place in your marketing strategy.

So, with the exorcism complete, let’s talk instead about what you should do.

5 Better Ways to Improve Branded Search Volume

Let’s explore some practical steps you can take to improve your branded search metrics that won’t leave exhausted job hunters crying into their cornflakes.

1) Say More, Help More, Give Stuff Away

You do stuff, so you probably know stuff. Just like writing these blog posts, not only is it something we enjoy doing at The SEO Works, but it’s something we’re good at, because we solve SEO challenges for our clients all day long. If you’ve got some experience, you can help people, and helpful content will drive interest and get people Googling you.

Take Moz for example, quoted earlier in this article. Moz made its name in the SEO space by creating and sharing a ton of high-quality educational content for free, engaging with their target audience.

When I first started out in SEO, I soaked up everything I could from the Moz Academy, and up until recently (when it became less free), I’d send folks to their site to learn more about SEO.

Google’s own guides recommend “creating helpful, reliable, people-first content” if you want to impress their automated ranking systems. To perform well in search, we should all be posting content that demonstrates “first-hand expertise and a depth of knowledge”. It’s all right there in the user manual, people!

And what’s that old customer service adage, “serve before you sell”? 

You could write blogs, appear on industry podcasts, build cool little tools like Whitespark’s Freshness Calculator or our handy Title Generator, and give them away for free. Or put a bunch of useful resources up on your site like our webinars, whitepapers and helpsheets.

If you really have a lot to say, you could even write a book. Gil Gildner from Discosloth (a digital agency who publish marketing titles like The Beginner’s Guide to Google Ads) writes,

“Books that we wrote back in 2017 are still paying dividends. In the past couple of years, we have gotten to the point where we have so many brand searches & inbound inquiries that we essentially don’t have to do sales anymore.”

Lucky for you, Gil.

2) Engage Your Audience Where They Hang Out

You probably have a good idea who your target audience are, but we suggest creating some customer personas (using data to build profiles of your ideal customers) and start engaging with them in the spaces they spend time online and off. You could:

  • Sponsor their favourite podcast or industry event
  • Contribute thoughtfully to popular threads on social media
  • Run ads on their favourite YouTube channels
  • Run display ads on relevant online communities
  • Be super active and helpful in their favourite subreddits
  • Pull back the curtain with a practical, behind-the-scenes ‘how I did this’ type case study that helps them achieve a common goal

3) Leverage Digital PR

To encourage organic brand mentions, you can utilise Digital PR tactics by offering insights or data that authoritative publications will want to reference or quote.

Mikaila Storey heads up our Digital PR Team, and has this to say:

“The more visible your brand is, the more familiar people will become with it, searching for your company to get additional info or purchase products.”

Mikaila suggests having a few Digital PR tactics up your sleeve that can generate brand mentions to boost visibility:

  • Sharing company news and case studies
  • Providing expert commentary and insights on relevant topics 
  • Creating and sharing data-led campaigns with unique data
  • Reacting to trending relevant news stories with different angles

Increasing brand awareness and mentions can be tricky, especially in very niche or competitive industries, but trialling different approaches and tactics can pay off and help to increase branded searches.

4) Reviews: When Stars Align

Both as marketers and consumers, we all recognise the value and importance of customer reviews. These days, savvy shoppers will append ‘reddit’, ‘trustpilot’ or even ‘legit?’ to the end of a branded search before even entertaining the idea of engaging with you. This is all about credibility and trust.

Now, we’re not saying that having great online reviews is going to directly send you huge droves of branded search traffic, but there are less obvious ways that it can help.

Genuine product reviews often appear on sites like Yelp, G2, Capterra, Quora, and Reddit. When people see these reviews but want more info, they take to Googling you. I’ve personally taken screenshots of Reddit comments recommending a product, earmarking it to look up later.

Here’s one I found on my phone. I later Googled the brand and bought their earplugs after seeing several people vouch for them online.

Reddit thread showing a dicussion and recommendation about earplugs

This can be really powerful for smaller brands, where having a bit of a cult following amongst your core audience can help you compete with the big incumbents.

If your brand is well-reviewed, you also stand a better chance of being featured in rich results. This is a bit like an unofficial stamp of approval from Google, which could lead to people searching for your brand later on when they’re further down the funnel.

So whatever you can do to encourage your customers to review you online (and not just on the big review sites) will contribute to your branded search performance, as well as your conversions.

5) Post Real Jobs!

Here’s the big plot twist. We do think posting job vacancies increases branded search volume, and we do recommend you do it. When you’re actually hiring.

There’s no harm in posting your genuine job ads across multiple job boards and all over social media, in fact, it’s a good idea. You’re casting a wider net and, in theory, stand a better chance of reaching the right candidates.

Posting about a vacancy on social media means you can also attract passive candidates – those who aren’t actively looking for a new role, but might be open to one.

So there you have it – what not to do, and what to do instead. Take what you read in newsletters with a pinch of salt (and a healthy glug of integrity), and trust your gut. If it feels shady, it probably is.

To end on a quote (shoutout to GCSE English):

“Don’t be evil.” – Google, before adopting a more scalable approach to morality.

– Update: Since being interviewed for this article, Paul has landed a marketing role with a well-known travel brand. Nice one, Paul! 💪

The post Ghost Jobs: The Dark Side of Branded Search appeared first on The SEO Works.

]]>
What is Crawlability and Indexability in SEO? How to Get Your Pages Seen https://www.seoworks.co.uk/crawlability-and-indexability/ Tue, 25 Mar 2025 16:35:21 +0000 https://www.seoworks.co.uk/?p=17338 You’ve done the hard part: you’ve poured your heart and soul into creating a stunning website that aligns with your core offerings and values. The UX is fully functional, the content is high-quality, and you’ve spent countless hours perfecting every little detail. But how do you ensure that this content gets seen by organic users?...

The post What is Crawlability and Indexability in SEO? How to Get Your Pages Seen appeared first on The SEO Works.

]]>

You’ve done the hard part: you’ve poured your heart and soul into creating a stunning website that aligns with your core offerings and values. The UX is fully functional, the content is high-quality, and you’ve spent countless hours perfecting every little detail.

But how do you ensure that this content gets seen by organic users? To not let your hard work go unnoticed, the key is to create a website that is optimised for crawlability and indexability. 

Here, we’ll break down exactly what crawlability and indexability mean, why they’re crucial for your SEO success, and how you can make sure your pages are getting the spotlight they deserve on Google.

What is Crawlability?

In short, crawlability refers to how easy it is for search engine bots, such as Googlebot, to access and navigate the pages of your website. These bots, often called ‘spiders’, follow links on your site to discover and analyse its content. 

These fetch information from billions of web pages, and it achieves this by discovering new URLs either through internal links or via your submitted sitemap. If your site is crawlable, it means these bots can explore your pages and understand their structure in an efficient manner.

Without the ability for your site to be crawled, search engines do not have access to the information they need to evaluate your site and its content. Consequently, it has no reason to rank your site on its search results pages.  

This means that ensuring your site is crawlable is vital for your chances of gaining visibility in search.

However, having a fully functional site is only part of the equation. In order to ensure that all of your valuable pages are crawled (not just some), you need to make sure any potential barriers to crawlability, like broken links, blocked resources and poor site architecture, are removed. 

While often confused, crawlability is a separate concept from indexability. 

What is Indexability?

Indexability is the process that allows a search engine to analyse, process, and store a webpage in its database. This allows the page to appear in the search results.

Once a page is crawled by search engine bots, it must meet certain criteria to be indexed and made available to users searching for relevant terms.

Having an indexable site means search engines can not only access your content but also include it in their search index, making it visible to potential visitors. 

However, like with crawlability, there are some potential blockers to indexability to watch out for. For instance, in most cases, if pages are blocked by settings like ‘noindex’ meta tags or HTTP headers, Google will not index it, even if it was crawled. Therefore, ensuring your pages are indexable is just as crucial as making them crawlable.

5 Common Crawling and Indexability Issues (& How to Fix Them)

Various issues can hinder search engines from fully accessing and ranking your content. With this in mind, here are five commonly found crawling and indexability problems and how to fix them.

1) Thin content

Issue:

Search engines may not rank pages with very little original, or ‘thin’, content. This may include pages with just a few sentences, or pages that primarily consist of product descriptions copied from manufacturers.

Solution:

Boost your content with original research, expert opinions, or high-quality images and videos. This will not only help to increase engagement rate, but it will also make the content appear more authoritative. 

There’s also a chance that you may have a number of pages that focus on a similar topic. If you have multiple thin pages that have some overlap, consider consolidating them into a more comprehensive one.

Also, ensure that thin pages offer a valuable user experience even if they don’t have a lot of text.

2) Mobile-first indexing

Issue:

Google primarily uses the mobile version of your site for indexing and ranking. If your mobile site has speed or usability issues, it can negatively impact your search rankings.

Solution:

Ensure your website uses responsive design to adapt seamlessly to different screen sizes. In particular, make an attempt to boost mobile speed by minimising page load times, optimising images, reducing HTTP requests, and enabling browser caching.

3) Duplicate content

Issue:

In a similar vein, search engines may view your content unfavourably if they find significant amounts of duplicate content. This could be either on your own site or another site found elsewhere on the web. 

If it’s between two pages on your own site, this can confuse search engines about which version of the page is the original and authoritative one.

Solution:

There are numerous potential solutions to removing duplicated content. Depending on your situation, you may want to employ:

  • Canonical tags: Use the rel=”canonical” tag to specify the preferred URL for a page if it has multiple versions. This may involve different URLs for mobile and desktop.
  • URL parameters: If you use URL parameters for tracking or filtering out source traffic, use differentiated versions to avoid duplication. 
  • Content consolidation: If you have very similar content on different pages, consider consolidating it into a single page.
  • Noindex directives: Use the ‘noindex’ meta tag or robots.txt directive to prevent search engines from indexing pages with duplicate content.

4) Crawl errors

Issue:

For a variety of reasons, search engine bots may encounter problems while trying to access and crawl your website. This could be due to server errors, such as 404 Not Found or 500 Internal Server Error, but also incorrect robots.txt directives, or slow server response times.

Solution:

To diagnose and address crawl errors on your site, you can use tools like Google Search Console and website crawlers like Sitebulb or Screaming Frog:

  • Google Search Console provides valuable insights into crawl errors, such as specific URLs affected, indexing issues, and coverage reports. It also offers tailored advice on how to resolve these issues.
  • Website crawlers like Sitebulb or Screaming Frog enable real-time troubleshooting by discovering and fixing broken links and missing resources, and analysing your robots.txt file to ensure it isn’t unintentionally blocking important pages.

You can also review your server logs to identify and fix broken links. 

Meanwhile, to further reduce the risk of crawl errors, you should explore improving server response times by optimising large resources, using a content delivery network (CDN), and minimising HTTP requests.

5) JavaScript rendering issues

Issue:

Search engines like Google can render and understand JavaScript, but this happens as a secondary step in the indexing process. For JavaScript-heavy websites, especially large ones, challenges can arise if JavaScript is slow to load or inefficient. 

In such cases, Google may run out of crawl budget and prevent the search engine from fully rendering and indexing dynamically loaded content.

Solution:

If your site relies heavily on JavaScript, you can tackle rendering issues in some of the following ways: 

  • Server-Side Rendering (SSR): Render JavaScript content on the server and provide a fully rendered HTML version to search engines.
  • Use a JavaScript rendering service: Services like Prerender.io can help search engines render Javascript content and index dynamic content.
  • Improve JavaScript performance: Optimise your JavaScript code to ensure it loads quickly and efficiently.

How to Know if Your Site has Been Crawled and Indexed

So you’ve taken all the necessary steps to make your website crawlable and indexable. But how can you know definitively if your pages are being crawled and indexed?

One of the best ways to do this is by using Google Search Console. By logging into Search Console and accessing the URL Inspection Tool, you can quickly check if Google has crawled specific pages on your site and when it last did so. 

This tool provides valuable insights into whether your pages are indexed and up-to-date with Google’s crawling activity. It also allows you to check when your XML sitemap was last read. 

It’s also important to remember that sometimes pages are either ‘discovered but not yet crawled’, as shown in the ‘Discovered – currently not indexed’ section of Google Search Console. This means Google is aware of your page but hasn’t had the chance to crawl it yet.

This could happen if Google considers the page non-critical or if there are crawl budget issues, often due to the page being similar to other low-value pages or lacking strong signals like internal links. In such cases, you can request indexing to help ensure the page gets crawled and indexed.

Discovered - currently not indexed issues on Search Console

Likewise, the ‘Crawled – Currently Not Indexed’ status indicates that Googlebot has visited the page, but it hasn’t yet been added to the search index, possibly due to the various factors mentioned above.

Crawled - currently not indexed issues on Search Console

Alternatively, you can also analyse your website’s log files. These files capture every request made to your site, including those from Googlebot. By reviewing the logs, you can find specific timestamps of when Googlebot last visited your site. 

Bear in mind, though, that this process requires access to your website’s logs and might need the help of a hosting provider or technical team. 

Another quick method to check if your site has been crawled is by using the ‘site:’ command in Google’s search bar. Simply type “site:yourwebsite.com,” and if Google returns a list of indexed pages from your site, it means Google has crawled and indexed your content.

Although this method doesn’t show the exact date or time of the crawl, it’s an easy way to confirm that Google has at least discovered your pages.

An example of a ‘site’ command in Google’s search bar:

Example of site command on Google

The amount of results seen under the ‘Tools’ dropdown should reflect the amount of pages on your site that you expect to be indexed:

Example of collated results on Google listings

Together, these methods give you a good understanding of whether your site is being crawled by Google and how often.

Crawlability and Indexability in SEO: The Verdict

It almost goes without saying that crawlability and indexability are the foundation of any successful SEO strategy. Naturally, if search engines can’t discover, access, and index your content, your website’s visibility in search results will be significantly limited. 

By addressing common issues like duplicate content, JavaScript rendering, and crawl errors, and by consistently monitoring your site’s performance using tools like Google Search Console, you can pave the way for better rankings and more organic traffic.

Ready to take your website’s performance to the next level? Get in touch with our expert team at SEO Works to see how we can help!

FAQs

How often does Google crawl websites?

The frequency depends on factors like the site’s authority, update frequency, and overall quality. High-authority or frequently updated sites are crawled more often.

What is a crawl budget, and does it matter for small websites?

A crawl budget refers to the number of pages Googlebot crawls on your site within a specific timeframe. While it’s more critical for large sites with thousands of pages, smaller sites can still benefit by ensuring their most important pages are easily crawlable.

Does crawl budget matter for large sites, and how can it be managed?

For large websites with extensive content, crawl budget is important as Googlebot may not crawl every page if the site is too large or has limited resources. To manage crawl budget effectively, you can use robots.txt to block non-essential pages from being crawled. By doing this, you’ll be directing Googlebot to focus on the most important content. 

Ensuring your site has fast load speeds and bolstering horizontal and vertical internal links to key pages will also help to maximise the crawl budget.

Can internal linking improve crawlability?

Yes, a well-structured internal linking strategy is crucial for helping search engine bots navigate your site efficiently. It ensures they can discover deeper pages that might not be linked externally. 

Additionally, XML sitemaps play an important role in guiding search engines to important content on your site by providing a clear list of all URLs that should be crawled and indexed.

The post What is Crawlability and Indexability in SEO? How to Get Your Pages Seen appeared first on The SEO Works.

]]>
On-Page SEO Guide: How To Optimise A Web Page in 2025 https://www.seoworks.co.uk/on-page-seo-guide/ Tue, 25 Feb 2025 12:08:06 +0000 https://www.seoworks.co.uk/?p=17111 SEO and digital marketing is undergoing a major shift at the moment. With the rise of AI, frequent algorithm updates, and new innovative technologies that are influencing user behaviour massively, it can feel more challenging than ever to stay on top of the best practices for online success. Thankfully, amidst the chaos, there’s one area...

The post On-Page SEO Guide: How To Optimise A Web Page in 2025 appeared first on The SEO Works.

]]>

SEO and digital marketing is undergoing a major shift at the moment. With the rise of AI, frequent algorithm updates, and new innovative technologies that are influencing user behaviour massively, it can feel more challenging than ever to stay on top of the best practices for online success.

Thankfully, amidst the chaos, there’s one area of search marketing that has consistently remained the foundation of organic search performance: on-page SEO

In this guide, we’re going to break down everything you need to master the art of on-page SEO in 2025; whether you’re a beginner just finding your feet or an experienced SEO veteran with the results to prove it.

Packed with all the fundamentals as well as the latest tips, tricks and strategies, this guide is the only on-page SEO resource you’ll need to dominate your competitors in the SERPs.

Presented in this clear, easy-to-follow on-page SEO guide, you’ll learn how to:

  • Craft high-value, impactful content that satisfies search intent.
  • Write keyword-optimised metadata to reach your target audience and drive clicks.
  • Structure your content optimally for both the user and search engines.
  • How to place keywords correctly to boost rankings.
  • Leverage your experience and expertise within your content.

Let’s get started!

On-Page SEO: Key Takeaways

  • Google prioritises content that is original, helpful, and satisfies search intent. To excel, your content should be people-first, authoritative, and well-structured for an optimal user experience.
  • Optimising meta titles is one of the most impactful on-page SEO tactics you can use. Make sure they are keyword targeted and a suitable length (55-60 characters).
  • Structure your website using descriptive URLs organised into subfolders. This improves clarity for both users and search engines, helping them understand your content and its place within the site hierarchy.
  • As a rule, target keyword mentions should be strategically placed within the H1, the introductory section of your content, and within supplementary headings (H2s, H3s, H4s…).

Page speed and Core Web Vitals reports aren’t everything. If your site feels fast, is responsive, easy to navigate, and free of content shift issues, it’s better to focus your efforts elsewhere – even if there’s still room for improvement in your PageSpeed Insights report.

What is On-Page SEO?

Definition of on-page SEO

On-page SEO is the process of optimising ‘on-site’ elements of your website for both search engines and users.

By focussing on key ranking signals, the practice of on-page SEO makes it easier for Google to recognise how great your website is at helping users reach their end goal. This goal could be to make a purchase, learn about a new topic, or to simply scroll through an endless archive of cute cat photos.

Ultimately, webpages that help users achieve their goals are what search engines prioritise when assessing content quality.

The more straightforward it is for search engines to recognise how relevant and useful your content is for users based on their search query, the better your pages will rank and the more traffic they will drive. 

We achieve this by carrying out on-page SEO.

On-page SEO elements include:

  • Meeting search intent
  • Content quality and relevance
  • Meta titles and descriptions
  • Header tags
  • URL structure
  • Internal/external links

And much more.

Why is On-Page SEO Important?

By considering on-page SEO factors, you can create website content that is compliant and follows Google’s guidelines to the letter, whilst also offering a great user experience.

This way, your pages meet the expectations of both search engines and the user, creating the perfect recipe for organic success.

On-Page SEO vs. Off-Page SEO

On-page SEO is opposed to off-page SEO, which focusses on ‘off-site’ factors such as backlinks and other influences from external sources with the aim of improving keyword rankings. 

Both are integral to any comprehensive SEO strategy. However, on-page SEO is often the starting point given that it’s easier to control and generally delivers more immediate, tangible results.

11 Effective On-Page SEO Techniques and Best Practices for your Website in 2025

Now that we know what on-page SEO is and why it’s so important, we’re ready to jump into the good stuff.

Here are 11 of the most important on-page SEO techniques to remember and follow in your pursuit of position #1 rankings.

1) Helpful Content is Still King

If you’re in any way familiar with SEO, chances are you’ve heard the term ‘helpful content’ thrown around quite a lot over the past few years – and for good reason.

It might seem obvious, but every piece of content you create should focus on delivering genuine value to the person on the other side of the screen. The goal is to satisfy their search intent while guiding them toward achieving their end goal as quickly and as efficiently as possible.

This has been the focal point of a number of Google’s recent core algorithm updates, starting with the aptly named Helpful Content Update released in 2022.
The core idea is that all content should be helpful, reliable, and people-first, designed to serve the reader and not just to gain search engine rankings. Pages should also offer a positive user experience, taking navigation, page speed, and Core Web Vitals metrics into account.

How can you assess how helpful your content is?

Before looking at any of the more complex aspects of on-page SEO, you need to decide if your content is strong enough to rank in the first place. 

Here are a few questions you can ask yourself to judge if your content is likely to meet Google’s expectations on helpfulness:

Originality and Value
  • Does your content provide original insights, research, or substantial additional value beyond existing sources?
  • Does your content cover the topic thoroughly with insightful analysis and comprehensive detail?
Expertise, Trust, and Accuracy
  • Is your content written or reviewed by a knowledgeable expert, with clear factual accuracy sourcing where appropriate?
  • Does your content clearly demonstrate first-hand expertise and a depth of knowledge?
  • If someone researched your site, would they find it well-trusted or recognised as an authority on the topic discussed?
Quality
  • Is your content well-written, free of errors, and produced with care rather than mass-produced or automated?
User Experience and Satisfaction
  • Does your content leave the reader feeling informed and satisfied without needing to search further?
  • Would the reader want to bookmark, share, or recommend this content to others?
  • Is your content responsive and easy to read across all different browsers and devices?
Relevance and Purpose
  • Is your content aligned with the site’s audience and purpose, rather than created mainly for search rankings?
  • Are you writing about trending topics just for traffic rather than genuine audience value?
Transparency and Ethical Practices
  • Does your content avoid misleading tactics, such as clickbait headlines or artificial freshness updates?
  • Are your sources and authorship clearly disclosed, supporting your content’s credibility?

For more information on helpful content criteria and guidelines, check out Google’s documentation on helpful, people-first content

2) Keyword-Optimised Metadata is the META

An example of a keyword-optimised meta title served in search results.

Optimised metadata is one of the most fundamental elements of any successful SEO strategy.

Think of metadata as the ‘shop window’ to your site. It’s often the user’s first impression of your page, its content, and sometimes even your brand as a whole. 

Your metadata can make or break whether a user decides to click-through to your site or not, so it’s vital to write it manually with your target audience in mind.

Metadata isn’t only important for users and boosting click-through rates (CTR), it’s also one of the first on-page elements search engines look at to understand and rank your page. This makes target keyword mentions super important.

Meta Titles

Meta titles are a direct ranking factor, meaning search engines use them to determine where your page should rank. Therefore, the primary keyword for your page should be included within the meta title to help improve organic visibility for that term.

For the best results, meta titles should be between 50-60 characters in length. This length maximises your piece of SERP real estate, whilst also avoiding truncation which can make titles appear cut off and less engaging.

Meta Descriptions

Meta descriptions are not a direct ranking factor so they don’t impact page ranking. However, they give readers a preview into what your page is about and what you can offer them. Make sure they’re still high-quality!

In terms of length, meta descriptions should be between 155-160 characters in length, again to make the most of the space available to you whilst avoiding any truncation.

Using Google SERP preview tools can help with writing meta titles and descriptions.

To summarise, meta titles and descriptions should be written to entice the user to click-through to your page by clearly conveying value. To improve rankings, target keywords should be integrated into meta titles to tell search engines what your content is about. 
For more detailed guidance and tips for optimising meta tags for SEO, read our guide on the topic.

3) Clean, Descriptive and User-Friendly URLs

Two examples of both descriptive and non-descriptive URLs.

Google makes a point of emphasising the importance of using descriptive URLs on your site. But, what is exactly meant by the term ‘descriptive’?

What is a Descriptive URL?

A descriptive URL is a link that provides users with the proper information and context they need to understand where the link will take them on the website.

This means that they will be absent from random characters and identifiers which have no meaning to a user.

Descriptive URL Examples

“https://www.bestonlineshop.com/electronics/wireless-earphones”

“https://www.jbsgolfemporium.com/blog/expert-golf-putting-guide”

Non-descriptive URL Examples

“https://www.bestonlineshop.com/electronics/?p=98765&ref=cat1”

“https://www.jbsgolfemporium.com/blog/category1/page5678”

As a rule, your website pages should be logically categorised using subfolders by using URLs that create a hierarchical structure.

This signals to users and search engines how your pages are related to each other and the rest of the site, making it easier for Google to match your content with related searches and for users to navigate your pages.

4) Strategic Keyword Placement Within Content

So, you’ve chosen target keywords, now what? How do you make sure you’ve mentioned them enough in your content?

There are 3 main areas within your content where you should place target keywords for maximum impact. These are:

  1. The H1
  2. The introduction (usually your first paragraph)
  3. Supplementary headers (H2s, H3s, H4s etc.) and copy

By doing this, you’re clearly signaling to Google what your content is about while making it easy for users to scan and decide if it matches their search intent. 

But that’s not all.

How does Google Assess Keyword Density Within Content?

Okay, things are about to get a little nerdy.

Search engines don’t read your content like humans do. In fact, they can’t read at all. They’re robots!

Instead, Google uses what is known as the TF*IDF algorithm, as part of their wider page rank algorithm. TF*IDF sounds complicated and the technical details aren’t super important here, but think of it like this…

Google is trying to understand what your page is really about by looking at the words you’ve used and how often you’ve used them compared to the rest of the internet. It does this using maths, not language.

TF (Term Frequency)

As the phrase suggests, this part of the formula checks how many times you’ve used a certain keyword or phrase.

As an example, imagine you’re at a party and someone says the word “pizza” 20 times in a single conversation. You’d probably assume that pizza is pretty important to whatever they’re talking about, right? 

That’s essentially what Google does with TF – it checks how many times each word appears on your page to understand what’s being discussed.

IDF (Inverse Document Frequency)

The IDF side of the formula is there to calculate if the keyword Google has recognised is special, or whether it’s just like any other word.

Going back to the previous example, if everyone else at the party is also saying “pizza” 20 times in their conversations, it’s not as unique or special anymore – it’s just a common topic of the night.

IDF helps Google figure out if a word is rare and meaningful, or just something everyone else is using, like “the”, “and”, or any other filler word.

So, what does this mean?

To put TF*IDF together, Google combines these two ideas to find words that:

  • Show up a lot on your page (TF).
  • Aren’t super common compared to the rest of the internet (IDF).

So, if your page talks a lot about “sourdough pizza crust” and that phrase isn’t everywhere else, Google can assume that your page is likely about sourdough pizza crust!

How do you Optimise for Keyword Density?

If there’s a formula for working out keyword density, then there must be a way to get the perfect ratio of keyword mentions across your content, right?

Well, maybe. SEO content writing tools like Surfer SEO use the TF*IDF algorithm and Natural Language Processing (NLP) principles when assessing how ‘optimised’ a piece of content is within their software. 

A lot of content writers believe in using tools like this to ensure their content is as keyword-optimised as possible for SEO rankings because the software calculates how many times other well-ranked pieces of content use certain key phrases.


Don’t knock it until you’ve tried it – tools like this might work well for you, especially in competitive niches where any advantage you can get is valuable. However, when it comes to on-page SEO, it’s never essential to make everything ‘go green’ within a content optimisation tool. If anything, this can encourage unnatural keyword stuffing.

In reality, TF*IDF is just a small part of Google’s algorithm for deciding page rankings.

Whilst it’s cool to know about TF*IDF and pull back the curtain, it’s best to go back to the golden rule: prioritise that your content is people-first, offers value to the reader and satisfies their search intent.

Focus on placing targeted keyword mentions naturally throughout your content e.g. your H1, introduction, as well as other supplementary headings/copy and you’ll be just fine.

5) Hierarchical Heading Structures

An example of a hierarchical, optimised heading structure.

A well-optimised heading structure can transform reading a piece of content from a long and tedious slog into a smooth and engaging experience. 

The majority of web content nowadays is written to be easily scanned by the reader. This way, the content is accessible to a wide range of audiences. 

Headings (H1, H2, H3, etc.) allow you to split your content up into digestible sections, helping the user to find the information they need in as little effort as possible.

Headings also give search engines context and clues into what your content is about. Google uses this information to decide whether your page should be ranked for relevant search queries.

Heading Hierarchy tips for On-Page SEO

Your heading structure should have a clear hierarchy to it, representing content importance and flow. Here’s how it works:

  • H1: Serves as your page title and should summarise what your content is about. The H1 is one of the most important on-page ranking factors, should only be used once per page, and must include your target keyword.
  • H2s: Used to break your content up into key sections to improve readability and provide additional content to search engines. These headers should contain primary and secondary keyword mentions to build relevance with the key topic.
  • H3s: Used for breaking sections up into subtopics to go into more detail.
  • H4s (and deeper headings): Used for splitting subtopics up into sub-subtopics.

It’s also good practice to use headings to set up jumplinks, especially with long-form content. Jumplinks help users to easily ‘leap’ to specific sections of your content and get to their end goal faster. 

Jumplinks are often included as built-in features of blog post templates, like ‘Tables of Contents’. However, in some cases, you’ll need to create them manually by adding ID attributes to each heading in the HTML and setting up anchor links.

6) Internal Linking

Internal links are hyperlinks on your website that point from one page to another. 

Effective internal linking is essential for helping users to navigate through your pages, as well as ensuring search engine bots can easily crawl and discover your content.

Not only this, internal links are also needed to pass link equity from the most important, high authority pages throughout your site structure to deeper pages, making your entire site appear stronger in the eyes of search engines.

  • Acting as signposts, they help users to navigate your website. This could be via the menu, buttons, or in-text anchor links.
  • Similarly, they also help search engine crawlers to move throughout your site, find content and assess it for rankings. Good internal linking improves crawlability and indexability.
  • Descriptive anchor text provides context to search engines about what the linked page is about and what keyword it should rank for.
  • Internally linking from one page to another signals to Google that these pages are related and the linked page is valuable. This is integral to the process of building SEO topic clusters.
  • Your homepage and top-performing pages often hold the most page rank. By internally linking from these high-authority pages to deeper content, you share that authority, helping those pages to rank better too.

For more information on internal linking for SEO, check out our helpful guide on the topic.

7) External Linking

External links, also known as outbound links, are hyperlinks on your website that point to pages on a different site. 

External linking can sometimes seem counterintuitive to SEO efforts. After all, why would you want to divert traffic away from your site?

Well, strategic external linking is actually good for your SEO efforts, particularly when it comes to informational content. When done right, external links add credibility, provide deeper context, and improve how search engines view your content. Yet, they’re often underutilised when it comes to on-page SEO. Here’s why they deserve more attention:

  • Linking to authoritative, trustworthy sources helps back up your claims, making your content more informative and reliable. This is especially important if you operate within a Your-Money, Your-Life (YMYL) niche.
  • External links can provide additional resources for users looking for more in-depth information. When used thoughtfully, they create a richer, more valuable experience for your audience.
  • Quality external linking is a hallmark of well-researched content. Google values pages that offer useful resources, especially for in-depth articles where users expect references and further reading.

When it comes to implementing external links, prioritise well-established, credible websites over lower-quality sources. Anchor text should clearly describe what users can expect from the linked page to offer the best user experience. 

It’s also a good idea to set external links to open in a new tab – this way if users do decide to check out the linked content, your site remains open on their device and they don’t forget about you!

8) Flex your Expertise & Experience

If you have a website that sells products, offers services, and produces content within a particular niche or industry, search engines bank on you being an expert within your space to deliver the best possible experience to its users.

Google’s algorithm is designed to rank pages that demonstrate a high level of knowledge and competence. This keeps search results up to scratch, offering accurate and reliable content that users need to maintain trust in the platform.

Optimising for “E-E-A-T” or Expertise, Experience, Authoratativeness, and Trustworthiness is how we work to meet these expectations. Google’s quality rater guidelines outline criteria for assessing informational quality – by following these guidelines you can position your site as a trusted resource. Here are some tips:

On-Page SEO Tactics for Strengthening E-E-A-T:

  • Use Trust Signals: Do you or your team members have a bunch of certifications and qualifications within the industry or field you’re in? Display them proudly across your key pages! The same goes for customer reviews of your products/services.
  • Write a Detailed ‘About Us’ Page: Make it easy for users to learn more about you and your business. An ‘About Us’ page is the perfect place for you to tell your story and build trust with potential customers.
  • Link to Useful Sources of Information: If you’re writing detailed informational guides about in-depth topics, externally linking to sources and other relevant sites can help to show credibility, authority and trustworthiness.
  • Use Expert Quotes: Another great way to strengthen informational content is to directly quote experts within your niche – it’s even better if these are quotes from primary sources within your team. Be sure to outline your expert’s experience within the industry and list any relevant qualifications they hold.

Demonstrate First-Hand Experience: Don’t just tell users about a topic, show them you know what you’re talking about. Include images, videos, screenshots and other forms of media that prove your expertise and experience to users and search engines.

E-E-A-T Resources:

If you’re looking to learn more about E-E-A-T and how to strengthen your website in this area, check out the following resources we’ve put together to help you:

9) Don’t Forget to Optimise Images

Images can help to improve the search visibility of your site in a few ways, from them ranking in Google Images to appearing in search features like image carousels, featured snippets and more.

To make the most of your images, it’s a good idea to follow a few on-page SEO tactics to help search engines understand your images in the context of your content, meaning they work a little harder for the performance of your site.

Alt Text

Alt text, or alternative text, is a short description of an image for both users and search engines that describes exactly what the image is. It’s especially important for accessibility, allowing visually impaired users to understand images through screen readers.

Best Practices for Alt Text
  • Be as descriptive as possible but don’t waffle on! Be nice and concise. For example, “A large pepperoni pizza with a stuffed crust” over “pizza”.

Include relevant keyword mentions wherever you can, naturally.

Image File Names

Search engines use image files names to understand images better. Remember to rename your images when you upload them.

Best Practices for Image File Names
  • Include a target keyword mention where you can.
  • Be concise and descriptive. For example, “pepperoni-pizza-stuffed-crust.svg”.

Next-Gen Formats & Image Compression

Out with ‘pngs’ and ‘jpgs’ and in with next-gen image formats! These most commonly include WebP, SVG, and AVIF formats, which compress image sizes massively whilst retaining the same high-quality. 

This makes next-gen formats ideal for use on websites as they help to improve page speed by requiring the browser to load less data in comparison to large traditional formats.

To convert images from traditional formats to next-gen formats you can use a wide selection of tools, ranging from online conversion platforms to CMS plugins and content delivery networks (CDNs). 

It’s always best to consult a web developer on the best way for your site to implement next-gen image formats – you never know what dev magic they’ll be able to conjure up for you.

10) Page Speed & Core Web Vitals

Optimising for page speed is a key component of on-page SEO and helps to guarantee a great user experience for visitors to your site, improving engagement and retention. Page speed is also a direct ranking factor, meaning it could make the difference when it comes to outranking the competition. 

Along with optimising images and serving them in next-gen formats, there are plenty of other ways you can speed up your page load times and improve responsiveness by optimising other on-page elements.

PageSpeed Insights is a free tool provided by Google which you can use to audit the speed and performance of each page on your site on both desktop and mobile devices.

On-Page SEO: Speed Considerations

  • Reduce unused/unnecessary code: PageSpeed insights will flag any unused JavaScript and CSS that is bloating your page and increasing load times. Removing unnecessary code like this, especially if they are sending separate HTTP requests, is a great way to improve page load speed.
  • JavaScript/CSS Minification: Web developers like their code to look pretty and be easy to read, but sometimes this isn’t ideal for page speed. JS/CS minification is the process of removing unnecessary characters and spaces from code to reduce file sizes. 
  • Gzip compression: Gzip is a type of data compression that reduces file sizes before they are sent to the user, improving the speed of content delivery.

Core Web Vitals

Core Web Vitals (CWVs) are user experience metrics used by Google to judge how well your site performs in terms of interactivity and visual stability. 

Using PageSpeed Insights, you can assess your pages, and Google will provide feedback on whether you’ve passed or failed across each of these metrics, along with recommendations on how to improve.

Whilst CWV scores are ranking factors, Google has emphasised that they are not as important as others.

The general advice is to optimise for page speed and CWVs only if you notice your site is feeling slow or if there are clear issues that could harm the user experience – not just to simply improve PageSpeed Insights scores. 

Put yourself in the shoes of your user and browse your site. If you think that pages are responsive, load quickly and there are no issues with visual stability, then great! The chances are that your performance is more than acceptable and should focus your attention elsewhere.

Page Speed Resources

Improving page speed and Core Web Vitals can sometimes feel like a jargon-filled nightmare. If you’re struggling to get your head around some of the key concepts, check out the following resources from our expert team:

11) Structured Data

Structured data, also known as Schema, is a type of code that provides search engines with additional context about your content. 

Schema helps search engines better understand specific details and information on your pages, allowing them to improve the appearance of your content in results pages by generating rich snippets.

Common Types of Structured Data

  • Product Schema
  • Article Schema
  • Breadcrumb Schema
  • FAQ Schema
  • Local Business Schema
  • Organisation Schema
  • Review Schema
  • Event Schema
  • Video Schema
  • Person Schema

Here’s an example of recipe schema at work, enhancing the appearance of these site’s pages in the SERPs:

An example of recipe schema enhancing search appearance.

To quickly assess the structured data on your site, you can use tools like Google Search Console, or website crawlers such as Screaming Frog and Sitebulb. These will give you a snapshot of how well your structured data is set up.

If you’re adding or updating structured data, using a schema markup generator can be a huge time saver. Depending on your CMS, there may even be a plugin that can automatically generate types of schema for you.

Finally, once your structured data is generated and live, be sure to use schema.org’s Schema Validator to see if everything is working properly and compliant.

Your Leading On-Page SEO Service Provider

Now we’ve covered all the most important on-page SEO techniques and tactics for online success, you should feel more confident applying them to your own website.

However, if you’d prefer some help from the experts, look no further than our award-winning team at The SEO Works. With over 15 years of proven success in the industry, we’ve delivered outstanding results for clients across various sectors.

Get in touch with our team today to get a free SEO audit and learn more about how our on-page SEO services can take your website to the very top of page 1!

The post On-Page SEO Guide: How To Optimise A Web Page in 2025 appeared first on The SEO Works.

]]>
Orphan Pages Guide: Impact on SEO and How to Find & Fix https://www.seoworks.co.uk/orphan-pages/ Mon, 16 Dec 2024 08:00:00 +0000 https://www.seoworks.co.uk/?p=16633 Orphan Pages Guide blog header imageInternal linking is invaluable for both users and search engines to discover other pages of your website.  When pages have no incoming links to them, orphan pages are created. In this guide, we’ll define what an orphan page is, how they affect SEO, and how you can find, fix and monitor orphan pages to prevent...

The post Orphan Pages Guide: Impact on SEO and How to Find & Fix appeared first on The SEO Works.

]]>

Internal linking is invaluable for both users and search engines to discover other pages of your website. 

When pages have no incoming links to them, orphan pages are created.

In this guide, we’ll define what an orphan page is, how they affect SEO, and how you can find, fix and monitor orphan pages to prevent them from harming your SEO performance. We’ll also answer some frequently asked questions.

What is an orphan page?

Orphan pages are web pages that don’t have any internal links from other pages of your site. You may sometimes see them called isolated pages.

Think of orphan pages like a place you’d want to visit. Only there are no roads connecting you to the place, meaning you can’t access it. This presents a major problem.

How do orphan pages impact SEO?

Just like the analogy of the place with no connecting roads, orphan pages stand alone and are not visited by users. They can impact SEO in various ways.

The page is not indexed

Internal links act as a signpost for search engines to find, crawl and index other pages. Without incoming internal links, orphan pages are hard to find. 

There’s a much higher chance of orphan pages not being indexed, which means they will not be shown to users in search results. 

Orphan pages can, however, be discovered, crawled and indexed if they are in your sitemap.

You should do all you can to make it easy for search engines to find and index your content.

The page struggles to rank for keywords

Even in the case that orphan pages are added to Google’s index, it’s going to be a real struggle to get those pages to rank for any keywords. 

Internal links pass link equity – value passed from one page to another. So without that link equity, you’re not signalling to search engines that those pages are important. 

Link equity is just one of the many factors that can help your content rank for keywords.

Users do not discover the content

You could have really useful information on the orphaned pages, but without links to them, users don’t know they exist. 

Orphan web pages create a bad user experience as helpful content is essentially invisible to users if it can’t be found.

Causes of orphan pages

We know that orphan pages have no incoming links. But what are some of the common causes that create orphan pages?

  • New pages are not added to the menu navigation or site structure
  • Site hierarchy has undergone a restructure but internal links haven’t been considered
  • Site migration hasn’t been properly planned, so pages are left orphaned. Migrations should always be carefully considered. Check out our site migration checklist 
  • Pages are not included in the sitemap
  • Products are not categorised properly, so can’t be found
  • Outdated pages are still live, such as past events or old campaigns
  • Internal links and site as a whole are not regularly audited or updated (Google favours fresh, up-to-date content)

How to find orphan pages on a website

Orphan pages can be identified using auditing tools.

Orphan page checker tools

Sitebulb

Screenshot of Sitebulb crawler flagging orphan pages as isolated URLs
Sitebulb refers to orphan pages as ‘isolated URLs’ and will give you a list of affected pages

You can crawl your site with Sitebulb and it will provide SEO hints, with the most urgent being flagged as ‘critical’ or ‘high’. Sitebulb calls orphan pages ‘isolated URLs’ and these can be found in the Indexability report.

Sitebulb also integrates with Google Analytics and Search Console data for further insights.

Screaming Frog

Screenshot of Screaming Frog spider flagging orphan page issues
The Screaming Frog spider detects orphan pages

Screaming Frog is another tool that allows you to crawl your site. Screaming Frog provides lots of data, including orphan pages on your site.

Screaming Frog also has the option of integrating with your Google Analytics and Search Console data.

Ahrefs

Screenshot of Ahrefs' Site Audit tool that has highlighted orphan pages
Ahrefs’ Site Audit will highlight if there are orphan pages

Ahrefs’ Site Audit crawl will highlight if there are orphan pages present on your website. Make sure you enable crawling of your sitemap.

Semrush

Screenshot of Semrush's Site Audit tool
Semrush’s Site Audit tool will flag any orphan pages

Semrush’s Site Audit tool will crawl your site, highlighting any issues including orphan pages.

How to fix orphan pages

Now we’ve used a tool to identify them, it’s time to fix the orphan pages of your site.

Orphan pages are easy to fix. You just need to add links from relevant pages to the orphan page. 

Consider linking in your navigational menu, sitemap, from the homepage and other relevant pages.

How else to address orphan pages

If you don’t want to build links to your orphan pages, there are other ways to approach them.

Add a redirect

If a page is orphaned because it’s been replaced by a new version of the page, implement a 301 redirect.

301 redirects keep any link equity, and direct both search engines and users to the newest version of the page.

Add a noindex tag

Snippet of code showing noindex tag on a website instructing all search engines to noindex the page

Perhaps the page was intentionally left out of the navigational menu and site structure. In which case, you should add a noindex tag that instructs search engines not to add that page to their index. Users will then not be able to find the page from search results. 

Because there’s still a chance that crawlers can find orphan pages via the sitemap or external backlinks, a noindex tag is the best way of ensuring that orphan pages don’t harm your site performance.

Delete the orphan page

In cases where a 301 redirect or noindex tag are not viable options, you may consider deleting the orphan page. 

Before you delete it, check that the page is not providing any value, such as external backlinks.

Preventing & monitoring orphan pages

Regularly monitoring orphan pages is important to prevent them from harming your SEO efforts.

We recommend running a crawl with an auditing tool on a regular basis. You should also check for any errors in Google Search Console. 

Be especially vigilant around any big site changes, such as migrations or site restructuring.

Final thoughts

You may create the best optimised content, but orphan pages can hinder your SEO if not dealt with properly. Orphan web pages are much less likely to be indexed, so users can’t find the content and it will struggle to rank for keywords.

For expert advice, get in touch with us for a free website review.

Orphan pages FAQs

Are orphaned pages bad?

Orphan pages can be bad for your SEO efforts. Because they have no links to them, orphan pages are less likely to be found, crawled and added to Google’s index. This can harm your ability to rank for keywords, and prevent users from finding your pages.

Are empty pages bad for SEO?

Having empty pages with no content or having thin content can be bad for SEO. Your site is much less likely to rank high for keywords with thin or no content. It can especially harm your SEO efforts if you have a lot of empty pages or thin content.

It’s easy to solve ‘orphan page has no incoming internal links’. Use an auditing tool and run a crawl to identify the orphan pages. Once you know which pages are orphaned, link from other relevant pages of your site to the orphan pages. You may also add them to your menu structure.

Can Google crawl orphan pages?

Google is still able to crawl and index orphan pages if they are linked from your XML sitemap, have any incoming redirects or canonicals that reference the orphan page, or are found via external backlinks. There is, however, a much smaller chance that Google’s bots can find, crawl and index orphan pages because of the lack of internal links.

Can orphan pages be indexed?

Orphan pages are much less likely to be indexed than pages that have internal links pointing to them. There’s a small chance they may still be found, crawled and indexed if they are linked from your XML sitemap or have external backlinks.

How to find orphaned pages in Google Search Console?

Site auditing tools such as Sitebulb and Screaming Frog allow integration with Google Search Console data. 

  1. Go to the ‘Pages’ Performance report in Search Console. Select the largest date range possible and ensure Impressions are included in the data. 
  2. Export the URLs and upload the list to a crawling tool like Sitebulb or Screaming Frog. 
  3. Run a crawl with Search Console data and one without, and compare the two. If a URL is missing from the crawl data without Search Console integrated, you have an orphan page.

The post Orphan Pages Guide: Impact on SEO and How to Find & Fix appeared first on The SEO Works.

]]>
How Google Decides Which Brands It Likes: 50 Site Case Study https://www.seoworks.co.uk/how-brand-impacts-seo/ Mon, 02 Dec 2024 08:00:00 +0000 https://www.seoworks.co.uk/?p=16555 How-Google-Decides-Which-Brands-It-Likes-50-Site-Case-Study blogIn the world of SEO, there are always two sides to the story. The first is what Google says is happening. The second is what the SEO community sees happening in practice. If you asked Google what’s happened over the last couple of years, they will tell you that they have been focusing on rewarding...

The post How Google Decides Which Brands It Likes: 50 Site Case Study appeared first on The SEO Works.

]]>

In the world of SEO, there are always two sides to the story. The first is what Google says is happening. The second is what the SEO community sees happening in practice.

If you asked Google what’s happened over the last couple of years, they will tell you that they have been focusing on rewarding “helpful, reliable, people-first content” that demonstrates E-E-A-T.

It’s clear that Google have been making significant adjustments to the search algorithm. But what that looks like in practice is somewhat different to how they’ve described it.

In practice, what the SEO community has witnessed includes: the introduction of AI overviews on a huge percentage of informational queries; dramatic increases in visibility for large forum sites and select legacy media sites; and drastic decreases in visibility for independent media sites and many small businesses.

At the same time, few in the SEO community would report any improvement in the “helpfulness” of the content that Google is serving in the search results. On the contrary, there have been thousands of cases of Google punishing top-quality expert-written content, and replacing it with content written by anonymous accounts on forums (like Reddit), or with generic low-value content on non-expert sites (like Forbes).

So what’s really going on with the algorithm? I went in search of an explanation.

Like many in the SEO community, I had the hunch that the strength of a brand had a lot to do with whether Google likes a website or not.

But how does an algorithm evaluate the strength of brand? This study provides a partial answer to that question.

I sifted though a tonne of data on 50 brands, and ended up focusing on these key metrics:

  • Year-on-year organic user growth. This is the metric by which I sort the “winners” and “losers”
  • Organic traffic as a % of total of all site traffic
  • Non-branded organic traffic as a % of total of all site traffic
  • % of branded vs non-branded organic traffic
  • % of paid Google traffic as a % of total of all site traffic (including search, cross-network, display, shopping)
  • Domain Rating (Ahrefs)
  • Domain age

Key Findings

As I mentioned, my hunch was that the algorithm has a way of understanding which brands it likes, and which it doesn’t like. But what data does the algorithm use to decide which brands to reward?

My key three findings were:

  • The algorithm likes sites that have an overall low % of non-branded organic traffic
  • The algorithm dislikes sites that have a higher % of non-branded organic traffic
  • The algorithm likes sites that use Google ads for a large % of their overall traffic

I will break down the data on each of these below, and then expand on the less conclusive data.

Naturally, I’ll save the big takeaway for last.. The final section highlights what I think is the most important finding–what I’ll call the “golden ratio” of overall traffic make-up.

About the sample

The study is based on 50 brands, so any conclusions are, of course, tentative. A larger sample could potentially confirm the findings. On the other hand, it’s possible that a larger sample could reveal that the findings were inaccurate.

The brands in the sample cover a very wide range of sectors. However, it is perhaps significant that the sample is slightly skewed towards small and medium businesses (fewer than 250 employees), although it does include ten large businesses (greater than 250 employees).

Estimated employee range graph

The majority of the businesses in the study (70%) have annual revenues below £10m, with the other 30% of the businesses exceeding £10m in revenue.

Estimated revenue range graph

This table demonstrates how I have categorised the different brands in the sample. The findings focus chiefly on the “very bad performers” and the “excellent performers,” which is where trends were most noticeable.

Performance is determined by percentage change in organic users year-on-year. At the bad end of the scale, brands have experienced a dip in organic traffic whereas at the other end of the scale, brands have experienced significant organic growth year-on-year.

Table 1

Trend 1: the algorithm likes sites with overall low % of non-branded organic traffic

A strong trend amongst the excellent performers was a low percentage of non-branded organic traffic in relation to the site’s total traffic. To calculate this figure, I looked at the site’s total organic traffic, and used Semrush to get an estimate of how much of that traffic was non-branded (i.e. generic).

8 out of 10 of the excellent performers received under 20% of their traffic from the non-branded organic channel. In reality, I think that 9 out of 10 would actually be the more accurate figure, as there are some inaccuracies in the data. Either way, 8 of 10 is enough to demonstrate the trend.

Table 2

Trend 2: the algorithm likes sites that pay Google for a large % of overall traffic

The second trend is that the excellent performers tend to pay Google (through Google Ads) for a large percentage of their overall traffic. The data showed that 7 out of 10 excellent performers were using Google ads for at least a third of their traffic. In fact, 3 out of these 10 were paying Google for over two-thirds of their total traffic. On the other hand, the worst performers were generally only paying Google for a very small amount of traffic, or none at all.

Table 3

The algorithm dislikes sites that have a higher overall % of non-branded organic traffic

Based on the above assertion that Google likes sites with a low percentage of non-branded organic traffic, you might suspect that the inverse is true, and, indeed, it is. Aside from two outliers amongst the “good performers,”² high non-branded organic traffic correlated closely with very poor performance.

Table 4

Inconclusive findings

Naturally, I came upon many dead-ends in this study before patterns started to emerge. But a couple of these dead-ends are worth sharing, so here they are.

Old domains

Having an old domain didn’t seem to have a big impact on performance. Although there was a slight tendency for good performers and excellent performers to have an old domain, quite a few bad performers also had old domains. I think this is because the bad performers group contained a lot of big, old companies in competitive niches where change happens very slowly.

Table 5

High DR

High Domain Rating (according to Ahrefs) did not seem to make much difference to overall performance. I think this is mostly likely because the Domain Rating required to compete in different niches is very different. It is notable that only one of the excellent performers had a Domain Rating of over 40.

Table 6

Low DR

Remarkably, I found that the high performers tended to have a fairly low Domain Rating. This was quite unexpected, but potentially significant. My explanation would be that many of the top performers are companies who had historically not done much SEO work, such as link-building. Rather than investing in SEO and link-building, they have focused on other channels, such as paid Google traffic, which Google prefers, and tends to reward.

Table 7

The Golden Ratio

Now here’s the interesting bit. Considering my hunch that “brand” is important, I was expecting to find that branded vs non-branded (generic) traffic data would reveal a trend. 

Namely, I was expecting that brands that received a high percentage of branded searches relative to non-branded searches would perform better, and that brands with a low percentage of branded searches would perform worse. It turns out it’s a little more complex, and a lot more interesting than that.

The table below looks at three of the worst performing brands, who all had a very low percentage of branded traffic. The table also includes three of the top performing brands who happened to also have a very low percentage of branded searches.

What this comparison reveals is that it doesn’t matter if not many people are searching for a brand. As long as organic traffic makes up an overall low proportion of total traffic, and as long as a brand is paying Google for a significant proportion of their traffic, they can outperform in organic search.

Table 8

Google’s Reasons For This Approach

Don’t get me wrong, I’m not in a hurry to defend Google here, but I do think that all of this makes some sense.

You probably will have noticed the implication that paid traffic is a ranking factor. On the one hand, of course, there is the cynical reason to think that Google likes sites that pay them for ads.

But on the other hand, being able to pay Google tells Google that that particular brand has resources. If they have resources, it’s an indication of sorts that we are dealing with a legitimate brand, rather than just a well-SEOed shell that basically doesn’t exist outside of Google Search.

However, Google liking sites that pay them is only half of the picture. The other essential part of the picture is that Google doesn’t like sites that have high non-branded (generic) organic traffic. Google likes sites with low generic organic traffic.

Why might this be? A great study from Cyrus Shepard suggests that Google seems to be punishing “good SEO.” And I think this makes a lot of sense. Google no longer wants to rank the brands that have the best SEO. It wants to find more reliable, less “gameable” ways of deciding who should rank.

That’s why the non-branded organic traffic metric is so important. This metric needs to be in the correct proportion to other traffic sources, as this demonstrates a balanced and natural-looking make-up of traffic. A healthily low proportion of non-branded organic traffic seems to suggest to Google that it is dealing with a “wholistically” successful brand, as opposed to a brand that just happens to have good SEO.

Because, in fairness, users aren’t looking for “good SEO.” And Google is, arguably, trying to serve its users. At least, it was until recently.

Conclusion

This study was based on the hypothesis that Google has a way of identifying which brands it likes, and rewarding them with greater visibility in organic search.

The hypothesis was based on my own hunches, and the hunches of many others, from what we have seen, anecdotally, on the ground in the world of SEO. I set out to see what the data might reveal–unsure what I would find–and, personally, I think it has revealed a couple of real insights.

Like many in the SEO industry, I don’t think that the Helpful Content Update made Google Search better. And like many, I am doubtful about how serious Google is about serving the best content to its users.

But the point of this study is not to ask whether Google Search is getting better or worse. The point is to understand the mechanisms by which the algorithm decides to reward one site and punish another.

I think this study does point to a certain logic, whether you agree with it or not, by which the algorithm attempts to identify and reward certain brands in organic search.

The findings point to a future of SEO in which SEO professionals must be increasingly mindful of the part that SEO plays within the wider marketing mix. It highlights how increasingly challenging it is for brands to beat their competition at SEO alone. In fact, the findings suggest that the algorithm could be preventing brands from winning with organic search if they are not investing sufficiently in other channels. And whilst that will pose a challenge to laggards in the field of the SEO, in the end, that may not be such a bad thing.

Footnotes

¹ I attribute the relatively large numbers of bad performers with low generic traffic to the fact that many of these are quite big, incumbent businesses in competitive niches. As we will see later, many of these sites have old domains. They also tend not to pay much for Google ads. Many of these sites could be described as entrenched and slow moving, and overall not very proactive with SEO or search marketing generally.

² I looked into these two outliers. Notably, they were both in niches where there was fierce SEO competition, and where no one was paying Google for traffic.

The post How Google Decides Which Brands It Likes: 50 Site Case Study appeared first on The SEO Works.

]]>
A Guide to Broken Links: How to Find & Fix Dead Links https://www.seoworks.co.uk/broken-links-guide/ Mon, 04 Nov 2024 08:00:00 +0000 https://www.seoworks.co.uk/?p=16387 A Guide to Broken Links: How to Find & Fix Dead Links header imageYou’re browsing a website and discover a link to some content you’d like to read – it’s just what you’ve been searching for. You click the link, and you’re met by ‘404 – page not found’.  Broken links on a website can make for a very frustrating search experience. They are bad for user experience...

The post A Guide to Broken Links: How to Find & Fix Dead Links appeared first on The SEO Works.

]]>

You’re browsing a website and discover a link to some content you’d like to read – it’s just what you’ve been searching for. You click the link, and you’re met by ‘404 – page not found’. 

Broken links on a website can make for a very frustrating search experience. They are bad for user experience and can harm your SEO efforts too. 

Broken links are also more common than you’d think. Rather aptly, when researching this blog, I found a 404 page on one website! 

In this guide, we define what a broken link is, explore common causes of broken links and their impact on SEO, and explain how to find, fix, and prevent broken links from occurring. We also answer some frequently asked questions about dead links.

Table of Contents

A broken link – or dead link – on a website is a hyperlink to a page or resource that no longer exists. Once it’s clicked, you are met with the ‘404 – Page Not Found’ message. 

Broken links can be:

  • Internal links (linking to another page of your website)
  • External links (linking out to another website)
  • Broken backlinks (another site links to a broken page on your site, so the backlink no longer exists)

Dead links are a common issue that can occur for several reasons.

Page has moved or been deleted

404 Page not found error message
Broken links will display a ‘404 – Page not found’ error message

If a page has been deleted or has moved location from the original URL, any links to the original URL will become broken. Users will be met by the 404 status code if a redirect is not implemented, or if the existing links are not updated. 

For example, the bikes category page has been deleted from an online store as these are no longer sold. However, existing links to that page across the site are not removed, resulting in broken links. Any backlinks pointing to that page will also be removed.

Change in URL or site structure (no redirects)

Where site structure or URL structure changes have happened across the site, 301 redirects should be implemented. If this hasn’t happened, links will become broken.

For example, a recent site restructure has taken place to better target keywords.
Old URL: /black-blue-shoes/
New URLs: /black-shoes/ and /blue-shoes/ 

A redirect should be put in place and existing links to this page should be updated.

Contact us URL is misspelled
Links that are inputted incorrectly will also result in a broken link

Dead links may be caused by a simple error such as the URL has a typo or is not formatted correctly. 

For example: https://www.seoworks.co.uk/cntact-s/ should be https://www.seoworks.co.uk/contact-us/

Restricted access

Password protected page
Password-protected pages restrict access

Links may not lead users to the correct destination if they have restricted access.

For example, a password-protected page or a page behind a firewall.

File is removed

Broken image icon
Links may be broken by files being removed, such as this broken image icon

Just as a 404 will be returned if a page no longer exists, the same applies if files that are linked to it are removed.

For example, a restaurant menu PDF no longer exists, or an image has been deleted.

Plugin issues

Issues with plugins malfunctioning can cause links to break, or HTML or JavaScript errors may cause elements of the page to break. 

For example, a plugin causes dead links from social sharing buttons.

When a site changes domain names, links to any page on that domain will return a 502 error (bad gateway).  

For example, a company rebrands from Sheffield Office Supplies to Yorkshire Office Supplies. 

Redirects should always be mapped out in advance of a domain change. Check out our site migration checklist.

Broken links can negatively impact your SEO efforts in several ways:

  • Dead links are frustrating for user experience, which means users are less likely to convert (e.g. make a purchase or an enquiry), or to return to your site in future 
  • When Google’s crawlers hit various broken links, this hinders their ability to crawl and index your site. This means that all of your SEO efforts, like creating great content, are for nothing if they can’t be crawled and indexed – leading to zero chance of ranking 
  • Keyword rankings are determined by many factors, including how users engage with your site. So if a lot of users are ‘bouncing’ back to search results due to hitting dead links, this will act as a bad signal to Google. Your content is unlikely to rank high
  • Google favours fresh, updated content so whether you have a lot of broken links on your site (internal and external), or whether inbound links to your site hit a dead end, Google will assume your content is outdated and your site is poorly maintained. This will again have a knock-on effect on rankings and performance

Now we’ve identified some common causes of broken links and know the negative impact they have on SEO, let’s explore how you can find broken links.

Various broken link checker tools will provide you with the data to find broken links.

Auditing tools such as Sitebulb, Screaming Frog, Semrush and Ahrefs provide data on broken links.

Sitebulb auditing tool identifies broken internal and external links on your site
Auditing tools such as Sitebulb crawl your site and identify broken internal and external links

You can also add extensions to your Chrome browser that will flag any dead links on the page. Add-ons such as Check My Links are free from the Chrome web store.

Google Search Console

Google Search Console Indexing Report - Pages not found 404
Search Console’s Indexing report flags 404 page not found errors and other indexing issues

Search Console’s Indexing report will provide you with an overview of which pages aren’t indexed, including any 404 (not found) errors. We recommend checking Search Console regularly.

You can also review all internal and external links using the Links report.

Google Analytics

Google Analytics Pages and Screens Report to find 404 pages
Google Analytics Pages and Screens Report to find 404 pages

Google Analytics also enables you to find broken links on your website. Using the Pages and Screens Report (which can be found under Engagement), select ‘Page title and screen class’ in the drop-down menu. Add ‘page path and screen class’ as a secondary dimension. Search for ‘404’ or ‘page not found’ in the search box.

Alternatively, you can also check through all the pages of your site, clicking each link. This can, however, be time-consuming – especially if you have a lot of content. 

Manual checks will only identify broken links on your website, and not inbound backlinks.

How to fix broken links will depend on the type.

You have a few options for fixing the dead links on your site – whether they are internal or external links. 

  • Update the old link to the new URL. Not only will this fix the link, but it will also avoid the extra ‘hop’ that occurs when a 301 redirect is in place (if the link is internal). The new URL will take the user directly to the linked page 
  • Remove the old link. You may want to remove the old link entirely if you cannot find the new URL or a suitable alternative
  • If it’s an internal link, you could add a 301 redirect. You might choose this option if the URL has moved location, or to redirect users to a similar product or service. It is always better to input the new URL rather than implement a redirect if it’s the same product. Do not redirect users to irrelevant pages – this will be bad for rankings and user experience

While the internal and external links on your site are easier to fix, broken backlinks to your site can prove to be an effective link-building method. 

Recovering your lost backlinks will involve reaching out to the website owner and asking them to update the link. You can read more in our blog about broken link-building & other methods.

With broken links found and fixed, it’s important to prevent them from harming your SEO efforts in future.

When undergoing any major site changes like a site or URL restructure, make sure to update your links.

We recommend regularly checking for dead links across your site using auditing tools such as Sitebulb or Semrush. 

Google Search Console’s Links report is a great tool for reviewing internal and external links as a whole. You can also identify pages that could benefit from more internal links. Find out more about the benefits of internal linking.

Final thoughts

Broken links can negatively impact your site’s performance. They are bad for users and search engines, so it’s important to regularly review your links.

We can see how your site is performing with our audits. Contact us for a FREE website review.

A broken link leads to content that no longer exists, whereas 404 (page not found) is the status code you usually see that tells you the page could not be found. The 404 status code is served after you click the broken link.

A broken link is also known as a dead link. It’s a hyperlink on a website that links to content that no longer exists. It may also be referred to as link rot.

Dead – or broken – links should be fixed so they don’t impact your SEO. With dead links on your website, you can either update or remove the link, or add a 301 redirect if the link is internal. Website owners will need to be contacted to update broken backlinks to your site.

Auditing tools like Semrush or Ahrefs will help you to identify lost links to your site, as well as broken competitor links. Broken links can be an effective link-building method. Read our blog to learn more about broken link-building & other methods.

Google’s crawlers will eventually remove dead links from their index if they are not dealt with. It may take a few weeks to a few months for Google to remove dead links from its index. It will depend on many factors, including how often Google crawls your site. If you don’t want your content de-indexing, it’s best to regularly audit your links.

The post A Guide to Broken Links: How to Find & Fix Dead Links appeared first on The SEO Works.

]]>
How We Transformed the Organic SEO of a Skin Surgery Brand https://www.seoworks.co.uk/seo-for-skin-surgery-brand/ Mon, 23 Sep 2024 08:00:00 +0000 https://www.seoworks.co.uk/?p=16277 image of a graph with a scalpelWe recently had the privilege of leading a transformative SEO campaign for a chain of private skin surgery clinics – this is the story of how we did it. The company we partnered with was a well-known and respected brand. With a strong reputation and a loyal customer base, they had successfully established clinics across...

The post How We Transformed the Organic SEO of a Skin Surgery Brand appeared first on The SEO Works.

]]>

We recently had the privilege of leading a transformative SEO campaign for a chain of private skin surgery clinics – this is the story of how we did it.

The company we partnered with was a well-known and respected brand. With a strong reputation and a loyal customer base, they had successfully established clinics across the UK.

Despite its existing success, the brand wanted to further solidify its dominance in two of its core locations: Birmingham and London. These cities were home to their most established and profitable clinics, with the most potential for further expansion.

In both Birmingham and London, the market for private skin surgery is highly competitive, with many other clinics vying for the attention of potential clients. Our client understood that to maintain and grow their leadership position, they needed to continuously attract new clients whilst retaining existing ones.

Your money or your life

In SEO, few challenges are as exciting – and daunting – as helping a client in the highly competitive YMYL (Your Money or Your Life) sector.

YMYL is a term used by Google to classify websites that can potentially impact a person’s future happiness, health, financial stability, or safety. The YMYL sector includes topics that are key to individuals’ well-being and life decisions – including health and medical information.

Because of the potential impact these topics can have on a person’s life, Google applies stricter scrutiny to YMYL pages. The search engine emphasises the importance of E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) for these types of content, meaning that websites in the YMYL sector must provide highly accurate, reliable, and credible information to rank well in search results.

This is done to protect users from misleading, incorrect, or harmful information.

With bags of experience in healthcare marketing, and a rigorous focus on SEO best practices, we knew we were capable of steering the client’s website to growth despite the challenges their sector poses. 

Understanding the client’s needs

Understanding client needs is the first requirement in any digital marketing or SEO campaign, because it forms the foundation for developing an effective and tailored strategy.

SEO isn’t just about applying a set of best practices; it’s about understanding the context of the website and its market. No two SEO campaigns are the same.

Factors such as the client’s industry, the competitive landscape, the current state of their website, and their business objectives all influence the development of an SEO strategy. By deeply understanding the client’s specific needs, we can tailor our approach to address their unique situation rather than applying a generic, one-size-fits-all solution.

We also believe that SEO should not operate in a vacuum. The ultimate goal is to support the client’s broader business objectives, whether that’s increasing leads, driving sales, building brand awareness, or reducing reliance on paid search. Understanding these goals helps in designing an SEO strategy that directly contributes to the client’s success. 

This detailed understanding also helps when creating content. Content that is highly relevant to the audience’s needs and interests is more likely to rank well in search engines and convert visitors into customers.

This means understanding not just who the target audience is, but also what they are searching for, the problems they need solutions to, and the language they use to describe those problems.

Our client came to us with a clear objective: grow and consolidate their market share in Birmingham and London. With significant monthly spend on PPC campaigns in those areas, they were keen to also see if they could reduce costs by boosting their organic search presence. 

Developing a multi-faceted strategy

To achieve this goal, we needed to increase their visibility in search results for location-specific and also selected short-tail keywords, as well as improving the overall quality of their website.

We devised a comprehensive strategy that focused on three key areas: location silos, broadening website quality, and conversion rate optimisation.

1. Building location silos for targeted visibility

We knew that targeting local search terms was critical for driving organic traffic to the client’s clinics in Birmingham and London. To do this, we created dedicated content silos for these locations.

This involved developing “Skin Clinic Birmingham” and “Skin Clinic London” hubs, each designed to capture location-specific search queries.

From these main pages, we linked out to individual service pages for key procedures such as “Mole Removal.” Each page was meticulously crafted with bespoke content, including FAQs developed internally and also from the “People Also Ask” sections in search engine results.

This ensured that our content was not only relevant but also directly answered the questions potential clients were asking.

FAQs panel

Additionally, we worked closely with the client to optimise their Google Business profiles for both Birmingham and London, encouraging them to also develop a proactive review approach. This helped bolster the local SEO efforts and ensured that their location pages ranked as strongly as possible.

2. Enhancing website quality across the board

Improving the site’s overall technical and content quality was our next priority. Because we had introduced location-based silos, this meant we could pivot the top-level navigation pages to target short-tail keywords that would attract a broader audience.

Short-tail keywords are typically more general and have higher search volumes than long-tail keywords. By optimising top-level navigation pages for these broad terms, you increase the likelihood of these pages appearing in search results for a wider audience.

To achieve this we went through a thorough process of content and metadata optimisation, refining the URL and site structure, and addressing technical issues like page speed and internal linking.

One of the more challenging aspects of this process was dealing with PPC-related pages that were cannibalising organic traffic. When both PPC and organic pages target the same keywords, they can end up competing for the same search queries.

This can cause search engines to split the relevance signals between these pages. As a result, neither page may rank as well as a single, well-optimised page would, leading to lower overall visibility in organic search.

To mitigate this, we no-indexed a significant number of these pages, preventing them from competing with our organic content. This detailed approach ensured that our SEO efforts had the maximum possible impact.

3. Improving conversions with conversion rate optimisation

As an agency, we want to help people improve their organic rankings and website traffic – but our main priority is to help them get more customers online and grow. 

So this means improving traffic is only part of the battle; converting that traffic into leads and consultations is equally important. Whilst not part of the initial brief, we convinced the client to look at CRO.

We implemented a rigorous conversion rate optimization (CRO) process, starting with the installation of Microsoft Clarity to gain insights from heatmaps and user recordings.

Based on this data, we made several strategic changes to the core landing pages:

  • We introduced a carousel for “Before & After” photos to reduce scroll depth and keep visitors engaged.
  • We repositioned calls-to-action (CTAs) higher on the page, ensuring they were visible to the high amount of users who scrolled less.
  • We added short biographies of the lead surgeon to build trust with potential customers.
  • We made customer testimonials more prominent to enhance credibility further.

These changes were also implemented on the PPC landing pages, leading to a significant increase in conversion rates for key services like Gynecomastia, which jumped from 4% to 10%.

client review panel

Overcoming challenges

Every campaign faces hurdles, and ours was no different. One of the major challenges we encountered was keyword cannibalisation.

Initially, many top-level navigation pages ranked for Birmingham-related terms, which conflicted with our new Birmingham silo. 

We resolved this by:

  • De-optimising the pages within the top-level navigation for terms related to Birmingham;
  • Removing the address of the head clinic in Birmingham from pages within the top-level navigation;
  • Interlinking to the Birmingham silo from the footer & Contact Us page;
  • Assisting the brand in configuring their Google Business profiles to make it clear the business was a chain of clinics based around the country;
  • Manually-submitting all pages in the Birmingham silo within Google Search Console.

Another challenge was operating within a YMYL industry, where accuracy and credibility are paramount. We tackled this by ensuring all medical content was rigorously reviewed and approved by the client’s internal team.

We also emphasised the expertise of the clinic’s principle surgeons through detailed biographies and focused on building E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) signals throughout the site.

Delivering growth

So did we achieve our mission of delivering SEO growth for the client? Definitely. Organic traffic increased by 321.7% year-on-year, with a significant surge as the campaign progressed and our strategies took full effect.

graph showing traffic

Keyword visibility also saw a massive boost, with tracked keywords increasing from 3.9% to 12.9%, an overall increase of 230.8%.

graph showing keyword visibility

Our location-based silos performed exceptionally well. The Birmingham silo alone attracted over 800 visits in August and the London silo achieved similar success.

In terms of conversions, organic search goal completions more than doubled year on year! What’s more, this strategy is scalable. As the clinic’s presence is developed in other major cities, we have continued to work with them to build out other location hubs targeting more local searches.

Perhaps most importantly, we achieved our client’s goal of reducing their reliance on PPC. Organic traffic’s share of total website traffic increased from 18.4% to 45.1%, while PPC traffic’s share dropped from 61.9% to 45.7%.

This shift not only lowered their monthly healthcare PPC spend but also enhanced the overall efficiency and ROI of their digital marketing efforts.

Driving results within the health and medical sector

This campaign stands out as a great example of how targeted healthcare SEO strategies can drive significant results in a highly competitive market. Despite the challenges of working within the YMYL (Your Money or Your Life) space and the ambitious goals set by our client, we exceeded expectations across the board. 

By combining strong technical SEO, precise content strategies, and data-led conversion rate optimisation (CRO), we were able to transform the client’s digital presence, drive quality traffic, and significantly boost conversions.

For any business facing similar challenges, this case study demonstrates the power of a well-rounded, data-driven approach to SEO. It’s not just about driving traffic—it’s about driving the right traffic, and then turning those visitors into valuable leads and customers.

The post How We Transformed the Organic SEO of a Skin Surgery Brand appeared first on The SEO Works.

]]>
What Is A Sitemap? The Different Types, Uses & SEO Best-Practice https://www.seoworks.co.uk/what-is-a-sitemap/ Mon, 09 Sep 2024 08:00:00 +0000 https://www.seoworks.co.uk/?p=16159 What Is A Sitemap? Blog headerMaps have been an important tool used by humans for years. They provide us with a sense of direction and help us navigate unfamiliar surroundings. As much as maps are useful in the real world, they are just as essential within the digital landscape. Sitemaps are a website’s blueprint, ensuring that users and search engine...

The post What Is A Sitemap? The Different Types, Uses & SEO Best-Practice appeared first on The SEO Works.

]]>

Maps have been an important tool used by humans for years. They provide us with a sense of direction and help us navigate unfamiliar surroundings. As much as maps are useful in the real world, they are just as essential within the digital landscape. Sitemaps are a website’s blueprint, ensuring that users and search engine crawlers can find their way around.

In this article, we explore what a sitemap is, find out more about the different types of sitemap, and dig deeper into their benefits and importance within the search landscape.

Table of Contents

What is a sitemap?

A sitemap is a file that shows the structure of a website, listing out important URLs. The purpose of a sitemap is to help search engines find, crawl and index the content on a website, and some sitemaps help users navigate a website.

There are different types of sitemaps, which are influential for SEO and improving your online visibility.

The different types of sitemaps

Standard XML sitemap: The most common type of sitemap that is often read by search engine crawlers to understand a website.

Image sitemap: A type of XML sitemap that helps search engines find all the images on a website.

Video sitemap: Another type of XML sitemap that helps search engines find all the videos on your website. Video schema has largely replaced the need for a video sitemap.

News sitemap: This helps Google find the content hosted on your website that is suitable for Google News.

HTML sitemap: A sitemap that is a regular page designed to help users navigate a website. This also helps ensure all important website pages are internally linked by at least one other page.

The two most common types of sitemap you will find on a website are the XML and the HTML sitemap.

XML sitemaps

An XML sitemap is a file that lists the important pages of a website to help search engines like Google discover, crawl, and index them more efficiently. XML stands for ‘Extensible Markup Language,’ which is a format designed to store and transport data.

The pages sitemap generated by Yoast on The SEO Works website.

Above, you can see an example of the ‘Pages XML Sitemap’ on our website.

There are several components that make up a sitemap:

  1. A list of all the important URLs on the website that you want to be indexed on search engine results pages (SERPs).
  1. The <lastmod> attribute, which shows search engine crawlers when content was last updated.
  2. It’s also possible to set up hreflang attributes using a sitemap, which tells Google more information about your website if you host content in different languages and locations.

HTML sitemaps

An HTML sitemap is a web page that lists and links to the important pages of a website. This provides a clear and organised overview of its content. Unlike XML sitemaps, HTML sitemaps are designed for human visitors.

The HTML sitemap on The SEO Works' website.

When ‘sitemaps’ are discussed in SEO, people are commonly referring to an XML sitemap as opposed to an HTML sitemap. For the remainder of this article, when we refer to a ‘sitemap’ we mean an XML sitemap.

Do I need a sitemap?

According to Google Search Central, Google can usually discover the pages on your website if they’re properly linked without a sitemap. 

They define proper linking as pages you find important being included in the navigation, or through links on your webpages.

Google states that you may need a website if your website fits any of the following criteria: 

  1. You have a large website.
  2. Your website is new and doesn’t have many links from other websites.
  3. Your site has a lot of images and videos, and you’d like articles to appear in Google News.

There are some sites that Google say you may not need a sitemap:

  1. Your site is small (500 pages or less).
  2. Your website has been correctly internally linked.
  3. You don’t have many videos, images, or news pages on your website.

For the above examples, you probably don’t need a sitemap – but both ourselves and Google still recommend you have one.

What are the benefits of having a sitemap?

Whilst not all websites need a sitemap, we highly recommend that you have one due to the potential benefits towards your SEO performance. Here are some of the key benefits from having a sitemap:

Improved crawlability and indexing

Sitemaps help search engine crawlers, such as ‘Googlebot’ find and index pages on a website. This is important for both large and new websites, as you’re helping Google find the pages you’d like them to index. Sitemaps also include information about each URL, such as when it was most recently updated. This helps search engines prioritise and understand your content better.

Inform search engines about new content

Sitemaps can inform search engines such as Google about new or updated content. This can lead to faster indexing and appearance within the search results.

Help technical tools locate orphan pages

If your sitemap contains every important URL on your website, it can be a helpful tool to identify orphan pages.

Technical SEO tools such as Screaming Frog and Sitebulb work by following links on a URL (such as your homepage) until they’ve reviewed all the URLs they can find. If a page is not included within the website’s linking structure, it’s known as an ‘orphan page’ and these tools will not be able to find them without knowledge of your sitemap.

If you submit a sitemap alongside the website’s root URL, technical SEO tools can cross-reference the URLs on the sitemap against the URLs they can find on your website. When a tool finds a URL in the sitemap, but not the website’s linking structure, an orphan URL has been identified.

How do I locate my sitemap?

Finding a sitemap on your website involves checking a few common locations where sitemaps are placed. Here are the steps to find a sitemap:

Check standard URL locations

  • Many websites place their sitemap at common locations. Try adding /sitemap.xml or /sitemap_index.xml to your root URL. For example:
    • https://www.yourwebsite.com/sitemap.xml
    • https://www.yourwebsite.com/sitemap_index.xml
  • Check your website’s robots.txt file, which often contains a link to the sitemap. You can find this file by going to ‘https://www.yourwebsite.com/robots.txt’. Look for a line starting with ‘Sitemap:’ which will point to the sitemap’s URL.

Use Google Search Console Webmaster Tools

If you have access to your website’s Google Search Console account, you can find submitted sitemaps under the ‘Sitemaps’ section. This shows the sitemaps that Google knows about.

Talk to your developer

If the above methods don’t work, you may need to talk with your developer to help you locate the sitemap.

It’s possible that the sitemap hasn’t been created or made accessible. You may need to create and submit one yourself using a CMS tool or an online sitemap generator.

Sitemap best practices

A sitemap is a great tool for improving your website’s visibility and ensuring efficient indexing by search engines. Following these best practices will help maximise its effectiveness.

Create a sitemap

Simply creating a sitemap is essential for ensuring that search engines can effectively crawl and index the pages on your website. Here are several options for how you can create one:

Use a CMS plugin

If you use a CMS such as WordPress, we recommend using a plugin such as Yoast to create a sitemap. Yoast will automatically create an up-to-date sitemap of your website when you add new content, such as a new blog or product.

Use an online tool

If you have a hard-coded website without a CMS, an online tool is the easiest option to generate a sitemap. This XML Sitemap Generator tool will create a sitemap for you based on your site structure. As this will not automatically update, you will need to create a new sitemap when new content is added to your website.

Custom scripts

For hard-coded websites, consider working with a developer to create a custom script that generates and updates your sitemap based on the live URLs on your website.

Use smaller sitemaps

If you have a larger website, it’s worth splitting your sitemap into several smaller sitemaps. This involves creating a sitemap index, breaking groups of pages down based on sections.

Sitemap index on The SEO Works' website

The Yoast plugin on WordPress does this automatically.

Smaller sitemaps can lead to more efficient crawling by search engines. Using a sitemap index can be beneficial for websites with thousands of pages.

Submit and check your sitemap with Google Search Console

To ensure your sitemap is recognised and used by Google, follow these actions:

Submit your sitemap

  • Log in to your Google Search Console account.
  • Navigate to the ‘Sitemaps’ section and enter the URL of your sitemap (e.g., https://www.yourwebsite.com/sitemap.xml), and submit it.
  • Ensure Google processes your sitemap and identifies all included URLs.
Google Search Console Sitemap report

Check for errors

  • Use the Coverage report in Google Search Console to check for errors related to URLs in your sitemap.
  • Address any errors such as URLs that couldn’t be indexed, crawl issues, or other warnings.
  • Check your sitemap as your site grows to maintain the best indexing and crawling efficiency.

Using a sitemap for international SEO

Does your website target people from around the world? If yes, it should serve appropriate content to different users in different regions and languages.

Your sitemap can play a vital role in optimising your website for international SEO. The process involves adding hreflang tags to your sitemap, which is the method of international targeting preferred by many webmasters.

Hreflang tags are crucial for indicating to search engines which version of a web page is intended for which language and region.

Explore more about managing multi-regional and multilingual sites with Google’s guidelines.

Sitemap FAQs

Here are some frequently asked questions (FAQs) we receive about sitemaps:

What should I include in my sitemap?

Include all important URLs that you want search engines to index. This typically includes:

  • Main pages (home, about, contact)
  • Blog posts
  • Product and category pages
  • Media content (videos, images)
  • Any other significant content

Are there any size limits for sitemaps?

Yes, sitemaps have size limits:

  • A single sitemap can contain up to 50,000 URLs.
  • The uncompressed file size of a sitemap should not exceed 50MB. If your sitemap exceeds these limits, divide it into multiple sitemaps and use a sitemap index file.

What is the difference between an HTML sitemap and an XML sitemap?

An HTML sitemap is designed for website visitors to use, whereas an XML sitemap is designed for search engines.

What common errors should I avoid with my sitemap?

  • Ensure all URLs in the sitemap are working and lead to live pages instead of broken links.
  • Avoid including URLs with duplicate content from elsewhere on your website in your sitemap.
  • Make sure to include only canonicalised URLs to prevent cannibalization.
  • Ensure that hreflang tags are correctly implemented if you’re targeting international audiences.

What is the difference between a sitemap and a robots.txt file?

Both the sitemap and the robots.txt file are used by search engines to understand your website better. 

A sitemap provides a list of URLs that you want a search engine to index, and the robots.txt file provides instructions about which pages or areas of your website should not be crawled and indexed.

Need a little help with your website?

Struggling to get your head around sitemaps? Don’t have the time on your hands to optimise your website yourself? Require some additional support to increase your rankings on Google? 

Request a free SEO audit for your website today. We’ll help you understand the performance of your site and how you can get more customers online.

The post What Is A Sitemap? The Different Types, Uses & SEO Best-Practice appeared first on The SEO Works.

]]>
Creative & Often Underutilised Link Building Methods In SEO https://www.seoworks.co.uk/creative-and-underutilised-seo-link-building-methods/ Mon, 12 Aug 2024 08:00:00 +0000 https://www.seoworks.co.uk/?p=15822 Creative Often Underutilised Link Building Methods In SEO blog header imageDespite SEO changing over time, link building has remained a key strategy for boosting website authority and ultimately search engine rankings. However, when most people think of link building they think of digital PR and expensive outreach campaigns. Whilst link acquisition methods such as these are effective, they are not the only techniques for success....

The post Creative & Often Underutilised Link Building Methods In SEO appeared first on The SEO Works.

]]>

Despite SEO changing over time, link building has remained a key strategy for boosting website authority and ultimately search engine rankings. However, when most people think of link building they think of digital PR and expensive outreach campaigns. Whilst link acquisition methods such as these are effective, they are not the only techniques for success.

From utilising existing content to reclaiming lost links, in this article we will explore creative and often under-utilised link acquisition methods that can generate high-quality backlinks. 

Whether you’re looking to diversify your link building strategy or simply avoid the challenges of outreach, our guide offers fresh insights and actionable tips to help you achieve your SEO goals.

In today’s digital landscape it is critical to stay ahead of the competition. Continue reading for innovative SEO link building techniques that will enhance your brand and keep you ahead in the market.

Table of Contents

Brand Mentions

It is crucial to include a hyperlink with every brand mention in order to maximise your online visibility. This link building approach requires actively monitoring online platforms for brand mentions and contacting the website owners to ask for a link.

By converting brand mentions into backlinks, you can greatly improve your SEO strategies and increase the traffic to your site. This link acquisition strategy is particularly beneficial if the referring site has a strong domain ranking. 

Ahrefs or SEMRush can identify cases of when a brand has been mentioned without a link. Once identified, we recommend contacting the site owners or authors, thanking them for the mention and requesting that they convert it into a clickable link. This approach involves finding existing content that already acknowledges your brand, making it easier to obtain a backlink.

Using Ahrefs in this example, digital marketers can use the Content Explorer feature of the platform to identify unlinked brand mentions. Searching for brand name (and excluding your domain) will reveal the instances of your brand being mentioned without having a referring link. This SEO link building method is often overlooked but can be hugely beneficial for a business.

ahrefs example screenshot
This image from Buzzstream highlights how the Ahrefs Content Explorer feature can be used to identify cases of where a brand has been mentioned.

Although there are many link acquisition techniques, broken link building stands out as a simple but under-utilised method. By identifying and acquiring broken links from competitors and reinstating lost links to your own website, you can significantly enhance your site’s authority and visibility.

Despite its simplicity and effectiveness, many digital marketers often overlook these tactics in favour of more complex strategies. This oversight presents a unique opportunity for those willing to invest in the straightforward but lucrative method of broken link building.

Broken link building involves locating broken links leading to your website from other related websites. The aim of this SEO link building method is to reinstate lost backlinks or acquire backlinks that competitors have lost. Utilising tools such as SiteBulb or Screaming Frog, marketers can pinpoint broken links on high-authority websites in their industry.

Once a suitable replacement is found on your site, it is recommended to contact the webmaster or content manager, notifying them of the broken link and suggesting the new link as a suitable alternative. 

This approach helps in acquiring a new backlink and supports webmasters in upholding the standard of their website. Below are some key Ahrefs steps for finding broken competitor links:

  1. Go to Site Explorer
  2. Enter the domain of a competitor
  3. Navigate to the Best By Links report
  4. Filter for “404” pages
  5. Sort the report by Referring domains from highest to lowest

Once broken competitor links have been identified, it is important to investigate the potential of a potential link and explore the possibility of creating a page that can be used as a replacement.

Over time, some backlinks may be removed or lost due to changes in website content or structure. For example, a tour operator may remove travel destinations they no longer serve from their website without considering whether that particular page has any valuable backlinks. It is important to utilise backlink tracking tools to identify these lost links.

Once identified, reach out to the person in charge of managing your website and ask them to reinstate the links or provide updated content that can replace the lost links. This proactive approach ensures that valuable backlinks are maintained, ultimately supporting your SEO strategy.

Nofollow links are backlinks that Google has been instructed not to crawl. Although their SEO value is limited in terms of link equity, they still have the potential to increase brand awareness and drive traffic.

Pursuing nofollow links from high-traffic sites can indirectly benefit SEO through increased visibility and potential future dofollow links. This SEO link building method helps to ensure a diverse backlink profile which contributes to overall online presence.

Creative link building techniques, such as infographics, quizzes, calculators, interactive maps, and original research, provide engaging ways to earn backlinks. Infographics simplify complex information into visually appealing images, which encourages sharing.

Quizzes and calculators provide interactive experiences that engage users and increase engaged sessions. Interactive maps offer unique insights and localised data, attracting interest and backlinks. Original research boosts authority, prompting other sites to reference your findings.

These link acquisitions boost user engagement, enhance shareability, and naturally attract high-quality backlinks. Below is an example of an SEMRush infographic which outlines key stats about social media networks. This image simplifies the information and presents it elegantly.

SEMrush infographic about social media networks
An infographic created by SEMRush which presents information about social media networks in a visually appealing format.

Conclusion

Whilst traditional SEO link building methods like outreach campaigns and digital PR remain effective, exploring alternative link acquisition techniques can significantly enhance your SEO strategy.

The link building methods discussed throughout this blog can help you diversify your backlink profile and boost your website’s authority and search engine rankings.Additionally, addressing nofollow and dofollow backlinks ensures a balanced and healthy link profile.

Adopting these innovative approaches to link building in SEO not only streamlines your link acquisition process but also keeps you ahead of the competition. As SEO continues to evolve, staying adaptable and creative with your link building strategies will be key to achieving sustained digital growth.

However, it must be noted that although link building in SEO is important, it is only one part of several SEO tactics that require attention to maximise performance. 

At The SEO Works, we have helped companies across the UK and beyond boost their SEO efforts with our award-winning link building services and SEO strategies. From small startups to large international companies, our team of SEO experts can help boost your online presence.

Want to know how your website is performing? Contact us today for a FREE website review.

The post Creative & Often Underutilised Link Building Methods In SEO appeared first on The SEO Works.

]]>