Mastering the Language of SEO/SEO Audit

With over 100 SEO checkpoints combined into a phase-by-phase programme, this course can help your business become fluent in SEO, whether you want us to perform an audit, implement an effective strategy, or you’re just here for some tips.

Inspired by the modern methods of language learning, we decided to take a new approach to boring ‘SEO checklists’. Focusing on building SEO fluency in easy, digestible chunks, our SEO audit provides a comprehensive and holistic review to your SEO performance and keeps your wider business goals in mind.

Module 1: An Introduction

Website Health Check and Site Level Overview

In this module, we look at the top level metrics that are important for SEO, to give you an idea of how healthy your website looks in terms of visitors, retention, and the top keywords bringing them here.

  • The number of web users visiting your website, reported on a monthly timeline.

  • The percentage of visitors to a particular website who navigate away from the site after viewing only one page.

  • Authority Score is a compound domain score that grades the overall quality of a website or a webpage. The higher the score, the more assumed weight a domain's or webpage's backlinks could have. Triangulated score – semrush, seobility, and ahrefs.

  • Average rank is the average of the rankings for the keywords you track for a day, week, or month.

  • The position of a singular page or keyword in SERPs (Google’s Search Engine Results Pages). For example, a page about ‘the best chocolates in the UK’ could rank in position 3 on the first page in Google for keyword “best chocolate uk”.

  • A report of your top 5 ranking keywords and the pages on your site they deliver traffic to.

Module 2: Getting the Basics Right

Website Structure & Architecture, Security & Usability, Indexing and Crawlability

This module is all about making sure your website has the right structure to succeed, troubleshooting any security or UX issues and making sure search engines are given the right information to properly crawl your site and help you show up at the top of SERPs.

  • A 10-point heuristic evaluation of your website following Jakob Neilsen’s usability characteristics followed by a site heatmap to identify patterns within user interaction and identify user friction.

  • Making sure URLs are simple, easy to read and include keywords related to the content.

    Making sure URLs don’t include any characters that are not recommended such as underscores and capital letters.

  • Multiple redirects in a chain can negatively impact how pages are indexed by crawlers like Googlebot. A high number of redirects may even result in the crawler giving up. Generally, a page loses about 10% of its strength with each redirect. In turn, this negatively impacts SEO.

  • 404 is a status code that tells the user that a requested page is not available. 404 and other response status codes are part of the web's Hypertext Transfer Protocol response codes. The 404 code means that a server could not find a client-requested webpage. This is bad for visibility because it tells search engines your content is outdated and your site hard to navigate.

  • A page without any links to it is called an orphan page. Search engines, like Google, usually find new pages by following links from one page to another, or, the crawler finds the URL listed in your XML sitemap. By making it harder for bots to crawl, this reduces their visibility and potential to be indexed on SERPs.

  • Search engines process images by reading them and creating a picture through information instructions. Schema markup helps search engines to understand the meaning of your content by providing information to understand these pictures more clearly

    Schema markup can help crawlers understand your site more quickly and can increased click through rate analogous to rich snippets by providing additional relevant information on SERPs, this is especially for recipes, events, and priced products.

  • Noindex means that a web page shouldn't be indexed by search engines and therefore shouldn't be shown on the search engine's result pages. Pages with this tag won’t show up in search. This can be useful for pages that are not valuable for the search engine to index but are necessary for functions within the website e.g. ‘thank you’ pages, member login pages, author archives etc.

  • A 302 redirect lets search engines know that a website or page has been moved temporarily. This type of redirect should only be used if users need to be sent to a different site or page for a short period of time, such as when a page is being redesigned or when a website is being updated. 302 Redirects don’t move the SEO value of a page to their destination, thus should never be used for permanent redirects.

  • From an SEO point of view, a meta refresh redirect is not the best way of redirecting because, as the name suggests, it's actually a page refresh rather than a redirect. Redirecting using a 301 redirect is always recommended.

  • An XML sitemap is a file that lists a website's important pages, making sure Google can find and crawl them all. It also helps search engines understand your website structure. You want Google to crawl every essential page of your website. Having a sitemap in itself is not a ranking factor, but not having one may cause some pages not to be crawled and thus not indexed by Google.

  • A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, block indexing with no index or password-protect the page. The robots.txt file contains directives for search engines on how best to crawl the website.

  • Problems with the server could be the source of many search engine optimisation issues. These have to be checked when we see clear red flags. If your site’s server is returning status codes 5XX, then something is definitely off, and, yes, you may be experiencing a traffic loss as a result. If server errors continue for more than 2-3 days, Google may start to deindex your site starting with the pages that are crawled the most (the most popular web pages usually drop first). 

  • Indexing pages that require users to log in to view the content will make it difficult to rank in Google with those pages. This doesn’t always need to be fixed, but if so, we will list affected pages case by case.

  • Mixed content is when pages have different URLs e.g. http and https. When a site has http rather than https, their connection is vulnerable to eavesdropping and attacks. With https, their connection is secured by an SSL encryption. Mixed content becomes an issue when a page with https includes the unsecured http content.

  • Google predominantly uses the mobile version of the content for indexing and ranking. Historically, the index primarily used the desktop version of a page's content when evaluating the relevance of a page to a user's query. Mobile SEO is an integral part of growing your business online. If you want to help your business grow through SEO, you must account for users that search for your business through mobile devices. A robust mobile SEO plan will help you reach leads interested in your business.

  • A tap target is any element on a web page that a user interacts with. These include action buttons, links, ads, etc. that a user taps on when accessing a web page using a touchscreen. When not sized appropriately, this is bad for mobile users and thus the website would score lower on the mobile-first index, which is undeniably important to move up in Google’s search rankings.

  • You require a minimum of 16px or 12pt font for your site to be considered mobile friendly. This is crucial in the mobile-first index.

  • Search engines can sometimes index multiple versions of your homepage, which is a duplicate content issue. A canonical tag should be used to tell them which version is the official one.

  • Feedback on websites can include thank you notices when filling out applications, green ticks to signal a transaction has gone through, and even noises to increase the connection between your decision making and your brain.

    Haptic feedback is the physical sense of touch, which can be triggered by textures, vibrations etc.

    Feedback and haptic marketing create a better user experience by providing extra information and emotional cues which reassure and inspire confidence in the user. Think of it like body language for your website!

Module 3: Building the Foundations

Site Speed & Performance and Core Web Vitals

In this module, we look at the top level metrics that are important for SEO, to give you an idea of how healthy your website looks in terms of visitors, retention, and the top keywords bringing them here.

  • Broken JavaScript and CSS can cause problems for the user. Quality coding and scripting on your site help improve its ranking.

  • AMP is a web development framework for creating faster-loading, static-content web pages. While it was once promoted for improving SEO through speed, AMP is no longer a Google ranking factor as of 2021. However, if your website uses AMP, there is no need to change this.

  • Identifying your PSI Score gives you a metric from which you can improve. The PageSpeed Insights Score ranges from 0 to 100 points. A higher score is better and a score of 85 or above indicates that the page is performing well.

  • First Contentful Paint (FCP) is a metric used to define when the first bit of content is rendered from the DOM (direct object model). In essence, it’s the first indication to the user that the site is loading.

    To provide a good user experience, sites should strive to have a First Contentful Paint of 1.8 seconds or less. 1.8-3 seconds is considered in the ‘needs improvement’ category and anything over 3 seconds is poor.

  • Largest Contentful Paint (LCP) is an important Core Web Vitals metric which measures when the largest content element in the viewport becomes visible. It can be used to determine when the main content of the page has finished rendering on the screen.

    The most common causes of a poor LCP are slow server response times, render blocking resources and improper use of lazy loading requests. To provide a good user experience, sites should strive to have Largest Contentful Paint of 2.5 seconds or less.

  • Lazy loading is a performance optimisation technique that delays the loading of non-critical assets, such as images, videos, or other media, until they are needed.

    However, lazy loading requests are often implemented incorrectly, especially for CMS’s which choose to employ a site-wide lazy-loading policy, meaning that critical elements are lazy loaded, leading to a poor LCP score.

  • The Total Blocking Time (TBT) measures the total amount of time a web page is blocked from responding to user input during its load process by measuring the time between First Contentful Paint and Time to Interactive (how long the site takes to become fully interactive).

    A good TBT should be less than 300 MS in an average device and network connection.

  • Speed Index (SI) is a page load performance metric that shows you how quickly the contents of a page are visibly populated. It is the average time at which visible parts of the page are displayed.

  • INP is a metric that assesses a page's overall responsiveness to user interactions by observing the latency of all click, tap, and keyboard interactions that occur throughout the lifespan of a user's visit to a page. The final INP value is the longest interaction observed, ignoring outliers.

  • Cumulative Layout Shift (CLS) is a measure of a website's instability. This measure determines whether a website behaves as the user expects it to behave. One of the most frustrating aspects of an unstable website is that the page's content shifts as the user views it.

    The Cumulative Layout Score is calculated by multiplying the Impact Fraction by the Distance Fraction: Cumulative Layout Shift (CLS) = Impact Fraction x Distance Fraction CLS = 0.645 x 0.179 CLS = 0.1154. The CLS score keeps rising as the impact and the distance fraction increases.

    CLS can be affected by width and height attributes not being specified and dynamically injected content

  • Render-blocking resources are scripts, stylesheets, and HTML imports that block or delay the browser from rendering page content to the screen. When the browser encounters a render blocking resource, it stops downloading the rest of the resources until these critical files are processed. Removing these can significantly improve page speed.

  • Reducing unused JavaScript can reduce render-blocking behaviour to speed up your page load and improve your visitors' page experience. By default, JavaScript files are render-blocking because they block the browser from dealing with other page load tasks, thus delaying your page's First Paint. This can be helped through various methods such as removing unused JavaScript from specific pages, deferring or delaying JavaScript.

  • Server response time is the time that passes between a client requesting a page in a browser and a server responding to that request. It is measured by TTFB (Time to First Byte). TTFB is how many milliseconds it takes to receive the first byte of the page after sending an HTTP request.

  • Preloading a resource is a browser instruction to fetch a critical asset, like a script, font, or image, sooner than its natural discovery order

    This ensures the resource is available early in the page's lifecycle, leading to faster rendering and a better user experience. 

    For example, you can use preloading to anticipate user navigation. By fetching the next page's resources when a user hovers over a menu link, you can eliminate or significantly reduce the waiting time after they click.

  • Static caching is when a browser requests a resource, the server providing the resource can tell the browser how long it should temporarily store or cache the resource. For any subsequent request for that resource, the browser uses its local copy, rather than going to the network to fetch it. Efficient caching helps improve website speed.

  • A common mistake but one that is easy to fix is using inefficient media formats.

    WebP is a next-gen image format that provides lossless and lossy compression as well as animation and alpha transparency for web images. This means that smaller—but significantly richer—images can be used on your site to make your pages load faster.

  • When a script is asynchronous, it will load simultaneously with other scripts, meaning that script A and script B are now able to load at the same time, thereby speeding up the overall loading of a page. This can be applied to both CSS and JS. Deferring non-critical elements means that parts of the code on a page will be deferred until the necessary elements on the page have already loaded. This way chunks of code that are not necessary from the beginning won’t be render blocking.

  • Chaining critical requests is a sequence of requests depending on each other that are crucial for rendering content on your web page.

    Since they must be loaded sequentially, long chains of critical requests can significantly slow down a web page's rendering.

    These dependencies are processed in the order dictated by the browser's critical rendering path, which converts code into the visible page. To minimise performance impact and reduce the time it takes for a page to display content, these render-blocking chains should be minimised or broken up

  • A broken link is a link which no longer takes users to its intended destination because the page has either been moved, no longer exists, or because the URL has been entered incorrectly by the owner.

  • As the name suggests, authority links are backlinks from website that have established a level of trustworthiness and value. By obtaining a link from one of these sources, some of this authority is then passed on to your site, almost like a job reference from a respectable employer.

  • A look at the linking strategy to landing pages aimed at converting these landing pages, their relevance to your content strategy, and keyword usage.

  • A look at the domains referring to your site; assessing their authority, relevance, value etc. The more high-quality referring domains, the better.

  • Remember we said that links from high value, trustworthy sources transfer some of their authority to your page. Well, one caveat to this is if the links have been marked as no-follow, which is an instruction to search engines not to transfer this authority to you. However, it is recommended to have a healthy mix of these links to show your site has acquired these links naturally and has not artificially manufactured backlinks.

  • This metric represents the quality of backlinks to URLs and websites. A web page with a higher Trust Flow than Citation Flow is typically associated with high-quality links.

    Imagine you’re an author, this score is an indicator of the quality of the references you have received, perhaps from top critics and publishers.

  • Reflects the quantity of links pointing to a given website. This metric does not distinguish between high-quality and low-quality links.

    This is an indicator of your perceived popularity as you are receiving references from a wide variety of sources (good or bad).

    Imagine you’re an author again, this score would reflect a wide number of reviews from your work, from critiques to low-level bloggers.

  • When it comes to your internal linking strategy, anchor text (clickable text that takes you to a new page) provides additional context for the reader and crawlers about what the linked page is about. In turn, clear and diverse usage of anchor text provides a better user experience as well as helping crawlers to understand the site architecture better, and prevent confusion about which pages are most relevant to a specific topic.

    Considered usage of anchor text can have positive effects on both user experience and SEO.

  • A timeline of won and lost backlinks over time is helpful to identify changes in SEO performance and identify ways in which you can retrieve lost backlinks.

  • Toxic backlinks are links which show signs of low quality or an attempt to manipulate rankings For example, spammy, paid-for linking sites, or links coming from sites and content which aren't topically relevant, or even sites that aren't indexed by Google.

    Backlinks are grouped by ‘toxic,’ ‘OK,’ or ‘disavowed.’ While toxic backlinks are a problem, they can be disavowed. However, it is preferable to first get them removed by demanding action from the referring website. Generally, you don’t want your website to be found on directories, press release distribution sites, blog networks, social bookmarking sites, and non-niche reciprocal link sites.

  • Toxic pages are pages that are poisoning your website. These could be pages that have received a penalty because of their toxic backlinks or because of harmful content such as hate speech or misinformation. Similarly, toxic pages can be pages with very low-quality, spammy content that is negatively affecting your SEO.

  • Identifying pages which have issues of the following severity: critical errors, warnings, and notices.

  • Outbound links are important for SEO as they provide the user with extra context, increase credibility, and improve user experience.

    While outbound links take users away from your site, they act like sources in an article to back up what you’re talking about and allow the reader to access additional information that will assist them in understanding your content.

Module 4: Connecting Ideas

Backlink Analysis

Links are crucial for SEO as they signal to search engines that your content is trustworthy, valuable, and relevant. Here, we look at your strategy for gaining high-authority referring links, your use of outbound links, and establishing a hierarchy of content within your site.

Module 5: Cross-cultural Communication

International SEO

Proper usage of hreflang errors ensures that pages are formatted for the correct international audiences and avoids canonical pages from being ignored.

  • Correct hreflang attributes are used to specify a page’s language and/or regional version. The correct tags of rel=canonical and rel=alternate must be used alongside the URLs for localised pages. An x-default tag should also be used for users who do not belong to any relevant localisation.

  • If you have an international presence in which you have multiple versions of the same page translated into different languages, then you need to use hreflang tags. Whenever hreflang tags are used, they require self-reference, or URL of the page itself, just like for the canonical tag.

    While used for websites which provide content in different languages, this also prevents users who speak the same language in different regions from encountering issues such as incorrect currencies.

    To fix this, a page for an English language speaker would have ‘en-gb’, while a page for an English-speaking Australian audience would have the ‘en-au’ tag. The former would receive pricing in pounds and the latter in Australian dollars.

  • If an English page has hreflang pointing at its Spanish alternate, the Spanish page must also have hreflang pointing at the English page. If this return tag is missing, search engines will ignore the hreflang instruction.

  • A page linking to a non-canonical URL from its hreflang annotations sends contradictory signals to search engines. This can result in both versions being ignored, harming the ranking of your canonical page.

  • For sites using hreflang, especially those that are utilising 301 and 302 redirects, hreflang URLs should not be linked to redirected URLs.

  • Hreflang attributes must be formatted according to ISO 639-1 conventions.

    For example, ‘en-us’ specifies the English language for users in the United States, while ‘en-gb’ indicates content for all English speakers in the United Kingdom.

Module 6: Making a Good First Impression

Meta titles, descriptions, h1 tags, alt attributes etc.

This module is all about SERPs (search engine results pages), specifically how effective use of title tags, meta descriptions, heading tags, and alt attributes can help you rank at the top of search results and drive more clicks.

  • A title tag or meta title is a page’s title when you click on it in search results. Title tags have long been vital for SEO as they are they are the most immediate indicators of a page’s content. A well-written title tag is essential for SEO and getting visitors to your website who will stay there.

  • While a title tag is the first thing a user usually sees, and a good one is often enough to get a user to click, a meta description gives extra context and can provide key information that will make a user visit your page. If you don’t have a compelling meta description that is optimised for your target keyword and search intent, then prepare to lose out to competitors and watch your rankings struggle.

  • Title tags are technically measured in pixels. It's typically recommended that title tags be from 50 to about 70 characters long. If it is too long, the title tag will be truncated from the display, not revealing the full message. To be on the safe side, we always advise a maximum of 60 characters.

  • The title tag is considered important to help both users and search engines to quickly understand what content they can expect to find on the page. If the title uses too few characters, it may not be sufficient to effectively communicate the desired message.

  • Duplicate title tags should be avoided on a website because it confuses the search engine about which page to go to for a specific topic, and it is also a missed opportunity to signal the diversity of your content.

  • If you don’t put in your title tag, Google will do this for you, meaning that you have missed out in an opportunity to maximise the potential for your page to rank and compete on SERPs.

  • Despite claims that too many H1 tags are bad, Google has stated that it does not actually punish sites with more than H1 tag. However, it is best practice to only use one H1 tag per page in order to establish hierarchy, clarity, and flow of content.

  • Using alt text on images makes websites more accessible for impaired users and it also helps earn both explicit and implicit SEO benefit. Along with implementing image title and file naming best practices, including alt text may also contribute to image SEO. This makes images on the website findable in Google’s image search and there is potential for the images being displayed in the SERPs pages, helping you claim more positions in Google’s SERPs pages. It also gives the search engine more information and context about the site’s content.

Module 7: Developing fluency and expressing yourself

Content and Search Intent

Content is king. This module assesses the quality and strategy of your content. Today, SEO is much more than ranking for keywords, it’s about delivering value and optimising for search intent. Once you’ve optimised the technical side, content is how you realise your ranking potential.

  • E.E.A.T. stands for Experience, Expertise, Authority, and Trustworthiness. This is a framework that is used by Google, assessed and updated by thousands of human reviewers (quality reviewers), to assess content quality. The 2018 ‘Medic Update’ marked a huge shift in how Google’s rankings have matured to favour search intent and content and away from a reliance on keywords. This is particularly important for ‘YMYL’ (Your Money, Your Life) content. These guidelines (Quality Raters Guidelines) do not have a direct impact on search engine rankings, but by indirectly assigning your website as a more trustworthy, valuable source for the reader, it can impact how Google perceives your content.

  • Keywords help pages rank. A page can be optimised for and rank for multiple keywords. It is important to avoid keyword stuffing, thus as a rule of thumb, a given keyword shouldn’t appear more than once per 200 words. The use of synonyms and semantically related keywords helps Google understand the context of the page. It also shows that the writer is knowledgeable about the topic.

  • Readability is a major factor in online content and can greatly increase your site's search engine optimization (SEO) levels. Readability is the practice of making your writing understandable and easy to digest for your target audience. Marketing to your audience is critically important.

  • A ratio of anywhere between 25 to 70 percent is considered to be a good text-to-HTML ratio. The percentage determines the visible text content ratio as opposed to HTML elements and other non-visible information of the website. Most of the websites with high search engine rankings have a large amount of visible text. It is OK for the text to HTML ratio to vary on a website, however for important pages which are supposed to rank, we will flag this as important when the ratio is 10% or below.

  • Keyword cannibalisation can cause confusion among search engines. For example, if you have two pages in keyword competition, Google will need to decide which page is best.

    When multiple pages on a website are competing for the same keywords, this actually lowers the rankings for all pages and can negatively affect click rates.

  • Zombie pages are the pages that generate little to no traffic and affect the website's search rankings. Businesses should regularly conduct SEO audits to improve the ROI of their SEO efforts and enhance their site's search rankings.

  • Featured snippets are the holy grail of search results. These are highlighted excerpts from a web page that appear at the top of Google search results in a larger font to provide, quick, to the point answer’s to a user’s query.

    Optimising for these includes excellent, well-structured, and concise content that is produced for specific search results. It also requires optimising for mobiles and incorporating structured data like schema markup to help Google understand your content’s format.

    However, AI overviews have quickly replaced a huge amount of these, making it harder to achieve.

  • Two of the most important metrics for SEO are the time users spend on your page or site and the number of backlinks referring back to your domain, and video almost always improves both of these figures. It's more likely for websites displaying video content to rank on the top of Google's search results.

  • While social signals are not a direct ranking factor for major search engines like Google, social media helps SEO by increasing brand awareness, driving traffic to your website, and generating backlinks. A strong social media presence can also lead to more brand searches and shares, which can improve your site's credibility and search performance over time. It is therefore important that your social media strategy is in syzygy with your SEO and wider marketing strategy.

  • Google maps out the intent for each keyword and optimises their SERPs by linking to pages that users are most likely to find useful. These keywords will fall under four main categories in relation to search intent:

    • informational

    • navigational

    • transactional

    • commercial investigation.

    This means that the content you can serve for a specific keyword is limited to its intent. For example, when there is a clear informational intent, there is little point trying to rank with a shop or product page, which is transactional. For example, if informational, optimise with FAQ style questions, h2 and h3 tags etc.

  • Google’s algorithm have increasingly become favourable of high-quality, human-led content that satisfies the user’s search intent in as comprehensive and efficient manner.

    What this means for SEO is that thin, AI-generated, and generally low-quality content is being down-ranked. Even though AI and search engine developments are becoming more sophisticated, its actually moving away from lazy content designed to catch a few keywords, and more than ever it takes a well thought-out, skilled content writer who knows what they’re talking about to rank well in SERPs.

  • Synonyms and LSI (latent semantic indexing) keywords are important for SEO because they trigger related topics and related phrases for your top-ranking keywords, and help provide extra content for both the user and search engine.

  • Tangential content is content that is not directly related to your service or the product you’re offering. The benefit of this strategy is to attract a wider audience who may find your content useful even if it is not the primary purpose of your content. While not directly related, tangential themes can provide additional information and may help provide context and extra value to your content. And while the user may not have initially sought out your product, maybe they’ll interact with your service or buy your product while stopping by.

    For instance, a budget supermarket might want to make content about recipes that are cheap and easy to make, or write about ways to prevent wasting food at home. By doing so, they are appealing to internet users who may be searching for this content and find themselves interacting with the brand, creating a stronger connection, and possibly leading to interacting with the supermarket’s actual products. Another such example could be a highly stylised brand that focuses on aesthetics and emotional marketing. A clothes brand may choose to create content that ostensibly is irrelevant but evokes similar feelings and connotations (but be careful with this when it comes to SEO).

  • -       Google doesn’t want people ‘pogo-sticking’ around between the search results. Instead, it wants people to click on a result and find their answer on that page. This is why it’s important to optimise a page for the user’s intent. A page should give users the information they need and shouldn’t obstruct their journey. Obstructions for example could be popups, text without subheadings, no use of bullet points.

    Provide value with high-quality, well-structured content and not obstructing the user’s journey to obtaining the objective with annoying pop-ups, bad formatting etc.

  • Navigation is an important part of any SEO strategy since the UX design directly affects sites’ SEO rankings through user engagement metrics. By taking the time to improve UX design, sites will also see improved SEO rankings, increased brand credibility, and better audience retention. The menu structure and pages the main menu links to are important as well. This way, the website is easy to navigate and users will flow to the most important pages on the site. Google follows users, so it is important to make their journey, and Google’s, as easy as possible.

    Breadcrumbs are a simple but very useful tool to improve website navigation, by helping user’s find their way back to topics when browsing blogs, information and product pages etc.

  • You can have the best content on the internet for a specific topic, but if you don’t use CTA’s properly, you will struggle to convert visitors to your page. Proper usage of CTA’s are essential to moving your audience through whichever stage of the sales funnel they are at. They are also key to reducing bounce rates.

    It’s important to consider the number, variety, and positioning of your CTA’s to ensure you’re not confusing your audience.

Module 8: Preparing for the Future

AIO/AI SEO - adapting to AI

AI is transforming the way we obtain information online, and thus is drastically changing the landscape of SEO. This module looks at your preparedness for these changes and assesses your AIO strategy.

  • -       Good or bad, AI is transforming search completely. By failing to prepare for AI’s changes to SEO, you are running the risk of being left behind. But fear not, optimising for AI actually requires content that is better for humans, and once again points to value, content, and authority. There are various ways to optimise for AI overviews, such as using short, simple sentence structure with clear headings and introductions with content organised for quick scanning. Your content must reflect the purpose of AI, to get the user the answers they need as soon as possible. You can also benefit from using schema markup, tailoring content to long-tail keywords, providing concise introduction summaries, and utilising descriptive meta descriptions that differentiate from AI summaries.

  • With AI able to trawl the internet for thousands of results, by creating ‘hidden gem’ content i.e. unique and distinct insights, AI will be picked up out of the crowd. Main content is high quality content that takes effort, originality, skill and talent to create, and is also most likely to be picked up. Firsthand experience and expertise (EEAT) are often favoured which is why you will often see consider reddit, YouTube cited.

  • Using language and syntax that reflects the lexicon of the user making the search will help you show up in AI tools such as ChatGPT, Ask Perplexity etc. Direct, conversational phrasing that mirrors these interactions is easiest for AI to interpret and therefore pick up.

  • AI crawlers are blocked to protect a website's content from being used to train AI models without permission. Should you choose to do this, you will prevent visibility on AI search functions, potentially leading in a loss of traffic to your site. For now, it is not recommended to block AI crawlers because blocking Google’s AI completely means blocking Google’s search bot. The rapidly changing and advanced nature of this technology means that website generally will struggle to keep AI bots out of their site.

  • If you can, create conversations around your brand and leverage its reputation to optimise for AI. PR strategies can be effective for this by getting your site on to other relevant sites with high authority.

Bonus Module: Conversing with Locals

Local SEO

This module is for businesses with a local presence. We take a look at how you can arrive at the top of search results, reach your audience and get ahead of your competitors.

  • Identifying which keywords are ranking highest for local search results will inform your local SEO strategy. There are different tools you can use to see where your local business is ranking in any given area, down to the specific postcode.

  • To increase your local visibility, ensure your Google business profile is verified, up-to-date and has accurate information. Consider high-quality photos and videos which demonstrate your local presence.

  • Name and place citations should be consistent across online directories. NAP citations confirm your legitimacy to Google.

  • It’s important to garner positive reviews and reply to them. This will tell both Google and users that you are a respected, reliable organisation within your local community. By interacting frequently to customers, Google is able to tell that you are interacting with local customers in a positive way and boost your local SEO visibility.