A Step-by-Step Guide to Building a Winning Local SEO Strategy

Remember the last time you found yourself craving a good meal and turned to Google to find nearby restaurants? Chances are, Google presented a Map Pack (also known as Local Pack), showcasing three options, complete with their contact details, ratings, and sometimes even their hours of operation. 

Like this:

Read more: A Step-by-Step Guide to Building a Winning Local SEO Strategy

Then, the regular organic results followed below: 

But have you ever wondered how Google decides which restaurants to feature in this Map Pack? Well, it boils down to three main factors:

  • Relevance: Google assesses how relevant the listed businesses are to the terms you’ve searched for. The closer the match between the search query and the business’s profile, the higher its chances of appearing in the Map Pack.
  • Distance: Google considers how close the restaurant is to your current location or the location you mentioned in your search. The closer, the better the chances of making it into the Map Pack.
  • Prominence: This is where online and offline presence comes into play. Online factors include the restaurant’s website, reviews, backlinks, and the completeness of their Google Business Profile matter. On the other hand, offline prominence entails the business’ reputation and popularity in the local community. 

Source: Google Help Center

Mastering these elements is where local SEO comes in. By optimising your online presence and reputation, you can boost your chances of snagging a spot in the Map Pack and even the organic search results for location-specific queries.

Ready to master local SEO? This post has got you covered with everything you need to know to build and execute a winning strategy. 

Why is Local SEO Important?

Local SEO involves techniques and tactics designed to help businesses rank higher on search engine result pages (SERPs) when users search for products or services within a specific geographical area. Below are three reasons why it is essential. 

1. Targeted Traffic

Nearly half of all Google searches reportedly have local intent, meaning users are actively seeking nearby businesses or services. Additionally, the Map Pack, which prominently displays local businesses, appears in a staggering 93% of these searches. Local SEO offers a direct pathway to capitalise on these search trends, driving targeted traffic and guiding potential customers straight to your doorstep. 

2. Increased Visibility

Local SEO helps businesses appear in local search results when users search for products or services within a specific geographic area. For example, when someone searches for “coffee shops near me,” local SEO helps coffee shops optimise their online presence to show up in those search results.

3. Revenue Generation 

Local SEO isn’t just about visibility; it enables conversions and purchases, too. According to Google’s 2022 Retail Marketing Guide, there has been a notable surge in searches such as “shops near me” and “where to buy,” indicating heightened consumer interest in local businesses. These trends reveal the tangible revenue opportunities businesses can tap into with a well-crafted local SEO strategy. For instance, businesses that optimise their local SEO effectively can experience a surge in foot traffic to physical stores and an uptick in online purchases. 

But before you jump headfirst into local SEO, it’s essential to pause and ask yourself: Is this the right move for my business?

Is Local SEO right for your business?

Just because a marketing tactic works wonders for some doesn’t necessarily mean it’s the perfect fit for you. So, before you roll up your sleeves and start crafting a Local SEO strategy, consider these key questions:

  1. What is Your Business Type? 

Consider whether your business has a physical presence or relies heavily on local customers. Local SEO could be a game-changer if you’re running a brick-and-mortar store, a service-based business, a healthcare facility, or a restaurant. But if your business doesn’t fit into these categories, Local SEO might not be your best bet.

  1. Who is Your Target Market?

Take a deep dive into your target audience’s behaviours, preferences, and how they conduct location-specific searches. If you notice a trend of active local searches and a preference for proximity and convenience, then Local SEO should be on your radar.

  1. Where are Your Customers Located?

Analyse where your customers stay, assessing whether they are primarily local or dispersed across different regions across the world or country. Doing this helps you target them effectively when executing Local SEO. 

  1. Are Your Competitors Leveraging Local SEO?

Keep an eye on what your competitors are up to. Are they successfully attracting local customers through Local SEO strategies? Analysing their approach can reveal insights and opportunities for improving your own strategy.

  1. What are Your Current Local Search Rankings? 

Assess your online presence and local search rankings for relevant keywords and phrases. This assessment will serve as your starting point and guide your Local SEO efforts moving forward.

  1. Do you have a Google Business Profile listing? 

Before you kickstart your Local SEO endeavours, ensure your business has claimed and optimised its Google Business Profile. An optimised profile streamlines your presence in local search results and provides valuable information to potential customers.

  1. Are You Ready to Invest in Local SEO Efforts?

Are you prepared to invest time, effort, and resources into Local SEO? It’s not a quick fix, but the payoff can be massive. So, no worries if now isn’t a suitable time due to resource and time constraints—you can always revisit it later.

Answering these questions thoughtfully will help you determine if local SEO aligns with your business goals and if it’s the right path for you to pursue. 

Ready to master Local SEO? Let’s dive in!

10 Strategies for Executing a Winning Local SEO Strategy

Once you’ve determined that local SEO is the right fit for your business, it’s time to get to work. 

Here are nine strategies to help you execute a winning local SEO strategy:

1. Audit Your Website 

Before you start digging into keyword research and optimisation, it’s crucial to ensure that your website is in tip-top shape. 

Use web audit tools to conduct a thorough examination, identifying any technical or on-page SEO issues that could hinder your local SEO efforts. Look for common culprits like slow site speed, broken links, poor mobile optimisation, and missing XML sitemaps.

Read this content audit guide to learn more about auditing content for SEO.

2. Conduct Keyword Research

Once your website is primed for action, define your target keywords for local ranking. Focus on geo-specific keywords that align with your business’s location and target audience’s intent. For instance, if you operate a bakery in London, prioritise keywords like “Seattle bakery” or “bakery near Pioneer Square” to maximise local visibility.

Tools like InLinks can streamline your keyword research process, providing valuable insights into local search trends and user behaviour.

3. Perform Competitor Research

Take a deep dive into your local competitors’ SEO strategies, analysing their strengths, weaknesses, and opportunities. Use this information to refine your own approach and gain a competitive edge in the local market.

4. Optimize Your Website

With your target keywords in hand, it’s time to put them to work by optimising your website. 

Here’s how:

  • Keyword Mapping: Assign each geo-specific keyword to relevant website pages, ensuring a seamless fit.
  • Page Titles and Meta Descriptions: Incorporate target keywords into compelling titles and descriptions to boost click-through rates.
  • Rich, Keyword-Optimized Content: Create engaging content that speaks to your audience’s local needs and pain points.
  • Internal Linking: Expand your internal linking structure to improve navigation and user experience.
  • Image Optimization: Enhance visuals with optimised alt text and compressed file sizes for faster load times.

This is simple to implement with InLinks. All you have to do is sign up for a free account on Inlinks.com and then select “Content Briefs” > “Create a Brief”.

5. Create Localized Pages 

Develop dedicated landing pages highlighting your local services to attract targeted traffic and drive conversions. For example, a law firm in Manchester might create separate pages for services like “Accident and Injury claims in Manchester” and “Civil dispute resolution in Manchester.” 

An excellent example of a localised page

Once you’ve created localised pages, amplify their reach through paid advertising channels, ensuring maximum exposure in your target market. This post by Search Engine Land provides a valuable guide on how to create well-optimized localised landing pages. 

6. Monitor and Manage Local Listings Across Platforms 

NAP citations are mentions of your business name, address, and phone number across the web, popping up on business directories and social media profiles. 

The more consistent and accurate your NAP citations are across different platforms, the more confident Google becomes in your business’s trustworthiness. That confidence often translates into higher rankings and increased visibility.

So, it’s important to maintain consistency in your local listings across platforms. Pay close attention to details like your business’s address, name, and phone number – even the slightest inconsistency could throw a wrench in your local SEO efforts.

This visual by Smart Insights clearly distinguishes the difference between a good and bad NAP citation.

7. Refine Your Google Business Profile Listing 

Your Google Business Profile showcases key information about your business, from its location to your services and products, along with captivating photos.

Example of a Google Business Profile

Optimising this profile is non-negotiable if you want to maximise your presence in local searches. This means filling out all the relevant fields, including business hours, categories, and high-quality photos that truly represent your business.

But it doesn’t stop there. Actively engage in Q&A sessions to address potential customers’ queries—it’s a great way to demonstrate your reliability and commitment to customer satisfaction.

A good way to answer questions on Google Business Profile

If you haven’t already claimed your Google Business Profile, simply create an account, claim your business, and start adding crucial details like your address, phone number, website URL, images, and business hours. 

8. Encourage Positive Online Reviews 

Online reviews influence potential customers and play a crucial role in improving your visibility and rankings on search engines.

That’s why it’s crucial to actively encourage and manage online reviews across all your directory platforms. However, it’s not just about racking up positive reviews. You must also address any negative feedback promptly and professionally, 

For expert tips on navigating positive and negative reviews effectively, check out this handy guide by Google for Small Business.

9. Implement Structured Data and Schema Markup 

Structured data and schema markup provide search engines with invaluable information about your business, from reviews and business hours to contact details. This extra layer of data helps search engines understand your content better, improving your visibility in search results.

Interestingly, implementing structured data and schema markup is easier than you might think. With tools like InLinks, you can automatically add schema to your website and give your website an instant SEO boost without knowing a line of code. 

To get started, check out this in-depth guide distilling how the InLinks schema markup tool works.

10. Invest in Local Link Building 

Once your local SEO efforts gain momentum, you can kick things up a notch with strategic link-building.

However, link-building for local SEO isn’t quite the same as the traditional approach. Instead, it entails forging relationships with local businesses and organisations to attract links within your target location.

So, how do you go about it? Here are some specific steps you can take:

  • Collaborate on local events and sponsorships
  • Write guest posts on local websites
  • Host local workshops
  • Create localised content
  • Regularly monitor and analyse your backlink profile

Dominate Local Search 

Local SEO isn’t a quick fix. It takes consistent effort and patience to clinch that top spot ahead of your local rivals.

But by putting the strategies highlighted in this post into action and sticking to what works, you’ll gradually rise to the top of the local search rankings. It’s all about staying the course, adapting as needed, and watching your dominance in local search grow over time. 

(Thank you to Juliet John for her assistance in drafting this content).


6 Ways to Optimise Existing Content for Featured Snippet Success

During my recent talk about featured snippets at Search London , an attendee asked: “Is it worth optimising existing content for featured snippets?” The short answer to this is yes. I have done this for clients and I have seen our number of featured snippets for existing content double.

So, if you have an SEO client or project with a library of existing content that generally works well but has yet to achieve featured snippets, here is a tried and tested set of on-page optimisations that you can carry out to improve featured snippet visibility.

1. Add and optimise images

It is not uncommon for featured snippets to include a single image (or a collection of carousel images) alongside the main text. When images are shown as part of a featured snippet, they may be sourced from the same URL as the text or, as is often the case with carousels, will include images from other relevant search results.

In this result for the query where was fast the furious 9 filmed, you can see that there are four different images showing above the paragraph snippet, but only one of the images is from the same URL as the feature snippet paragraph. 

This means that web pages with images have more opportunities to attract clicks and show more prominently in featured snippets. 

To optimise for this, review existing blogs and written content to identify opportunities to:

  • Add relevant images to content with potential to rank for featured snippets
  • Update existing images to include alt text that is relevant to the featured snippet topic
  • Add image structured data to the post to make it easier for crawlers to identify the most important images on the page
  • Update the image file name to include relevant terms

2. Add infographics and charts

There are some verticals where it’s fairly straightforward to add an image into an article or blog. Content around travel, fashion, food, and the arts will have no shortage of images to include with copy. 

But, for professional services like law, consulting, and even recruitment, finding relevant images can be a challenge.

For these verticals, I recommend reviewing content to find opportunities to add the following visualisations:

  • Charts for statistics
  • Diagrams for processes
  • Infographics for explainer posts

I like to use these tactics for lower competition queries, because it’s more likely that both my content and my images will appear as part of a Featured Snippet paragraph and image carousel. 

Tools like Canva and even Powerpoint can make it simple to create these types of visuals while keeping the impact on your time and budget relatively low. And, the visuals will add more user value to your content overall. 

3. Include relevant headers

Content with clear formatting tends to earn more featured snippets. So, if you have valuable content that does not have logical and relevant H2s, H3s, or other header tags, then adding these should help your content perform better for FS and for users.

4. (Re)format content to include numbered or ordered lists

Lists are one of the most commonly seen types of Featured Snippets. Content that includes ordered lists (<ol>) with numbers or unordered lists with bullet points (<ul>) will perform better for this featured snippet type.

To optimise existing content for featured snippet lists, read through your content to see where you have lists that can be reformatted from paragraphs to bullets or numbers. 

For instance, if you have content that is originally written out as a list in paragraph form, like this: 

The Corvid family of birds includes crows, jays, choughs and magpies.

You will better optimise your page for featured snippets by listing the same information as a bulleted list, like this:

The Corvid family of birds includes:

  • Crows
  • Jays
  • Choughs
  • Magpies

And, since this is an on-page optimization that updates the format rather than the content, you can carry this out quickly with minimal delays from any client content approval process.

5. (Re)format content to include tables

Tables are another of the most common featured snippet types, with studies suggesting that 29% of all FS are tables

These snippets are extracted from HTML tables and displayed in the SERP to satisfy the query, sometimes by focusing the visible table on the most relevant sections or tabs. 

For existing content, look for blogs and pages which include banks of data like prices, statistics, dates, or entity comparisons. 

As an example, let’s say you are creating content around legendary girl group Destiny’s Child. You could write a paragraph explaining how many solo albums each member released after the group disbanded and/or you could create a table that shows the information in an easy-to-read format that is optimised for featured snippets.

Solo Studio Albums Released by Destiny’s Child members
Beyoncé Knowles6
Kelly Rowland4
Michelle Williams4
Example of table formatting

In addition to being great for featured snippets, content written this way is also genuinely helpful to users and can make the information more accessible.

6. Add supporting schema markup 

Though 66% of URLs with featured snippets include some schema markup, to be clear: you do not need to add schema markup to a post in order to be eligible for a featured snippet. 

That said, it is the case that Google sometimes replaces features snippets with schema-dependent rich results — the best example of this is featured snippets for recipes, which were extremely common in the early days of featured snippets and are now almost exclusively shown on the SERP as rich results

This means that content that is optimised for featured snippets and rich results is more resilient to these changes. Thus, taking a layered approach to your featured snippet strategy can pay dividends.

Which schema markup pairs best with FS content optimization?

Schema markup that provides additional information for multimedia and supports transparent E-A-T can help content your featured snippet content to maintain traffic as the SERP changes. It can also help new content gain the kind of authority that Google expects to see from pages it might highlight as a featured snippet. Relevant schema include:

  1. Author property 
  2. Review type
  3. Image property
  4. VideoObject type
  5. FAQPage type

Get more ROI by optimising existing content for featured snippets

Taking each of these elements into consideration, you should be able to review and update your existing content to add more value and improve your featured snippet performance.

Featured Snippets Content Across the Wider SERP in 2022

In 2022, Featured Snippets are part of the journey, not the final destination. The continued integration of topic filtering elements across the SERP means that content from ‘position 0’ is showing in many more parts of the search results. 

Where is Featured Snippet Content Displayed in 2022?

Content that has been extracted by Google for Featured Snippets can now be seen across the SERP as component parts of a multilayered topic landscape. This means that rather than being a single element of the SERP, today FS content excerpts can be seen in:

  • The classic Featured Snippet 
  • People Also Ask
  • Voice Assistant Results
  • Knowledge Panels
  • Topic Accordions
  • Topic Verticals
  • Featured Snippet Dropdowns

So content that ranks for Featured Snippets has become more widely distributed across the SERP. In this environment, sites using a topic-led approach to content will have multiple opportunities to gain visibility via content from Featured Snippet paragraphs.

1-SERP FS with SERP features 2022
Continue reading “Featured Snippets Content Across the Wider SERP in 2022”

The Secrets of Log Monitoring for the Curious SEO

When log monitoring and SEO come up, we hear lot about the value of crawl budget optimization, monitoring Googlebot behavior and tracking organic visitors. But here are a few of the gritty secrets that don’t get shared as often. 

What is log monitoring?

In SEO, server records of requests for URLs can be used to learn more about how bots and people use a website. Web servers record every request, its timestamp, the URL that was requested, who requested it, and the response that the server provided.

Requests are logged in various formats, but most look something like this:

Bot visit, identified by the Googlebot user-agent and IP address:

www.oncrawl.com:80 – – [07/Feb/2018:17:06:04 +0000] “GET /blog/ HTTP/1.1” 200 14486 “-” “Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)” “-”


Organic visit, identified by the Google address as the referer:

www.oncrawl.com:80 – – [07/Feb/2018:17:06:04 +0000] “GET /blog/ HTTP/1.1” 200 37073 “https://www.google.es/” “Mozilla/5.0 (Linux; Android 7.0; SM-G920F Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.137 Mobile Safari/537.36” “-”


Why logs are the key to SEO success

Log data provides concrete and un-aggregated figures to answer questions that are at the heart of SEO:

  • How many of my pages have been explored by Google? By users coming from the SERPs?
  • How often does Google (or organic visitors) visit my pages?
  • Does Google (or organic visitors) visit some pages more than others?
  • What URLs produce errors on my site?


Following patterns in Googlebot visits can also provide information about your site’s ranking and indexing. Drops in crawl frequency often precede ranking drops by 2-5 days; a peak in crawl behavior followed by an increase in mobile bot activity and a decline in desktop bot activity is a good sign you’ve moved to Mobile-First Indexing–even if you haven’t received the official email.



Example of a website that have switched to the mobile-first index


And if you’re having a hard time ranking pages, but they’ve never had a “hit” by a googlebot, you know it’s not worth your on-page SEO: Google doesn’t know what’s on the page yet.


Log data is also essential when analyzing crawl budget on groups of pages, when examining the relationship between crawl and organic traffic, or when understanding the relationship between on-page factors and behavior by bots and visitors.


Because this data comes directly from the server, the gatekeeper of access to a website, it is complete, definitive, always up-to-date, and 100% reliable.

Logs lines come in all shapes and sizes

Not all server logs are presented in the same format. Their fields might be in a different order They might not contain all of the same information.

Some information that is often not included in log files by default includes:

  • The host. If you’re using your log files for SEO, this is extremely useful. It helps differentiate between requests for HTTP and HTTPS pages, and between subdomains of the same domain.
  • The port. The port used to transfer data can provide additional information on the protocol used.
  • The transfer time. If you don’t have other means of determining page speed, the time required to transfer all of the page content to the requester can be very useful.
  • The number of bytes transferred. The number of bytes helps you spot unusually large or small pages, as well as unwieldy media resources.


Identifying bots is not always easy: the good, the bad, and the missing

Bad bots

Sometimes it’s hard to tell what’s a bot and what’s not. To identify bots, it’s best to start by looking at the user-agent, which contains the name of the visitor, such as “google” or “googlebot”.

But because Google uses legitimate bots to crawl websites, scammers and scrapers (who steal content from your website) often name their bots after Google’s in hopes that they won’t be caught.

Google recommends using reverse DNS lookup to check the IP addresses of their bots. They provide the range of IP addresses their bots use. Bots whose entire user-agent and IP address do not check out should be discounted, and–most likely–blocked.

This isn’t a rare case. Bad bots can account for over a quarter of the total website traffic on a typical site, according to this study from 2016.

Good bots

On the other hand, bot monitoring also produces some surprises: at OnCrawl, SEO experts often uncover new googlebots before they’re announced. Based on the type of pages they request, we’re sometimes able to guess their role at Google. Among the bots we identified before Google announced them are:

  • [Ajax] : JavaScript crawler
  • Google-AMPHTML: AMP exploration
  • Google-speakr: crawls pages for Google’s page-reading service. It gained a lot of attention in early February 2019 as industry leaders tweeted about having noticed it.


Knowing what type of bot your site attracts gives you the keys to understanding how Google sees and treats your pages.

Missing bots

We’ve also discovered that, although Googlebot-News is still listed in the official list of bots, this bot is not used for crawling news articles. In fact, we’ve never spotted it in the wild.

Server errors disguised as valid pages

Sometimes server errors produce blank pages, but the server doesn’t realize this is an error. It reports the page as a status 200 (“everything’s ok!”) and no one’s the wiser–but neither bots nor visitors seeing blank pages get to see the URL’s actual content.

Monitoring the number of bytes transferred in server logs per URL will reveal this sort of error, if one occurs.


Hiding spots for orphan pages

Orphan pages, or pages that are not linked to from any other pages in your site’s structure, can be a major SEO issue. They underperform because link’s confer popularity and because they’re difficult for browsing users (and bots) to discover naturally on a website.


Any list of known pages, when examined with crawl data, can be useful for finding orphan pages. But few lists of pages are as complete as the URLs extracted from log data: logs contain every page that Google crawls or has tried to crawl, as well as every page visitors have visited or tried to visit.

Sharing with SEA


SEA (paid search engine advertising) also profits from log monitoring. GoogleAds verifies URLs associated with paid results using the following bots:

  • AdsBot-Google
  • AdsBot-Google-Mobile

The presence of these bots on your site can correspond with increases in spending and new campaigns.

What’s really behind Google’s crawl stats

When we talk about crawl budget, there are two sources for establishing your crawl budget:

  1. Google Search Console: in the old Google Search Console, a graph of pages crawled per day and an average daily crawl rate are provided under Crawl > Crawl Stats.
  2. Log data: the count of googlebot hits over a period of time, divided by the number of days in the period, gives another daily crawl rate.

Often, these two rates–which purportedly measure the same thing–are different values.

Here’s why:

  • SEOs often only count hits from SEO-related bots (“googlebot” in its desktop and mobile versions)
  • Google Search Console seems to provide a total for all of Google’s bots, whether or not their role is associated with SEO. This is the case of bots like AdSense (“Mediapartners-Google”), which crawls monetized pages on which Google places ads.
  • Google doesn’t list all of its the bots or all of the bots included in its crawl budget graph.

This poses two main problems:

  1. The inclusion of non-SEO bots can disguise SEO crawl trends that are subsumed in activity by other bots. Drops in activity and unexpected peaks may look alarming, but have nothing to do with SEO; conversely, important SEO indicators may go unnoticed.
  2. As features and reports are phased out of the old Google Search Console, it can be nerve-wracking to rely on Google Search Console for such essential information. It’s difficult to say whether this report will remain available in the long term.

Basing crawl analysis on log data is a good way around these uncertainties.

Log data and the GDPR

Under the European Union’s GDPR, the collecting, storing, and treatment of personal data is subject to extra safety care and protocols. Log data may fall in a gray zone under this legislation: in many cases, the European Commission considers IP address of people (not bots) to be personal information.

Some log analysis solutions, including OnCrawl, offer solutions for this issue. For example, OnCrawl has developed tools that strip IP addresses from log lines that do not belong to bots in order to avoid storing and processing this information unnecessarily.


TL;DR Log data isn’t just about crawl budget

There are plenty of secrets you don’t often hear mentioned in discussions about log files.

Here are the top ten takeaways:

  1. Log data is the only 100% reliable source for all site traffic information.
  2. Make sure your server logs the data you’re interested in–not all data is required in logs.
  3. Verify that bots that look like Google really are Google.
  4. Monitoring the different Google bots that visit your site allows you to discover how Google crawls.
  5. Not all official Googlebots are active bots.
  6. In addition to 4xx and 5xx HTTP status codes, keep an eye out for errors that serve empty pages in 200.
  7. Use log data to uncover orphan pages.
  8. Use log data to track SEA campaign effets.
  9. Crawl budget and crawl rate is best monitored using log data.
  10. Be aware of privacy concerns under the GDPR.


Rebecca works at OnCrawl who were the headline sponsors at Search London’s 8th birthday party. They are still offering an exclusive 30 day  free trial or visit OnCrawl at www.oncrawl.com to find out more.

Investing in SEO Crawl Budget to Increase the Value of SEO Actions

Discussions about crawl budget often either spark debates or sound too technical. But making sure your crawl budget fits your website’s needs is one of the most important boosts you can give your SEO.

Why invest in a healthier crawl budget?

SEO functions on one basic principle: if you can provide a web page that best fulfills Google’s criteria for answers for a given query, your page will appear before others in the results and be visited more often by searchers. More visits mean more brand awareness and more marketing leads for sales and pre-sales to process.

This principle assumes that Google is able to find and examine your page in order to evaluate it as a potential match for search queries. This happens when Google crawls and indexes your page. A perfectly optimized page that is never crawled by Google will never be presented in the search results.


The search engine process for finding pages and displaying them in search results.


In short: Google’s page crawls are a requirement for SEO to work.


A healthy crawl budget ensures that the important pages on your site are crawled in a timely fashion. An investment in crawl budget, therefore, is an essential investment in an SEO strategy.

What is crawl budget?

“Crawl budget” refers to the number of pages on a website that a search engine discovers or explores in within a given time period.


Crawl budget is SEO’s best attempt to measure abstract and complex concepts:

  • How much attention does a search engine give your website?
  • What is your website’s ability to get pages indexed?



Graphical representation of daily googlebot hits on a website.

How much budget do I have?

The term “budget” is controversial, as it suggests that search engines like Google set a number for each site, and that you as an SEO ought to be able to petition for more budget for your site. This isn’t the case.

From Google’s point of view, crawling is expensive, and the number of pages that can be crawled in a day is limited. Google attempts to crawl as many pages as possible on the web, taking into account popularity, update frequency, information about new pages, and the web server’s ability to handle crawl traffic, among other criteria.

Since we have little direct influence on the amount of budget we get, the game becomes one of how to direct Google’s bots to the right pages at the right time.


No, really. How much crawl budget do I have?


The best way to determine how many times Google crawls your website’s URLs per day is to monitor googlebot hits in your server logs. Most SEOs take into account all hits by Google bots related to SEO and exclude bots like AdsBot-Google (which verifies the quality and pertinence of a page used in a paid campaign).



Visits by Google’s AdsBot that should be removed from an SEO crawl budget.

Because spammers often spoof Google bots to get access to a site, make sure you validate the IPs of bots that present as googlebots. If you use the log analyzer available in OnCrawl, they do this step for you.


Take the sum of the hits over a period of time and divide it by the number of days in that period. The result is your daily crawl budget.


If you can’t obtain access to your server logs, you can currently still use the old Google Search Console to get an estimate. The Google Search Console data on crawl rates provides a single “daily average” figure that includes all Google bots. This is your crawl budget (it will be inflated by the inclusion of additional bots).


Managing crawl budget by prioritizing quality URLs

Since you can’t control the amount of budget you get, making sure your budget is spent of valuable URLs is very important. And if you’re going to spend your crawl budget on optimal URLs, the first step is to know which URLs are worth the most on your site.

As obvious as it sounds, you will want to use your budget on the pages that can earn the most visits, conversions and revenue. Don’t forget that this list of pages may evolve over time or with seasonality. Adapt these pages to make them more accessible and attractive to bots.

Bots are most likely to visit pages with a number of qualities:

  • General site health: pages on a website that is functional, able to support crawl requests without going down, reasonably rapid, and reliable; it is not spam and has not been hacked
  • Crawlability: pages receive internal links, respond when requested, and aren’t forbidden to bots
  • Site architecture: pages are linked to from topic-level pages and thematic content pages link to one another, using pertinent anchor text
  • Web authority: pages are referenced (linked to) from qualitative outside sources
  • Freshness: pages are added or updated when necessary, and page are linked to from new pages or pages with fresh content
  • Sitemaps: pages are found in XML sitemaps submitted to the search engine with an appropriate <lastmod> date
  • Quality content: content is readable and responds to search query intent
  • Ranking performance: pages rank well but are not in the first position


If this list looks a lot like your general SEO strategy, there’s a reason for that: quality URLs for bots and quality URLs for users have the nearly identical requirements, with an extra focus on crawlability for bots.

Stretching your crawl budget to cover essentials

You can stretch your crawl budget to cover more pages, just like you can stretch a financial budget.

Cut unnecessary spending

A first level of unnecessary spending concerns any googlebot hits on pages you don’t want to show up in search results. Duplicate content, pages that should be redirected, and pages that have been removed all fall into this category. You may also want to include, for example, confirmation pages when a form is successfully sent, or pages in your sign-up tunnel, as well as test pages, archived pages and low-quality pages.

If you have prioritized your pages, you can also include pages with no or very low priority in this group.


Viewing the number of googlebot hits per day for different page categories.


To avoid spending crawl budget on these pages, keep bots away from them. You can use redirections as well as directives aimed at bots to herd bots in a different direction.

Limit budget drains

Sometimes unexpected configurations can become a drain on crawl budget.


Google will spend twice as much budget when two similar pages point to different canonical URLs. In particular, if your site uses facets or queryString parameters, going over your canonical strategy can help you save on crawl budget. Tools like the canonical evaluations in OnCrawl can help make this task easier.


Tracking hits by googlebots on pages with similar content that do not declare a single canonical URL.


Using 302 redirects, which tell search engines that the content has been temporarily moved to a new URL, can also spend more budget than expected. Google will often return frequently to re-crawl pages with a 302 status in order to find out whether the redirect is still in place, or whether the temporary period is over.


Reduce investments with few returns

User traffic data, either from analytics sources such as Google Analytics or from server log data, can help pinpoint areas where you’ve been investing crawl budget with little return for your efforts in user traffic.

Examples of pages you might be over-investing in include URLs that rank in the first few pages of the SERPs but have never had organic traffic, newly crawled pages that take much longer than average to receive their first SEO visit, and frequently crawled pages that do not rank.



Crawl rate for pages that receive no organic visits in strategic page groups: before and after implementing improvements.

Returns on crawl budget investments

When you improve how crawl budget is spent on your site, you can see valuable returns:

  • Reduced crawling of pages you don’t want to rank
  • Increased crawling of pages that are being crawled for the first time
  • Reduced time between publishing and ranking a page
  • Improved crawl frequency for certain groups of page
  • More effective impact of SEO optimizations
  • Improved rankings

Some of these are direct effects of your crawl budget management, such as reduced crawling of pages you’ve told Google not to crawl. Others are indirect: for example, as your SEO work has more of an effect, your site’s authority and popularity increase, increasing your rankings.

In both cases, though, a healthy crawl budget is at the core of an effective SEO strategy.


OnCrawl is a technical SEO platform that uses real data to help you make better SEO decisions. Interested in monitoring or improving your crawl budget, as well as other technical SEO elements, and in using a powerful platform with friendly support provided by experienced SEO experts? Ask them about their free trial at the Search LDN event on Monday, February 4th where they are headline sponsors.If you cannot make it for whatever reason, visit OnCrawl at www.oncrawl.com.


Media Sponsors with Search London

We are really pleased to be a media partner with Search Elite and they have two tickets to give away, worth £600 for their Search Elite and Conversion Elite event June 6th.

Search Elite is for the technically minded who need to get their geek on. It is bringing you lots of actionable takeaways for an enormous ROI covering:New Interfaces – voice, visual, apps, video, plus next generation technical SEO – structured markup, performance optimisation.

Slido have been supporting Search London for years and we are very happy to be a media partner for the birthday party. Sli.do is an audience interaction platform for meetings and events. It allows event organisers to crowd-source top questions for Q&A sessions, get instant feedback via live polls, which will be great for our presentations on Tuesday !

We look forward to seeing you next week.


Search London’s 7th Birthday Party

We are excited to announce the 7th birthday party at Urban Golf in Smithfield, on Tuesday, January 30th from 6:00pm.

Come and celebrate Search London turning 7 years old with food and drink and of course cake !

Plus as we are at Urban Golf we have exclusive use of the whole place including 7 golf simulators. Practice your golf and network, all while being indoors.

Our Headline Sponsor is Searchmetrics with Malte Landwehr presenting “In China, 2018 is the Year of the Dog.”

Malte Landwehr is the Director for Product Marketing and Product Solutions at Searchmetrics. He has more than 10 years of experience in search marketing and is a regular speaker at conferences across Europe. Before joining Searchmetrics, Malte worked as a project manager and consultant for digital strategy at a management consultancy.

Searchmetrics logo

Since 2018 is the year of the dog, Malte will talk about the Mobileman Pinscher, the Content Terrier and the Shetland Speechdog. After the talk, you will know how ranking factors are different on desktop and mobile devices, and how to create better content that resonates with your readers. And of course, there will be some nice statistics about voice search and digital assistants that you can drop in your next meetings.

You cannot have a birthday party without food and thank you to ConsiderableInfluence (CI) for sponsoring the food.  CI is an influencer marketplace, working primarily with bloggers and their social reach. Already with more than 700 influencers active on site and ready to work on your campaign. Influencers are across Travel, Food, Lifecycle, Fashion, Tech & Gadgets in many countries. Sign up for free today.

Considerable Influence

Pierre Far will be presenting “It’s 2018, can we actually make SEO be about the user for real?”

He will cover the following topics to illustrate how SEO is about users:
– Mobile-first indexing, and technical SEO in general
– Page speed
– IA, tying it back to SEO

Pierre is a digital product management consultant, with a specialty in SEO. Prior to founding Deliberate Digital and Blockmetry, Pierre held several roles at Google and the technology sector in the UK, including product management, community management, innovation consulting, and online marketing.

Pete Reis-Campbell will be sharing his thoughts on “Voice Search and AI in 2018”. Pete is the owner and founder of the award winning agency, Kaizen which he set up just 4 years ago and now has a team of 15. Kaizen is a Content Marketing agency based in the Tech Hub of London.

We could not put onsuch a big event without our sponsors and Ricemedia will be one our drinks sponsor for the evening. Ricemedia  are a Birmingham based Digital Marketing Agency, specialising in SEO, PPC and Social Media. Founded in 2001 we have grown to a team of 24 and have worked with clients from Superdry, Hardly Ever Worn it and Aston Villa. You may have seen us at BrightonSEO, Digital Olympus, Internet Retailing and here: we’re slowly muddying our Brummie expertise into the Search world.

Laura Hogan from Rice Media will be speaking about “5 Minutes, 5 Tips: Go!”

Thank you to Distilled for being our drinks sponsor as well. Distilled  is an award-winning online marketing agency that believes in sharing knowledge through its blog, industry leading conferences and interactive training resources. With offices in London, New York and Seattle, the team works to build exceptional online businesses. Tim Allen, will be speaking about “SEO Split Testing 2018”

Spaces for the party are limited, only 100 tickets available. Purchase your ticket on Eventbrite to reserve your place.

If you would like to be involved in our birthday party, please contact @SearchLDN.