As an e-commerce professional, you likely look at dozens of metrics each week to assess your site’s success. From visits to traffic value to conversion rate, there are plenty of metrics you can use to understand how your site’s current setup is performing.
An often-overlooked metric for e-commerce websites, though, is crawl depth. This metric can tell you a lot about user behaviors like bounce rate and exit rate, and it may also explain gaps you’ve seen in your search engine optimization efforts.
In this article, we’ll introduce crawl depth and its importance in e-commerce.
We’ll then analyze five major e-commerce websites to help you understand best practices surrounding crawl depth for your website.
So, if you’re looking to improve user experience and enhance SEO, then read on.
What is Crawl Depth and Why Is It Important?
Before we get into the best practices, we first must define crawl depth.
Crawl depth is the number of links that a crawlbot has to crawl through to reach a particular page on your website. This is also sometimes referred to as click depth, which is the number of clicks a user must perform to get to that same page.
What does this look like?
Think of the homepage as position zero. You click a top-level category in the navigation. That’s one click. Then you click into a lower-level category from the category page. That’s two clicks away from the homepage. You finally click into a product page. That means your product pages are three clicks from the homepage.
So, why is crawl depth so important?
When thinking about crawl depth, or click depth, you need to think of user experience and content indexing.
As it relates to user experience, the more clicks from the homepage to an important page on the site (like a product page), the poorer the experience is for the user.
A lot of required clicks likely indicate that the site architecture is illogical and poorly designed. If users spend a lot of time clicking around, they may get confused or frustrated. The more clicks, the greater odds they get distracted as well.
All of that to say, a lower click depth is important for user retention.
Crawl depth isn’t just important for real-life users, though. It’s also important to reduce crawl depth so that more of your content is indexed by search engines.
Search engines crawl your website using crawl bots. Contrary to what you might think, not all of your site’s URLs are indexed. It simply comes down to a matter of bandwidth.
You can increase the percentage of your site that is indexed by removing old pages and reducing average crawl depth. The less URLs to cover, and the lower your crawl depth, the more of your important content can be indexed.
The more content indexed by Google, the more content will show on Search Engine Results Pages (SERPs). So it’s really a win-win.
Crawl Depth From 5 E-Commerce Giants
If you’ve been in e-commerce for any length of time, you know that best practices are often derived from the top names in the industry. So, what are the biggest and most profitable websites doing?
To answer this question, we’ve analyzed the average crawl depth of five e-commerce giants:
- Nike
- Patagonia
- REI
- Athleta
- Aerie
In particular, we used Screaming Frog to find and analyze the crawl depth as it pertains to category pages and product pages.
What did we find?
Nike: Streamlined Navigation
When analyzing crawl depth, you have to have an idea of the site’s structure. That is, what are the most important pages and how do you get to them?
With Nike, the most “important” URLs include “/w/” and “/t/”. The pages that include “/w/” are the category pages. These are where the bulk of the products live. The pages that include “/t/” are the product pages. These are the single-item pages where items can be added to the cart.
The start URL is the homepage. This is true for Nike as well as the other four e-commerce sites we’re analyzing. The crawl depth, then, will tell us how far away from the homepage these high-priority pages live.
So, what’s the average?
More than half of the URLs (53.6%, to be exact) are two clicks or less away from the homepage. This is ideal. The bulk of the remainder of the URLs (40.92%) are three clicks away. This means that almost 95% of the site’s URLs are three clicks or less away from the homepage.
Patagonia: A Breadth of Categories
Patagonia has a large number of category pages and product pages. While this could be problematic for other reasons (site load time, product cannibalization, etc.), it’s not necessarily problematic for crawl depth. That is, as long as the bulk of these pages are a reasonable distance (i.e., three clicks or less) from the homepage.
On Patagonia, category pages are denoted by URLs containing “/shop/” and product pages are denoted by URLs containing “/product/”. A significant number of Patagonia’s pages (36.58%) were three clicks away from the homepage. There was a small amount – 1.95% to be exact – that were only two clicks away from the homepage.
REI: Logical Subcategorization Breakdown
Due to the constraints of our scraping tool, we could only crawl 91% of REI. However, the data found is still a good indicator of how REI organized its pages.
Similar to Patagonia, REI has a large number of category pages. These include “/c/” in the URL. Product pages are denoted by “/product/” in the URL.
So, what is REI’s average crawl depth?
According to our data, less than half of the category pages were two clicks away from the homepage. This leaves a whopping 65.7% of category pages being three clicks or more away from the homepage.
This is likely due to REI’s web structure which relies on numerous subcategories under each category. This means product pages are four clicks away or more from the homepage.
While a crawl depth of four or more may not be ideal, that’s not to say that you shouldn’t sacrifice crawl depth if there are other benefits. For example, if you have a large number of products, it may make sense to use more subcategories to make navigation more logical for your customer.
Ultimately, whether or not this is the right decision is largely based on consumer behavior, so you need to be ready to test and monitor regularly. For example, if a website owner notices a drop off in engagement after a certain point in the customer journey, it may show customers are starting to lose interest or get frustrated at that point.
Athleta: Subcategories Upon Subcategories
Athleta has the widest range of crawl depths, with less than 0.50% of URLs being two clicks or less away from the homepage.
Instead, URLs containing “/browse/”, which includes both Athleta’s category pages and product pages, had a wide range, with 86.39% of URLs being seven or more clicks from the homepage.
This is far from an ideal experience, both for users and crawlbots.
The excessive crawl depth indicates that Athleta relies heavily on subcategories within subcategories. This is a poor organizational experience for users who are likely not used to such a specific subcategory breakdown.
With an average crawl depth of 7 or more, this also means that many of Athleta’s category and product pages are not indexed by Google. This will result in fewer rankings on SERPs, which can directly contribute to lower traffic and revenue.
Aerie: Quick and Clean Categorization
When analyzing Aerie, we look specifically at URLs containing “/c/”. These are indicative of category pages.
Aerie has a similar average crawl depth to Nike. A good portion of Aerie’s URLs (86.94%) are three clicks or less away from the homepage, with more than half of those URLs (51.31% exactly) being just two clicks or less away.
This means the bulk of product pages are just three or four clicks away from the homepage. This is ideal for the user experience but also for search engine indexing purposes.
What We Learned From Our Data
Four out of the five sites we analyzed had their most important pages – category and product pages – within two or three clicks away from the homepage.
Athleta was the only site with the majority of their most important pages 7+ clicks away from the homepage.
This is a sub-optimal experience, both from a customer and SEO perspective.
For customers, the need to nest so deep into the website before finally hitting product pages will likely lead to a lower conversion rate. This could be because the customer gets lost, they become frustrated with the funnel, or they simply lose interest.
From an SEO perspective, Google has crawl budget constraints which may make crawling these deeply-nested pages impossible. This means less representation on SERPs and perhaps even inaccurately reported data to the search engines.
Does that mean that less clicks is always ideal?
No, not at all.
It’s important to balance click depth with ease of utility. Athleta’s deep nesting strategy is obviously working for them as they’re a top brand in the fashionable sportswear industry. This is likely because their offering is so varied that more categories and subcategories – and therefore, greater click depth – makes more sense from a customer navigation standpoint.
FAQs
Do you have more questions about crawl depth? Here are answers to these frequently asked questions.
Crawl depth is the number of URLs a search engine bot must “crawl” through to get to a specific page. This is also referred to as click depth which is the number of clicks it takes for a user (as opposed to a search engine bot) to reach that page.
While variations can occur, a good rule of thumb is to keep priority pages as close to the homepage as possible. Three or four clicks seems to strike a good balance between low click depth and ease of consumer navigation.
The further your users need to travel on a website to add products to cart, the lower your conversion rate is likely to be. Also consider that crawl bots will have trouble indexing pages that are deep within your website which results in fewer rankings on SERPs.
Not necessarily, but it does bring its own unique challenges. For example, if you have a very large product selection, having a robust sub-navigation with many accessible categories is a UX benefit. However, it introduces a greater risk of cannibalization from an SEO perspective. So it’s really important to make sure your UX meets your customer needs, and then you optimize SEO based on that.
Conclusion
If user experience and conversion rate optimization are important for your e-commerce website, then reducing crawl depth is a must.
Don’t worry, that doesn’t mean you need to completely restructure your site. There are a few adjustments you can make to help both users and crawl bots more effectively navigate your site. These include:
- Creating a robust internal linking strategy
- Consider your customer needs and determine if you need more or less sub-categories, catering your SEO to either approach
- Changing your category page pagination
As e-commerce professionals, we don’t have control over every aspect of our website and its performance. So, when we come across something that we can control – like crawl depth – we should do our best to optimize it well.
What steps will you take to begin to reduce your site’s crawl depth?
See How My Agency Can Drive More Traffic to Your Website
- SEO - unlock more SEO traffic. See real results.
- Content Marketing - our team creates epic content that will get shared, get links, and attract traffic.
- Paid Media - effective paid strategies with clear ROI.
Are You Using Google Ads? Try Our FREE Ads Grader!
Stop wasting money and unlock the hidden potential of your advertising.
- Discover the power of intentional advertising.
- Reach your ideal target audience.
- Maximize ad spend efficiency.