Google Search Console (GSC) offers some great data on how visible your website is and any problems Google is finding with it, but gaining insight from the information and what that means for your business isn’t always as simple as we’d maybe hope.
This guide isn’t about how to set up Google Search Console on your website – you can read a great step-by-step guide on that by clicking here.
GSC currently holds up to 16 months of data (vs the 90 days it used to hold), which allows you to compare your current three months of data directly to the same period last year, as well as looking at any range within that as a whole, or part, when needed.
By default, reports generally show you the last three months of data at once, but you can easily edit this view to one of the set options or a custom date range.
The essential reports for retailers that we’re going to dig into are:
To get an overview on how your site has performed over the last period of time, you can choose the ‘Performance’ report from the main left hand bar, and set the filters as you wish.
The search type refers to the type of traffic, so in addition to ‘web’ searches, you can also look at the data for image searches or video searches, if that is a significant source of traffic for your site. For the purposes of this guide, we’ll stick with ‘web’, as this will usually account for the majority of traffic for retail websites.
You can filter the timescale of data that you view to see anything from the last 7 days, to the full 16 months of stats.
Clicking on the date filter will also bring up the option to compare data. Choose one of the standard options or input a custom date if that better suits your needs e.g. if you published SEO-related updates to the site on a specific date and want to see the impact this has had on organic performance.
You can also compare different types of traffic e.g. the web and image categories.
In GSC, the term ‘queries’ relates to the search phrase, or keyword, used by someone when searching via Google, for which your site was given as an organic result. This information used to be available in Google Analytics (GA), but from about 2011 onwards, Google started to encrypt this data, so the vast majority of traffic now shows as being attributed to (not provided) in GA. In GSC, you can access more of these queries again – and vitally, see the number of SERP impressions that your site has gained as a result of this search term, as well as the number of clicks the search result gained, in the period of time you’re filtering by.
For many well-known brands, the biggest keyword drivers will usually be brand name related; however, finding the non-brand keywords used by people to find you can give great insight into how well your SEO activity is working and the growth or progression of those terms in bringing organic traffic to the site.
You can filter the queries viewed in this report to show only those not containing your brand name by adding a new filter alongside the search type and date.
Choose query and you can set the filter to include only keywords that don’t contain a reference to your brand to quickly review the non-brand visibility of your site in Google search results.
Understanding which keywords users are seeing your website for in SERPs can help you see if your current SEO strategy is effectively targeting the terms that are most relevant to what you sell. You can chart the progress of keywords that you are actively focusing on by comparing performance over time and use this data, in combination with info from your analytics platform, to help shape your stratgey and focus going forward.
If you sell luxury suitcases and curtains, you wouldn’t want people looking specifically for cheap products to see your website in their search results. This could explain a very low click-through-rate (CTR) from the SERPs or a high bounce rate for those that do click through. It’s unlikely to happen in such an extreme way as that example, unless your products have recently changed from one to the other; but, for retailers selling products that can mean more than one thing e.g. Tablet cases for consumers vs tablet cases for retail point of sale solutions, the data you find in GSC can help you to determine whether your efforts should be more focused on long-tail queries that have less volume but are much more relevant to what you offer and therefore are more likely to reach a more qualified audience.
Average position for keywords isn’t really a useful indicator of performance on this level. This is because the more new keywords your website becomes visible for as your site is better optimised, the more the average position increases, when you’ll naturally want it to decrease. This figure can be misleading because it can be far more profitable for retailers to rank for a wider selection of keywords that result in sales vs being in a higher position for a smaller number of keywords that don’t equal as much revenue because they aren’t as relevant or are super-competitive.
GSC also allows you to look at the same impressions and clicks data for various pages of your website, helping you to quickly see which pages are appearing most in the SERPs and which are achieving the most clicks through. You can also filter by metrics such as CTR, to discover pages that are getting a higher ratio of impressions but lower clicks (or the reverse).
This data can highlight pages that require some further optimisation; if people are seeing the page in results but not clicking through the result as often as other pages of yours, what is different about this page? Does the meta description include a compelling CTA? Is the page title a good description of the content of the page?
As with queries, benchmarking and charting the progress of these metrics over time will help you to measure SEO success, and show you areas that are underperforming, so that you can take action and use your resources in the areas where most needed.
Being able to analyse how your site is performing in SERPs across devices can play an important role in highlighting potential issues. Using the data, either as a whole or in comparison, can help to flag issues such as a disparity in CTR between devices. This kind of data can help you discover areas that you need to focus on and gives you a benchmark to use to measure progress in reports to stakeholders.
Crawl errors can be fairly common on retail websites because many sites are set up to make product pages disappear from view when the product goes out of stock or is discontinued, although there are a number of other reasons why a retail brand website might have this type of problem. A crawl error essentially means that Google bots cannot crawl the specific page or part of the site, and therefore the page(s) won’t be indexed and show up in people’s search results when they Google relevant terms.
On the GSC overview dashboard, you can scroll down to ‘Coverage’ to view at a glance the number of pages that Google can index on your site, and the number that it can’t.
Clicking into this report will show more details about the type of crawl errors found (and when they were discovered) which can help you track down the issue and give you a starting place to try and resolve the problem. The screen may look similar to the below:
From the above, you can see that there are 704 pages that Google has no problem crawling, plus another 4.17k pages that Google thinks have been excluded on purpose (often via robots.txt). There may also be pages showing as valid with warnings, which means Google can crawl them, but they do have some conflictions or other issues which makes Google think that perhaps you don’t want them to be crawled and shown to people searching. If your site has pages that cannot be crawled at all, they will show here on the top left and you can scroll down to click on each type of error for more details e.g. when it was spotted by GSC, and what page or types of page have this same issue.
Clicking on each of the errors in the list will give you a link to GSC information on what could be causing that specific type of issue and how it can be resolved. Once you think that the issue should have been fixed, you can click on the ‘Validate Fix’ button for each error to test whether it has been resolved. However, it can often take between 2-4 weeks Google to run this process, so it’s not a quick job to check!
If crawl errors isn’t something that has been looked at before, it’s important to be aware if there are important pages or areas of the website that are unintentionally inaccessible to Google, as this could be meaning you are simply not as visible as you should be in SERPs. Resolving these errors and issues is often one of the first tasks done when looking at overall website optimisation from a technical SEO point of view.
Keeping an ongoing eye on the number and nature of crawl errors (and the number of indexed/valid pages) is an important part of any SEO maintenance and site health-check activity, and if this is being managed well, the percentage of crawl errors should go down over time. More errors may appear as new things or sections are implemented on the website, but these can be resolved as and when required, and keeping the page or site crawl errors to a minimum will benefit your overall Google visibility.
As with the crawl errors, you can click on each error line in the list to see more details, a list of the affected URLs and some help information to enable you to resolve the problem.
Mobile traffic is rising all of the time for most retailers, so it’s vital to make the user experience as straightforward and intuitive as possible to maximise the number of transactions and increase your revenue. If someone has a poor or frustrating experience whilst trying to shop for something when on their mobile, they are highly unlikely to try again with another device – it’s much easier to simply go to another retailer that makes shopping via mobile really easy. Prioritising mobile usability is a good way to increase your mobile conversion rate, and it’s a great idea to closely monitor conversion rate in GA alongside any changes you make to the mobile UX, so you can correlate the changes with the result and report on this to stakeholders too.
XML site maps are designed to increase the ‘crawlability’ of your website for search, and are essential for large retail sites with a high number of pages and brand new retail sites, but are useful for sites of any size or age from an SEO point of view.
Some larger sites can have multiple site maps that each cover a different section of the website, but for many retailers, unless they sell many tens of thousands of different products, a single site map (that may be segmented internally) is usually all that is required.
You can submit your XML site map through GSC and it will flag up any problems that occur, showing when Google last read it and how many URLs it has indexed in each section.
An XML site map isn’t simply a list of URLs for search engines to crawl, it also give indications to Google about site structure, showing which pages you think are most important, which can ultimately play a part in whether/how your website is shown in SERPs.
Keeping an eye on the site map(s) submitted through GSC is an important task that only takes a few seconds each time, but it means that if any issues are flagged with an existing site map, you can react quickly to resolve the problem and ensure your site remains as crawlable as possible for maximum search engine visibility.
Take advantage of our FREE website audit offer by clicking the button below.
The free audit can help you build your retail digital marketing strategy by highlighting some of the weaker areas of your site.