Outdoors Sector Ecommerce Audit Report for May
Over the past 5 months, I’ve used these audit reports to not just give people in the outdoors sector visibility on data and comparisons that are difficult to find elsewhere, but to convey some of the detailed underpinnings of highly-effective brand or ecommerce websites. In some cases, I’ve been very explicit about the reasons for success and failure.
Some industry players accept that they, and the sector, are falling behind and have engaged with this data and analysis very positively, but they’re still in a minority. It strikes me that, in a relatively niche and inward-looking sector (I’ve worked in very similar and very dissimilar sectors so feel qualified to make this judgement), the confidence with which people approach product development or traditional forms of selling or marketing largely evaporates when it comes to digital transformation, communications and commerce. That’s a major problem in a world where everything from supply chains to customer behaviour is being altered by digital developments.
OK, that’s the rant over, so back to the audit reports. Instead of picking out key snippets this month, I’m going to run through a synopsis of what gets covered in the DigitalCrux audits to illustrate that the remedial actions are often numerous, but entirely achievable on short timescales (you might want to compare it with any audits that you already get).
The way that the DigitalCrux audits are constructed ensures that pretty much all the technical elements that might affect user experience or search engine performance are measured in both absolute terms and month-on-month comparisons (the latter being an important measure of internal actions and likely future performance). The overall score in the charts below combine all this data and weight different aspects depending on their likely importance - for example, high numbers of missing pages (404’s) will be more heavily penalised than high numbers of missing image alt tags, as they impact more heavily on user trust and search engine performance.
The summarised elements of the audit are:
A general health check looking at whole issues, such as the presence of a robots file, whether http requests redirect to https, the presence of legacy technologies etc.
An analysis of up to 5000 pages per website across 18 factors, including missing pages, blocked pages, overlarge page sizes, canonicalisation, etc.
Meta data analysis (as above image) across 8 factors that carries a high weighting for search engine effectiveness, but not usability, including missing or duplicated meta data (e.g. page titles and meta descriptions).
Analysis of the visible page content across 9 elements that carries a high weighting for both user experience and search engines. Again, this focuses on missing and duplicated content, but also pages with not enough (or too much) content.
A 12 factor analysis of internal and external links with two very different purposes. The internal linking and outbound linking analysis focuses on poor implementation, whilst the inbound links analysis looks at the number and quality of the links - as most of you will know, this a critical driver of search engine visibility.
A separate analysis of images, with the key metric being missing alt tags (EVERY image should have one, if only for SEO purposes).
Finally, a detailed optimisation analysis, split by mobile and laptop emulations, that identifies problems in the speed and manner of page loading in browsers - a definite usability and search engine issue.
If you managed to stay with me through all the above, well done! The point I’ll now make is that for all the factors listed above, the audits also tell me exactly where the problems are and that’s why I made the earlier comment about there being potentially numerous things to fix, but they’re not usually that difficult. For example, my full report for any for any of the companies in the charts tells me which pages can’t be found (so I can 301 redirect them, at least), which pages have duplicate titles (very common) and exactly what is slowing down my page load time (maybe the outdoors retailer with a megamenu on every page that has non-optimised images that could load 40% faster!).
It may not be sexy, but it is the stuff that you’ll find on the regular ‘to-do’ lists of the digital marketers in really effective retailers and brands.
So, without further ado, here are the scores for retailers (hover over the columns for the actual scores):
Outdoors Retail Website Audit Scores - March 2019
The first thing to note is that, for the moment, Blacks and Go Outdoors have been removed from the index - Blacks because of the ongoing issues that I’ve commented on before and Go Outdoors because they appear to now have the same issue, although that’s maybe coincidental, as their tech stacks are very different.
Last month, things weren’t looking so rosy for AlpineTrek (see previous comments), but they’re now the only brand website to challenge Gaynors score since the index started. Their return to form this month is mostly a result of some focused cleaning-up, with the heavily-weighted ‘Important Fixes Required’ dropping from 58 to 38 - mostly through tackling issues with key pages.
Cotswold deserve mention for their steady progress, although their score could easily be improved with some concentrated tidying-up work - they have 296 ‘Important Fixes Required’ on the Pages analysis alone.
Conversely, Ellis Brigham continues to struggle and one has to assume that that they’re happy to focus on strong visual content to mitigate the impact of a weak site, as their performance continues to steadily decline.
Outdoors Brands Website Audit Scores - February 2019
On the brand website front, and at the risk of appearing lazy, I’m going to refer straight back to last month’s commentary on Rab’s platform change, because it’s surprising to see a migration, especially on Magento 2, have such a negative impact these days (good web teams know the pitfalls), but even more surprising to see many of the problems still evident a month later. Rab’s score has returned to nearer its past average, but it’s still below-par, whether measured against similar ecommerce operations or good-quality Magento 2 implementations - and that migration won’t have been cheap or resource-light.
There’s clearly been some work done in several areas, including improving the internal linking, meta data and Google indexing, but significant areas remain untouched or incorrect (e.g. XML sitemap) and the audit currently shows over 1000 ‘important fixes required’.
Moving on, the continued improvements for Arcteryx are impressive, especially given the multi-territory nature of their website structure. This structure does have one crucial benefit, however, in that the rankings and domain authority that result from multiple powerful backlinks are concentrated into a single domain filter down and across multiple territories. Hence their stellar Domain Authority of 67/100 (for comparison, Berghaus is doing well at 54).
The comparison with Berghaus is appropriate as, in many ways, their sector-leading audit scores both result from having sites with small technical and content footprints - Berghaus typically staying around 300 front-end pages and Arcteryx around 200 - making optimisation and site management much easier.
As a final note on the audits this month, TNF must retain incredible strength in their brand equity, paid search, visuals and product ranging in order to have a commercially successful website (which their financials suggest it is) on the back of an under-par technical and content implementation (and they’re the only site in the audit to see a fall in Domain Authority in the past 3 months). It’s surprising, because significant improvements would not be expensive or soak-up very much resource.
Please note that next month’s audit report will complete a full 6 months of continuous retail and brand audit data that will stay on the site. Having had various people ask detailed questions about the audits this year, in future I’ll be using the DigitalCrux datasets to illuminate more focused articles that deal with specific issues. If you have any comments, feel free to contact me.
How Does the Website Score Work?
Some readers will be familiar with companies that offer or even send unsolicited ‘audits’ in an attempt to sell you website upgrades or SEO services. These scores are very different and use a very deep and broad set of indicators that combine the good, the bad and the ugly of a website’s platform, technical implementation, content and user experience. It also take into account the changing requirements placed on websites over time, such as smartphone performance.
It’s the same system that Resonant has used for a number of years for large ecommerce clients (although they obviously get the benefit of very detailed breakdowns of their strengths and weaknesses).
Fundamentally, improvements in audit scores, especially relative to close competitors, nearly always lead to measurable improvements in performance - the trick is to improve those areas that will most contribute to your business objectives, which might be product engagement and mobile-focused stockist signposting for brand owners or intuitive navigation and fast page-loads for retailers.