Skip navigation

Companies Losing More Than £300m In Lost Sales Through Poorly Performing Websites

23, October 2007 – UK e-commerce sites are jeopardising more than £300m* in lost sales each year and risk driving customers to competitors because of ‘invisible errors’ that can’t be detected by web analytics.

One third of the consumer online journeys tested by SciVisum experienced more than three per cent error rates, while more than ten per cent demonstrated extreme inconsistencies in delivery speed of the journey.

“This is a worrying trend for eCommerce and IT directors and for consumers with Christmas just around the corner. Poor performance and web errors will mean lost sales,” said Deri Jones, CEO, SciVisum. “Companies wanting to maximise their online sales this Christmas need to check the performance of their sites now. Those that fail to do so might as well include a link to their competitors’ site.”

These are the key findings of the SciVisum Lost Online Sales Study, which investigated the performance of 40 online sites from the retail, finance, insurance and travel sectors, over a period of six months. The study confirmed that customers of e-tailers are being exposed to a significant number and range of problems online which prevent them finishing their desired journeys, but are invisible to existing tools and web analytics.

Invisible errors

“Invisible errors are not outages affecting 100% of users, but are problems that impact a percentage of users at any point in time. A problem that impacts say 1 in 100 random users on a particular journey is not reproducible by IT teams, and so frequently remains unresolved,” said Jones.

SciVisum's testing adopts a mystery-shopper approach that actually visits the site and attempts to make a user journey every five minutes throughout the day. This allows the company to see what customers see, and makes it possible to identify a range of intermittent problems that impact real users, but that are invisible to any other analysis. These problems include:

- Session swap: where two users see each others' online sessions. Nowhere is this category of problem detected in server or analytic logs.

- Page not delivered errors: because the page is not delivered, there is no log of the error in web analytics.

- Jump back: the user is in error forced back several pages: the new page is itself a valid page, so no errors logged in analytics or tech logging.

- Page content incomplete: web analytics logs only that a page was delivered, not whether it showed the user what they expected.

- Shopping Basket errors: e.g. basket is empty after adding items. Nowhere is this category of problem detected in server or analytic logs.

Need for speed

The research also highlighted massive inconsistencies in the delivery speeds of the journeys that users undertake. More than thirty per cent of journeys experienced performance varying by more than 200 per cent, with one in ten varying more than 300 per cent, data averaging over a seven day period.

Variations in performances means that returning visitors will be frustrated on the occasions when the websites run slower, while first time visitors are likely to be driven to competitors. Because different technology blocks are used to deliver the different routes that customers follow, those journeys provide significantly different experiences, even though they run on the same website

Frustratingly for consumers, website performance was shown to be most commonly worst in the evenings between 8pm and 10pm. This often coincides with peak traffic levels, meaning that the site performs worst when it will inconvenience the most visitors. This is often invisible to the eCommerce manager, because simplistic measures of overall page-speed averages per day hide the fact that just one or two of the core user journeys perform really poorly for a couple of hours each day

Unacceptable behaviour

“As a specialist web tester, we’ve come in to contact with invisible errors for some time now, but it was only when conducting this research that the extent of the problem became so apparent. The UK’s online landscape is plagued by these errors and as users continue to become web savvy and increasing numbers of people encounter them, they won’t remain invisible for long,” said Jones.

“Interestingly, we often find that our clients Call Centre folk are aware that there are problems impacting users that are invisible to their own colleagues in other departments. It seems that a deaf ear is turned to their feedback; being repeatedly told that 'no problem was found' or that the problem 'couldn't be reproduced by the tech team'. That last phrase is a typical internal response to the kind of sporadic, invisible errors we found, and means that the underlying problems can never be addressed," added Jones.

Recommendations

Based on the findings, SciVisum made a number of broad recommendations for e tailers to improve their performance:

1. Adopt a Mystery Shopper approach and test the paths that real users take when making use of your company’s website. Simple uptime/downtime monitoring of your home page and/or a few main pages simply won't reveal invisible errors - 24/7 functional monitoring, running multi-page user journeys that mimic real users' product finding and purchasing transactions on-line is what is required.

2. Focus on relevant performance data. There is a wealth of website performance data available to firms, and this has in part contributed to the continued prevalence of invisible errors. To help detection, companies must focus on using data gathered by simulating real user journeys and experiences, not data from internal servers or monitors

3. Business people must take ownership of the issue. During the study we have found that it is business and marketing personnel that are most aware that there is an issue with their website. Yet when the issue is raised with the IT department, they are fobbed off with an avalanche of data and metrics that indicate the site is performing well from a metrics perspective. Business people must push back when this occurs and only accept data that is relevant to a user’s experience.

Methodology

SciVisum tested 52 multi-page user journeys for 40 different online organisations over the six month period April to September 2007. SciVisum’s multi-page journeys measured at five minute intervals and were defined with each client so as to follow the same core path that real users take when they make use of an online portal. These journeys measure the vital Money-Making or Customer-Servicing features of a website including Add-To-Basket: browse and select; Add-To-Basket: text-search and select; and Checkout and Pay. Five minute sampling equates to approximately 300 measurements over each 24 hour period.

Each page of a journey is checked to ensure that the correct and expected content has been served. Speed timings are also measured for the total journey, and for each page that makes up the journey.

A management report detailing SciVisum’s Lost Online Sales Study is available to download at: http://www.scivisum.co.uk

-ends-

Notes to editors:

* Total online sales for 2006 reached £30.2bn according to IMRG. SciVisum’s Lost Online Sales Study revealed 1 per cent of journeys experienced significant error numbers.

About SciVisum

SciVisum is a UK based web site testing specialist, helping clients to reduce lost sales online by identifying where and when user experience suffers.

The services provide vital data not available by web-analytics or other web monitoring:

- when invisible errors impact users but are invisible to the in-house teams

- when wrong or missing page content forces users to abandon their purchase journeys

- what % of marketing campaign traffic is lost due to under-capacity in one or more vital steps such as 'add to basket' or 'checkout' pages.

The company's services measure the performance and functionality of client's business-critical on-line systems. Using the multi-page User Journeys approach to measurement, SciVisum’s metrics provide real time KPIs and act as a common language between the business and marketing teams who work daily with journey concepts of Add-to-Basket, Checkout, Register, pay-online, login and etc; and the web technical teams who need precise input as to which step of which journey is under-performing, when and how, in order for them to most effectively apply technical resources to close the problem gaps.

Through SciVisum's testing and recommendations, clients are able to substantially increase visitor rates and customer satisfaction levels by achieving gains in key journey delivery times, increasing ability to handle peak load levels, and reducing sporadic but user-numbing error rates of 1 to 5% that most sites un-wittingly force on their users.

Clients come from a wide range of sectors and include Tesco, Boden, T-Mobile, Virgin Retail, Shell, Jessops, Gold Medal Travel, Ann Summers, Hertfordshire Council, Premium Bonds, Scottish & Southern Energy, BDO Stoy Hayward, National Savings and Investment Bank and uSwitch.

Test deliverables include:

SV-Monitor: 24/7 measurement of customer experience by User Journey

SV-Load: Load testing /Stress testing of User Journeys

SV-Access: Accessibility testing to the WAI guideline

SV-Function: Functionality & troubleshooting audits and consultancy on web performance

Media contact:

Sam Grace/ Stephen Waddington
Rainier PR
Telephone: +44 (0) 20 7494 6570
Email: sgrace@rainierpr.co.uk/swaddington@rainierpr.co.uk

This press release was distributed by ResponseSource Press Release Wire on behalf of Speed Communications in the following categories: Consumer Technology, Business & Finance, Computing & Telecoms, for more information visit http://pressreleasewire.responsesource.com/about.