Our first foray into user testing with live participants

dug dug Follow Jun 10, 2010 · 3 mins read

AFFILIATE LINKS

properbroadband.co.uk

In the spirit of Rob’s earlier post I thought you guys might find a test report interesting reading. This morning, Debbie, Louis, Mohammed and Dug piled into Mohammed’s car and headed down to Hemel High Street. Debbie recruited, Louis kept the mac level and stable, Mohammed tracked the tests and Dug interviewed and moderated.

Introduction

The DSGi Customer Experience team ran a goal-oriented test on a small sample of NRS social grade C1C2 users in Hemel Hempstead on 10 June 2010. The purpose of the test was identify whether the “was now” user interface was behaving like a ‘honeypot’ trapping clicks that would have otherwise taken the visitor to the product page and on to the checkout. While the data sample was very small (ten participants, one invalidated during the test), the recruitment and test protocol was constant for all participants and the hope was that some significant pattern may emerge from the test results.

Participant recruitment

Participants were recruited on the public footpath in the centre of town. They were offered a store gift token in exchange for their participation. As part of the recruitment process, two filter questions were asked:

  1. Do you shop online (required response = "yes")
  2. Have you looked at the Currys website in the last 30 days (required response = "no")

Once participants had been recruited, the test was conducted in an adjacent car park. All participants used the sample computer in the same position and lighting. Participants were using Firefox and were connected to the internet by a 3G ‘mobile broadband’ dongle.

Test protocol

The test was conducted using the Think Aloud Protocol. Where users experienced difficulty using the computer trackpad device, they were asked to put their finger on the screen at the position they wished to click the mouse. This happened for participant Jeane who is 65. Once she had indicated where she wished to click, the test moderator clicked on her behalf.

Test script

Participants were shown a laptop computer with a web browser open to the Curry’s home page. They were given the following instruction:

This is the homepage of the Currys website. Please take a minute to review the content of this home page, and when you are ready, please find a "Beko" washing machine. If you can find this machine, please try to buy it.

Conclusions

While the test sample was very small, the recruitment and test moderation attempted to follow approved protocols. Overall, the test was a positive experience as it gave both useful results and acted as an impromptu training session for team members who had not participated in goal-oriented user testing before. The main finding are:

  • Dug's hypothesis that the honeypot was trapping 50% or more of the add-to-basket clicks is wrong. In the test, only one participant tried to click directly on the "save" area. A second clicked on it but only as a result of being confused by the popup box.
  • However, as 15% of participants clicked on the honeypot, the test does indicate that the interface should be modified to remove the large red square.
  • Dug and Mohammed's assumption that best practise is to have a click on a product image link to the singleton view was confirmed by the observation that almost all participants clicked on the image of the product when asked to buy it.
  • The test also clearly indicates that both product title and product image should both link to singleton view, however it might make sense in a future test to establish whether a link straight ot basket might be more appropriate on an ecommerce website.

Mohammed has attached the test findings as an excel spreadsheet: utest_10june2010.xls

Interesting stuff and fun, to boot :-)


dug
Written by dug Follow
Hiya, life goes like this. Step 1: Get out of bed. Step 2: Make things better:-)