keep out
Online user testing solutions like have become incredibly popular in the last couple of years; providing a fast and easy-to-set-up test platform it’s no surprise it’s become a regularly used tool for CRO agencies, international web enterprises and thousands of small companies.

With plenty of reviews available online there’s no real need for me to go through the list of advantages, however as a strong believer in low cost usability testing proposed by Jacob Nielsen and brilliantly promoted by Steve Krug I think it might be interesting to have a bit more of a sceptical look at these solutions. Mostly importantly I want to try to comparing the costs against the amount and quality of data obtained.

So why might online user testing not be right for you?

Reason 1 – Less time = fewer tasks = less data = fewer insights

The average user session at lasts only about 15 minutes, severely limiting the amount and length of the tasks in the scenario. Compare this to a DIY solution, where a user is more than comfortable spending an entire hour undertaking 2–3 lengthy tasks in addition to a homepage ‘5 second test’.

Reason 2 – Is it really the cheapest solution?

A couple hours of your time and incentives for users are the only costs of doing user testing in person. Additionally, if your target audience is pretty general, you can always find some colleagues or friends who would do it for free or a pint of beer.

At three users will cost you $117 (or $195 for five), plus bear in mind you’re likely to require twice the amount of participants compared to DIY (see Reason 1).

Reason 3 – Lack of recruiting control

With only demographic targeting available, online user testing solutions are basically limited to generic sites only. Any niche website or serious B2B requiring some domain knowledge from the user would never be able to find suitable representative.

There is always an opportunity to specify additional requirements but when the decision as to whether a user suits the requirements is made by the user himself I would really question the level of accuracy.

Reason 4 – Professional test participants

Generally the likelihood of your participant being recruited for a particular test (e.g. yours) depends upon their ‘rating’. This rating is given by the test creator. Logically the system chooses users with higher ratings much more often than the rest, unintentionally creating small set of ‘professional users’ participating in plenty of tests on a regular basis. This creates at least two major problems:

1. Experience gained by these users from large number of user test sessions dramatically changes their behaviour towards ‘expert review’ style rather than ‘user thinking out loud’. They seem to focus on reviewing lots of tiny general practice roadblocks when we just need a user following the scenario providing us behaviour insights, not an opinion. There is nothing worse than an amateur giving expert advice.

2. Although maybe not experiencing the fear of hurting a test creator’s feelings that occurs in DIY tests (especially if there is an incentive), online user testing participants users are interested in a higher rating from the test creator and often will do almost everything to supply you with a set of minor, easy-to-tweak insights rather than honest severe leaks that usually could question your marketer/designer/developer professionalism.

Reason 5 – Lack of session control

This is without doubt the biggest reason to hate remote user testing, and it still pushes me to advice offline one-to-one user testing for most clients. From my experience at least 25% of every online remote test I conducted was ruined because of this issue. You can spend days on writing clear scenario and intuitive tasks but there’s always going to be some people missing the point of the task, simplifying it or thinking they completed it despite being only half-way through.

It becomes more of a problem when we consider ‘quiet thinkers’ – users continually forgetting to think out loud and only describing what they do. In an offline solution there’s always the opportunity to ask “Why are you doing this?”, “What are you thinking about?” and “What are you planning to do next?”.

And to finish…

Even if the reasons above might sound critical, I still would not recommend excluding online user testing from your tool box. What I wanted to do with this post is to set up some basic factors influencing your decision on which method to take.

I’ve no argument that, if you are limited with time or want to test small part of your conversion process, should be you BFF. On the other hand if quality of insights is your main priority consider running test by yourself. It might sound complicated but the pay off will be worth it.

Finally, if this all sound way too complicated simply get in touch with us and we will take care of entire user test process for you including analysis and hypothesis preparation.

Happy testing!

Tags: , ,