Websites are using hidden tricks to make you click or buy without realising, and the way they do it can be both baffling and controversial, finds Chris Baraniuk.
The internet is one big experiment, and you’re part of it. Every day, millions of trials are manipulating what you see when you browse online, to find out how to keep your attention, make you click more links – and spend more money. And these experiments are often secret. You’ll probably never know you were part of them.
This is all thanks to something now well-known in the tech industry, called A/B testing. It means that the web pages served to you are not necessarily the same as those shown to the next person – they might have slightly different colours, an alternate headline or, on social networks, you could be shown different personal information about your friends and family.
What started as a way to tweak website design is becoming increasingly controversial – in the most divisive cases, A/B testing can help companies sway people’s mood or even their love life. This summer, it emerged that Facebook used the technique to experiment on users, without their knowledge, in an effort to influence their emotions. And more recently came the revelation that dating network OkCupid lied to some of its users about their suitability to be “matched” romantically with another member of the site. The company was hooking people up with unsuitable potential partners, and then tracking their interactions.
So, at what point does all this experimentation become outright manipulation?
The phenomenon of A/B testing began as a relatively benign, even mundane, way of improving websites. It’s largely used for something called “Conversion Rate Optimisation” (CRO), which is a measure of how well any website is able to engage users. What has made it so powerful, however, is that sometimes it throws up results that nobody would have predicted otherwise.
Earlier this year, for example, a Google executive revealed that using a different shade of blue for advertising links on search result pages caused more people to click on those links, boosting the company’s revenue by $200m.
Similarly, travel site TripAdvisor has used A/B testing to discover that certain colours draw some people in more than others. If people have arrived on a TripAdvisor page from a Google advert, for example, they’re more likely to click on a blue button. Other users navigating from within the TripAdvisor site, however, prefer yellow.
Some results are even more baffling. The dental referral service 1-800-DENTIST, for example, recently trialled a variety of photos to encourage visitor engagement on their website, along with other tests. They found the one that won was a dentist with his hand on a female patient's shoulder, which was unexpected because it’s something a dentist would never do with a patient, says Dan Siroker, CEO of A/B testing company Optimizely, which advised 1-800-DENTIST.
As Stewart Ulm, director of engineering at travel search engine Kayak, puts it, sometimes you don’t know why an A/B result works – it just does. “We try to come up with theories to explain the results that we get, but when we’re doing our experiments with just pure statistical analysis we never really know for sure.”
As Stewart Ulm, director of engineering at travel search engine Kayak, puts it, sometimes you don’t know why an A/B result works – it just does. “We try to come up with theories to explain the results that we get, but when we’re doing our experiments with just pure statistical analysis we never really know for sure.”
Most examples of A/B testing are relatively harmless. However, in recent months, use of the technique has generated something of a backlash.
For starters, A/B testing has begun to affect the news you read – and not necessarily for the better. Sites like Slate and Upworthy, for example, often test up to 25 headlines using specially designed software to see which performs best. This leads to headlines such as: “They Had A Brilliant Idea To Give Cameras To Homeless People. And Then The Cameras Got Turned On Us”. These can receive huge attention online and be widely shared, but are frequently derided as misleading “clickbait” because the articles or videos they relate to can often be a disappointment.
Now that companies like Facebook are experimenting with our emotions, and using falsehood to encourage engagement, many feel that A/B testing has entered a troubling new era. The OkCupid experiments exposed how simple A/B tests can directly impact your life. In addition to hooking up mismatched people on dates, OkCupid recently carried out another test in which the subject lines of notification emails were changed for a portion of the site’s members.
False match
As OkCupid co-founder Christian Rudder explains, one group received a subject line in this format: “You have a new message from [sender’s name]” while the other group received this subject line: “New message from [sender name]”, plus a tiny symbol of an envelope. The second headline performed 4% better. On a website with over 30 million users, 4% is not insignificant.
Imagine if that 4% related to a test on one million users. “That’s 40,000 clicks that never would have happened,” comments Rudder. “And in that 40,000, maybe 10,000 people will start a conversation who never would have spoken. And maybe 1,000 of them will actually go on a date – just because of this random experiment.”
As the precision of these tests becomes greater, there are those who worry that ethics may be thrown out of the window in the pursuit of short-term engagement and creeping profits. Harry Brignull is a design consultant who keeps an online catalogue of “dark patterns” – design tricks which, he says, unethically manipulate users into doing things they wouldn’t otherwise do. This could be anything from signing up to subscription services to agreeing that their personal data can be shared with third parties.
A lot of these tricks constitute things like confusingly worded forms during sign-up procedures, or checkboxes which appear innocuous but which in fact sign away your privacy in a single click. How does a company know these methods are really effective? A/B tests, of course.
“If a designer says design A makes this much money, but design B makes even more, when you look at it with those sorts of financials against it, it can be quite hard not to use all these little tricks – especially when your competitors use them anyway,” explains Brignull.
A/B aficionados like Optimizely’s Siroker insist that companies will, in the long-term, only damage their reputation and business success by acting unethically. But as Brignull points out, techniques for tricking people into buying things have been around for millennia, and plenty of people have profited from them through the ages.
To some extent, we have been here before: in the 20th Century, fears grew over the manipulative effects of advertising, leading to regulation of the industry. Perhaps such oversight will face A/B testers in the future.
Unfortunately, the line between a “dark pattern” and a design change that makes a website better for both users and the business is sometimes difficult to draw. But for Christian Rudder of OkCupid, the accusation that “lies” had been fed to unsuspecting users misses the fact that the site’s experiments are ultimately all to the greater good.
“We are doing something that is purely additive to the happiness in the world,” he insists. “We’re taking somebody who is single and we’re finding them a relationship, or a casual relationship, or a marriage. There’s no downside to anyone involved there. This isn’t a zero-sum game for us.”
Still, we’re all now becoming increasingly aware of the internet’s secret experiments to find out what makes us click, and they will continue whether we like it or not.
0 comments:
Post a Comment