top | item 5472392

Do bigger images mean improved conversion rates? Three case studies

84 points| scholia | 13 years ago |econsultancy.com | reply

40 comments

order
[+] orangethirty|13 years ago|reply
1. Bigger images usually improve click-through rates. Why? There are a number of reasons. An image is used to to tell part of the sales message. People relate ideas to images and understand the message you are trying to send better. Also, a lot of people love reading what is under images. As a result, your copy gets read more often. A bigger image amplifies this by turning the landing page into a sort of short story.

2. Big buttons work. They are the call to action, and big call to actions get a big amount of attention.

3. When picking images, be mindful of the market you are selling to. The race of the people in the images are very important. Not because of racism, but due to how people relate to those who look like themselves. A self-taken photograph usually nets better results than stock photos. Even one from a mobile phone.

4. If you are selling a service, then include a photo of the dashboard/main area in the main page. People want to see how it looks like without signing up.

[+] crimsonzagar|13 years ago|reply
There is an old adage A picture is worth a thousand words for exactly that reason. It helps people scroll less and get the message without having to read too much text.

All one needs to do is consult the artist within and choose the right image for better conversion. That's exactly the point no. 3, like you said, kudos. IMHO, images and their messaging are probably one of the spots where art and math can converge.

[+] villek|13 years ago|reply
The cases presented don't give much evidence for the impact of bigger images on conversion rates. Only the first case seems to be a direct comparison of images of different sizes. In the two others, the copy and design in general have changed significantly as well. In the second case, the image on the page is actually the same size, the button has been resized.

In general, though, I do completely agree with the article that the only way to find out what works is to test, test, test.

[+] notahacker|13 years ago|reply
Examples 2 and 3 as presented in the article couldn't be more horrendous examples of invalid inferences from test results if the author had tried.

It's the cargo cult approach to testing: "let's change everything and if the revamp increases conversions we'll guess what factor had the most effect".

[+] ciex|13 years ago|reply
Only testing specific examples is not enough. Without testing against a theory you gain little knowledge from these experiments. I would like to see reports of people testing their beliefs about how web design decisions affect user behaviour instead of finding out which design works better for a given site. Of course this is more time consuming but it will also yield more valuable insight.

Also I think it is important to include feedback from actual users in your test results. How did they feel using the different sites? What are the users feelings about your own concerns (invisible content below the fold, ridiculous big buttons, etc)?

Without these considerations you are just poking around the mud with a stick.

[+] dclowd9901|13 years ago|reply
> Responsive web designers know that on many sites, as few as 20% of visitors will bother to scroll down.

Is this really an "understood" truth? In our own study for our own site, we found nearly everyone scrolled the page all the time, and that seems the be the case with every colleague I've ever talked to as well. Who goes to a page and says, "fuck it, that scroll wheel is just way too damned difficult"?

[+] ricardobeat|13 years ago|reply
No idea where they got that number, or what a "responsive designer" is (answers phone calls quickly?). This myth has been debunked long ago.
[+] passionfruit|13 years ago|reply
I have been surprised to observe the number of older people who still click on the up and down arrows in order to scroll. Moving the cursor onto the scroll arrows takes precision and for someone with poor mouse skills is even more difficult.
[+] Shish2k|13 years ago|reply
In the case of an online auction, it seems pretty obvious to me that more detailed images of the item being sold will increase interest (I've been scrolling past lots of potentially good deals on graphics cards on ebay recently, because there wasn't a detailed photo of their output sockets...)

In the other cases, it seems more like "a change happened" and "conversion rates went up" - far too many changes at once to imply one specific change caused it :-/

[+] gav|13 years ago|reply
The better product content you present the customer: bigger pictures, different product shots, video, reviews, comparisons, product alternatives, better copy, detailed descriptions, line-art with dimensions, etc., the better the conversion rate.

It's a big differentiator when the same (or equivalent) product is sold by multiple merchants. Customers in my experience are even willing to pay a premium to the merchant with the better content, presumably because they seem more legitimate and less-risky.

[+] jordanmessina|13 years ago|reply
How can you call something a baseline control with completely different messaging and layout than the variation? The last two examples can't even come close to attributing the increase in conversion to larger images. This is just an advertisement for WhichTestWon. Coincidentally, WhichTestWon is the author's employer...
[+] paskster|13 years ago|reply
I would love to see more Case Studies on Mobile App Design. For example how big should images within a mobile app be, to increase user engagement. Especially inside a listview bigger images can mean fewer items, but might still increase conversion because it looks more appealing.
[+] bennyg|13 years ago|reply
I think it looking more appealing is the key. Sure you can fit less, but if people don't want to use it, it doesn't really matter does it? People are usually more willing to make subconscious tradeoffs when it looks, feels, and works awesomely.
[+] btilly|13 years ago|reply
Massively improved conversion rates..on what sample size? Can we trust these results?

So try it for yourself. Track results. And make sure you do the statistics right to know if you're looking at signal or noise. (Early noise can be very, very impressive.)

[+] nicpottier|13 years ago|reply
Would be nice to see actual the numbers to judge whether the results are statistically significant.

At first blush all but the last Dell treatment are absolutely horrendous to my eye, so I just don't know how to even judge these.

[+] robomartin|13 years ago|reply
How would you calculate what would constitute statistically significant results for each of these sites? How about any other site?
[+] msrpotus|13 years ago|reply
At a certain point, bigger won't be better, right? What's the maximum effective size? When an image is so big it just takes over your screen?
[+] D-Train|13 years ago|reply
At some point, it's less, also, about the size as much as it is the quality of the picture. This reminds me of OkCupid (the dating site). They did an analysis of the quality of picture (and camera used) to measure their effects on click rates and messages. See http://blog.okcupid.com/index.php/dont-be-ugly-by-accident/

Looking at how some really great sites are doing, too, there are sites with massive pictures, but if they're not high quality, turns out to look like a cheap site.

[+] bicknergseng|13 years ago|reply
I'd point out that the call to action is also much larger and visible in that Dell case study. The control for that one is pretty poor.
[+] jmount|13 years ago|reply
Has anyone found a source for any publicly released raw data on this sort of thing? Something one could re-sift and discuss?
[+] tylerdodge|13 years ago|reply
When I saw this I thought for sure it was an april fools joke, but then I saw that it was posted on March 21st.
[+] polskibus|13 years ago|reply
Does HN filter out April Fools' pranks?