devxlogo

13 Surprising Results from UI A/B Testing by the Experts

What are the go-to methods for A/B testing UI elements on your website? Learn from 13 business leaders as they share their techniques and the surprising insights gained through A/B testing.

  • Rely on Data Analytics
  • Identify a Single Variable to Test
  • Segment Users Based on Device Type
  • Reduce Cognitive Load for Users
  • Test One Major UI Element
  • Analyze Secondary Metrics for Insights
  • Use Hotjar with Google Analytics
  • Test Impact of Removing Extra Images
  • Experiment with Small UI Changes
  • Consider Emotional and Cognitive Impact
  • Optimize CTA Button Placement
  • Test Specific Call-to-Action Text
  • Combine User Feedback with Hands-On Testing

Surprising Finding from UI Experts

Rely on Data Analytics

Relying on data analytics is my go-to method for A/B testing various UI elements on our website at Trusted Wedding Gown Preservation. For instance, we recently ran an A/B test for our “Call-to-Action” button, varying its color from traditional bridal white to a more vibrant lavender purple. I hypothesized that the conventional white-button variant (A) would outperform the experimental purple-button variant (B). To my surprise, the purple button triggered a 15% increase in form submissions, suggesting that a pop of color can stimulate a better response in our demographic.

Ensuring data-driven choices, not personal biases, influence your A/B testing decisions is a critical learning I gained from this experience. This approach has significantly improved our web interface, offering a better user experience while also enhancing customer engagement and conversion rate.

Mary LopezMary Lopez
Marketing Manager, Trusted Wedding Gown


Identify a Single Variable to Test

I start by identifying a single variable to test-like a call-to-action button, headline placement, or navigation layout—so it’s clear what impact each change is having. Tools like Google Optimize or Optimizely make it easy to set up and analyze these tests, especially when paired with clear success metrics like click-through rates or time-on-page.

One test that yielded surprising results was for a SaaS company’s pricing page. We tested a traditional grid layout for pricing tiers against a more narrative-style approach, where the pricing options were embedded within a scrolling story about how customers succeed with their product. I expected the grid to perform better because it was concise and familiar, but the narrative design led to a 35% increase in conversions. It turned out that the additional context helped users feel more confident about their choice, even though it required more scrolling. This taught me that sometimes breaking the mold of traditional design can resonate more deeply with users, as long as it supports their decision-making process.

Eric HebertEric Hebert
Lead Web Mechanic, Jersey Devil Web Design & Support


Segment Users Based on Device Type

While building our WordPress plugins, I learned that small UI tweaks can have huge impacts—I use VWO for testing because it lets me segment users based on their device type and previous behavior. Recently, I moved our feature comparison table above the fold on mobile devices, thinking it would help conversions, but it actually decreased them by 23%—turns out our users preferred seeing social proof first.

Joshua OdmarkJoshua Odmark
CIO and Founder, Local Data Exchange


Reduce Cognitive Load for Users

We conduct A/B testing to discover how incremental adjustments can transform user behavior through reducing cognitive load. For instance, we wanted to test our dashboard’s default view on the homepage for returning users. Originally, it showed the breakdown of their portfolio in a detailed fashion, but we noticed users with large portfolios took too long to just click to a particular page rather than visit it. The theory was that a personal summary—of the most meaningful information such as daily updates or notifications—would encourage users to take steps more quickly. To see if this worked, we had two versions, one using the existing detailed view, and one with a small summary and obvious navigation prompts.

The results revealed something unexpected. The personalized summary drove a 150% improvement in user satisfaction scores, but what really turned things around was a 25% increase in users using our powerful tools. The reduced cognitive time spent to parse their portfolios in advance enabled users to search and make their own unbiased decisions. This showed how crucial designing for cognitive clarity was, especially when it came to platforms containing sensitive data. If your team is considering A/B testing, I’d recommend scouring the content for where users hesitate or feel overwhelmed.

Thomas FranklinThomas Franklin
CEO, Swapped


Test One Major UI Element

Testing one major UI element against multiple subtle variations is better than testing multiple elements simultaneously, as it yields the most actionable insights. This focused approach has helped us make data-driven decisions about our website design without confounding variables.

A recent test of our service/product pages revealed an unexpected insight that challenged our assumptions. We tested our standard “Request a Quote” button against three variations with different micro-copy and design elements. The surprising winner wasn’t our beautifully designed primary CTA, but rather a simple text link reading “See Pricing Examples” placed below the main button. This variant increased click-through rates by 45% and led to a 30% improvement in qualified lead generation. The data suggested that visitors were more comfortable exploring pricing information before committing to a direct conversation.

Effective A/B testing isn’t about validating your existing beliefs—it’s about being open to surprising results that challenge your assumptions. Start with a clear hypothesis, test one significant element at a time, and run tests long enough to gather statistically significant data. This methodical approach helps separate real insights from random fluctuations.

Aaron WhittakerAaron Whittaker
VP of Demand Generation & Marketing, Thrive Digital Marketing Agency


Analyze Secondary Metrics for Insights

We A/B tested a modification to our “Meet Our Experts” page. Option A presented a list of specialists, followed by their name. Option B added a short video introduction for each expert. The initial findings were intriguing—conversion rates (client queries) were the same for both options. But a closer examination yielded intriguing insights. We examined confidence intervals to make sure that what happened was not accidental. Then, we examined secondary metrics.  Option B, the video introductions, significantly boosted session length and usage. People were spending time looking up our experts even if it didn’t translate into a question right away.

It was this unanticipated outcome that shaped our thinking. Conversions were still the same, but the introduction videos connected people better with our experts and presumably helped prospects make more informed choices. Filtering our audience by traffic source gave more nuance to the data. For organic search users (really looking for a film provider), the opt-in was much better on Option A (text list). But social-media referrals (presumably not familiar with FilmFolk) were more interested in Option B’s intro videos.

This experience demonstrates the need for a comprehensive A/B testing analysis. By stepping outside the initial findings, we had some valuable user data. This enables us to make data-based changes and deliver a website that attracts visitors, but also establishes trust and relationships through storytelling.

Andrew CussensAndrew Cussens
Digital Marketing Specialist| Founder & CEO, FilmFolk


Use Hotjar with Google Analytics

For me, the go-to A/B testing method is to use Hotjar with Google Analytics. Heatmaps and session recordings from Hotjar show user behaviour, and Google Analytics tells us about bounce rates, session times, and conversions.

Using this method yield me a surprising A/B test result:

Here we were changing the colour and position of a CTA button. I originally had Variant A, green and centred, and then tested Variant B, red button slightly lower on the page. A 15% increase in conversions came from the lower, red button. Hotjar recordings indicated that users read a few lines before engaging with the button and this is where the value of aligning CTA placement with natural reading flow comes into play.

This result confirmed that small changes in colour and placement, based on user behaviour, can have a big impact on engagement.

Mei Ping MakMei Ping Mak
Director of SEO and Web, Website Design Asia


Test Impact of Removing Extra Images

We use Omniconvert for our A/B testing, but there are a lot of other great A/B testing tools. We like that this one can be easily applied to any site (no matter the CMS) and that the results can be interpreted using either Bayesian or frequentist algorithms. We find the Bayesian framework for A/B testing to be simpler, less restrictive, and more reliable for interpreting test results.

We’ll occasionally test major layout changes, but when possible, we like to test the change of one element at a time to the impact of individual elements on a page. The duration of tests can vary depending on the level of monthly traffic for sites, but one month is our minimum length for A/B tests.

We have found some surprising results with A/B testing. I think one particularly surprising result was related to images on pages and blog posts. There’s a common school of thought that users need imagery scattered across a page to remain engaged with the page. The best practice we’ve heard reiterated across the web is to include at least one image per scroll. We had been doing this, but we wanted to test the impact of this best practice. We ran an A/B test where the B variation of our blog template only had one featured/hero image up top and no other images in the content.

What we found was that conversion rates increased by 7% when the extra images were removed and we conducted this study on several sites, which all yielded similar results. Our interpretation of the findings is that users visit pages to receive information, which they get through text content.

Images typically don’t provide information and give users something they have to scroll past to continue reading. The removal of images helped users skim content faster to get what they needed and helped more users get through more of the content to reach our CTAs scattered throughout the posts.

Shaun TyndallShaun Tyndall
Founder / Director, Inclind


Experiment with Small UI Changes

I use Optimizely. These tools allow me to create and track variations of key elements like buttons, navigation menus, or landing page layouts. I typically start by testing elements that could directly impact conversions, such as call-to-action buttons or form designs. One surprising test I ran involved changing the color of a call-to-action button on a landing page from blue to orange.

While the original blue button had performed well, the orange button resulted in a 25% increase in click-through rates. This was surprising because the blue was aligned with the brand’s primary color, but the contrast of the orange drew more attention and created a sense of urgency. This test reinforced the importance of regularly experimenting with small UI changes to improve user engagement.

Kristin MarquetKristin Marquet
Founder & Creative Director, Marquet Media


Consider Emotional and Cognitive Impact

When A/B testing UI design elements, I try to make sure to pay attention to how changes to design make people feel (both emotionally and cognitively). There’s something about looking only at click-through or conversion rate metrics that makes you think of the finer details of how the users are engaging and intent.

For instance, we have tried placing a large CTA button on our website. We initially put the button at the top of the page because we thought it would naturally follow users. Yet when we moved it down below the fold, we experienced an unexpected uptick in clicks. Lower placement allowed users to chew the content through before responding when they saw the button.

The lesson I got out of this is that the user’s experience does not happen at one moment and forcing them to make an immediate decision can be frustrating. Rather, giving a more deliberate pathway to engagement invites a more deliberate decision-making process.

Anders BillAnders Bill
Cofounder/CPO, Superfiliate


Optimize CTA Button Placement

At Mail King USA, A/B testing UI elements is a crucial part of optimizing user experience and improving conversions. A common method involves testing variations of key design elements like call-to-action (CTA) buttons, page layout, or images. For instance, we once tested two different CTA button placements on our homepage. One was located above the fold, while the other was towards the bottom of the page. The result was surprising: the lower-placed button performed better, which highlighted the importance of user flow and how they interact with different content on the page.

This test provided valuable insights, showing that users were more likely to engage with the CTA after reviewing the content, rather than at first glance. Understanding these subtle behavior patterns allows us to continuously refine and improve our website, driving higher engagement and conversion rates.

Steve NeherSteve Neher
CEO, Mail King USA


Test Specific Call-to-Action Text

My go-to method for A/B testing UI elements on my website is using Google Optimize or Optimizely. I typically start by testing one element at a time, whether it’s a CTA button, a headline, or a form layout, to see how small changes impact conversion rates.

One example that yielded surprising results was testing two variations of a CTA button-one with the text “Get a Free Consultation” vs. “Schedule Your Free Consultation.” The change was subtle, but the version with “Schedule Your Free Consultation” resulted in a 15% higher conversion rate. It showed me that a more specific call to action resonated better with users and drove more action.

Kyle MorrisKyle Morris
CEO / Founder, LawTurbo


Combine User Feedback with Hands-On Testing

I’ve found that over-reliance on automated A/B testing often leads to misleading conclusions. Instead of splitting traffic and analyzing data points, I get better results from direct user feedback combined with hands-on testing.

For example, when revamping a landing page’s call-to-action section, we created two versions and had real users try both while we watched their interactions. The surprising part wasn’t the final choice, but how differently various user segments behaved—traffic from social media had completely different preferences than search visitors.

This human-first approach revealed nuances that traditional split testing would have missed. Now I focus on qualitative feedback first, using analytics data as a secondary validation rather than the primary decision driver. It’s less “scientific” on paper, but produces more reliable real-world results.

Vincent SchmalbachVincent Schmalbach
Web Developer, AI Engineer & SEO Expert, Vincent Schmalbach


Related Articles

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.