A/B testing of changes: how to measure effectiveness on your website
When a business owner says, “We redesigned the design,” the next question should always be, “Did it work?” Because a nice button, a new photo, or an updated form are just hypotheses that, without verification, remain “at first glance” ideas. A/B testing is the only way to really see what works and what doesn’t. In this article, we’ll tell you how one site was able to increase conversions by 27% using simple A/B tests, and why testing changes isn’t a marketing whim, but a business necessity.
Blind design change: when "better" means worse
One of the clients, an online furniture store, decided to update its homepage. New banner, new buttons, less text. Visually, it became “cleaner” and “more stylish.” But within a week, the number of orders dropped.
At first glance, everything should have worked better: the page loaded faster, was more responsive. But an A/B test conducted by the Glyanets team in retrospect showed that the new version had an 18% lower conversion rate. The reason? The block with three key delivery benefits, which was the main trigger for purchase, was removed from the main banner.
How A/B testing works: just the basics
The essence of the test is simple: the audience is divided into two parts. One part is shown the old version of the element (A), the other part is shown the new version (B). For example:
old form with 5 fields vs simplified form with 3 fields;
the “Add to Cart” button is red versus green;
the text on the banner “10% discount” versus “Free shipping”.
Data is collected in real time: who clicked, who added to cart, who placed an order. After a few days, you can tell for sure which version is more effective. These are not assumptions, but numbers.
What metrics are important to track during A/B tests?
Not every change makes sense, so you need to know what to test. The most common ones are:
CTR (Click-Through Rate) — how often a button or banner is clicked;
Conversion — did the change lead to more orders?
Time on page — has it become more interesting for the user?
Bounce rate — whether the change alienated part of the audience;
Average check — whether the change affected the average purchase price.
For example, one test showed that changing the color of a button from orange to dark blue reduced clicks but increased the average check, because people played with the button less and bought expensive items more often.
A/B testing in practice: what it looks like in real life
One example is changing the product description. Instead of the standard “100% cotton, made in Ukraine,” the team tested the option “Soft natural fabric. Ideal for summer. Made in Ukraine.” Neither the design nor the price changed. But the new description resulted in 12% more additions to the cart.
Another test involved placing a sale banner. In the old version, it was at the bottom of the page. In the new version, it was right after the header. The CTR doubled.
Why test constantly: the effect lasts for months
A/B testing is not a one-time event. It is a process that produces consistent improvements. One test yields +5%, another +3%, and so on. In six months, a website can grow by 30–40% in sales without any additional advertising budget. Simply by making the right hypotheses and testing each change.
And most importantly: it protects against “subjective decisions” — when a manager wants “a yellow button because it’s trendy.” A/B testing gives the answer in numbers, not emotions.
How the company Glyanets implements A/B testing in practice
Glyanec approaches A/B tests systematically. We:
we create hypotheses based on GA4 and Hotjar analytics;
we implement change options (design, text, blocks) through special systems or custom code;
we collect data and analyze it in Looker Studio;
we present the results to the customer in a clear form;
We only launch those changes that really improve metrics.
We don’t change the design “because it’s better.” We test, prove, and implement what works. This is a results-oriented approach.
Just one step to your perfect website



