Bwin Germany
GDPR Opt-in Email - A/B Test
Overview
In the context of betting apps and websites, GDPR compliance was a strategic priority-not only from a regulatory and risk perspective, but also from a user experience standpoint. Given the handling of sensitive data, such as financial and behavioral information, ensuring clear, accessible, and well-contextualized consent was essential to reduce uncertainty and increase the user’s sense of control. This focus on transparency and predictability directly supports trust-building within a naturally high-stakes journey, impacting engagement, retention, and perceived platform credibility.
A/B testing was the most effective approach, as it enabled direct comparison between different versions of the GDPR opt-in email in a real environment, capturing user behavior at scale and with direct impact on conversion rates, something traditional usability testing cannot measure with the same level of accuracy.
Based on these insights, it was possible to reduce friction across the journey and optimize the experience through adjustments in option placement, message clarity, and visual hierarchy, ultimately improving comprehension, trust, and conversion.
Challenge
Alongside the classic A/B test—where different opt-in variations were exposed to separate user groups a complementary test and conducted in focusing on experience quality, using UserTesting App.
This platform enables the recruitment of real users who interact with the proposed flow while verbalizing their perceptions, doubts, and decisions in real time. Through guided tasks, it becomes possible to observe how users interpret the content, where they encounter friction, and how they make decisions throughout the journey.
While A/B testing provided quantitative evidence around performance and conversion, UserTesting added a critical qualitative layer and helping explain the “why” behind user behavior and making the results more robust and actionable from a UX perspective.
Results
To define the winning version, we combined quantitative learnings from the A/B test with qualitative insights gathered through UserTesting.
Quantitative:
From a performance standpoint, the winning version was the one that achieved the strongest results across the defined KPIs, particularly opt-in conversion rate, while also ensuring statistical consistency.
Qualitative:
We analyzed message clarity, level of user comprehension, friction points, and direct user feedback. Versions that triggered confusion, hesitation, or ambiguous interpretations were discarded, even when their performance was close.
The final decision, therefore, was not based solely on numerical performance, but on the version that delivered a more seamless, understandable, and trustworthy experience—ensuring sustainable results over time.