Engel & Völkers has its own A / B testing tool since the end of 2017, which was developed together with the technology provider freiheit.com.
With this digital tool, the performance of one or n-landing pages can be measured and improved accordingly. This is done on the basis of pre-defined KPIs, which at the same time set the goals of the pages to be tested. In these performance comparisons of two or more pages, the focus is usually on usability. Either existing elements on the website are placed differently or new elements are added (such as CTA, contact forms, etc.). Visitors are directed to the respective main or sub pages via a fixed emphasis (in percent) of the variants that are controlled by cookies.
Whilst the development of the tool, we made sure that we use existing components, such as microservices. Our WebCMS communicates with one of these microservices via the GRPC protocol to retrieve the current tests. Additionally, the tool must fit into the new cloud environment, which is part of the digital transformation at Engel & Völkers.
Another challenge was the implementation of the control panel into the existing WebCMS. The control panel serves to create and manage all tests for us. For the new A / B testing tool, the "React" framework has been used with the latest technology in the front-end, which is also used by companies such as Netflix or Airbnb. The evaluation of all collected data is taken over by Kibana tracking.
A possible test scenario could look like this:
Subject group A is shown a specific design of a page, usually the control variant, while a group B is presented the same website with a different design. The distribution in this case would be 50%. The more variants are tested against each other, the more granular is the percentage distribution. Hereby, the pages can be displayed to unequal or equal parts. For example, a performance evaluation can be realised via click and conversion rates, but also by dwell time or submitted forms. With these simple measurement methods it is easy to find out the most efficient way in terms of performance.
For the most meaningful result, it should be noted when creating a test, that only one element per page may be changed by time. Only then it can be really said whether the change has made an influence on the original state and thus the result.
The variants are always defined in advance with the help of hypotheses, so results can be proven afterwards. In order to obtain valid results, the test duration, the scope of the traffics and the participating groups of people are of elementary importance. The aim of an A / B testing is always to optimise a product or service in a way that increases the number of users and constantly improves the website performance.
We will keep you updated on what conclusions we will draw from our A / B tests!