A/B testing - also known as bucket testing - allows an organization to evaluate the performance/impact of new features implemented on their website by exposing a small fraction of visitors to them.
In this paper, we propose a novel methodology that can reveal an ongoing bucket testing and the various features being tested. To evaluate the effectiveness of our proposed methodology, we began with testing the homepages of seven popular websites. We discover that four of them were actively performing bucket testing during our experiments, and we successfully spot different features being tested. Moreover, to investigate the factors that might affect bucket testing, we setup another experiment. Here, we request web pages from different browsers and record several features of server response, e.g., cookies set by the server, IP and port address of the responding server, and response time. We observe variations in the response time for different browsers, which suggest that the type of user agent plays an important role. Finally, we showcase the captured bucket-elements and release our dataset that can serve as ground truth for future investigations in this direction of research.
Mauro Conti, Ankit Gangwal, Sarada Prasad Gochhayat, Gabriele Tolomei.
Spot the Difference: Your Bucket is Leaking - A Novel Methodology to Expose A/B Testing Effortlessly.
In Proceedings of the 4th IEEE Workshop on Security and Privacy in the Cloud
(IEEE CNS 2018 workshop: SPC 2018), pages 1-7, Beijing, China, May 30, 2018.
DOI: 10.1109/CNS.2018.8433122, ISBN: 978-1-5386-4586-4.