Skip to content

amazon-science/adaptive-abtester

Adaptive A/B Testing

This project implements methods from the paper Stronger Neyman Regret Guarantees for Adaptive Experimental Design. It is built to test and compare adaptive A/B testing techniques. We compare our adaptive, strongly convex, no-variance-regret average treatment effect (ATE) estimation algorithms with the adaptive no-variance-regret algorithm from Dai et al (2023).

Project Structure

  • abtester/: Main library with optimizers and utility functions.
  • scripts/: Scripts for data preprocessing, running experiments, and analysis.

Setup

  • Clone the repository.

  • Navigate to the project directory and install dependencies using pip or Poetry.

Usage

  • Preprocess data:

    python scripts/preprocess.py
  • Run experiments:

    python -m scripts.run_experiments

About

Implementation of an adaptive experimental design algorithm

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published