ADOpy: A Python Package for Adaptive Design Optimization

Yang, J., Pitt, M., Ahn, W.-Y., & Myung, J. 2019. PsyArXiv

Abstract

Experimental design is fundamental to research, but formal methods to identify good designs are lacking. Advances in Bayesian statistics and machine learning offer algorithm-based ways to identify good experimental designs. Adaptive design optimization (ADO; Cavagnaro, Myung, Pitt, & Kujala, 2010; Myung, Cavagnaro, & Pitt, 2013) is one such method. It works by maximizing the informativeness (and efficiency) of data collection, thereby improving inference. ADO is a general-purpose method for conducting adaptive experiments on the fly, and can lead to rapid accumulation of information about the phenomenon of interest with the fewest number of trials. The technical skills required to use ADO have been a barrier to its use. To increase its accessibility to a wide range of researchers, we introduce an open-source Python package, ADOpy, that implements ADO for optimizing experimental design. The package, available on GitHub, is written using high-level modular-based commands such that users do not have to understand the computational details of the ADO algorithm. In this paper, we first provide a tutorial introduction to ADOpy and ADO itself, and then illustrate its use in three walk-through examples: psychometric function estimation, delay discounting, and risky choice. Simulation data are also provided to demonstrate how ADO designs compare with other designs (random, staircase).