Amortized Variational Bayesian Regression


Video


Team Information

Team Members

  • Mert Ketenci, PhD Candidate in Computer Science, Graduate School of Arts and Sciences, Columbia University

  • Faculty Advisor: Noemie Elhadad, Associate Professor of Biomedical Informatics, Vagelos College of Physicians and Surgeons, Columbia University

Abstract

Non-linear supervised machine learning algorithms achieve remarkable results in many applications, but often at the expense of interpretability. This limits the applicability of many well-performing models in high stakes decision-making domains such as healthcare, finance, and criminal justice. As a result, such domains often depend on less accurate but more interpretable linear models. In this paper, we introduce Amortized Variational Bayesian Regression (AVBR). AVBR combines the interpretability of linear models with the predictive power of non-linear function approximators. In particular, AVBR models the output by an instance-wise linear predictor function whose coefficients are given by an amortized, non-linear inference network. AVBR performs on par with established supervised machine learning algorithms while being able to quantify (1) feature importance, (2) feature importance uncertainty, and (3) predictive uncertainty. We demonstrate the predictive performance and interpretability of AVBR on several benchmark datasets.

Team Lead Contact

Mert Ketenci: griffin.adams@columbia.edu

Previous
Previous

What's in a Summary? Laying the Groundwork for Advances in Hospital Course Summarization

Next
Next

Do You Trust Me? Development, Implementation and Acceptance of a Machine Learning Model - Opportunities, Challenges and Future Direction