This paper is a reproduction of work by Ray et al. which claimed to have uncovered a statistically significant association between eleven programming languages and software defects in projects hosted on GitHub. First we conduct an experimental repetition, repetition is only partially successful, but it does validate one of the key claims of the original work about the association of ten programming languages with defects. Next, we conduct a complete, independent reanalysis of the data and statistical modeling steps of the original study. We uncover a number of flaws that undermine the conclusions of the original study as only four languages are found to have a statistically significant association with defects, and even for those the effect size is exceedingly small. We conclude with some additional sources of bias that should be investigated in follow up work and a few best practice recommendations for similar efforts.
Thu 24 OctDisplayed time zone: Beirut change
14:00 - 15:30 | |||
14:00 22mTalk | On the Impact of Programming Languages on Code QualityTOPLAS OOPSLA Emery D. Berger University of Massachusetts Amherst, Celeste Hollenbeck Northeastern University, Petr Maj Czech Technical University, Olga Vitek Northeastern University, Jan Vitek Northeastern University Link to publication DOI Pre-print | ||
14:22 22mTalk | Casting about in the Dark: An Empirical Study of Cast Operations in Java Programs OOPSLA Luis Mastrangelo Università della Svizzera italiana, Matthias Hauswirth Università della Svizzera italiana, Nate Nystrom Università della Svizzera italiana DOI | ||
14:45 22mTalk | On the Design, Implementation, and Use of Laziness in R OOPSLA DOI Pre-print | ||
15:07 22mTalk | Aroma: Code Recommendation via Structural Code Search OOPSLA Sifei Luan Facebook, Inc., Di Yang University of California, Irvine, Celeste Barnaby Facebook, Inc., Koushik Sen University of California, Berkeley, Satish Chandra Facebook DOI |