Tuesday 13.10.2020 17:00 — 19:00
Online Event

Virtual Lecture by Alexis Makin: Have we been practicing cumulative science in our EEG lab? An exhaustive analysis of all 6674 brain responses from 40 studies and 2215 participants

There is growing anxiety about the trustworthiness of published science. Bishop (2019) claims “researchers work in ways almost guaranteed not to deliver meaningful results”. We thus critically audited our own practice. Our lab investigates the neural response to visual symmetry. We use an ERP called the Sustained Posterior Negativity (SPN). We organized 249 SPNs from 2215 participants into 40 project folders (now publicly available). We then used Bishop’s ‘four horsemen’ framework to grade our SPN work.  Horse 1: Publication bias: The published record overstates effect sizes because weaker effects languish in the file drawer. Our 134 unpublished SPNs were 33% weaker than the 115 published ones. Despite legitimate explanations, we give ourselves B+ for publication bias (68% in UK). Horse 2: Low statistical power. New power analyses suggest that many SPN results may would not be reliably replicated using original sample sizes. We propose C- for statistical power (62%). Horse 3: P-hacking. In ERP research, P-Hacking often involves double dipping. That is, examining all data, then selecting a subset for statistical analysis. We reanalyzed all SPNs using 3 alternative electrode clusters. Correlation across analyses was reassuringly high (r > 0.92). However, some results were problematically dependent post hoc decisions. We give B+ for P hacking (68%). Horse 4: HARKing (‘Hypothesizing After Knowing Results’). HARKing beautifies papers by creating the false narrative about an a-priori prediction confirmed. Pre-registration combats HARKing, but we have only recently engaged with this. We thus propose B for HARKing (65%). The grades are merely playful discussion points. However, this self-audit was a serious exercise that we encourage others to emulate. First, it forced us to organize all our data, make it public, and extract new value (instead of wastefully forgetting it). Second, the complete SPN catalogue allowed new analyses that were not possible from a single data set. Third, it informed future research design. This supports trustworthy, cumulative science.

 

Bishop, D. (2019, April 25). Rein in the four horsemen of irreproducibility. Nature. Nature Research. doi.org/10.1038/d41586-019-01307-2