The problem with our data-driven world 

In many fields of research right now, scientists collect data until they see a pattern that appears statistically significant, and then they use that tightly selected data to publish a paper. Critics have come to call this p-hacking, and the practice uses a quiver of little methodological tricks that can inflate the statistical significance of a finding. As enumerated by one research group, the tricks can include:

  • “conducting analyses midway through experiments to decide whether to continue collecting data,”
  • “recording many response variables and deciding which to report postanalysis,”
  • “deciding whether to include or drop outliers postanalyses,”
  • “excluding, combining, or splitting treatment groups postanalysis,”
  • “including or excluding covariates postanalysis,”
  • “and stopping data exploration if an analysis yields a significant p-value.”

Add it all up, and you have a significant problem in the way our society produces knowledge.

Source: The problem with our data-driven world | Fusion

Leave a Reply

Your email address will not be published. Required fields are marked *