Putting results in meaningful terms
Quantitative social scientists aim to be able to say that some effect is statistically significant or insignificant. However, a growing collective of academics argue that this aim is misleading. For example, this practice is involved in the tendency to overreport support for hypotheses. Leading statisticians argue that, instead of statistical significance thinking, academics should aim to evaluate whether an effect is practically meaningful or not; yes, you read that right, not on top of statistical significance, but instead (and yes, not business leaders, but statisticians say this).
Statistical significance thinking is still dominant.
The state of the art
In our open access publication in Social Sciences & Humanities Open our detailed review of recent top journal publications across social science disciplines reveals that statistical significance thinking is unfortunately still dominant. 79% of papers evaluate evidence using statistical significance, and concerns about overreporting support for hypotheses seem valid with an average of 88% of hypotheses supported. So we are not rid of statistical significance thinking yet. Among disciplines, economics is most is moving away from using hypothesis testing, and is moving toward a more exploratory style. That shift makes sense for them as all economics papers we reviewed that do have hypotheses support them 100% of the time. Such high support makes one question whether one might as well write papers without.
Hopeful examples
We also see papers that give hope by evaluating practical meaningfulness in creative ways. For instance, some studies evaluate the meaningfulness of an effect by comparing it to the effect of known important variables. For example, a hypothetical study on charismatic leadership that controls for age could say ‘the difference in job satisfaction between those with leaders with below average charisma and those with leaders with above average charisma is about as big being # years younger’. This helps make it intuitive to understand whether an effect is practically meaningful or not. There is also an important role for data visualizations to intuitively engage with data beyond averages. We encourage the use of such approaches, not only to help other academics understand your results, but also to help connect to practitioners.