Abstract
Witnessing the ongoing “credibility revolutions” in other disciplines, political science should also engage in meta-scientific introspection. Theoretically, this commentary describes why scientists in academia’s current incentive system work against their self-interest if they prioritize research credibility. Empirically, a comprehensive review of meta-scientific research with a focus on quantitative political science demonstrates that threats to the credibility of political science findings are systematic and real. Yet, the review also shows the discipline’s recent progress toward more credible research. The commentary proposes specific institutional changes to better align individual researcher rationality with the collective good of verifiable, robust, and valid scientific results.
Zusammenfassung
Angesichts der „Glaubwürdigkeitsrevolutionen“ in anderen Sozialwissenschaften liegen Fragen nach der Verlässlichkeit institutioneller Wissensproduktion auch in der Politikwissenschaft nahe. Dieser Kommentar beschreibt, warum Wissenschaftler entgegen ihrem Eigeninteresse handeln, wenn sie Forschungsvalidität priorisieren. Ein umfassender Überblick der meta-wissenschaftlichen Literatur mit Fokus auf der quantitativen Politikwissenschaft weist einerseits auf jüngst eingeleitete Reformen zur Sicherung reliabler Forschung hin. Andererseits offenbart der vorliegende Überblicksartikel systematische Probleme in der Glaubwürdigkeit veröffentlichter Forschungsbefunde. Dieser Kommentar schlägt konkrete Maßnahmen vor, individuelle Forscheranreize in Einklang zu bringen mit dem gemeinschaftlichen Ziel verlässlicher Forschung.
Similar content being viewed by others
Notes
Different lines of thought in the Open Science movement comprise “the infrastructure school (which is concerned with the technological architecture), the public school (which is concerned with the accessibility of knowledge creation), the measurement school (which is concerned with alternative impact measurement), the democratic school (which is concerned with access to knowledge) and the pragmatic school (which is concerned with collaborative research)” (Fecher and Friesike 2014: 17).
Even if similar discussions are gaining traction in other research cultures (Monroe 2018; Elman et al. 2018; Janz 2018), this commentary focuses on quantitative political science as published in English-language peer-reviewed journals, which has attracted most meta-scientific attention in recent years. Although it is a debate worth having, it is beyond the scope of this commentary to discuss how the evidence and arguments presented here can be applied to other research cultures and publication formats in political science.
Freese and Peterson (2017) call this type of replication verifiability.
These criteria can be ordered hierarchically in the sense that the latter are more likely fulfilled when the former are met.
Note that making one’s work accessible to inter-subjective assessment goes beyond data transparency and includes disclosure of data-analytical and processing procedures. Stockemer et al. (2018) discuss cases in which attempts to replicate prior results failed because neither the syntax nor the published article provided sufficient information to repeat the authors’ analytical steps.
See https://opennessinitiative.org/. Accessed 20 September 2018.
For details on the costs of AJPS’s verification processes, see https://www.insidehighered.com/blogs/rethinking-research/should-journals-be-responsible-reproducibility. Accessed 20 September 2018.
Esarey and Wu (2016) estimate that the true value of statistical relationships is on average 40% smaller than their published value.
Note that HARKing and confirmatory analysis are perfectly reconcilable if the new data is collected before both analytical steps but not without the collection of new data: “Just as conspiracy theories are never falsified by the facts that they were designed to explain, a hypothesis that is developed on the basis of exploration of a data set is unlikely to be refuted by that same data. Thus, one always needs a fresh data set for testing one’s hypothesis.” (Wagenmakers et al. 2012, p. 633).
References
Angrist, Joshua D., and Jörn-Steffen Pischke. 2010. The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics. Journal of Economic Perspectives 24(2):3–30.
Brodeur, Abel, Nikolai Cook, and Anthony Heyes. 2018. Methods matter: p‑hacking and causal inference in economics. IZA discussion papers, Vol. 11796. Bonn: Institute for the Study of Labor (IZA). http://ftp.iza.org/dp11796.pdf. Accessed 10 September 2018.
Burlig, Fiona. 2018. Improving transparency in observational social science research: A pre-analysis plan approach. Economics Letters 168:56–60.
Camerer, Colin F., Anna Dreber, Eskil Forsell, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, Johan Almenberg, Adam Altmejd, Chan Taizan, Emma Heikensten, Felix Holzmeister, Taisuke Imai, Siri Isaksson, Gideon Nave, Thomas Pfeiffer, Michael Razen, and Hang Wu. 2016. Evaluating replicability of laboratory experiments in economics. Science 351:1433–1436. https://doi.org/10.1126/science.aaf0918.
Camerer, Colin F., Anna Dreber, Felix Holzmeister, Teck-Hua Ho, Jürgen Huber, Magnus Johannesson, Michael Kirchler, Gideon Nave, Brian A. Nosek, Thomas Pfeiffer, Adam Altmejd, Nick Buttrick, Chan Taizan, Yiling Chen, Eskil Forsell, Anup Gampa, Emma Heikensten, Lily Hummer, Taisuke Imai, Siri Isaksson, Dylan Manfredi, Julia Rose, Eric-Jan Wagenmakers, and Hang Wu. 2018. Evaluating the replicability of social science experiments in nature and science between 2010 and 2015. Nature Human Behaviour 2:637–644. https://doi.org/10.1038/s41562-018-0399-z.
Chambers, Chris. 2017. The Seven Deadly Sins of Psychology. A Manifesto for Reforming the Culture of Scientific Practice. Princeton: Princeton University Press.
Chambers, Chris, and Pete Etchells. 2018. Open science is now the only way forward for psychology, 23.08.2018. https://www.theguardian.com/science/head-quarters/2018/aug/23/open-science-is-now-the-only-way-forward-for-psychology. Accessed 31 August 2018.
Cingranelli, David, and Mikhail Filippov. 2018. Are Human Rights Practices Improving? American Political Science Review 112(4):1083–1089. https://doi.org/10.1017/S0003055418000254.
Cook, Bryan G., John Lloyd Wills, David Mellor, Brian A. Nosek, and William J. Therrien. 2018. Promoting Open Science to Increase the Trustworthiness of Evidence in Special Education. Exceptional Children 85(1):104–118. https://doi.org/10.1177/0014402918793138.
Elman, Colin, Diana Kapiszewski, and Arthur Lupia. 2018. Transparent Social Inquiry: Implications for Political Science. Annual Review of Political Science 21(1):29–47. https://doi.org/10.1146/annurev-polisci-091515-025429.
Esarey, Justin, and Ahra Wu. 2016. Measuring the effects of publication bias in political science. Research & Politics 3(3):1–9. https://doi.org/10.1177/2053168016665856.
Fecher, B., and Sascha Friesike. 2014. Open Science: One Term, Five Schools of Thought. In Opening Science: The Evolving Guide on How the Internet is Changing Research, Collaboration and Scholarly Publishing, ed. Sascha Friesike, Sönke Bartling, 17–47. Wiesbaden, Springer VS.
Fiedler, Klaus, and Norbert Schwarz. 2016. Questionable Research Practices Revisited. Social Psychological and Personality Science 7(1):45–52. https://doi.org/10.1177/1948550615612150.
Findley, Michael G., Nathan M. Jensen, Edmund J. Malesky, and Thomas B. Pepinsky. 2016. Can Results-Free Review Reduce Publication Bias? The Results and Implications of a Pilot Study. Comparative Political Studies 49:1667–1703. https://doi.org/10.1177/0010414016655539.
Fox, Nick, Nathan Honeycutt, and Lee Jussim. 2018. How many psychologists use questionable research practices? Estimating the population size of current QRP users. Preprint PsyArXiv. https://psyarxiv.com/3v7hx/download?format=pdf. Accessed 31 August 2018.
Franco, Annie, Neil Malhotra, and Gabor Simonovits. 2014. Publication bias in the social sciences: Unlocking the file drawer. Science 345(6203):1502–1505. http://science.sciencemag.org/content/345/6203/1502.full.pdf. Accessed 30 August 2018.
Franco, Annie, Neil Malhotra, and Gabor Simonovits. 2015. Underreporting in Political Science Survey Experiments: Comparing Questionnaires to Published Results. Political Analysis 23(2):306–312.
Freese, Jeremy, and David Peterson. 2017. Replication in Social Science. Annual Review of Sociology 43:147–165. https://doi.org/10.1146/annurev-soc-060116-053450.
Gerber, A., and Neil Malhotra. 2008. Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals. Quarterly Journal of Political Science 3(3):313–326. https://doi.org/10.1561/100.00008024.
Gerber, Alan S., Donald P. Green, and David Nickerson. 2001. Testing for Publication Bias in Political Science. Political Analysis 9(4):385–392. https://doi.org/10.1093/oxfordjournals.pan.a004877.
Gerber, Alan S., Neil Malhotra, Conor M. Dowling, and David Doherty. 2010. Publication Bias in Two Political Behavior Literatures. American Politics Research 38(4):591–613.
Gernsbacher, Morton A. 2018. Rewarding Research Transparency. Trends in Cognitive Sciences 22(11):953–956. https://doi.org/10.1016/j.tics.2018.07.002.
Gertler, Aaron L., and John G. Bullock. 2017. Reference Rot: An Emerging Threat to Transparency in Political Science. PS: Political Science & Politics 50(1):166–171. https://doi.org/10.1017/S1049096516002353.
Gervais, Will M., and Ara Norenzayan. 2018. Analytic atheism revisited. Nature Human Behaviour 2. https://doi.org/10.1038/s41562-018-0426-0.
Ghergina, Sergiu, and Alexia Katsanidou. 2013. Data Availability in Political Science Journals. European Political Science 12(3):333–349. https://doi.org/10.1057/eps.2013.8.
Hardwicke, Tom E., and John P.A. Ioannidis. 2018. Mapping the Universe of Registered Reports. BITSS Preprints. https://osf.io/preprints/bitss/fzpcy/. Accessed 10 September 2018.
Ho, Daniel E., Kosuke Imai, Gary King, and Elizabeth A. Stuart. 2007. Matching as Nonparametric Preprocessing for Reducing Model Dependence in Parametric Causal Inference. Political Analysis 15(3):199–236. https://doi.org/10.1093/pan/mpl013.
Humphreys, Macartan. 2018. Declare Design. Presentation at the University of Mannheim, 10.08.2018.
Humphreys, Macartan, Raul Sanchez de la Sierra, and Peter van der Windt. 2013. Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration. Political Analysis 21(1):1–20. https://doi.org/10.1093/pan/mps021.
Ioannidis, John P. A. 2005. Why Most Published Research Findings Are False. PLoS Med 2(8):e124. https://doi.org/10.1371/journal.pmed.0020124.
Ishiyama, John. 2014. Replication, Research Transparency, and Journal Publications: Individualism, Community Models, and the Future of Replication Studies. PS: Political Science & Politics 47(1):78–83. https://doi.org/10.1017/S1049096513001765.
Janz, Nicole. 2018. Replication and transparency in political science—did we make any progress? https://politicalsciencereplication.wordpress.com/2018/07/14/replication-and-transparency-in-political-science-did-we-make-any-progress/. Accessed 14 June 2018.
Kaplan, Robert M., and Veronica L. Irvin. 2015. Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time. PLoS ONE 10(8):e0132382. https://doi.org/10.1371/journal.pone.0132382.
Kerr, Norbert L. 1998. HARKing: HARKing: Hypothesizing After the Results are Known. Personality and Social Psychology Review 2(3):196–217. https://doi.org/10.1207/s15327957pspr0203_4.
Key, Ellen M. 2016. How Are We Doing? Data Access and Replication in Political Science. PS: Political Science & Politics 49(2):268–272. https://doi.org/10.1017/S1049096516000184.
King, Gary. 1995. Replication, Replication. PS: Political Science & Politics 28(3):444–452. https://doi.org/10.2307/420301.
LeBel, Etienne P., Randy J. McCarthy, Brian D. Earp, Malte Elson, and Wolf Vanpaemel. 2018. A Unified Framework to Quantify the Credibility of Scientific Findings. Advances in Methods and Practices in Psychological Science 1(3):389–402. https://doi.org/10.1177/2515245918787489.
Lenz, Gabriel, and Alexander Sahn. 2017. Achieving Statistical Significance with Covariates. BITSS Preprints. https://osf.io/preprints/bitss/s42ba/download?format=pdf. Accessed 10 September 2018.
Lupia, Arthur, and Colin Elman. 2014. Openness in Political Science: Data Access and Research Transparency: Introduction. PS: Political Science & Politics 47(1):19–42. https://doi.org/10.1017/S1049096513001716.
Mellor, David, Alexandra Hartman, and Florian Kern. 2018. Preregistration for Qualitative Research Template. Open Science Framework (OSF). https://osf.io/j7ghv/. Accessed 28 September 2018.
Monroe, Kristen R. 2018. The Rush to Transparency: DA-RT and the Potential Dangers for Qualitative Research. Perspectives on Politics 16(1):141–148. https://doi.org/10.1017/S153759271700336X.
Montgomery, Jacob M., and Brendan Nyhan. 2010. Bayesian Model Averaging: Theoretical Developments and Practical Applications. Political Analysis 18(2):245–270. https://doi.org/10.1093/pan/mpq001.
Motyl, Matt, Alexander P. Demos, Timothy S. Carsel, Brittany E. Hanson, Zachary J. Melton, Allison B. Mueller, J. P. Prims, Jiaqing Sun, Anthony N. Washburn, Kendal M. Wong, Caitlyn Yantis, and Linda J. Skitka. 2017. The state of social and personality science: Rotten to the core, not so bad, getting better, or getting worse? Journal of Personality and Social Psychology 113(1):34–58.
Nelson, Leif D., Joseph Simmons, and Uri Simonsohn. 2018. Psychology’s Renaissance. Annual Review of Psychology 69:511–534. https://doi.org/10.1146/annurev-psych-122216-011836.
Nosek, Brian A., Charles R. Ebersole, Alexander C. DeHaven, and David T. Mellor. 2018. The preregistration revolution. Proceedings of the National Academy of Sciences 115(11):2600–2606. https://doi.org/10.1073/pnas.1708274114.
Open Science Collaboration. 2015. Estimating the reproducibility of psychological science. Science 349(6251):aac4716. http://doi.org/10.1126/science.aac4716.
Pearl, Judea, and Dana Mackenzie. 2018. The Book of Why: The New Science of Cause and Effect. New York: Basic Books.
Rinke, Eike M., and Frank M. Schneider. 2015. Probabilistic misconceptions are pervasive among communication researchers. 65th Annual Conference of the International Communication Association, San Juan, Puerto Rico, 21.–25.05.2015. Mannheim: MZES University of Mannheim.
Rohrer, Julia, Boris Egloff, and Stefan C. Schmukle. 2017. Probing Birth-Order Effects on Narrow Traits Using Specification-Curve Analysis. Psychological Science 28(12):1821–1832. https://doi.org/10.1177/0956797617723726.
Scheliga, Kaja, and Sascha Friesike. 2014. Putting open science into practice: A social dilemma? First Monday 19(9). http://dx.doi.org/10.5210/fm.v19i9.5381.
Schönbrodt, Felix, and David Mellor. 2018. Academic job offers that mentioned open science. Open Science Framework. https://osf.io/7jbnt/. Accessed 13 October 2018.
Shweder, Richard A., and Donald Winslow Fiske. 1986. Introduction: Uneasy Social Science. In Metatheory in Social Science. Pluralisms and Subjectivities, ed. Donald W. Fiske, Richard A. Shweder, 1–18. Chicago: University of Chicago Press.
Simmons, Joseph P., Leif D. Nelson, and Uri Simonsohn. 2011. False-Positive Psychology Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science 22(11):1359–1366. https://doi.org/10.1177/0956797611417632.
Simonsohn, Uri, Joseph P. Simmons, and Leif D. Nelson. 2015. Specification Curve: Descriptive and Inferential Statistics on All Reasonable Specifications. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2694998.
Spada, Paolo, and Matt Ryan. 2017. The Failure to Examine Failures in Democratic Innovation. Political Science & Politics 50(3):772–778. https://doi.org/10.1017/S1049096517000579.
Sparrow, Betsy. 2018. The importance of contextual relevance. Nature Human Behaviour 2:607. https://doi.org/10.1038/s41562-018-0411-7.
Stark, Philip B. 2018. Before reproducibility must come preproducibility. Nature 557:613.
Steegen, Sara, Francis Tuerlinckx, Andrew Gelman, and Wolf Vanpaemel. 2016. Increasing Transparency Through a Multiverse Analysis. Perspectives on Psychological Science 11(5):702–712. https://doi.org/10.1177/1745691616658637.
Stockemer, Daniel, Sebastian Koehler, and Tobias Lenz. 2018. Data Access, Transparency, and Replication: New Insights from the Political Behavior Literature. PS: Political Science & Politics 51(4):799–803. https://doi.org/10.1017/S1049096518000926.
Tenopir, Carol, Suzie Allard, Kimberly Douglass, Arsev Umur Aydinoglu, Lei Wu, Eleanor Read, Maribeth Manoff, and Mike Frame. 2011. Data Sharing by Scientists: Practices and Perceptions. PLoS ONE 6(6):e21101. https://doi.org/10.1371/journal.pone.0021101
Vadillo, Miguel A., Natalie Gold, and Magda Osman. 2018. Searching for the bottom of the ego well: failure to uncover ego depletion in Many Labs 3. Royal Society Open Science 5(8):180390. https://doi.org/10.1098/rsos.180390.
de Vries, Ymkje, Annelieke Roest, Peter de Jonge, Pim Cuijpers, Marcus Munafò, and Brian Nosek, George Alter, George Banks, Denny Borsboom, Sara Bowman, Steven Breckler, Stuart Buck, Christopher Chambers, Gilbert Chin, Garret Christensen, Monica Contestabile, Allan Dafoe, Eric Eich, Jeremy Freese, Rachel Glennerster, Daniel Goroff, Donald Green, Bradford Hesse, Macartan Humphreys, John Ishiyama, Dean Karlan, Alan Kraut, Arthur Lupia, Patricia Mabry, Temina Madon, Neil Malhotra, Evan Mayo-Wilson, Marcia McNutt, Edward Miguel, Elizabeth Paluck, Uri Simonsohn, Courtney Soderberg, Barbara Spellman, James Turitto, Gary VandenBos, Simine Vazire, Eric-Jan Wagenmakers, Rick Wilson, and Tal Yarkoni. 2015. Promoting an open research culture. Author guidelines for journals could help to promote transparency, openness, and reproducibility. Science 348:1422–1425.
de Vries, Ymkje, Annelieke Roest, Peter de Jonge, Pim Cuijpers, Marcus Munafò, and Jojanneke Bastiaansen. 2018. The cumulative effect of reporting and citation biases on the apparent efficacy of treatments: The case of depression. Psychological Medicine. https://doi.org/10.1017/s0033291718001873.
Wagenmakers, E.-J., Ruud Wetzels, Denny Borsboom, Han L. J. van der Maas, and Rogier A. Kievit. 2012. An Agenda for Purely Confirmatory Research. Perspectives on Psychological Science 7(6):632–638. https://doi.org/10.1177/1745691612463078.
Washburn, Anthony N., Brittany E. Hanson, Matt Motyl, Linda J. Skitka, Caitlyn Yantis, Kendal M. Wong, Jiaqing Sun, J. P. Prims, Allison B. Mueller, Zachary J. Melton, and Timothy S. Carsel. 2018. Why Do Some Psychology Researchers Resist Adopting Proposed Reforms to Research Practices? A Description of Researchers’ Rationales. Advances in Methods and Practices in Psychological Science 1(2):166–173. https://doi.org/10.1177/2515245918757427.
Weston, Sara J., and Marjan Bakker. 2018. Preregistration hack-a-shop. Open Science Framework. https://osf.io/vjdwm/. Accessed 15 October 2018.
Weston, Sara J., David Mellor, Marjan Bakker, Olmo van den Akker, Lorne Campbell, Stuart J. Ritchie, William J. Chopik, Rodica I. Damian, Jessica Kosie, and Courtney K. Soderberg. 2018a. Secondary data preregistration. Open Science Framework. https://osf.io/x4gzt/. Accessed 14 October 2018.
Weston, Sara J., Stuart J. Ritchie, Julia M. Rohrer, and Andrew K. Przybylski. 2018b. Recommendations for increasing the transparency of analysis of pre-existing datasets. Preprint PsyArXiv. https://psyarxiv.com/zmt3q/download?format=pdf. Accessed 15 October 2018.
Wilcox, Rand R. 2017. Introduction to Robust Estimation and Hypothesis Testing. A volume in Statistical Modeling and Decision Science. Amsterdam: Elsevier.
Zigerell, L. J. 2017. Reducing Political Bias in Political Science Estimates. PS: Political Science & Politics 50(1):179–183. https://doi.org/10.1017/S1049096516002389.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wuttke, A. Why Too Many Political Science Findings Cannot Be Trusted and What We Can Do About It: A Review of Meta-Scientific Research and a Call for Academic Reform. Polit Vierteljahresschr 60, 1–19 (2019). https://doi.org/10.1007/s11615-018-0131-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11615-018-0131-7