Skip to main content
Log in

What drives the relevance and reputation of economics journals? An update from a survey among economists

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This paper analyses the interrelationship between perceived journal reputation and its relevance for academics’ work. Based on a survey of 705 members of the German Economic Association (GEA), we find a strong interrelationship between perceived journal reputation and relevance where a journal’s perceived relevance has a stronger effect on its reputation than vice versa. Moreover, past journal ratings conducted by the Handelsblatt and the GEA directly affect journals’ reputation among German economists and indirectly also their perceived relevance, but the effect on reputation is more than twice as large as the effect on perceived relevance. In general, citations have a non-linear impact on perceived journal reputation and relevance. While the number of landmark articles published in a journal (as measured by the so-called H-index) increases the journal’s reputation, an increase in the H-index even tends to decrease a journal’s perceived relevance, as long as this is not simultaneously reflected in a higher Handelsblatt and/or GEA rating. This suggests that a journal’s relevance is driven by average article quality, while reputation depends more on truly exceptional articles. We also identify significant differences in the views on journal relevance and reputation between different age groups.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. The use of these rankings for evaluation purposes has been heavily debated in many scientific disciplines and countries (see, e.g., Albers 2009; Franses 2014; Frey and Osterloh 2014), recently even leading to a boycott of business scholars in Germany (see Berlemann and Haucap 2015).

  2. The H-index (see Hirsch 2005) is the maximum number n of articles that have been cited at least n times.

  3. Note, however, that journal rankings based on impact factors are remarkably stable even if one removes the journals’ most heavily cited articles from the count (see Seiler and Wohlrabe 2014).

  4. We replaced Australian Journal of Agricultural Economics, Die Weltwirtschaft, Hamburger Jahrbuch für Wirtschafts- und Gesellschaftspolitik, Homo Oeconomicus, Jahrbuch für Neue Politische Ökonomie, Public Finance Quarterly, Public Finance, RWI-Mitteilungen, and Swedish Economic Policy Review, as most of them ceased to exist, by American Economic Journal: Applied Economics, DIW-Wochenbericht, Economics-The Open-Access Journal, ifo Schnelldienst, International Organization, Journal of the European Economic Association, Nature, Public Finance Review, and Science.

  5. The log-odds transformation for a variable x is defined as log [x/(1 − x)].

  6. For a more detailed analysis of the interrelationship between respondents’ evaluation of journals and journals’ rating in the journal rankings of the Handelsblatt and the GEA see Bräuninger et al. (2011b).

  7. Note that the coefficients reported only represent the direct effect of each independent variable on Relevance or Reputation. However, as already explained in the case of the 2SLS estimation, calculating the total effects would involve plugging Eq. (1) in Eq. (2) (and vice versa) and successively solving for each variable. In the context of the FRM model, this implies plugging a normal density function into the exponential part of another normal density function. However, in this paper, we refrained from calculating the total effects for the FRM model for two reasons. First, solving the resulting equations for each variable is analytically not tractable anymore. Second, due to the nonlinear nature of the estimation equation, the total effect of each variable would still depend on the value of all other independent variables which, in turn, impedes a meaningful interpretation of the total effects.

  8. We conducted some additional tests to explore why German economists consider journals with a high H-index less relevant for their daily work. First, we re-estimated the regression models while excluding Nature and Science, both of which have very high H-indices while receiving below-average ratings in terms of relevance. Secondly, we also interacted H-index with a dummy variable indicating whether or not the journal primarily focuses on statistics and econometrics to test whether the negative effect of the H-index can be attributed to econometric journals which have high H-indices due to the publication of methodological landmark articles but which might otherwise not be very relevant for many economists. However, in both cases, the negative coefficient estimate for H-index persisted.

  9. Note that neither the Handelsblatt- nor the GEA-rating were available when Bräuninger and Haucap conducted their survey. Also the H-index was only invented in 2005 (see Hirsch 2005).

References

  • Albers, S. (2009). Misleading rankings of research in business. German Economic Review, 3, 352–363.

    Article  Google Scholar 

  • Azar, O. H. (2005). The review process in economics: Is it too fast? Southern Economic Journal, 72, 481–491.

    Article  Google Scholar 

  • Beed, C., & Beed, C. (1996). Measuring the quality of academic journals: The case of economics. Journal of Post-Keynesian Economics, 18, 369–396.

    Google Scholar 

  • Berlemann, M., & Haucap, J. (2015). Which factors drive the decision to opt out of individual research rankings? An empirical study of academic resistance to change. Research Policy. doi:10.1016/j.respol.2014.12.002.

  • Besley, T., & Hennessy, P. (2009). Letter to the Queen, dated July 22, 2009, online at: http://www.ft.com/intl/cms/3e3b6ca8-7a08-11de-b86f-00144feabdc0.pdf

  • Blank, R. M. (1991). The effects of double-blind versus single-blind reviewing: Experimental evidence from the American Economic Review. American Economic Review, 81, 1041–1067.

    Google Scholar 

  • Blaug, M. (1997). Ugly currents in modern economics. Policy Options, 18(7), 3–8.

  • Bräuninger, M., & Haucap, J. (2001). Was Ökonomen lesen und schätzen. Perspektiven der Wirtschaftspolitik, 2, 185–210.

    Article  Google Scholar 

  • Bräuninger, M., & Haucap, J. (2003). Reputation and relevance of economics journals. Kyklos, 56, 175–198.

    Article  Google Scholar 

  • Bräuninger, M., Haucap, J., & Muck, J. (2011a). Was schätzen und lesen deutschsprachige Ökonomen heute? Perspektiven der Wirtschaftspolitik, 12, 339–371.

    Article  Google Scholar 

  • Bräuninger, M., Haucap, J., & Muck, J. (2011b). Was lesen und schätzen Ökonomen im Jahr 2011?, DICE Ordnungspolitische Perspektiven 18. https://ideas.repec.org/p/zbw/diceop/18.html.

  • Chang, C.-L., McAleer, M., & Oxley, L. (2011a). What makes a great journal in economics? The singer, not the song. Journal of Economic Surveys, 25, 326–361.

    Article  Google Scholar 

  • Chang, C.-L., McAleer, M., & Oxley, L. (2011b). What makes a great journal great in sciences? Which came first, the chicken or the egg? Scientometrics, 87, 17–40.

    Article  Google Scholar 

  • Chang, C.-L., McAleer, M., & Oxley, L. (2011c). Great expectatrics: Great papers, great journals, great econometrics. Econometric Reviews, 30, 583–619.

    Article  MathSciNet  Google Scholar 

  • Colander, D., Goldberg, M., Haas, A., Juselius, K., Kirman, A., Lux, T., & Sloth, B. (2009). The financial crisis and the systemic failure of the economics profession. Critical Review, 21(2), 249–267.

    Article  Google Scholar 

  • Danielson, A., & Delorme, C. D. (1976). Some empirical evidence on the variables associated with the ranking of economics journals. Southern Economic Journal, 43, 1149–1160.

    Article  Google Scholar 

  • Demarest, B., Freeman, G., & Sugimoto, C. R. (2014). The reviewer in the mirror: Examining gendered and ethnicized notions of reciprocity in peer review. Scientometrics, 101, 717–735.

    Article  Google Scholar 

  • Dow, S. C., Earl, P. E., Foster, J., Harcourt, G. C., Hodgson, G. M., Metcalfe, J. S., et al. (2009). The GFC and University economics education: An open letter to the Queen. Journal of Australian Political Economy, 64, 233–235.

    Google Scholar 

  • Dulleck, U., & Kerschbamer, R. (2006). On doctors, mechanics and computer specialists: The economics of credence goods. Journal of Economic Literature, 44, 5–42.

    Article  Google Scholar 

  • Ellis, L. V., & Durden, G. C. (1991). Why economists rank their journals the way they do. Journal of Economics and Business, 43, 265–270.

    Article  Google Scholar 

  • Ellison, G. (2002). Evolving standards for academic publishing: A q-r-theory. Journal of Political Economy, 110, 994–1034.

    Article  Google Scholar 

  • Ellison, G. (2011). Is peer-review in decline? Economic Inquiry, 49, 635–657.

    Article  Google Scholar 

  • Engers, M., & Gans, J. S. (1998). Why referees are not paid (enough). American Economic Review, 88, 1341–1349.

    Google Scholar 

  • Franses, P. H. (2014). Trends in three decades of rankings of Dutch economists. Scientometrics, 98, 1257–1268.

    Article  Google Scholar 

  • Frey, B. S. (2005). Problems with publishing: Existing state and solutions. European Journal of Law and Economics, 19, 173–190.

    Article  Google Scholar 

  • Frey, B. S., & Osterloh, M. (2014). Ranking games. Evaluation Review. doi:10.1177/0193841X14524957.

  • Frey, B. S., & Rost, K. (2010). Do rankings reflect research quality? Journal of Applied Economics, 13, 1–38.

    Article  Google Scholar 

  • Graber, M., Launov, A., & Wälde, L. (2008). Publish or perish? The Increasing Importance of publications for prospective economics professors in Austria, Germany and Switzerland. German Economic Review, 9, 457–472.

    Article  Google Scholar 

  • Handelsblatt. (2011). So funktioniert das VWL-Ranking. http://www.handelsblatt.com/politik/oekonomie/vwl-ranking/methodik-so-funktioniert-das-vwl-ranking/4575334.html

  • Harzing, A. W. (2007). Publish or perish. http://www.harzing.com/pop.htm

  • Hawkins, R. G., Ritter, L. S., & Walter, I. (1973). What economists think of their journals. Journal of Political Economy, 81, 1017–1032.

    Article  Google Scholar 

  • Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572.

    Article  Google Scholar 

  • Hodgson, G. M., & Rothman, H. (1999). The editors and authors of economics journals: A case of institutional oligopoly? Economic Journal, 109, F165–F186.

    Article  Google Scholar 

  • Institute for Scientific Information. (2010). Journal citation report 2009. Philadelphia: Institute for Scientific Information.

    Google Scholar 

  • Institute for Scientific Information. (2011). Journal citation report 2010. Philadelphia: Institute for Scientific Information.

    Google Scholar 

  • Iyengar, K., & Balijepally, V. (2015). Ranking journals using the dominance hierarchy procedure: An illustration with IS journals. Scientometrics, 102, 5–23.

    Article  Google Scholar 

  • Kalaitzidakis, P., Mamuneas, T., & Stengos, T. (2003). Rankings of academic journals and institutions in economics. Journal of the European Economic Association, 1, 1346–1366.

    Article  Google Scholar 

  • Kleibergen, F., & Paap, R. (2006). Generalized reduced rank tests using the singular value decomposition. Journal of Econometrics, 127, 97–126.

    Article  MathSciNet  Google Scholar 

  • Krugman, P. (2009). How did economists get it so wrong? New York times online, September 02, 2009. http://www.nytimes.com/2009/09/06/magazine/06Economic-t.html

  • Krugman, P. (2013). How the case for austerity has crumbled. The New York review of books, June 6, 2013. http://www.nybooks.com/articles/archives/2013/jun/06/how-case-austerity-has-crumbled/

  • Laband, D. N. (1990). Is there value-added from the review process in economics? Preliminary evidence from authors. Quarterly Journal of Economics, 105, 341–353.

    Article  Google Scholar 

  • Lucas, R. E. (2009). In defence of the dismal science. The Economist, 392(8643), 67.

    Google Scholar 

  • Ma, Z., Pan, Y., Yu, Z., Wang, J., Jia, J., & Wu, Y. (2013). A quantitative study on the effectiveness of peer review for academic journals. Scientometrics, 95, 1–13.

    Article  Google Scholar 

  • Oswald, A. J. (2007). An examination of the reliability of prestigious scholarly journals: Evidence and implications for decision-makers. Economica, 74, 21–31.

    Article  Google Scholar 

  • Papke, L. W., & Wooldridge, J. M. (1996). Econometric methods for fractional response variables with an application to 401 (K) plan participation rates. Journal of Applied Econometrics, 11, 619–632.

    Article  Google Scholar 

  • Piketty, T. (2014). Capital in the twenty-first century. Cambridge, MA: Belknap Press of Harvard University Press.

    Google Scholar 

  • Ramalho, E. A., Ramalho, J. J. S., & Henriques, P. D. (2010). Fractional regression models for second stage DEA efficiency analyses. Journal of Productivity Analysis, 34, 239–255.

    Article  Google Scholar 

  • Ramalho, E. A., Ramalho, J. J. S., & Murteira, J. M. R. (2011). Alternative estimating and testing empirical strategies for fractional regression models. Journal of Economic Surveys, 25, 19–68.

    Article  Google Scholar 

  • Reinhart, C., & Rogoff, K. (2013). Open letter to Paul Krugman. 25. Mai 2013. http://www.carmenreinhart.com/letter-to-pk/

  • Ritzberger, K. (2008). A ranking of journals in economics and related fields. German Economic Review, 9, 402–430.

    Article  Google Scholar 

  • Schläpfer, F. (2010). How much does journal reputation tell us about the academic interest and relevance of economic research? GAIA: Ecological Perspectives for Science & Society, 19(2), 140–145.

    Google Scholar 

  • Schneider, F., & Ursprung, H. W. (2008). The 2008 GEA journal ranking for the economics profession. German Economic Review, 9, 532–538.

    Article  Google Scholar 

  • Seiler, C., & Wohlrabe, K. (2014). How robust are journal rankings based on the impact factor? Evidence from the economic sciences. Journal of Informetrics, 8, 904–911.

    Article  Google Scholar 

  • Sorzano, C. O. S., Vargas, J., Caffarena-Fernández, G., & Iriarte, A. (2014). Comparing scientific performance among equals. Scientometrics, 101, 1731–1745.

    Article  Google Scholar 

  • Staiger, D., & Stock, J. H. (1997). Instrumental variables regression with weak instruments. Econometrica, 65, 557–586.

    Article  MATH  MathSciNet  Google Scholar 

  • Statalist. (2010). Discussion onreg3 option-robust-”. http://statalist.1588530.n2.nabble.com/reg3-option-robust-td5647547.html

  • Sutter, M., & Kocher, M. (2001). Tools for evaluating research output: Are citation-based rankings of economics journals stable? Evaluation Review, 25, 555–566.

    Article  Google Scholar 

  • The Economist. (2009a). What went wrong with economics? The Economist, 392(8640), 11–12.

    Google Scholar 

  • The Economist. (2009b). The other-wordly philosophers. The Economist, 392(8640), 65–67.

    Google Scholar 

  • Wall, H. J. (2009). Don’t get skewed over by journal rankings. B.E. Journal of Economic Analysis and Policy, 9(1), Article 34.

  • Wooldridge, J. M. (2010). Econometric analysis of cross section and panel data (2nd ed.). Cambridge, MA: MIT Press.

    MATH  Google Scholar 

  • Wooldridge, J. M. (2014). Quasi-maximum likelihood estimation and testing for nonlinear models with endogenous explanatory variables. Journal of Econometrics, 182, 226–234.

  • Wooldridge, J. M. (2013). Introductory econometrics—A modern approach (5th ed.). Florence: South-Western.

    Google Scholar 

Download references

Acknowledgments

We would like to thank all economists who participated in our survey. Moreover, we thank Elisabeth Flieger, Susanne Schäfers and Olaf Siegert (all of ZBW—Leibniz-Informationszentrum Wirtschaft) for their excellent support in conducting the survey and assembling the data. For comments and very useful discussions we thank Michael Bräuninger, Florian Heiß and two anonymous referees.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Justus Haucap.

Appendix

Appendix

See Tables 4, 5, 6 and 7.

Table 4 Table of correlations
Table 5 Determinants of Relevance for different age groups
Table 6 Determinants of Reputation for different age groups
Table 7 Estimated total effects of 2SLS models for different age groups

See Figs. 7, 8, 9, 10 and 11.

Fig. 7
figure 7

Average marginal effect of H-index on Relevance. Note Five journals with H-index > 230 have been omitted to improve the visual representation

Fig. 8
figure 8

Average marginal effect of H-index on Reputation. Note Five journals with H-index > 230 have been omitted to improve the visual representation

Fig. 9
figure 9

Average marginal effect of Volume on Reputation. Note Five journals with Volume > 230 have been omitted to improve the visual representation

Fig. 10
figure 10

Average marginal effect of HB-Rating on Reputation

Fig. 11
figure 11

Average marginal effect of GEA-Rating on Reputation

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Haucap, J., Muck, J. What drives the relevance and reputation of economics journals? An update from a survey among economists. Scientometrics 103, 849–877 (2015). https://doi.org/10.1007/s11192-015-1542-5

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-015-1542-5

Keywords

JEL Classification

Navigation