Skip to main content

Popularity Weighted Ranking for Academic Digital Libraries

  • Conference paper
Advances in Information Retrieval (ECIR 2007)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4425))

Included in the following conference series:

Abstract

We propose a popularity weighted ranking algorithm for academic digital libraries that uses the popularity factor of a publication venue overcoming the limitations of impact factors. We compare our method with the naive PageRank, citation counts and HITS algorithm, three popular measures currently used to rank papers beyond lexical similarity. The ranking results are evaluated by discounted cumulative gain(DCG) method using four human evaluators. We show that our proposed ranking algorithm improves the DCG performance by 8.5% on average compared to naive PageRank, 16.3% compared to citation count and 23.2% compared to HITS. The algorithm is also evaluated by click through data from CiteSeer usage log.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Citeseer, http://citeseer.ist.psu.edu

  2. Cutting, D.: The Lucene Search Engine (2006), http://lucene.apache.org/

  3. Garfield, E.: The impact factor. Current Contents 25, 3–7 (1994)

    Google Scholar 

  4. Harnard, S.: The New World of Webmetric Performance Indicators: Mandating, Monitoring, Measuring and Maximising Research Impact in the Open Access Age. In: Proc. of the 1st ECSP in Biomedicine and Medicine (2006)

    Google Scholar 

  5. Hawking, D., et al.: Results and challenges in Web search evaluation. In: Proc. of the 8th International World Wide Web Conference, pp. 1321–1330 (1999)

    Google Scholar 

  6. Hecht, F., Hecht, B., Sandberg, A.: The journal “impact factor”: a misnamed, misleading, misused measure. Cancer Genet. Cytogenet. 104(2), 77–81 (1998)

    Article  Google Scholar 

  7. Hull, D.: Using statistical testing in the evaluation of retrieval experiments. In: Proceedings of the 16th annual international ACM SIGIR Conference, pp. 329–338. ACM Press, New York (1993)

    Chapter  Google Scholar 

  8. Jarvelin, K., Kekalainen, J.: IR evaluation methods for retrieving highly relevant documents. In: Proc. of the 23rd SIGIR conference, pp. 41–48 (2000)

    Google Scholar 

  9. Kleinberg, J.M.: Authoritative sources in a hyperlinked environment. Journal of ACM 48, 604–632 (1999)

    Article  MathSciNet  Google Scholar 

  10. Landis, R.J., Koch, G.G.: The measurement of observer agreement for categorical data. Biometrics 33, 159–174 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  11. Lehmann, S., Lautrup, B., Jackson, A.D.: Citation networks in high energy physics. Physical Reivew E68, 026113 (2003)

    Google Scholar 

  12. Narin, F.: Evaluative bibliometrics: The use of publication and citation analysis in the evaluation of scientific activity. Computer Horizons, Cherryhill (1976)

    Google Scholar 

  13. Nie, Z., et al.: Object-Level Ranking: Bringing Order to Web Objects. In: Proc. of the 14th International World Wide Web Conference (2005)

    Google Scholar 

  14. Page, L., Brin, S.: The PageRank citation ranking: bringing order to the web. Tech. report SIDL-WP-1999-0120, Stanford University (Nov. 1999)

    Google Scholar 

  15. Pirolli, P., Card, S.: Information foraging in information access environments. In: Proc. of the SIGCHI conference, pp. 51–58 (1995)

    Google Scholar 

  16. Redner, S.: How Popular is Your Paper? An Empirical Study of the Citation Distribution. European Physical Journal B 4, 131–134 (1998)

    Article  Google Scholar 

  17. Richardson, M., Prakash, A., Brill, E.: Beyond PageRank: Machine Learning for Static Ranking. In: Proc. of the 15th International World Wide Web Conference (2006)

    Google Scholar 

  18. Seglen, P.: Why the impact factor of journals should not be used for evaluating research. British medical journal 314(7079), 498–502 (1997)

    Google Scholar 

  19. Thomson and Corporation. In Cites: Analysis Of (2005), http://www.in-cites.com/analysis/

  20. Voorhees, E.: Evaluation by Highly Relevant Documents. In: Proc. of the 24th annual international ACM SIGIR conference, pp. 74–82. ACM Press, New York (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Giambattista Amati Claudio Carpineto Giovanni Romano

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Sun, Y., Giles, C.L. (2007). Popularity Weighted Ranking for Academic Digital Libraries. In: Amati, G., Carpineto, C., Romano, G. (eds) Advances in Information Retrieval. ECIR 2007. Lecture Notes in Computer Science, vol 4425. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-71496-5_57

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-71496-5_57

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-71494-1

  • Online ISBN: 978-3-540-71496-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics