Skip to main content

Data-Driven Student Knowledge Assessment through Ill-Defined Procedural Tasks

  • Conference paper
Current Topics in Artificial Intelligence (CAEPIA 2009)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5988))

Included in the following conference series:

Abstract

The Item Response Theory (IRT) is a statistical mechanism successfully used since the beginning of the 20th century to infer student knowledge through tests. Nevertheless, existing well-founded techniques to assess procedural tasks are generally complex and applied to well-defined tasks. In this paper, we describe how, using a set of techniques we have developed based on IRT, it is possible to infer declarative student knowledge through procedural tasks. We describe how these techniques have been used with undergraduate students, in the object oriented programming domain, through ill-defined procedural exercises.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brooks, R.A.: Intelligence without representation. Artificial Intelligence 47, 139–159 (1991)

    Article  Google Scholar 

  2. Greer, J., McCalla, G.: Student Modeling: The Key to Individualized Knowledge-based Instruction. Springer, Heidelberg (1994)

    Book  MATH  Google Scholar 

  3. Thurstone, L.L.: A method of scaling psychological and educational tests. Journal of Educational Psychology 16, 433–451 (1925)

    Article  Google Scholar 

  4. Anderson, J.R., Boyle, C.F., Corbett, A.T., Lewis, M.W.: Cognitive modeling and intelligent tutoring. Artificial Intelligence 42, 7–49 (1990)

    Article  Google Scholar 

  5. Conati, C., Gertner, A., VanLehn, K.: Using Bayesian networks to manage uncertainly in student modeling. User Modeling & User-Adapted Interaction 12(4), 371–417 (2002)

    Article  MATH  Google Scholar 

  6. Hayes, J.R.: Cognitive psychology: Thinking and creating. Dorsey Press, Homewood (1978)

    Google Scholar 

  7. Gálvez, J., Guzmán, E., Conejo, R., Millán, E.: Student Knowledge Diagnosis Using Item Response Theory and Constraint-Based Modeling. In: The 14th International Conference on Artificial Intelligence in Education (AIED 2009), vol. 200, pp. 291–298 (2009)

    Google Scholar 

  8. Ohlsson, S.: Constraint-based Student Modeling. In: Student Modeling: The Key to Individualized Knowledge-based Instruction, pp. 167–189. Springer, Heidelberg (1994)

    Chapter  Google Scholar 

  9. Mitrovic, A., Martin, B., Suraweera, P.: Intelligent Tutors for All: The Constraint-Based Approach. IEEE Intelligent Systems, IEEE Educational Activities Department 22, 38–45 (2007)

    Google Scholar 

  10. Ohlsson, S., Mitrovic, A.: Constraint-based knowledge representation for individualized instruction. Computer Science and Information Systems 3, 1–22 (2006)

    Google Scholar 

  11. Mayo, M., Mitrovic, A.: Optimising ITS behaviour with Bayesian networks and decision theory. International Journal of Artificial Intelligence in Education 12, 124–153 (2001)

    Google Scholar 

  12. Hambleton, R.K., Swaminathan, H., Rogers, J.H.: Fundamentals of Item Response Theory (Measurement Methods for the Social Science). Sage Publications, Inc., Thousand Oaks (1991)

    Google Scholar 

  13. Guzmán, E., Conejo, R., Pérez-de-la-Cruz, J.L.: Adaptive Testing for Hierarchical Student Models. User Modeling and User-Adapted Interaction 17, 119–157 (2007)

    Article  Google Scholar 

  14. Gálvez, J., Guzmán, E., Conejo, R.: HA blended E-learning experience in a course of object oriented programming fundamentals. Knowledge-Based Systems 22(4), 279–286 (2009)

    Article  Google Scholar 

  15. Gálvez, J., Guzmán, E., Conejo, R.: A SOA-Based Framework for Constructing Problem Solving Environments. In: The 8th IEEE International Conference on Advanced Learning Technologies, pp. 126–128 (2008)

    Google Scholar 

  16. Friedman-Hill, E.J.: JESS, The Java Expert System Shell. SAND–98-8206 (1997)

    Google Scholar 

  17. Guzmán, E., Conejo, R., Pérez-de-la-Cruz, J.L.: Improving Student Performance using Self-Assessment Tests. IEEE Intelligent Systems 22, 46–52 (2007)

    Article  Google Scholar 

  18. Thissen, D.: Multilog: Multiple, categorical item analysis and test scoring using item response theory (version 5.1). Mooresville, In Scientific Software (1988)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Gálvez, J., Guzmán, E., Conejo, R. (2010). Data-Driven Student Knowledge Assessment through Ill-Defined Procedural Tasks. In: Meseguer, P., Mandow, L., Gasca, R.M. (eds) Current Topics in Artificial Intelligence. CAEPIA 2009. Lecture Notes in Computer Science(), vol 5988. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-14264-2_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-14264-2_24

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-14263-5

  • Online ISBN: 978-3-642-14264-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics