Skip to main content

Oranges and Apples? Using Comparative Judgement for Reliable Briefing Paper Assessment in Simulation Games

  • Chapter
  • First Online:
Simulations of Decision-Making as Active Learning Tools

Part of the book series: Professional and Practice-based Learning ((PPBL,volume 22))

Abstract

Achieving a fair and rigorous assessment of participants in simulation games represents a major challenge. Not only does the difficulty apply to the actual negotiation part, but it also extends to the written assignments that typically accompany a simulation. For one thing, if different raters are involved, it is important to assure that differences in severity do not affect the grades. Recently, comparative judgement (CJ) has been introduced as a method allowing for a team-based grading. This chapter discusses in particular the potential of comparative judgement for assessing briefing papers from 84 students. Four assessors completed 622 comparisons in the Digital Platform for the Assessment of Competences (D-PAC) tool. Results indicate a reliability level of 0.71 for the final rank order, which had demanded a time investment around 10.5 h from the team of assessors. Next to this, there was no evidence of bias towards the most important roles in the simulation game. The study also details how the obtained rank orders were translated into grades, ranging from 11 to 17 out of 20. These elements showcase CJ’s advantage in reaching adequate reliability levels for briefing papers in an efficient manner.

Pierpaolo Settembri writes in a personal capacity and the views he expresses in this publication may not be in any circumstances regarded as stating an official position of the European Commission.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Raymond and Usherwood (2013, p. 4) put it extremely clearly: “University faculty must ask themselves what a simulation adds to a student’s knowledge base that cannot be learned more efficiently in a traditional classroom setting, and how this can be measured”. Baranowski and Weir (2015) offer a deep review of the literature evaluating the effects of simulations, coming to the conclusion that “a small but growing body of evidence lends support to the contention that students who participate in simulations do in fact learn more than students not taking part in this exercise”. For a different outcome, see Raymond (2010).

  2. 2.

    Alternatively, the data for the assessment can be collected by videotaping the meetings. Although this is a highly intrusive method, it yields material useful for subsequent analyses.

  3. 3.

    Perchoc (2016) mentions the example of the International Relations Department of the College of Europe in Bruges.

  4. 4.

    To be noted here that success or failure of a simulation does not necessarily mean that participants managed or failed to find an agreement. This is a subjective notion that the instructor defines on the basis of prior criteria and learning objectives.

  5. 5.

    In their Model United Nations simulation programme, they have a member of the teaching team to chair the final conference “to maintain equity of opportunity in assessment … and to ensure adherence to the rules of procedure” (Obendorf and Randerson 2013, p. 357). They make a similar exception for the activities of the Secretariat. While there might be an undue advantage granted to those who are assigned these roles (hence the need to mitigate or compensate for it in various ways, as explained in this chapter), similar exceptions could be detrimental to the realism of the simulation itself as it creates an artificial subordination between different categories of players that has no equivalent in reality, as the authors themselves admit.

  6. 6.

    The details of this simulation game have been provided in the chapter on verisimilitude. The official page of the course is accessible here: https://www.coleurope.eu/course/settembri-p-hermanin-c-worth-j-negotiation-and-decision-making-eu-simulation-game-50h.

  7. 7.

    The combination of participation and written contribution is common also to other modules. For example, Obendorf and Randerson (2013) describe a formal assessment based on four components, with a similar articulation: a written country position paper (25%), participation in the simulation (35%), a binder of research sources (25%) and reflective essays (15%).

  8. 8.

    This assignment has been described in greater detail in the chapter concerning verisimilitude.

  9. 9.

    For a more detailed description of this tool, please refer to the chapter on verisimilitude.

  10. 10.

    In fact, the total pool consisted of 96 papers, but 12 of these were of different nature. They were assignments to non-institutional actors (journalists, lobbyists, NGOs and other stakeholders), for which the briefing was not a suitable assignment. These 12 assignments have been assessed separately, but based on the same rationale as in the D-PAC tool. The analysis here focuses exclusively on the larger pool.

  11. 11.

    In fact it is standard practice that a course is assessed before and irrespective of how students have been graded.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pierpaolo Settembri .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Settembri, P., Van Gasse, R., Coertjens, L., De Maeyer, S. (2018). Oranges and Apples? Using Comparative Judgement for Reliable Briefing Paper Assessment in Simulation Games. In: Bursens, P., Donche, V., Gijbels, D., Spooren, P. (eds) Simulations of Decision-Making as Active Learning Tools. Professional and Practice-based Learning, vol 22. Springer, Cham. https://doi.org/10.1007/978-3-319-74147-5_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-74147-5_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-74146-8

  • Online ISBN: 978-3-319-74147-5

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics