Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines

Authors

  • Laura Roberts Swansea University
  • Joanne Berry Swansea University

DOI:

https://doi.org/10.47408/jldhe.vi28.1030

Keywords:

STEM, open-book exams, online assessments, closed-book exams

Abstract

The mass shift to Open-Book, Open-Web (OBOW) assessments during the pandemic highlighted new opportunities in Higher Education for developing accessible, authentic assessments that can reduce administrative load. Despite a plethora of research emerging on the effectiveness of OBOW assessments within disciplines, few currently evaluate their effectiveness across disciplines where the assessment instrument can vary significantly. This paper aims to evaluate the experience students across STEM subjects had of OBOW exams to contribute to an evidence-base for emerging post-pandemic assessment policies and strategies. In April 2021, following two cycles of OBOW exams, we surveyed STEM students across a range of subjects to determine their preparation strategy, experiences during the exam, perception of development of higher order cognitive skills, test anxiety, and how they thought these assessments might enhance employability. Overall, students from subjects that use assessment instruments requiring analytical, quantitative-based answers (Maths, Physics, Computer Science and Chemistry) adapted their existing study skills less effectively, felt less prepared and experienced higher levels of stress compared to students of subjects using more qualitative discursive based answers (Biosciences and Geography). We conclude with recommendations on how to enhance the use of OBOW exams: these include supporting and developing more effective study skills, ensuring assessments align with intended learning outcomes, addressing the issue of academic integrity, promoting inclusivity, and encouraging authentic assessment. Based on the outcomes of this study, we strongly advise that assessment policies that foster the whole-scale roll-out of OBOW assessment consider the inter-disciplinary impacts on learner development, staff training and workload resources.

Author Biographies

Laura Roberts, Swansea University

Laura Roberts is a Professor in Biological Sciences who specialises in ecology and conservation. She is currently the Associate Dean for Education in the Faculty of Science and Engineering in Swansea University. Laura is a Principle Fellow of the Advanced HE and a Member of the Chartered Institute for Ecologists and Environmental Managers. Her pedagogic research interests are in student employability and assessment and feedback.

Joanne Berry, Swansea University

Joanne Berry is a Professor in Roman History. She specialises in the material culture of Roman Italy, particularly Pompeii and the other cities in the Bay of Naples. Jo is School Education Lead for Swansea University's School of Culture and Communication and the previous Dean of Assessment and Feedback. She is also a Principal Fellow of the Advanced HE and has been researching assessment and feedback in higher education, with a current focus on academic integrity.

References

Barber, M., Bird, L., Fleming, J., Titterington-Giles, E., Edwards, E. and Leylands, C. (2021) Gravity assist: Propelling higher education towards a brighter future. Available at: https://www.officeforstudents.org.uk/publications/gravity-assist-propelling-higher-education-towards-a-brighter-future/ (Accessed: 20 March 2021).

Bengtsson, L. (2019) ‘Take-home exams in higher education: a systematic review’, Education Sciences, 9(4). https://doi.org/10.3390/educsci9040267.

Block, R. M. (2012) ‘A discussion of the effect of open-book and closed-book exams on student achievement in an introductory Statistics course’, PRIMUS 22(3), pp.228-238. https://doi.org/10.1080/10511970.2011.565402.

Dayananda, R., Patil, M., Manjunath, S.N., Parshuram, R. and Kautilya. V. (2021) ‘Study of students’ perception regarding open book assessment and closed book exams’, Indian Journal of Forensic Medicine and Toxicology, 15(1), pp.946-949. https://doi.org/10.37506/IJFMT.V15I1.13537.

Durning, S. J., Dong, T., Ratcliffe, T., Schuwirth, L., Artino, A. R., Boulet, J. R. and Eva, K. (2016) ‘Comparing open-book and closed-book examinations’, Academic Medicine, 91(4). https://doi.org/10.1097/ACM.0000000000000977.

Ebaid, I. E. S. (2021) ‘Cheating among Accounting students in online exams during Covid-19 pandemic: exploratory evidence from Saudi Arabia’, Asian Journal of Economics, Finance and Management, 4, pp.9-19. Available at: https://globalpresshub.com/index.php/AJEFM/article/view/1068 (Accessed: 21 June 2021).

Eurboonyanun, C., Wittayapairoch, J., Aphinives, P., Petrusa, E., Gee, D. W. and Phitayakorn, R. (2021) ‘Adaptation to open-book online examination during the Covid-19 pandemic’, Journal of Surgical Education, 78(3), pp.737-739. https://doi.org/10.1016/j.jsurg.2020.08.046.

Gharib, A., Phillips, W. and Mathew, N. (2012) ‘Cheat sheet or open-book? A comparison of the effects of exam types on performance, retention, and anxiety’, Journal of Psychology Research, 2(8). https://doi.org/10.17265/2159-5542/2012.08.004.

Goothy, S. S. K., Suphal, S., Bandaru, T. S., Movva, S., Manyam, R. and Raju, V. R. (2019) ‘Comparison of academic performance and stress levels in open book test and closed book test and perceptions of undergraduate dental students’, MOJ Anatomy & Physiology, 6(2). https://doi.org/10.15406/mojap.2019.06.00246.

Green, S. G., Ferrante, C. J. and Heppard, K. A. (2016) ‘Using open-book exams to enhance student learning, performance, and motivation’, The Journal of Effective Teaching, 16(1), pp.19-35.

Gu, S., Yuan, W., Zhang, A., Huo, G., Jiang, M., Han, J. and Shen, N. (2022) ‘Online re-examination of postgraduate medical students during the Covid-19 pandemic’, BMC Medical Education, 22(1), 42. https://doi.org/10.1186/s12909-022-03100-8.

Gulikers, J. T. M., Bastiaens, T. J., Kirschner, P. A. and Kester. L. (2006) ‘Relations between student perceptions of assessment authenticity, study approaches and learning outcome’, Studies in Educational Evaluation, 32. https://doi.org/10.1016/j.stueduc.2006.10.003.

Heijne-Penninga, M., Kuks, J. B. M., Hofman, W. H. A. and Cohen-Schotanus, J. (2011) ‘Directing students to profound open-book test preparation: the relationship between deep learning and open-book test time’, Medical Teacher. 33(1), pp.16-21. https://doi.org/10.3109/0142159X.2011.530315.

Jisc. (2020) Assessment rebooted from 2020’s quick fixes to future transformation. From fixes to foresight: Jisc and Emerge Education insights for universities and startups. Available at: https://repository.jisc.ac.uk/7854/1/assessment-rebooted-report.pdf (Accessed: 6 November 2020).

Jisc. (2021) The future of assessment: five principles, five targets for 2025. Available at: https://www.jisc.ac.uk/reports/the-future-of-assessment (Accessed: 8 June 2021).

Johanns, B., Dinkens, A. and Moore, J. (2017) ‘A systematic review comparing open-book and closed-book examinations: evaluating effects on development of critical thinking skill’, Nurse Education in Practice, 27. https://doi.org/10.1016/j.nepr.2017.08.018.

López, J. and Whittington, M. S. (2001) ‘Higher-order thinking in a college course: a case study’, NACTA Journal, 45(4), pp.22-29.

Lukasik, K. M., Waris, O., Soveri, A., Lehtonen, M. and Laine. M. (2019) ‘The relationship of anxiety and stress with working memory performance in a large non-depressed sample’, 10(4), pp.1-9. https://doi.org/10.3389/fpsyg.2019.00004.

Maguire, D., Dale, L. and Pauli, M. (2020) ‘Learning and teaching reimagined: a new dawn for higher education? Exploring the 2020 experience as well as the changing aspirations of the nature and shape of learning and teaching for the future’. Available at: https://repository.jisc.ac.uk/8150/1/learning-and-teaching-reimagined-a-new-dawn-for-higher-education.pdf (Accessed: 31 August 2023).

Malone, D. T., Chuang, S., Yuriev, E. and Short, J. L. (2021) ‘Effect of changing from closed-book to formulary-allowed examinations’, American Journal of Pharmaceutical Education, 85(1). https://doi.org/10.5688/ajpe7990.

Michael, K., Lyden, E. and Custer, T. (2019) ‘Open-book examinations (OBEs) in an Ultrasound Physics course: a good idea or a bad experiment?’, Journal of Diagnostic Medical Sonography, 35(3). https://doi.org/10.1177/8756479318821075.

Morrison, A. B. and Richmond, L. L. (2020.) ‘Offloading items from memory: individual differences in cognitive offloading in a short-term memory task’, Cognitive Research, 5(1). https://doi.org/10.1186/s41235-019-0201-4.

Myyry, L. and Joutsenvirta, T. (2015) ‘Open-book, open-web online examinations: Developing examination practices to support university students’ learning and self-efficacy’, Active Learning in Higher Education, 16(2). https://doi.org/10.1177/1469787415574053.

Parker, A., Watson, E., Dyke, N. and Carey. J. (2021) ‘Traditional versus open-book exams in remote course delivery: a narrative review of the literature’ in Proceedings 2021 Canadian Engineering Education Association (CEEA-ACEG21) Conference, Charlottetown, PE, Canada 21-23 June. (2021) pp.1-7.

Rahul, K. (2020) ‘Assessing higher education in the Covid-19 era’, Journal of Educational Research and Practice, 29(2), pp.37-41. https://doi.org/10.26522/brocked.v29i2.841.

Robinson, O. J., Vytal, K., Cornwell, B. R. and Grillon, C. (2013) ‘The impact of anxiety upon cognition: perspectives from human threat of shock studies’, 7(203). https://doi.org/10.3389/fnhum.2013.00203.

Spiegel, T. and Nivette, A. (2021) ‘The relative impact of in-class closed-book versus take-home open-book examination type on academic performance, student knowledge retention and wellbeing’, Assessment & Evaluation in Higher Education, 48(1), pp.1-14. https://doi.org/10.1080/02602938.2021.2016607.

Stowell, J. R. (2015) ‘Online open-book testing in face-to-face classes’, Scholarship of Teaching and Learning in Psychology, 1(1). https://doi.org/10.1037/stl0000014.

Theophilides, C. and Koutselini, M. (2000) ‘Study behavior in the closed-book and the open-book examination: a comparative analysis’, Educational Research and Evaluation, 6(4). https://doi.org/10.1076/edre.6.4.379.6932.

Tyng, C. M., Amin, H. U., Saad, M. N. M. and Malik, A. S. (2017) ‘The influences of emotion on learning and memory’, Frontiers in Psychology, 8, 1454. https://doi.org/10.3389/fpsyg.2017.01454.

Vazquez, J. J., Chiang, E. P. and Sarmiento-Barbieri, I. (2021) ‘Can we stay one step ahead of cheaters? A field experiment in proctoring online open book exams’, Journal of Behavioral and Experimental Economics, 90, 101653. https://doi.org/10.1016/J.SOCEC.2020.101653.

Williams, J. B. and Wong, A. (2009) ‘The efficacy of final examinations: a comparative study of closed-book, invigilated exams and open-book, open-web exams’, British Journal of Educational Technology, 40(2). https://doi.org/10.1111/j.1467-8535.2008.00929.x.

Williamson, M. H. (2018) ‘Online exams: the need for best practices and overcoming challenges’, The Journal of Public and Professional Sociology, 10(1).

Zhang, L. Y., Petersen, A. K., Liut, M., Simion, B. and Alaca, F. (2021) ‘A multi-course report on the experience of unplanned online exams’, in Proceedings of the 52nd ACM Technical Symposium on Computer Science Education. Online, USA 13-20 March. New York: Association for Computing Machinery, pp.17-23. https://doi.org/10.1145/3408877.3432515.

Downloads

Published

24-09-2023

How to Cite

Roberts, L. and Berry, J. (2023) “Should open-book, open-web exams replace traditional closed-book exams in STEM? An evaluation of their effectiveness in different disciplines ”, Journal of Learning Development in Higher Education, (28). doi: 10.47408/jldhe.vi28.1030.

Issue

Section

Papers