Logo PTI
Polish Information Processing Society
Logo FedCSIS

Annals of Computer Science and Information Systems, Volume 11

Proceedings of the 2017 Federated Conference on Computer Science and Information Systems

Semi-real-time analyses of item characteristics for medical school admission tests

, , , , ,

DOI: http://dx.doi.org/10.15439/2017F380

Citation: Proceedings of the 2017 Federated Conference on Computer Science and Information Systems, M. Ganzha, L. Maciaszek, M. Paprzycki (eds). ACSIS, Vol. 11, pages 189194 ()

Full text

Abstract. University admission exams belong to so-called highstakes tests, i. e. tests with important consequences for the exam taker. Given the importance of the admission process for the applicant and the institution, routine evaluation of the admission tests and their items is desirable. In this work, we introduce a quick and efficient methodology and on-line tool for semi-real-time evaluation of admission exams and their items based on classical test theory (CTT) and item response theory (IRT) models. We generalize some of the traditional item analysis concepts to tailor them for specific purposes of the admission test. On example of medical school admission test we demonstrate how R-based web application may simplify admissions evaluation work-flow and may guarantee quick accessibility of the psychometric measures. We conclude that the presented tool is convenient for analysis of any admission or educational test in general.

References

  1. AERA, APA, and NCME. Standards for educational and psychological testing. 2014.
  2. Penny Salvatori. “Reliability and Validity of Admissions Tools Used to Select Students for the Health Professions”. In: Advances in Health Sciences Education 6.2 (2001), pp. 159–175. ISSN: 13824996. http://dx.doi.org/10.1023/A:1011489618208.
  3. Čestmír Štuka, Patrícia Martinková, Karel Zvára, et al. “The prediction and probability for successful completion in medical study based on tests and pre-admission grades”. In: The New Educational Review 28 (2012), pp. 138–152. http://www.educationalrev.us.edu.pl/dok/volumes/tner_2_2012.pdf.
  4. Cyril Höschl and Jiří Kožený. “Predicting academic performance of medical students: The first three years”. In: The American journal of psychiatry 154.6 (1997), p. 86.
  5. Jiří Anděl and Karel Zvára. “Přijímací zkouška z matematiky na MFF v roce 2004”. In: Pokroky matematiky, fyziky a astronomie 50.2 (2005), pp. 148–161. http://hdl.handle.net/10338.dmlcz/141263%0A.
  6. Jiří Kožený, Lýdie Tišanská, and Cyril Höschl. “Akademická úspěšnost na střední škole: prediktor absolvování studia medicíny”. In: Československá psychologie : časopis pro psychologickou teorii a praxi 45.1 (2001), pp. 1–6. http://www.medvik.cz/link/bmc01014269.
  7. Jana Rubešová. “Souvisí úspěšnost studia na vysoké škole se středoškolským prospěchem”. In: Pedagogická orientace; Vol 19, No 3 (2009) (2014). https://journals.muni.cz/pedor/article/view/1261.
  8. Čestmír Štuka, Patrícia Martinková, Martin Vejražka, et al. Testování při výuce medicíny. Konstrukce a analýza testů na lékařských fakultách. 2013.
  9. Martin Chvál, Jana Straková, and Ivana Procházková. Hodnocení výsledků vzdvělávání didaktickými testy. Česká školní inspekce, 2015. ISBN : 978-80-905632-9-2.
  10. Petr Byčkovský and Karel Zvára. Konstrukce a analýza testů pro přijímací řízení. Univerzita Karlova v Praze, Pedagogická fakulta, 2007. ISBN : 9788072903313. https://books.google.cz/books?id=mvvjtgAACAAJ.
  11. R Development Core Team. “R: A Language and Environment for Statistical Computing”. In: R Foundation for Statistical Computing Vienna Austria 0 (2016), {ISBN} 3–900051–07–. ISSN: 16000706. http://dx.doi.org/doi.org/10.1038/sj.hdy.6800737. https://arxiv.org/abs/ /www.R-project.org. http://www.r-project.org/.
  12. IBM Corp. IBM SPSS Statistics for Windows, Version 22.0. Armonk, NY: IBM Corp. 2013.
  13. StataCorp. Stata Statistical Software: Release 14. 2015. http://dx.doi.org/10.2307/2234838.
  14. SAS Institute Inc. SAS 9.4 Language Reference: Concepts. Cary, NC, USA: SAS Institute Inc., 2013. ISBN: 1612905641, 9781612905648.
  15. Assessment Systems Corporation. Iteman 4.3. Woodbury, MN: Assessment Systems Corporation. 2013.
  16. University of Nottingham. Rogo: eAssessment Management System. 2016.
  17. John Michael Linacre. “Rasch dichotomous model vs. one-parameter logistic model”. In: Rasch Measurement Transactions 19.3 (2005), p. 1032.
  18. Li Cai, Dave Thissen, and Stephen Henry Charles du Toit. IRTPRO for Windows. Lincolnwood, IL, 2011.
  19. M. L. Wu, R. J. Adams, and M. R. Wilson. ConQuest: Multi-Aspect Test Software. Camberwell, 2007.
  20. Wim J van der Linden. Handbook of Item Response Theory, Three Volume Set. CRC Press, 2017.
  21. Patrícia Martinková, Adéla Drabinová, Ondrej Leder, et al. ShinyItemAnalysis: Test and Item Analysis via Shiny. 2017. https://cran.r-project.org/package=ShinyItemAnalysis.
  22. Steven Downing. Handbook of test development. Mah- wah, N.J: L. Erlbaum, 2006. ISBN: 0805852654.
  23. Mohsen Tavakol and Reg Dennick. “Postexamination analysis of objective tests”. In: Medical Teacher 33.6 (May 2011), pp. 447–458. http://dx.doi.org/10.3109/0142159x.2011.564682. https://doi.org/10.3109%2F0142159x.2011.564682.
  24. Lee J. Cronbach. “Coefficient alpha and the inter nal structure of tests”. In: Psychometrika 16.3 (Sept. 1951), pp. 297–334. http://dx.doi.org/10.1007/bf02310555. https://doi.org/10.1007/bf02310555.
  25. Alan Agresti. Categorical Data Analysis. Wiley Series in Probability and Statistics. Wiley, 2013. ISBN: 9780470463635. http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0470463635.html.
  26. R J de Ayala. “The theory and practice of item response theory.” In: (2009).
  27. Bruno Zumbo. “A handbook on the theory and methods of differential item functioning (DIF)”. In: Ottawa: National Defense Headquarters (1999).
  28. Martinková Patrícia, Drabinová Adéla, Yuan-Ling Liaw, et al. “Checking Equity: Why Differential Item Functioning Analysis Should Be a Routine Part of Developing Conceptual Assessments”. In: CBE-Lifesciences Education 16.2 (2017). Ed. by Ross Nehm, rm2. http://dx.doi.org/10.1187/cbe.16-10-0307.
  29. Jenny L. McFarland, Rebecca M. Price, Mary Pat Wenderoth, et al. “Development and Validation of the Homeostasis Concept Inventory”. In: CBE-Lifesciences Education 16.2 (2017). Ed. by Peggy Brickman, ar35. http://dx.doi.org/10.1187/cbe.16-10-0305.
  30. Rob J. Hyndman and Yanan Fan. “Sample Quantiles in Statistical Packages”. In: The American Statisti cian 50.4 (1996), pp. 361–365. ISSN: 00031305. http://www.jstor.org/stable/2684934.
  31. Patrícia Martinková, Adéla Drabinová, and Jakub Houdek. “ShinyItemAnalysis: Analyzing admission and other educational and psychological tests”. In: Test fórum (2017). Accepted/In press. http://dx.doi.org/ 10.5817/TF2017- 9-129.
  32. Lambert W. T. Schuwirth and Cees P. M. van der Vleuten. “General overview of the theories used in assessment: AMEE Guide No. 57”. In: Medical Teacher 33.10 (Sept. 2011), pp. 783–797. DOI: 10.3109%2F0142159x.2011.611022.