Monan Professor in Education
Executive Director, TIMSS & PIRLS International Study Center
Dr. von Davier’s research focuses on developing psychometric models for analyzing data from complex item and respondent samples and on integrating diagnostic procedures into these methods. His areas of expertise includes topics such as item response theory, latent class analysis, classification and mixture distribution models, diagnostic models, computational statistics, person-fit, item-fit, and model checking, as well as hierarchical extension of models for categorical data analysis, and the analytical methodologies used in large scale educational surveys.
Dr. von Davier’s applied research uses these methodologies to analyze data from educational testing, large-scale survey assessments of student skills and adult literacy, to computer-based assessment of skills, and to the analysis of questionnaire data.
von Davier, M. & Lee, Y.S. (2019). Handbook Diagnostic Classification Models. Springer: New York. https://www.springer.com/us/book/9783030055837
Bennett, R. & von Davier, M. (2017). Shaping the Landscape of Educational Measurement and Evaluation. Springer Book Series: Methodology of Educational Measurement and Assessment. 1st ed. 2017, XIV, 711 p. 15 illus. ISBN 978-3-319-58689-2
Rutkowski, L., von Davier, M., & Rutkowski, D. (2014). Handbook International Large-scale Assessment: Background, Technical Issues, and Methods of Data Analysis. CRC Press (Chapman & Hall).
von Davier, M. & Hastedt, D. (2012). IERI Monograph Series – Issues and Methodologies in Large Scale Assessments – Volume V. (Editor). IEA/ETS Research Institute (IERI): Hamburg, Princeton. ISBN: 978-088685411-9
von Davier, M., Gonzalez, E., Kirsch, I., & Yamamoto, K. (2012). The Role of International Large-Scale Assessments: Perspectives from Technology, Economy, and Educational Research. Springer: New York.
von Davier, M. & Hastedt, D. (2011). IERI Monograph Series – Issues and Methodologies in Large Scale Assessments – Volume IV. (Editor). IEA/ETS Research Institute (IERI): Hamburg, Princeton. ISBN: 978-088685411-9
von Davier, M. (2011, Guest Editor). Special Issue on Methodologial Advances in Educational and Psychological Testing. Psychological Test and Assessment Modeling. 4/24/2019 16
von Davier, M. & Hastedt, D. (2010). IERI Monograph Series – Issues and Methodologies in Large Scale Assessments – Volume III. (Editor). IEA/ETS Research Institute (IERI): Hamburg, Princeton. ISBN: 978-0886854096-8
von Davier, M. & Hastedt, D. (2009). IERI Monograph Series – Issues and Methodologies in Large Scale Assessments – Volume II. (Editor of the volume). IEA/ETS Research Institute (IERI): Hamburg, Princeton. ISBN: 978-0886-85404-1
von Davier, M. & Hastedt, D. (2008). IERI Monograph Series – Issues and Methodologies in Large Scale Assessments – Volume I. (Editor of the volume). IEA/ETS Research Institute (IERI): Hamburg, Princeton. ISBN: 978-0886-85402-7
von Davier, M. & Carstensen. C. H. (2007). Multivariate and Mixture Distribution Rasch Models – Extensions and Applications. (Editor of the volume) Springer, New York. ISBN: 978-0387-32916-1
von Davier, M, Yamamoto, K., Shin, H.-J., Chen, H., Khorramdel, L., Weeks, J., Davis, S. Kong, N. Kandathil, M. (2019) Evaluating item response theory linking and model fit for data from PISA 2000–2012, Assessment in Education: Principles, Policy & Practice, DOI: 10.1080/0969594X.2019.1586642
Santos, K., de la Torre, J. & von Davier, M. (in press). Adjusting person fit index for skewness in cognitive diagnosis modeling. Journal of Classification.
von Davier, M., Cho, Y., Pan, T.(2019) Effects of Discontinue Rules on Psychometric Properties of Test Scores. https://doi.org/10.1007/s11336-018-09652-3.
Pohl, S. & von Davier, M. (2018). Commentray: On the Importance of the Speed-Ability Trade-Off When Dealing With Not Reached Items. Frontiers in Psychology, 9. DOI=10.3389/fpsyg.2018.01988
von Davier, M. (2018). Automated Item Generation with Recurrent Neural Networks. Psychometrika. 2018 Mar 12. doi: 10.1007/s11336-018-9608-y 4/24/2019 17
von Davier, M. (2018) Diagnosing Diagnostic Models: From von Neumann’s Elephant to Model Equivalencies and Network Psychometrics, Measurement: Interdisciplinary Research and Perspectives, 16:1, 59-70, DOI: 10.1080/15366367.2018.1436827
Kirsch, I.S., Thorn, W., & von Davier, M. (2018). Introduction to the Special Issue on Survey Quality, Quality Assurance in Education, https://doi.org/10.1108/QAE-01-2018-0010
von Davier, M. (2018). Detecting and treating errors in tests and surveys, Quality Assurance in Education, https://doi.org/10.1108/QAE-07-2017-0036
Braun, H. & von Davier, M. (2017). The use of test scores from large-scale assessment surveys: psychometric and statistical considerations. Large-scale Assess Educ (2017) 5: 17. https://doi.org/10.1186/s40536-017-0050-x
von Davier, M., Shin, H.J., Khorramdel, L., & Stankov, L. (2017). The Effects of Vignette Scoring on Reliability and Validity of Self Reports. Applied Psychological Measurement First Published September 27, 2017. https://doi.org/10.1177/0146621617730389
Rose, N., von Davier, M., Nagengast, B. (2017). Modeling omitted and not-reached items in IRT models. Psychometrika, 82, 795-819. doi:10.1007/s11336-016-9544-7
Yamamoto, K. , He, Q. , Shin, H. J. and von Davier, M. (2017), Developing a Machine‐ Supported Coding System for Constructed‐Response Items in PISA. ETS Research Report Series, 2017: 1-15. doi:10.1002/ets2.12169
Stankov, L. , Lee, J., von Davier, M. (2017). A Note on Construct Validity of the Anchoring Method in PISA 2012. Journal of Psychoeducational Assessment. First published date: April04-2017. DOI: 10.1177/0734282917702270.
Weeks, J. P., von Davier, M., & Yamamoto, K. (2016). Using response time data to inform the coding of omitted responses. Special Issue: Current Methodological Issues in Large-Scale Assessments. Psychological Test and Assessment Modeling, Volume 58, 2016 (4), 671-701.
von Davier, M. (2016), High-Performance Psychometrics: The Parallel-E Parallel-M Algorithm for Generalized Latent Variable Models. ETS Research Report Series, 2016: 1–11. doi:10.1002/ets2.12120
von Davier, M. , Yamamoto, K., Shin, H.J., Chen, H., Weeks, J., Davis, S., Kong, N. & Kandathil, M. (2016). Evaluating IRT Linking and Model Fit for Data from PISA 2000-2012. ETS Research Report RR-XX-16, in press.
Rose, N., von Davier, M., Nagengast, B. (2015). Commonalities and differences in IRT-based methods for nonignorable item nonresponses. Psychological Test and Assessment Modeling, 57, 472-498.
Distinguished Research Scientist: Center for Advanced Assessment, National Board of Medical Examiners. Representing Psychometric Research, Mentoring, Editorial Activities. Developing methodologies and research agendas for advanced computer delivered assessments. January 2017- present.
Senior Research Director: Center for Global Assessment. Leadership of the research agenda and methodology development of the center, directing the operational psychometric work of the center. Oversight and research agenda, managing staff, budgeting. Projects involve innovations in applied analysis and primary statistical methodologies used in international survey assessments such as PISA, TIMSS, PIRLS and PIAAC. October 2014 to present.
Director, Research: International survey assessment research group, in the center for global assessment, ETS. Oversight and research agenda, managing staff, budgeting. Projectsinvolve innovations in applied analysis and primary statistical methodologies used in international survey assessments such as PISA, TIMSS, PIRLS and PIAAC. September 2013 to present.
Co-Principal Investigator: IES methodology grant on ‘21st century models for 21st century large scale assessments. Allocation of funds (1.5 million) to researchers, management of funds and staff, 2011-2014.
Co-Leader: ETS research initiative on foundational psychometric and statistical research. Allocation of annual funds (1.1 million) to ETS researchers based on a proposal review process. Management of funds and staff, 2006-2013.
Member Project Management Team: International TIMSS and PIRLS assessments, as ETS representative to International Study Center (ISC) at Boston College, 2002 – present.
Technical Director: NAEP Task Order Component. Developing research agenda with client (NCES) and oversight of research projects and reporting, timetable and budget done in this area of NAEP program research, 2006 – 2008.
Manager: Research component in the IEA - ETS Research Institute (IERI), coordinate research projects. Activities include directing research agenda, oversight of timelines and execution, as well coordination pipeline for dissemination in IERI publication series, 2007 – now.
Co-Director: Cross-divisional working group on “Computers as a tool for learning”, Institute for Science Education (IPN) Kiel, 1998 – 2000.
Group Leader: “Evaluation of computer-based instruction in secondary education", 4 direct reports, and directing a number of 8 additional adjunct teachers, at the Institute for Science Education (IPN) Kiel University, 1998 – 2000.
BIGDATA: Collaborative Research: IA: F: Latent and Graphical Models for Complex Dependent Data in Education. NSF funded project 2017-2020 together with Columbia University Department of Statistics, collaboration with professors Jingchen Liu and Zhiliang Zhang. Using response times to model and resolve missing values in item response data.
Cooperation and exchange agreement with Professor S. Pohl, Free University, Berlin, Germany. Funded by German Research Foundation (Deutsche Forschungs Gemeinschaft - DFG). 2016-2019.
Psychometric Models for 21st -century Educational Survey Assessments: Funded by the Institute of Education Sciences (IES) in the Statistics and Methodology Grant program. Roles: Co-principal investigator (with Frank Rijmen) 2011-2014, total budget $1.2 million.
National assessment of educational progress (NAEP) Technical direction of NAEP task order component projects (TOC) Developing proposals, budgeting, and supervision of TOC projects by NAEP research staff. 2005-2009.
A tool for improved precision reporting in secondary analysis of national and state level NAEP data. National Assessment of Educational Progress (NAEP) Secondary Analysis Grant. 8/2002- 1/2004.
Project BLK-PISA – analysis of SINUS (improving science education) project schools assessed with PISA 2000 instruments. Research and Analysis Project at ETS as a subcontractor of institute for science education (IPN), Germany, 1/2002-4/2002.