Investigating Single-Case Experimental Designs

Resulting Publications:

Moeyaert, M., Dehghan-Chaleshtori, M., Xu, X., & Yang, P. (2023). Single-case design meta-analyses in education and psychology: a systematic review of methodology. Frontiers in Research Metrics and Analytics, 8.

Moeyaert, M., & Dehghan-Chaleshtori, M. (2022). Evaluating the design and evidence of single-case experimental designs using the What Works Clearinghouse standards. Evidence-Based Communication Assessment and Intervention, 1-19.

Summary of the Work:

In response to the evolving landscape of educational and psychological research, I embarked on a dual investigation aimed at advancing methodological standards and enhancing evidence synthesis in single-case experimental designs (SCEDs). Our research team delved into the realm of meta-analysis within the education and psychology fields, recognizing its potential to inform evidence-based decisions in policy, practice, and theory. In our systematic review, we examined the landscape of single-case design meta-analyses, identifying key methodological trends and empirical evidence from 18 reports. Through rigorous analysis, we uncovered insights into meta-analytic techniques, data generation and analysis models, design conditions, and statistical properties, providing valuable guidance for methodologists and applied researchers alike.

We also explored the implications of updated standards set forth by the What Works Clearinghouse (WWC) for SCED studies, seeking to understand their impact on the quality rating of design and evidence synthesis. By comparing previous standards with Version 4.1, we discerned notable differences in evidence analysis, synthesis, and reporting, while the quality rating of designs remained consistent. Through a detailed examination of a selected publication, namely Pivotal Response Training, we elucidated the nuanced changes in evidence synthesis under Version 4.1 standards. Findings of this study underscored the importance of ongoing dialogue, collaboration, and adaptation to evolving standards in ensuring the reliability and validity of evidence synthesis in SCED studies. By bridging theory with practice, this research aims to empower stakeholders in education and psychology to make informed decisions grounded in robust empirical evidence.

Significance of the Work:

This project addresses critical gaps in methodological standards and evidence synthesis within the fields of education and psychology. By conducting a systematic review of single-case design meta-analyses and examining the implications of updated standards set by the What Works Clearinghouse (WWC), this research contributes to enhancing the rigor and reliability of research methodologies and evidence-based practices. The significance of this work extends beyond academia, as it directly impacts decision-making processes in education and mental health interventions, ultimately influencing the well-being and success of individuals and communities nationwide.

This project is important because it fills a crucial need for methodological guidance and evidence evaluation in single-case experimental designs (SCEDs), which are widely used in educational and psychological research settings. By identifying trends, best practices, and areas for improvement in meta-analytic techniques, this research empowers researchers, practitioners, and policymakers to make informed decisions based on robust empirical evidence. Additionally, by examining the impact of updated standards from reputable organizations like the What Works Clearinghouse, this project ensures that evidence synthesis processes remain relevant, transparent, and reliable in the face of evolving research landscapes.

Previous
Previous

Project Three

Next
Next

Project Five