Abstract
Visual analysis is the predominant method of analysis in single-case research (SCR). However, most research suggests that agreement between visual analysts is poor, which may be due to a lack of clear guidelines and criteria for visual analysis, as well as variability in how individuals are trained. We developed a survey containing questions about the content and methods used to teach visual and statistical analysis of SCR data in verified course sequences (VCS) and distributed it via the VCS Coordinator Listserv. Thirty-seven instructors completed the survey. Results suggest that there is variability across instructors in some fundamental aspects of data analysis (e.g., number of effects required for a functional relation) but a great deal of consistency in others (e.g., emphasizing visual over statistical analysis). We discuss our results along with their implications both for teaching students to analyze SCR data and for conducting additional research on behavior-analytic training programs.
Similar content being viewed by others
References
Barton, E. E., Lloyd, B. P., Spriggs, A. D., & Gast, D. L. (2018). Visual analysis of graphic data. In J. R. Ledford & D. L. Gast (Eds.), Single case research in behavioral sciences (3rd ed., pp. 179–214). New York: Routledge.
Behavior Analyst Certification Board. (2012). BACB fourth edition task list. Retrieved from http://bacb.com/wp-content/uploads/2015/05/BACB_Fourth_Edition_Task_List.pdf.
Behavior Analyst Certification Board. (2014). Professional and ethical compliance code for behavior analysts. Littleton, CO: Author.
Behavior Analyst Certification Board. (2017). BACB fifth edition task list. Retrieved from https://www.bacb.com/wp-content/uploads/2017/09/170113-BCBA-BCaBA-task-list-5th-ed-.pdf.
Blydenburg, D. M., & Diller, J. W. (2016). Evaluating components of behavior analytic programs. Behavior Analysis in Practice, 9, 179–183. https://doi.org/10.1007/s40617-016-0123-2.
Burns, M. K. (2012). Meta-analysis of single-case design research: Introduction to the special issue. Journal of Behavioral Education, 21, 175–184. https://doi.org/10.1007/s10864-012-9158-9.
Carr, J. E., & Briggs, A. M. (2010). Strategies for making regular contact with the scholarly literature. Behavior Analysis in Practice, 3, 13–18. https://doi.org/10.1007/BF03391760.
Cooper, J. O., Heron, T. E., & Heward, W. L. (2007). Applied behavior analysis (2nd ed.). Upper Saddle River, NJ: Merrill/Prentice Hall.
Council for Exceptional Children. (2014). Standards for evidence-based practices in special education. TEACHING Exceptional Children, 46, 206–212. https://doi.org/10.1177/0040059914531389.
Craig, A. R., & Fisher, W. W. (2019). Randomization tests as alternative analysis methods for behavior-analytic data. Journal of the Experimental Analysis of Behavior, 111(2), 309–328. https://doi.org/10.1002/jeab.500.
DeHart, W. B., & Kaplan, B. A. (2019). Applying mixed-effects modeling to single-subject designs: An introduction. Journal of the Experimental Analysis of Behavior, 111(2), 192–206. https://doi.org/10.1002/jeab.507.
Fisch, G. S. (1998). Visual inspection of data revisited: Do the eyes still have it? The Behavior Analyst, 21, 111–123. https://doi.org/10.1007/BF03392786.
Fisher, W. W., Kelley, M. E., & Lomas, J. E. (2003). Visual aids and structured criteria for improving visual inspection and interpretation of single-case designs. Journal of Applied Behavior Analysis, 36, 387–406. https://doi.org/10.1901/jaba.2003.36-387.
Fisher, W. W., & Lerman, D. C. (2014). It has been said that, “There are three degrees of falsehoods: Lies, damn lies, and statistics”. Journal of School Psychology, 52, 243–248. https://doi.org/10.1016/j.jsp.2014.01.001.
Gast, D. L., & Ledford, J. R. (Eds.). (2014). Single case research methodology: Applications in special education and behavioral sciences (2nd ed.). New York: Routledge.
Hagopian, L. P., Fisher, W. W., Thompson, R. H., Owen-DeSchryver, J., Iwata, B. A., & Wacker, D. P. (1997). Toward the development of structured criteria for interpretation of functional analysis data. Journal of Applied Behavior Analysis, 30, 313–326. https://doi.org/10.1901/jaba.1997.30-313.
Hall, S. S., Pollard, J. S., Monlux, K. D., & Baker, J. M. (2020). Interpreting functional analysis outcomes using automated nonparametric statistical analysis. Journal of Applied Behavior Analysis. https://doi.org/10.1002/jaba.689.
Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single subject research to identify evidence-based practice in special education. Exceptional Children, 71, 165–179. https://doi.org/10.1177/001440290507100203.
Johnston, J. M., & Pennypacker, H. S., Jr. (2009). Strategies and tactics of behavioral research (3rd ed.). New York: Routledge.
Kazdin, A. E. (2011). Single-case research designs (2nd ed.). New York: Oxford University Press.
Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, et al. (2010). Single-case designs technical documentation. Retrieved from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.
Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., et al. (2013). Single-case intervention research design standards. Remedial and Special Education, 34, 26–38. https://doi.org/10.1177/0741932512452794.
Lanovaz, M. J., & Turgeon, S. (2020). How many tiers do we need? Type I errors and power in multiple baseline designs. PsyArXiv. https://doi.org/10.31234/osf.io/vr8ut.
Lieberman, R. G., Yoder, P. J., Reichow, B., & Wolery, M. (2010). Visual analysis of multiple baseline across participants graphs when change is delayed. School Psychology Quarterly, 25, 28–44. https://doi.org/10.1037/a0018600.
Normand, M. P., & Bailey, J. S. (2006). The effects of celeration lines on visual data analysis. Behavior Modification, 30, 295–314. https://doi.org/10.1177/0145445503262406.
Pustejovsky, J. E. (2018). Using response ratios for meta-analyzing single-case designs with behavioral outcomes. Journal of School Psychology, 68, 99–112. https://doi.org/10.1016/j.jsp.2018.02.003.
Qualtrics. (2018). Qualtrics [Online survey software]. Retrieved from www.qualtrics.com.
Richman, D. M., Barnard-Brak, L., Grubb, L., Bosch, A., & Abby, L. (2015). Meta-analysis of noncontingent reinforcement effects on problem behavior. Journal of Applied Behavior Analysis, 48(1), 131–152. https://doi.org/10.1002/jaba.189.
Shadish, W. R. (2014). Statistical analyses of single-case designs: The shape of things to come. Current Directions in Psychological Science, 23, 139–146. https://doi.org/10.1177/0963721414524773.
Shadish, W. R., Hedges, L. V., Horner, R. H., & Odom, S. L. (2015). The role of between-case effect size in conducting, interpreting, and summarizing single-case research. NCER 2015-002. National Center for Education Research, December, 1–109. https://ies.ed.gov/ncser/pubs/2015002/pdf/2015002.pdf.
Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43, 971–980. https://doi.org/10.3758/s13428-011-0111-y.
Sidman, M. (1960). Tactics of scientific research. Oxford: Basic Books.
Slocum, T. A., Detrich, R., Wilczynski, S. M., Spencer, T. D., Lewis, T., & Wolfe, K. (2014). The evidence-based practice of applied behavior analysis. The Behavior Analyst, 37(1), 41–56. https://doi.org/10.1007/s40614-014-0005-2.
What Works Clearinghouse. (2017). Procedures and standards handbook (Version 4.0). Retrieved from https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf. Retrieved June 17, 2019.
Wolery, M. (2013). A commentary: Single-case design technical document of the what works clearinghouse. Remedial and Special Education, 34(1), 39–43.
Wolf, M. M. (1978). Social validity: The case for subjective measurement or how applied behavior analysis is finding its heart. Journal of Applied Behavior Analysis, 11, 203–214. https://doi.org/10.1901/jaba.1978.11-203.
Wolfe, K., Barton, E. E., & Meadan, H. (2019). Systematic protocols for the visual analysis of single-case research data. Behavior Analysis in Practice, 12, 491–502. https://doi.org/10.1007/s40617-019-00336-7.
Wolfe, K., Seaman, M. A., & Drasgow, E. (2016). Interrater agreement on the visual analysis of individual tiers and functional relations in multiple baseline designs. Behavior Modification, 40, 852–873. https://doi.org/10.1177/0145445516644699.
Wolfe, K., & Slocum, T. A. (2015). A comparison of two approaches to training visual analysis of AB graphs. Journal of Applied Behavior Analysis, 48, 472–477. https://doi.org/10.1002/jaba.212.
Young, M. E. (2018). A place for statistics in behavior analysis. Behavior Analysis: Research and Practice, 18(2), 193–202. https://doi.org/10.1037/bar0000099.
Zimmerman, K. N., Ledford, J. R., Severini, K. E., Pustejovsky, J. E., Barton, E. E., & Lloyd, B. P. (2018). Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor. Research in Developmental Disabilities, 79, 19–32. https://doi.org/10.1016/j.ridd.2018.02.003.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
Conflict of interest statement removed from manuscript for blinding purposes.
Ethical Approval
The study procedures were reviewed and approved by the University of South Carolina Institutional Review Board.
Informed Consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Wolfe, K., McCammon, M.N. The Analysis of Single-Case Research Data: Current Instructional Practices. J Behav Educ 31, 28–42 (2022). https://doi.org/10.1007/s10864-020-09403-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10864-020-09403-4