Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-g7gxr Total loading time: 0 Render date: 2024-11-11T02:09:24.058Z Has data issue: false hasContentIssue false

7 - Helping Teachers Use Progress Monitoring Data for Intervention Decisions

from Part II - Teacher- and System-Level Interventions

Published online by Cambridge University Press:  18 September 2020

Frank C. Worrell
Affiliation:
University of California, Berkeley
Tammy L. Hughes
Affiliation:
Duquesne University, Pittsburgh
Dante D. Dixson
Affiliation:
Michigan State University
Get access

Summary

Progress monitoring is an important part of any prevention and intervention model. The data can be used not only to evaluate whether the intervention is working, but also to suggest potential modifications to the intervention. This chapter defines progress monitoring, and presents data-based decision making, formative evaluation, and psychometric theory as the foundations for it. We also discuss the role of general outcome measures and subskill mastery measures in light of the last 25 years of research. Finally, the chapter discusses the role that progress monitoring has in a prevention model by providing an overview of decision-making models and how the data can be used to intensify interventions.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Algozzine, R., Ysseldyke, J., & Elliott, J. (1997). Strategies and tactics for effective instruction (2nd ed.). Longmont, CO: Sopris West.Google Scholar
American Educational Research Association, American Psychology Association, & National Council for Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.Google Scholar
Ardoin, S. P., & Daly III, E. J. (2007). Introduction to the special series: Close encounters of the instructional kind – how the instructional hierarchy is shaping instructional research 30 years later. Journal of Behavioral Education, 16, 16. https://doi.org/10.1007/s10864-006-9027-5Google Scholar
Begeny, J. C., Mitchell, R. C., Whitehouse, M. H., Harris Samuels, F., & Stage, S. A. (2011). Effects of the HELPS reading fluency program when implemented by classroom teachers with low-performing second-grade students. Learning Disabilities Research & Practice, 26, 122133. https://doi.org/10.1111/j.1540-5826.2011.00332.xGoogle Scholar
Bloom, B. S., Hastings, J. T., & Madaus, G. F. (1971). Handbook on formative and summative evaluation of student learning. New York, NY: McGraw-Hill.Google Scholar
Burns, M. K. (2004). Empirical analysis of drill ratio research refining the instructional level for drill tasks. Remedial and Special Education, 25, 167173. https://doi.org/10.1177/07419325040250030401Google Scholar
Burns, M. K. (2010). Formative evaluation in school psychology: Fully informing the instructional process. School Psychology Forum, 4, 2233. www.nasponline.org/publications/periodicals/spf/volume-4/volume-4-issue-1-(spring-2010)/formative-evaluation-in-school-psychology-fully-informing-the-instructional-processGoogle Scholar
Burns, M. K. (2011). School psychology research: Combining ecological theory and prevention science. School Psychology Review, 40, 132139. DOI:10.1080/02796015.2011.12087732Google Scholar
Burns, M. K., Codding, R. S., Boice, C. H., & Lukito, G. (2010). Meta-analysis of acquisition and fluency math interventions with instructional and frustration level skills: Evidence for a skill-by-treatment interaction. School Psychology Review, 39, 6983.Google Scholar
Burns, M. K., Scholin, S. E., Kosciolek, S., & Livingston, J. (2010). Reliability of decision-making frameworks for response to intervention for reading. Journal of Psychoeducational Assessment, 28, 102114. https://doi.org/10.1177/0734282909342374Google Scholar
Burns, M. K., & Senesac, B. V. (2005). Comparison of dual discrepancy criteria to assess response to intervention. Journal of School Psychology, 43, 393406. https://doi.org/10.1016/j.jsp.2005.09.003Google Scholar
Burns, M. K., Silberglitt, B., Christ, T. J., Gibbons, K. A., & Coolong-Chaffin, M. (2016). Using oral reading fluency to evaluate response to intervention and to identify students not making sufficient progress. In Cummings, K. D. (Ed.), The fluency construct (pp. 123140). New York, NY: Springer. https://doi.org/10.1007/978-1-4939-2803-3_5Google Scholar
Burns, M. K., Zaslofsky, A. F., Kanive, R., & Parker, D. C. (2012). Meta-analysis of incremental rehearsal using phi coefficients to compare single-case and group designs. Journal of Behavioral Education, 21, 185202. https://doi.org/10.1007/s10864-012-9160-2Google Scholar
Christ, T. J. (2006). Short-term estimates of growth using curriculum-based measurement of oral reading fluency: Estimating standard error of the slope to construct confidence intervals. School Psychology Review, 35, 128133.Google Scholar
Deno, S. L. (1985). Curriculum-based measurement: The emerging alternative. Exceptional Children, 52, 219232. https://doi.org/10.1177/001440298505200303Google Scholar
Deno, S. L. (1986). Formative evaluation of individual student programs: A new role for school psychologists. School Psychology Review, 15, 358374.Google Scholar
Deno, S. L., Fuchs, L. S., Marston, D., & Shin, J. (2001). Using curriculum-based measurements to establish growth standards for students with learning disabilities. School Psychology Review, 30, 507524.Google Scholar
Deno, S. L., & Mirkin, P. K. (1977). Data-based program modification: A manual. Reston, VA: Council for Exception Children.Google Scholar
Flanagan, R., & Miller, J. A. (2010). Specialty competencies in school psychology. Oxford: Oxford University Press. https://doi.org/10.1093/med:psych/9780195386325.001.0001Google Scholar
Fuchs, L. S., & Deno, S. (1991). Paradigmatic distinctions between instructionally relevant measurement models. Exceptional Children, 57, 488500. https://doi.org/10.1177/001440299105700603Google Scholar
Fuchs, L. S., & Fuchs, D. (1986). Effects of systematic formative evaluation: A meta-analysis. Exceptional Children, 53, 199208. https://doi.org/10.1177/001440298605300301Google Scholar
Fuchs, L. S., & Fuchs, D. (1999). Monitoring student progress toward the development of reading competence: A review of three forms of classroom-based assessment. School Psychology Review, 28, 659671.Google Scholar
Fuchs, L. S., & Fuchs, D. (2001). What is scientifically-based research on progress monitoring? New York, NY: National Center on Student Progress Monitoring. Retrieved from https://eric.ed.gov/?id=ED502460Google Scholar
Fuchs, L. S., Fuchs, D., Hosp, M. K., & Hamlett, C. L. (2003). The potential for diagnostic analysis within curriculum-based measurement. Assessment for Effective Intervention, 28, 1322. https://doi.org/10.1177/073724770302800303Google Scholar
Fuchs, L. S., Fuchs, D., Hosp, M. K., & Jenkins, J. R. (2001). Oral reading fluency as an indicator of reading competence: A theoretical, empirical, and historical analysis. Scientific Studies of Reading, 5, 239256. https://doi.org/10.1207/S1532799XSSR0503_3Google Scholar
Haring, N. G., & Eaton, M. D. (1978). Systematic instructional technology: An instructional hierarchy. In Haring, N. G., Lovitt, T. C., Eaton, M. D., & Hansen, C. L. (Eds.), The fourth R: Research in the classroom. Columbus, OH: Merrill.Google Scholar
Hixson, M. D., Christ, T. J., & Bruni, T. (2014). Best practices in the analysis of progress monitoring data and decision making. In Harrison, P. L. & Thomas, A. (Eds.), Best practices in school psychology: Foundations (pp. 343354). Bethesda, MD: National Association of School Psychologists.Google Scholar
McGinty, A. S., Breit-Smith, A., Fan, X., Justice, L. M., & Kaderavek, J. N. (2011). Does intensity matter? Preschoolers’ print knowledge development within a classroom-based intervention. Early Childhood Research Quarterly, 26, 255267. https://doi.org/10.1016/j.ecresq.2011.02.002Google Scholar
PRESS Research Group. (2014). Path to reading excellence in school sites intervention manual. Minneapolis, MN: Minnesota Center for Reading Research.Google Scholar
Herman, K. C., Riley-Tillman, T. C., & Reinke, W. M. (2012). The role of assessment in a prevention science framework. School Psychology Review, 41, 306314.Google Scholar
Reschly, A. L., Busch, T. W., Betts, J., Deno, S. L., & Long, J. D. (2009). Curriculum-based measurement oral reading as an indicator of reading achievement: A meta-analysis of the correlational evidence. Journal of School Psychology, 47, 427469. https://doi.org/10.1016/j.jsp.2009.07.001Google Scholar
Riley-Tillman, T. C., & Burns, M. K. (2009). Evaluating educational interventions: Single-case design for measuring response to intervention. New York, NY: Guilford.Google Scholar
Salvia, J., Ysseldyke, J., & Witmer, S. (2012). Assessment: In special and inclusive education. Boston, MA: Cengage Learning.Google Scholar
Samuels, S. J. (1979). The method of repeated readings. The Reading Teacher, 32, 403408.Google Scholar
Shapiro, E. S. (2010). Academic skill problems: Direct assessment and intervention (4th ed.). New York, NY: Guilford.Google Scholar
Silberglitt, B., & Hintze, J. (2005). Formative assessment using CBM-R cut scores to track progress toward success on state-mandated achievement tests: A comparison of methods. Journal of Psychoeducational Assessment, 23, 304325. https://doi.org/10.1177/073428290502300402Google Scholar
Spearman, C. (1904). The proof and measurement of association between two things: American Journal of Psychology, 15, 72101. https://doi.org/10.2307/1412159Google Scholar
Speece, D. (2007). How progress monitoring assists decision making in a response-to-intervention framework. Washington, DC: National Center on Student Progress Monitoring. Available at https://tinyurl.com/snnxznrGoogle Scholar
Speece, D. L., Case, L. P., & Molloy, D. E. (2003). Responsiveness to general education instruction as the first gate to learning disabilities identification. Learning Disabilities Research & Practice, 18, 147156. https://doi.org/10.1111/1540-5826.00071Google Scholar
Stecker, P. M., & Fuchs, L. S. (2000). Effecting superior achievement using curriculum-based measurement: The importance of individual progress monitoring. Learning Disabilities Research & Practice, 15, 128134. www.tandfonline.com/doi/abs/10.1207/SLDRP1503_2Google Scholar
Szadokierski, I., & Burns, M. K. (2008). Analogue evaluation of the effects of opportunities to respond and ratios of known items within drill rehearsal of Esperanto words. Journal of School Psychology, 46, 593609. https://doi.org/10.1016/j.jsp.2008.06.004Google Scholar
Traub, R. E. (1997). Classical test theory in historical perspective. Educational Measurement: Issues and Practice, 16, 814. https://doi.org/10.1111/j.1745-3992.1997.tb00603.xGoogle Scholar
Treptow, M. A., Burns, M. K., & McComas, J. J. (2007). Reading at the frustration, instructional, and independent levels: The effects on students’ reading comprehension and time on task. School Psychology Review, 36, 159166.Google Scholar
Tucker, J. A., & Burns, M. K. (2016). Helping students remember what they learn: An intervention for teachers and school psychologists. Communiqué, 44, 23.Google Scholar
Warren, S. F., Fey, M. E., & Yoder, P. J. (2007). Differential treatment intensity research: A missing link to creating optimally effective communication interventions. Mental Retardation and Developmental Disabilities Research Reviews, 13, 7077. https://doi.org/10.1002/mrdd.20139Google Scholar
Wayman, M. M., Wallace, T., Wiley, H. I., Tichá, R., & Espin, C. A. (2007). Literature synthesis on curriculum-based measurement in reading. The Journal of Special Education, 41, 85120. https://doi.org/10.1177/00224669070410020401Google Scholar
William, D. (2006). Formative assessment: Getting the focus right. Educational Assessment, 11, 283289. www.tandfonline.com/doi/abs/10.1080/10627197.2006.9652993Google Scholar
Zutell, J. (1998). Word sorting: A developmental spelling approach to word study for delayed readers. Reading & Writing Quarterly: Overcoming Learning Difficulties, 14, 219238. https://doi.org/10.1080/1057356980140205Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×