General Guidance for Assessing Your Students

From the University of Oregon and Amplify

As you plan for the new school year, we at Amplify and the team at the University of Oregon are here to provide continued guidance and support around collecting and using DIBELS 8th Edition data. In this guide, we offer recommendations for beginning-of-year (BOY) benchmark assessment with DIBELS as well as tips for interpreting benchmark data given the ongoing widespread disruptions to school.

  • What to assess
  • When to assess
  • Recommended administration procedure
  • Interpreting results
  • Instructional planning
  • Progress monitoring

What to assess

In general, we recommend that you assess students using all benchmark measures at the beginning of year (BOY). Administering all measures will give you access to both composite scores and individual subtests scores, which provide a clearer picture of your students’ level of risk and the effects of COVID-19-related academic disruptions. If it is not feasible to administer all benchmark measures, or you need to reduce assessment time, we recommend administering one key measure for each grade:

Grade level(s)BOYMOYEOY
KindergartenLNFNWFNWF
Grade 1NWFORFORF
Grade 2-8ORFORFORF

If you decide to administer one or a subset of these benchmark measures, note that composite scores won’t be generated, in which case these results at the measure level won’t be available in the Reporting and Analysis Suite for aggregate reporting purposes.

When to conduct BOY benchmark assessments

Many schools are planning for atypical setups in the fall, which will require teaching students new protocols and procedures (e.g., a hybrid of remote and in-person learning, social distancing procedures). When possible, give students time to adapt to these new routines before testing. We recommend beginning assessment after students have been in school for approximately 4 weeks.

Recommended administration procedures 

When possible, we recommend administering DIBELS benchmark testing in person. However, we know that standard in-person administration of DIBELS 8th Edition measures may not be possible for many schools if partial or fully remote operations are necessary at the start of the school year. See mCLASS guidance on remote administration.

We recognize that even if you are able to administer DIBELS in-person, you may need to adapt administration to adhere with your state’s social distancing guidelines. If you plan to administer DIBELS from a physical distance, we advise you to consider the following:

  • Record testing sessions. Assessing from a distance may make it difficult for you to hear student responses accurately. When permitted in your setting, we recommend that you audio record student responses to allow for rescoring if you are unsure about a student’s response. 
  • Acknowledge the atypical set up. Reading to an assessor who’s across the room may be a new experience for children, so we recommend directly acknowledging the unusual nature of the situation before testing. We also recommend prompting students to read as loudly and clearly as they can before you begin.
  • Decide how you will manage materials. Because you will be testing from a distance, it may be more difficult to manage student materials and provide visual prompts (e.g., pointing to a word).

Interpreting 2020 DIBELS benchmark results

As always, remember that DIBELS results should be used to make informed instructional decisions. When deciding how best to assess your students, consider the extent to which their results will be used to help you make decisions related to the assessment’s primary purposes: to identify student risk, to monitor student progress, and to inform instructional decision making. If you assess students using a combination of in-person and remote testing procedures, comparisons between student scores should be considered with the mode of test administration in mind.

Given that students have experienced two or more months of atypical instruction, it is expected that most students’ reading scores will be lower than you’d typically see at the beginning of the school year. Remember, even if students consistently received remote learning instruction during the spring, they still experienced a significant disruption to their normal routine, which likely affected their ability to access instruction to some degree. Because of this disruption, you will likely see more students than typical who fall in the “some risk” and “at risk” range based on their BOY benchmark scores. If this is the case, take a deep breath and remember that similar slumps occur every year during the summer when students are out of school. It is just that in 2020 the slump is expected to be greater because of COVID-19 disruptions. 

For the majority of students, several months of receiving appropriate, high-quality reading instruction should help get them back on track for reading success, as is usually the case during a typical school year.

Instructional planning considerations

Depending on the number of students in your class or grade who demonstrate reading risk, you may need to adjust how you plan and implement core, strategic, or intensive instruction.

Adjusting core

If many of your students fall in the “some risk” and “at risk” range, you may want to consider how to address this issue at the classroom or grade level, rather than just at the individual student level. For example, consider whether you can address some of these reading difficulties during whole group instruction. Talk to students’ classroom teachers from the previous school year—were they unable to cover specific instructional concepts with students because of school closures? Many core curricula include review content for the first several weeks of lessons. Consider spending time teaching this review content to students before moving on to new material. Or supplement grade-level core instruction with short lessons targeting student skill gaps. For example, if you notice in mCLASS Instruction that many students are making errors with a skill that should have been taught previously (e.g., the suffix -ed), spend five minutes at the beginning of your core reading lesson re-introducing the skill to students using the recommended activities for those skill gaps in mCLASS Instruction.

Additionally, Amplify is offering free instructional resources that fills previous years’ instructional gaps in foundational skills for grades 1-3. In some cases, it may make sense to begin the year by teaching students content from the previous year that they did not receive. However, the goal should always be for students to read and learn grade-level content; any plans to use below grade-level curricula should be paired with a plan for fast-tracking students through lessons until they are able to transition to on-grade material.

Supplemental intervention

If many students demonstrate reading risk, consider other data sources in addition to their BOY benchmark scores to prioritize supplemental intervention for those who need it most. Some additional data sources you may want to consider include:

  • DIBELS scores from a previous benchmark period, prior to instructional disruptions. What was the student’s level of risk before school closures? We may be more concerned about a student with a history of reading difficulties than a student who was doing just fine. 
  • DIBELS subtest scores in addition to composite scores. Is the student demonstrating risk on many different subtests, or just a couple of subtests? Can the core instruction you provide address their needs, or do they require something more targeted? It may be more important to prioritize supplemental instruction for students when you know they have skill gaps that core instruction will not address. For example, if a third grade student is in the “at-risk” range on NWF and you do not regularly teach phonics in core instruction, you may want to consider giving the student supplemental phonics support to bolster their skills with letter-sound correspondences and blending.
  • Performance on class assignments and during instruction. Does the student struggle to keep up with the content being taught during core instruction? We are more concerned about a student who consistently makes errors during core instruction than a student who appears to be mastering the content taught.
  • Percentile ranks. How does this student’s performance compare to the performance of other students in your class/grade/district? If many students perform below the benchmark cut score, you may need to assign students to supplemental instruction based on your available resources and focus on the lowest-performing 25% of students in the grade. Local percentiles will be added to mCLASS teacher reporting during SY2020.

Monitoring student progress

Regardless of which students you choose to provide supplemental and intensive instruction, we recommend monitoring the progress of all students who fall in the “some risk” or “at risk” range on DIBELS to ensure no students fall through the cracks. For students who receive supplemental or intensive intervention, use the PM recommendation in the mCLASS assessment application. The complete progress monitoring recommendations are published in the DIBELS 8th Edition Administration and Scoring Guide published on the University of Oregon site. Given the out-of-the-ordinary circumstances, you may find some students who receive supplemental intervention make quick progress and can be transitioned out of supplemental instruction so others are able to benefit from it.

For students who do not receive these supports but are still at some risk, consider administering an extra progress monitoring probe between the beginning- and middle-of-year benchmark periods. By continuously monitoring the performance of these students, you’ll be able to quickly tell if a student is falling behind and in need of supplemental intervention.

The complete BOY testing guidance from the University of Oregon is available on their site.


© 2021 Amplify Education, Inc. 55 Washington St #800, Brooklyn, NY 11201.