Interim Assessment Practices and Avenues for State Involvement

This report was commissioned by the Technical Issues in Large Assessment SCASS (State Collaborative on Assessment and Student Standards) Testing of the Council of Chief State School Officers. The report defines interim assessment, differentiates it from formative assessment, and describes how interim assessment can be used in case examples. It highlights examples where interim assessment appears to have positive effects and highlights positive state roles with regard to interim assessment.

This report is intended to:

  • Disseminate information what interim assessment is and how it can be used,
  • Clarify differences between interim and formative assessment,
  • Provide examples of state educational agencies’ (SEAs’) involvement and roles with interim assessments, and
  • Highlight educational agencies or jurisdictions where interim assessment appears to be having benefits.
Content Comments 

Particularly of interest, section 3 (pp. 7-8) lays out the roles state agencies may play in supporting (or not) districts’ use of interim assessment. The list is ordered from least to the most prescriptive.

  1. Ignore interim assessments. SEAs may not take a position on local use of interim assessments.
  2. Provide or support professional development on interim assessment. Generally only states that fund in an item bank or interim assessment system establish policy on the topic and/or provide state training on interim assessment.
  3. Disseminate nonbinding guidance or criteria for district use in selecting interim assessment system. SEAs may issue a fact sheet or information guide on interim assessment, including suggested criteria for selection.
  4. Develop and publish a non-binding consumer’s guide reviewing available interim assessments. As the report noted in this regard: New Mexico’s SEA clearly stated the SEA’s role: “The Public Education Department is not endorsing any of the vendors but rather provides a guide to school districts as they are reviewing vendor materials” (New Mexico Public Education Department, 2006, p. 3).
  5. Endorse specific interim products. The report notes as examples, Arkansas, Oklahoma, Texas who have encourage districts to use specific products.
  6. Produce an approved list. The approved list option not only reviews available interim assessment but based on the review identifies which products districts may choose and purchase under state funding.
  7. Facilitate consortia of districts. SEAs can bring together—or support initiatives that bring together—school districts looking to purchase an interim assessment system or construct an item bank. E.g., Nebraska through its School-based, Teacher-led Assessment and Reporting System (STARS), and Rhode Island, through its collaboration with the Rhode Island Skills Commission, have supported district consortia for item development.
  8. Build an item bank as an optional resource for school districts. Arizona, Georgia, Louisiana, Mississippi, Texas, and Utah all support item banks that local educators can use to construct interim assessments.
  9. Require district and school use of interim assessment. Either through a state system or through local tests, the SEA may mandate the use of interim assessments for all schools or of high-priority schools/schools in improvement. E.g., South Carolina and Georgia require use of interim assessments as part of their school and district improvement processes. 

Although dated, the report's analysis of the use and benefits of interim assessment in seven jurisdictions is worth reading. Its analysis of possible state roles in implementing interim assessment remains timely.