Skip to main content
IncluShift Research Brief · 2026-04-18

Indicator 13 at the National Stagnation Rate

Why federal transition-planning compliance has not moved in a decade — and the four field-level checks that close the gap.

By Davit Janunts, M.Ed. Special Education (Lehigh University — Fulbright Foreign Student Program); co-author, Morin, Janunts, et al. (2024), Exceptional Children, 90(2), 145-163, doi:10.1177/00144029231165506.

Summary

IDEA Indicator 13 — the federal measure of whether secondary IEPs include the eight required transition-planning components (34 CFR §300.320(b)) — has hovered in the mid-50s nationally across State Performance Plan / Annual Performance Report (SPP/APR) cycles for more than a decade, despite sustained federal investment in NSTTAC and NTACT:C technical assistance. The eight components are themselves structural — date arithmetic, field-existence checks, and goal-language audits, not pedagogical complexity. This brief sets out why the rate stagnates, the four field-level checks that cleanly distinguish compliant from non-compliant IEPs, and why the operational fix is closer to workflow than to training.

The cost shape — what stagnation means in post-school outcomes

The National Longitudinal Transition Study-2 (Newman et al., 2011, NCSER 2011-3005) followed students who exited special education up to eight years post-high-school. Among the most consistent findings: students whose IEPs included measurable postsecondary goals, transition services explicitly linked to those goals, and documented self-determination instruction were substantially more likely to enroll in postsecondary education, hold competitive integrated employment, and live independently. The effect ran in the same direction across disability category, race, and family income, controlling for measured pre-existing differences.

Mazzotti et al. (2021, Career Development and Transition for Exceptional Individuals, 44(1), 47-64) updated the original Test et al. (2009, CDTEI, 32(2), 115-128) taxonomy of evidence-based predictors of post-school success. Their refined list of 20 predictors — paid work experience, self-determination instruction, parent involvement, inclusion in general-education settings, and others — maps directly onto the eight Indicator 13 components. Districts that miss Indicator 13 components are systematically failing to deliver the predictors the literature identifies as causally linked to outcomes.

Four field-level checks that distinguish compliant from non-compliant IEPs

1. Age-appropriate transition assessment — completed and documented

The federal regulation requires that postsecondary goals be “based upon age-appropriate transition assessments related to training, education, employment, and where appropriate, independent living skills” (34 CFR §300.320(b)(1)). The structural compliance check is two fields: (a) which assessment was administered (named instrument or structured interview) and (b) when it was administered relative to the IEP date. A checkbox “assessment completed” without a named instrument fails the audit on its face. Test et al. (2009) catalog the most defensible inventories for IDEA contexts; the field is well-developed and the bar for documentation is low.

2. Measurable postsecondary goals in all required areas

§300.320(b)(2) requires “appropriate measurable postsecondary goals” in the areas of education or training, employment, and — where appropriate — independent living. The most common Indicator 13 failure is a goal written as a wish (“Student will be successful after high school”) rather than as a measurable target with a verb, an outcome, and a time horizon (“Within one year of graduation, [Student] will enroll in a community-college certificate program in automotive technology”). The check is a regex pass over the goal text plus a confirmation that the three required areas are populated when independent-living needs are flagged. Both are field-level, not narrative judgment.

3. Transition services explicitly linked to each postsecondary goal

§300.320(b)(2)(ii) requires “transition services (including courses of study) needed to assist the child in reaching those goals.” The structural Indicator 13 audit asks whether every postsecondary goal has at least one named service or instructional activity attached. Mazzotti et al. (2021) note that paid work experience during high school is the single most consistent predictor of post-school employment outcomes; Landmark, Ju & Zhang (2010, Remedial and Special Education, 31(6), 165-176) reach the same conclusion in a meta-synthesis of 21 studies. Whether a paid work-experience activity appears anywhere in the service list is itself a field check.

4. Multi-year course of study aligned to the goals

§300.320(b)(2)(ii) and the NSTTAC checklist require a multi-year course of study showing that the student is on a credit-bearing path consistent with the postsecondary goals. The audit checks that the IEP contains a multi-year academic plan rather than only the current year, and that the plan ends in the credential the postsecondary goal requires (high-school diploma, certificate of completion, etc.). Districts that maintain a four-year course-of-study artifact pass this check by default; districts that rebuild it annually almost always fail it.

Why national compliance stagnates around the mid-50s

Three structural reasons recur across the SPP/APR record and the practitioner literature. First, the form-versus-substance trap: districts complete the boilerplate fields without populating them with substantive content, and self-monitoring is performed by the same case manager who wrote the IEP. Second, the no-feedback loop: Indicator 13 is reported up to OSEP but rarely connected back to the operational case-management workflow that would let case managers see and fix gaps before annual review. Third, generic transition assessment administered too early: an interest inventory administered at age 14 in the absence of any work-based learning rarely produces the specificity Mazzotti et al. (2021) identify as predictive of outcomes.

Wehmeyer, Palmer, Shogren et al. (2012, AJIDD, 117(5), 386-407) and Shogren, Wehmeyer, Palmer et al. (2015, Remedial and Special Education, 36(1), 32-44) establish a causal relationship between self-determination instruction (specifically, the Self-Determined Learning Model of Instruction) and student post-school outcomes. Martin, Van Dycke, Christensen et al. (2006, Exceptional Children, 72(3), 299-316) demonstrate that student-led IEP meetings — using the Self-Directed IEP curriculum — increase student goal participation and are operationalized as an evidence-based practice. None of these requires new technology. They require that the IEP workflow surface the missing components in time for the annual-review cycle.

Equity guard: the disparity is in the district workflow, not the student

Subgroup compliance gaps in Indicator 13 across race, English-learner status, and disability category are well documented in OSEP’s SPP/APR aggregations. Two observations follow. First, all four field-level checks above audit district output (the IEP, written by adults), not student characteristics — they cannot disparately impact students directly because they do not score students. Second, the equity intervention is the same as the compliance intervention: surface missing components on every IEP, and route them to the case manager before the annual-review meeting. Subgroup gaps close when the underlying compliance gap closes, not because of a different equity workflow.

What changes operationally

The shift is from after-the-fact OSEP-cycle reporting (an annual aggregate, often disconnected from individual case managers) to in-workflow flags inside the IEP-management system. A multi-year course of study, a regex over postsecondary-goal text, the presence of a named transition assessment, and a service linked to each goal — all four are field-level checks the IEP system can run continuously. The output is a per-student queue routed to the case manager who already owns the file, three to six months before the annual-review window, with a one-screen summary of which Indicator 13 components are missing.

The deliverable is not a predictive label on a student. It is a list of fields in the IEP that an adult should look at next.

Disclaimer. This brief is a research-informed analysis of published peer-reviewed transition research, federal regulation, and OSEP compliance reporting. It is not legal advice and is not a clinical or instructional recommendation for a specific student. Districts considering changes to Indicator 13 monitoring should consult their state education agency’s Part B compliance office and qualified special-education counsel.

References

  • Individuals with Disabilities Education Act, 34 CFR §300.320(b) — Transition services in the IEP.
  • Landmark, L.J., Ju, S., & Zhang, D. (2010). Substantiated best practices in transition: Fifteen plus years later. Remedial and Special Education, 31(6), 165-176.
  • Martin, J.E., Van Dycke, J.L., Christensen, W.R., Greene, B.A., Gardner, J.E., & Lovett, D.L. (2006). Increasing student participation in IEP meetings: Establishing the Self-Directed IEP as an evidence-based practice. Exceptional Children, 72(3), 299-316.
  • Mazzotti, V.L., Rowe, D.A., Kwiatek, S., Voggt, A., Chang, W., Fowler, C.H., Poppen, M., Sinclair, J., & Test, D.W. (2021). Secondary transition predictors of postschool success: An update. Career Development and Transition for Exceptional Individuals, 44(1), 47-64.
  • Newman, L., Wagner, M., Knokey, A.M., Marder, C., Nagle, K., Shaver, D., & Wei, X. (2011). The post-high school outcomes of young adults with disabilities up to 8 years after high school: A report from the National Longitudinal Transition Study-2 (NLTS2). NCSER 2011-3005. SRI International.
  • Shogren, K.A., Wehmeyer, M.L., Palmer, S.B., Rifenbark, G.G., & Little, T.D. (2015). Relationships between self-determination and postschool outcomes for youth with disabilities. Remedial and Special Education, 36(1), 32-44.
  • Test, D.W., Fowler, C.H., Richter, S.M., White, J., Mazzotti, V., Walker, A.R., Kohler, P., & Kortering, L. (2009). Evidence-based practices in secondary transition. Career Development for Exceptional Individuals, 32(2), 115-128.
  • Wehmeyer, M.L., Palmer, S.B., Shogren, K.A., Williams-Diehm, K., & Soukup, J.H. (2012). Establishing a causal relationship between intervention to promote self-determination and enhanced student self-determination. American Journal on Intellectual and Developmental Disabilities, 117(5), 386-407.
  • U.S. Department of Education, Office of Special Education Programs. (annual). State Performance Plan / Annual Performance Report (SPP/APR), Indicator 13. Federal compliance reporting cycle.