Indicator 13 requires states to report data on "The percent of youth aged 16 and above with an IEP that includes coordinated, measurable, annual IEP goals and transition services that will reasonably enable the child to meet the postsecondary goals." The sections below summarize the 2006-2007 APR data for Indicator 13.
Data Reported
For 2006-2007, all 60 states and territories reported data for Indicator 13. Table 1 compares the number and percent by percentage ranges for baseline and current year.
Table 1. Summary of Number and Percent of I-13 Scores by Percentage Ranges
Percent |
2005-2006 (Baseline) # (%) |
2006-2007 # (%) |
95-100* |
6 (10%) |
10 (16.7%) |
75-94 |
17 (28.3%) |
15 (25%) |
50-74 |
12 (20%) |
16 (26.6%) |
25-49 |
10 (16.7%) |
11 (18.3%) |
0-24 |
12 (20%) |
8 (13.3%) |
No Data |
3 (5%) |
0 (0%) |
Median |
60% |
69% |
Range |
0-100% |
3-100% |
- For the baseline year (2005-2006), individual student data ranged from 0% to 100%, with a median of 60% with 58.3% of states and territories reporting baseline data between 51% and 100%. Six (10%) states and territories met the compliance criteria of 95-100%.
- For 2006-2007, data ranged from 3% to 100% with a median of 69% (an increase of 9%) with 68.3% (an increase of 10%) of states and territories reporting baseline data between 51% and 100%. Ten (16.7%) states and territories met the compliance criteria.
Progress and Slippage
Table 2 summarizes the progress or slippage across all 60 states and territories, as well as if the progress or slippage was explained and data were provided to mearsure the impact of the stated Improvement Activities.
Table 2. Progress or Slippage for 2006-2007
Type of Change |
Number |
Percent |
Made Progress |
37 |
61.7% |
Remained the Same |
3 |
5.0% |
Had Slippage |
17 |
28.3% |
Unknown (no baseline data) |
3 |
5.0% |
Explained Progress/Slippage |
53 |
88.3% |
Provided Impact Data on Improvement Activities |
8 |
13.3% |
- 40 (66.7%) states and territories made progress or remained the same.
- Of the 17 (28.3%) states and territories who reported slippage, 11 stated that slippage was due to implementing a more rigorous set of criteria for measuring I-13.
- While almost all (n=53; 88.3%) provided an explanation of what Improvement Activities may have caused their progress or slippage, only 8 (13.3%) provided data on the impact of their Improvement Activities. Most changes were explained using terms such as "we believe," "may have explained," or "because Indicator data improved it can be concluded that Improvement Activities were effective."
- 2 states explained their progress was the result of decreasing the rigor of their criteria for measuring I-13.
Type of Checklist Used to Collect Data (Validity and Reliability of Data)
States and territories used a variety of checklists to measure Indicator 13 including the NSTTAC I-13 Checklist, an Adapted NSTTAC I-13 Checklist, or their own checklist. Table 3 compares the type of checklists used by states and territories to measure Indicator 13 across baseline and the current year.
Table 3. Type of Checklist Used to Collect Indicator 13 Data
Type of Checklist |
2005-2006 (Baseline) # (%) |
2006-2007 # (%)) |
NSTTAC I-13 Checklist |
12 (20%) |
22 (36.7%) |
Adapted NSTTAC I-13 Checklist |
0 (0%) |
8 (13.3%) |
Own Checklist (requirements stated) |
15 (25%) |
12 (20%) |
Own Checklist (requirements not stated) |
30 (50%) |
3 (5%) |
No Checklist Reported |
3 (5%) |
15 (25%) |
- 42 (70%) of states and territories stated the requirements used to measure I-13. Since all the requirements were related to the language used in the Indicator, we concluded that these were valid instruments.
- 18 (30%) of states and territories did not provide the requirements used to measure I-13. Therefore, it is impossible to determine if they used a valid instrument.
- 38 (63.3%) described their reliability/verification process in their APR. This typically included training monitors (both SEA and LEA) and/or a state reviewing data collected via onsite file reviews or by a web-based data collection system.
- The number of states and territories providing an Item-by Item summary of their I-13 data increased from 9 (15%) in 2005-2006 to 18 (30%) in 2006-2007.
Type of Checklist Used and Progress/Slippage
Because states and territories used a variety of checklists to measure I-13, we disaggregated the progress or slippage data by type of checklist used. See Table 4.
Table 4. Progress and Slippage by Type of Checklist Used
Type of Checklist |
Progress # (%) |
Slippage # (%) |
Remained the Same # (%) |
Unknown # (%) |
NSTTAC I-13 Checklist |
15 (68.2%) |
6 (27.3%) |
1 (4.5%) |
N/A |
Adapted NSTTAC Checklist |
5 (62.5%) |
3 (37.5%) |
N/A |
N/A |
Own Checklist (requirements stated) |
6 (50%) |
4 (33.3%) |
1 (8.3%) |
1 (8.3%) |
Own Checklist (requirements not stated) |
2 (66.7%) |
1 (33.3%) |
N/A |
N/A |
No Checklist Reported |
9 (60%) |
3 (20%) |
1 (6.7%) |
2 (13.3%) |
Overall |
37 (61.7%) |
17 (28.3%) |
3(5.0%) |
3 (5.0%) |
- States and territories that used the NSTTAC I-13 Checklist (a valid instrument) had the highest percentage of progress, followed by Own Checklist (requirements not stated; validity of instrument unknown).
Improvement Activities
All 60 states and territories included Improvement Activities. Table 5 compares their stated activities across baseline and the current year.
Table 5. Summary of Improvement Activities
Improvement Activity |
2005-2006 (Baseline) # (%) |
2006-2007 # (%) |
(A) Improve data collection and reporting &/or (E) Clarify/examine/develop policies and procedures |
53 (92.9%) |
40 (66.7%) |
(B) Improve systems administration and monitoring |
15 (25.8%) |
38 (63.3%) |
(C) Provide training/professional development &/or (D) Provide technical assistance |
56 (96.5%) |
60 (100%) |
(F) Program development |
19 (33.3%) |
14 (23.3%) |
(G) Collaboration/coordination |
31 (53.4%) |
24 (40%) |
(H) Evaluation |
5 (8.8%) |
4 (6.7%) |
(I) Increase/Adjust FTE |
4 (7.0%) |
2 (3.3%) |
(J) Other |
N/A |
1 (1.7%) |
- The three most frequently stated Improvement Activities continued to be (C/D) provide training/professional development/technical assistance, (A/E) improve data collection and reporting/examine policies and procedures, and (b) improve systems administration and monitoring.
- Only 8 (13.3%) states and territories provided data on the impact of their Improvement Activities including:
- (A/E) Technical assistance/professional development (n=4) by collecting pre-post data on content presented (e.g., improved transition components of IEPs)
- (B) Improved systems administration and monitoring (n=3) by previewing sample files (e.g., using the NSTTAC Checklist to conduct detailed pre-data collection reviews)
- (G) Collaboration/coordination (n=1) by collecting satisfaction data on interagency linkages (e.g., "good" or "better" connection), number of students referred in last two years of school, and percent of students found eligible for services.
- Of the 45 (75%) who explained progress or slippage, but did not provide impact data, all provided some type of process data (e.g., # of workshops held, # of attendees, # of materials produced, # of meetings held).
TA Center Consulted with State
NSTTAC provided various levels of consultation to all 60 states and territories. Table 6 compares the types of consultation provided across baseline and the current year.
Table 6. Summary of NSTTAC Consultation to States and Territories (n = 60)
Level of Technical Assistance |
2005-2006 (Baseline) # (%) |
2006-2007 # (%) |
Universal/General |
11 (18.3%) |
11 (18.3%) |
Targeted/Specialized |
38 (63.3%) |
44 (73.3%) |
Intensive/Sustained |
4 (6.7%) |
5 (8.3%) |
(E) No Contact |
7 (11.7%) |
0 (0%) |
- 49 (81.7%) states and territories received Targeted or Intensive technical assistance from NSTTAC.
- The most frequent type of Targeted technical assistance was attending a State Planning Institute or an Indicator 1, 2, 13, & 14 Cross-Indicator Regional Meeting.
Highlights of 2006-2007 APR I-13 Data
- All states and territories provided data for 2006-2007.
- For 2006-2007, data ranged from 3-100% with a median of 69% (an increase of 9%) with 68.3% (an increase of 10%) of states and territories reporting baseline data between 51% and 100%.
- 10 (16.7%) states and territories met the compliance criteria of 95-100%.
- 40 (66.7%) states and territories made progress or remained the same.
- Of the 17 (28.3%) states and territories who reported slippage, 11 stated that slippage was due to implementing a more rigorous set of criteria for measuring I-13.
- 2 states explained their progress was the result of decreasing the rigor of their criteria for measuring I-13.
- 42 (70%) states and territories stated the requirements used to measure I-13. Since all requirements were related to the language used in the Indicator, we concluded that these were valid instruments.
- The three most frequently stated Improvement Activities continued to be (C/D) provide training/professional development/technical assistance, (A/E) improve data collection and reporting/examine policies and procedures, and (b) improve systems administration and monitoring.
- Only 8 (13.3%) states and territories provided data on the impact of their Improvement Activities.
- 49 (81.7%) states and territories received Targeted or Intensive technical assistance from NSTTAC. The most frequent type of Targeted technical assistance was attending a State Planning Institute or a Cross-Indicator (1, 2. 13, & 14) Regional Meeting.
Recommendations for Collecting Future I-13 Data
- In order to ensure data are valid, require states and territories to include a copy of their checklist in the APR. This could be done by requiring states to provide an item x item summary of their checklist.
- In order to ensure data are reliable (accurate), require APRs to describe the process used to collect reliable data. This does not mean just verifying that all data were collected, it means checking to determine that the data entered are accurate (would be agreed upon by a second person).
- Provide states and territories with list of possible methods they can use to determine the impact of their Improvement Activities.
- For ease of reporting and reading, require states and territories to list Improvement Activities in tabular format
Comments
Raji (not verified)
Tue, 04/03/2021 - 06:07
Permalink
hi sophie not nleuss peopel
Add new comment