Trends in the Participation and Performance
of Students with Disabilities

Technical Report 50

M. L. Thurlow, R. F. Quenemoen, J. Altman, & M. Cuthbert

December 2008

All rights reserved. Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Thurlow, M., Quenemoen, R., Altman, J., & Cuthbert, M. (2007). Trends in the participation and performance of students with disabilities. (Technical Report 50).  Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Table of Contents

Executive Summary
Overview
Methods
Results
     Characteristics of State Regular Assessments
     Quality of Assessment Reporting for Students with Disabilities
     Student Participation and Performance on Regular Reading and Math Assessments
     Student Participation and Performance on Alternate Reading and Math Assessments
Discussion
     Trends in the Public Reporting of Assessment Data
     Trends in Participation and Performance
     Conclusions
References


Executive Summary

This report marks the first analysis conducted by the National Center on Educational Outcomes (NCEO) of trends in the public reporting of state assessment results for students with disabilities. This study followed four analyses of public reporting conducted by NCEO since the passage of the federal No Child Left Behind Act of 2001 (NCLB). Greater numbers of states than ever before are reporting assessment data for students with disabilities disaggregated by grade level and content area. At the same time, states have made improvements in their data collection systems which make this type of reporting possible. Unfortunately, the number of states for which data were available across the four years was relatively small. For the regular assessment, only slightly over half had performance data across this time period, and fewer than ten states had participation data for each of the four years studied. Even with a more lenient approach to looking at alternate assessment data (i.e., including a state if it had at least three years of data), only eight states reported performance data for the alternate assessment across the time period, and no states reported participation data in a consistent manner across each of the four years.

Based on the states with data across years, average percentages of students with disabilities performing at the proficient or above level showed moderate increases across the four years for both reading and math in elementary and middle schools. Performance data for high school students did not show the same gains. Trend data also showed lower percentages of high school and middle school students demonstrating proficient or above performance, as compared to elementary school students. This tendency was true for both reading and math. Continued investigations of not only trend data but also the data available for examining trends is an important step for evaluations of the changes in participation and performance of students with disabilities over time.


Overview

States have been required to publicly report on the participation and performance of students with disabilities in large-scale assessments since the 1994 reauthorization of the federal Elementary and Secondary Education Act (ESEA), called the Improving America’s Schools Act (IASA). The reauthorization of the Individuals with Disabilities Education Act (IDEA) in 1997 initiated an alignment with ESEA, and specified that each state had to report assessment data for children with disabilities "with the same frequency and in the same detail as it reports on the assessment of nondisabled children" (IDEA, 1997). States were required to report the number of children with disabilities participating in regular assessments and in alternate assessments. Reporting on the performance of these children was required beginning in 1998 for regular assessments, and in 2000 for alternate assessments, if numbers would permit statistically sound inferences and not result in individually identifiable information (IDEA, 1997).

In 2001, the No Child Left Behind Act (NCLB) Title I provisions expanded the testing and accountability requirements that had been in place from IASA. Under IASA requirements, each state (or district) receiving Title I funding had to administer student assessments in mathematics and reading/language arts once in each grade band of 3-5, 6-9, and 10-12. NCLB required that starting no later than 2005-06, states had to have in place assessments in reading and math for each of grades 3-8, and at least once in grades 10-12; and by 2007-08, measure science at least once in each of the grade-level bands of 3-5, 6-9, and 10-12 (NCLB, 2001).

The Individuals with Disabilities Education Improvement Act of 2004 (IDEA 2004) reinforced the reporting provisions of the 1997 IDEA amendments and systematically aligned these requirements with the mandates of NCLB. IDEA 2004 and NCLB set strict requirements for public reporting of participation and performance for students with disabilities in statewide assessments, including assessments with appropriate accommodations and alternate assessments. First, these laws required public reporting of participation of all students, as well as disaggregated reporting of several key subgroups, including students with disabilities. Second, NCLB established criteria for states’ definitions of Adequate Yearly Progress (AYP) used for accountability purposes and required that disaggregated performance results be made public through state/district annual report cards. In addition, IDEA 2004 required states to establish performance goals for students with disabilities that are consistent with the state’s definition of AYP under NCLB and to report annually on progress toward meeting these performance goals.

NCEO has documented the public reporting of state participation and performance information for student with disabilities since 1997. The first NCEO report found that most states did not publicly report information on either participation or performance for students with disabilities and that many students with IEPs were exempted from testing altogether (Thurlow, Langenfeld, Nelson, Shin, & Coleman, 1998). The NCEO reports covering the time span from 1998-2002 showed that states slowly made improvements in their reporting practices for students with disabilities, specifically in disaggregation of the participation and performance results of these students on state assessments (Bielinski, Thurlow, Callender, & Bolt, 2001; Thurlow, House, Boys, Scott, & Ysseldyke, 2000; Thurlow et al., 1998; Thurlow, Nelson, Teelucksingh, & Ysseldyke, 2000; Thurlow, Wiley, & Bielinski, 2003; Ysseldyke, Thurlow, Langenfeld, Nelson, Teelucksingh, & Seyfarth, 1998).

The purpose of this report is to document the participation and performance trends over time for students with disabilities, progressing from school year 2001-02, a base year for determining AYP goals under NCLB— through school year 2004-05, the third year that states reported after the NCLB baseline year (VanGetson & Thurlow, 2007). Within this time frame NCEO prepared yearly reports to track public reporting of state assessment results (Klein, Wiley, & Thurlow, 2006; Thurlow & Wiley, 2004; VanGetson & Thurlow, 2007; Wiley, Thurlow, & Klein, 2005). This report is a compilation of state data across this time frame and focuses on the extent to which states reported participation and performance data, and an analysis of the participation and performance trends for those states reporting data. The goals of this research were to determine: (1) whether progress has been made by states in publicly reporting the participation and performance of students with disabilities on state assessments, and (2) the nature of trends in the four years of participation and performance data for students with disabilities.


Methods

Data were gathered from previous NCEO reports on state assessment reporting for school years 2001-02 through 2004-05 (Klein et al., 2006; Thurlow & Wiley, 2004; VanGetson & Thurlow, 2007; Wiley et al., 2005). Data points were obtained from reports and inserted into data tables showing participation and performance for each state, by year, for each grade level and content area.

The original data for these reports were gathered from individual state department of education Web sites, with collection generally starting during the winter following a spring-fall assessment cycle. The test name, grade levels assessed, content areas tested, test type, and availability of disaggregated participation and performance data for students with disabilities were recorded for each report. NCEO staff also documented whether participation and performance data were reported each year for students with disabilities. Typically this information was available online for slightly more than half of the states by mid-winter. Each year, these publicly-reported data were combined with data points obtained by NCEO during a springtime verification process with state directors of assessment and state directors of special education in order to generate the database of state information for past reports.

Trends in participation and performance could only be analyzed for those states that reported data. In order to maximize the data available for these analyses, a search of current state education Web sites was conducted to find historical assessment data posted after our initial analysis and reports, but now available to the public. Our initial analyses of participation data left only two states with historical participation information going back four years (Connecticut, Kansas). New participation information was uncovered for the other seven states included in this report (Colorado, Iowa, Nebraska, North Carolina, North Dakota, Washington, Wisconsin), enabling our analysis to include four years worth of information for a total of nine states.

New performance data points from states with available data across four consecutive years were added to the analysis as long as they met the specific guidelines we had used previously for clarity and depth in reporting. These efforts accounted for the inclusion, for both reading and math, of two states at the elementary and middle school level, and four states at the high school level: Iowa—elementary, middle, and high school; Kentucky—high school; Montana—elementary, middle, and high school; Mississippi—high school. Similarly, data points that had been revised for presentation on the Web site were accepted as replacements for data points obtained from NCEO report appendices or from raw data points that were the basis for NCEO figures in earlier reports. Such revisions occurred infrequently, for only one state at the elementary and middle school levels, and for three states at the high school level, for both reading and math (Connecticut—elementary, middle, and high school; New Mexico—high school; New Jersey-high school).

Though alternate assessment participation and performance information was collected in general terms starting in 2001-02, it was not included by individual state in previous analyses of the public reporting of state assessment results for students with disabilities. Thus, the collection of these data was an entirely new undertaking and all information included in this selection is new.

Data for this report were analyzed only for criterion-referenced assessments. The public presentation of assessment information for these assessments, disaggregated for students with disabilities, was considered to be clear only if it met specific guidelines. First, the data must have been presented by states in percentages and not just raw numbers of students—ideally with underlying numbers clearly evident—and must have been disaggregated for "students with disabilities" or "students with IEPs" or "special education students." Second, data must have been disaggregated by grade level and content area. Third, the assessment must have been clearly designated as either a regular assessment (given to most students) or an alternate assessment (given to a smaller number of students). All data and documents were verified by NCEO staff prior to analysis. Trend work began in the fall of 2006.


Results

Characteristics of State Regular Assessments

Table 1 shows the total number of regular assessments across states, the number of states with more than one assessment, the number of criterion referenced tests (CRTs), and the percentage of assessments that are CRTs in each year from 2001-02 through 2004-05. Since 2001-02, the number of different regular assessments (n=107-112), and the number of states with multiple regular assessment systems (n=35-36), have stayed relatively constant. There has been a trend toward a higher number and percentage of tests that are criterion-referenced tests (CRT), while the majority of remaining assessments are norm-referenced tests (NRT), assessments where students’ scores are compared against the norms of a defined population.

Table 1. Overview of States’ Regular Assessments

Year

Number of Regular Assessments

Number of States with More than One Assessment

Number of CRTs

Percentage of Assessments That Are CRTs

2001-02

111

35

64

58%

2002-03

110

35

70

64%

2003-04

112

35

76

68%

2004-05

107

35

75

70%

 

Quality of Assessment Reporting for Students with Disabilities

Regular Assessment

Table 2 shows the extent to which assessment information was disaggregated for students with disabilities on regular assessments from 2001-02 through 2004-05. The percentages were relatively stable across the four years.

Table 2. Disaggregated Data for Students with Disabilities on Regular Assessments

Year

Total Number Assessments

Participation

Performance

Number of Assessments

Percent1

Number of Assessments

Percent1

2001-02

111

91

82%

98

88%

2002-03

110

89

81%

91

83%

2003-04

112

94

84%

97

87%

2004-05

107

84

79%

94

88%

1 Percent is derived by dividing number of assessments reported by total number of assessments.

Table 3 shows the reporting practices for participation and performance by states rather than by assessments. When viewed this way, there does appear to be variation in reporting practices among states. In 2004-05, all but two states provided participation and performance data for at least some of their state assessments, and every state provided some assessment performance data disaggregated for students with disabilities. This was the first time in the past four years that every state reported disaggregated data. The number of states reporting participation and performance data for all of their regular assessments has remained relatively unchanged during the past four years.

Table 3. Numbers of States Disaggregating Regular Assessment Data for Students with Disabilities

Year

Participation & Performance for All State Assessments

Performance Only for All State Assessments

Participation & Performance for SOME State Assessments

No Participation or Performance

2001-02

35

4

9

2

2002-03

36

1

10

3

2003-04

35

2

11

2

2004-05

36

2

12

0

Table 4 provides more detailed information about participation in assessments, with percentages based on the total number of assessments given by the states. States increasingly seem to be reporting certain types of information, including the percentage of students with disabilities tested, and the count and percentage of students not tested for various reasons on statewide assessments. For example, there was an increase of 18 percentage points from 2001-02 to 2004-05 for the number of assessments for which "percent of students tested" was reported. Similarly, there was a 15 percentage point increase in the reporting of the number "not tested" and a 21 percentage point increase in the reporting of the percentage "not tested." Other types of information remained relatively stable, including the percent exempted/excluded (2 percentage point increase), and the number or percent absent (4 percentage points increase).

Table 4. Details of Participation Reporting for Regular Assessments

Year

Total Number of Assessments

Participation Details Reported Publicly

Number Tested

Number Not Tested

Percent Tested

Percent Not Tested

Number or Percent Absent

Percent Exempt or Excluded

2001-02

111

83%

9%

38%

0%

16%

8%

2002-03

110

90%

10%

41%

16%

17%

8%

2003-04

112

88%

20%

54%

33%

21%

5%

2004-05

107

82%

24%

56%

34%

20%

10%

 

Alternate Assessment

Table 5 shows the number and percentage of states that reported participation and performance data for state alternate assessments for each year from 2001-02 through 2004-05. Steady increases were evident across time for the reporting of both participation and performance, with a 38 percentage point increase in the number of states reporting participation, and a 34 percentage point increase in the number of states reporting performance.

Table 5. Data Reported for Students with Disabilities on Alternate Assessments

Year

Participation

Performance

States Reporting

Percent1

States Reporting

Percent1

2001-02

27

54%

27

54%

2002-03

32

64%

33

66%

2003-04

35

70%

34

68%

2004-05

46

92%

44

88%

1 The denominator used to create percentages for this table was number of states, not number of total assessments.

Table 6 provides another view of information on state reporting of alternate assessment data. This table shows that from 2001-02 through 2004-05, 15 additional states began reporting partial information (either participation or performance), and 21 additional states began reporting both participation and performance information. By 2004-05, 43 states provided both participation and performance data, and all but three states reported some information about participation and performance for students with disabilities on alternate assessments.

Table 6. Details of States’ Disaggregated Alternate Assessment Data for Students with Disabilities

Number of States Reporting

Year

Participation & Performance

Participation Only

Performance Only

Did Not Report

2001-02

22

5

5

18

2002-03

29

3

4

14

2003-04

33

2

1

14

2004-05

43

3

1

3

Table 7 provides information on the details of participation reporting for alternate assessments, such as the number and percentage of students tested, not tested, and exempted/excluded, or absent. The percentages in Table 7 are based on the 53 total alternate assessments reported by states. There is evidence of a general trend toward more states performing calculations and reporting the percent of students tested and not tested. In 2001-02, no state reported information on students who were exempt or absent for the alternate assessment. By the 2004-05 school year, 9% of state assessments reflected exempt or excluded students and 13% reflected the number or percent of students absent.

Table 7. Details of Participation Reporting for Alternate Assessments

Year

Total Number of Assessments

Participation Details Reported Publicly

Number Tested

Number Not Tested

Percent Tested

Percent Not Tested

Percent Exempt or Excluded

Number or Percent Absent

2001-02

27

52%

0%

11%

0%

0%

0%

2002-03

32

88%

0%

44%

3%

3%

9%

2003-04

32

100%

0%

59%

28%

13%

16%

2004-05

53

87%

30%

74%

36%

9%

13%

1 Percent is derived by using total number of assessments as a denominator, and not total number of states.

Figure 1 shows the number of states that reported some level of performance data for students with disabilities for at least one alternate assessment. In 2004-05, there was a substantial increase over previous years in the number of states providing these data.

Figure 1. Trends in Reporting for Performance on Alternate Assessments.

Student Participation and Performance on Regular Reading and Math Assessments

Participation in Regular Assessments

Participation rates that were publicly reported for reading and math assessments at the elementary, middle, and high school levels were summarized for the years 2001-02 through 2004-05. Each school level (elementary, middle, and high school) was represented by one grade per state, typically grades 4, 8, and 10. If data for any of these grades was unavailable, that school level is represented by grade 5, 7, or 11, depending on school level. Complete data were available for only eight states for reading and nine states for math. Only these states provided clear reporting of both numerator and denominator numbers across the four years.

Elementary Reading – Regular Assessments. Figure 2 presents the participation rates for the regular reading assessments in elementary school from 2001-02 through 2004-05 for eight states. In 2001-02, these states had an average participation rate of 90%, but this figure was due to rather low values for two states (Connecticut and North Carolina); the median participation rate was 96%. All eight states tended to show either increases in their rates or ceiling effects across the four-year reporting period, with a mean of 96% and a median of 98% in 2004-05.

Figure 2. Participation Rates for Students with Disabilities on Elementary Reading Assessments

Middle School Reading – Regular Assessments. Figure 3 presents participation rate data for students with disabilities on regular reading assessments in middle school from 2001-02 through 2004-05 for the eight states with data. Positive trends are evident in the mean participation rate (89% to 96%), and in the median rate (95% to 97%) over the four years. These gains were largely accounted for by two states (Connecticut and North Carolina) due to their relatively lower initial rates.

Figure 3. Participation Rates for Students with Disabilities on Middle School Reading Assessments

High School Reading – Regular Assessments. Figure 4 presents participation rate data for students with disabilities on regular assessments in high school for 2001-02 through 2004-05 for the eight states. Mean participation rates increased from 87% to 94% from school year 2001-02 to 2004-05, and median rates rose from 88% to 96% in the same years.

Figure 4. Participation Rates for Students with Disabilities on High School Reading Assessments

Elementary Math – Regular Assessments. Figure 5 shows the nine states reporting mathematics participation data from 2001-02 through 2004-05; these were the same states that reported reading participation data, along with Nebraska. Participation rates for students with disabilities in these states on elementary math assessments from 2001-02 through 2004-05 were relatively stable, except in two states. Connecticut and North Carolina reported lower initial rates in 2001-02, and these states showed the largest absolute gains by 2004-05. Across all nine states, mean rates increased from 92% to 97%, and median scores increased from 96% to 99%.

Figure 5. Participation Rates for Students with Disabilities on Elementary Math Assessments

Middle School Math – Regular Assessments. Figure 6 shows the nine states that reported mathematics participation data from 2001-02 through 2004-05. Participation rates for middle school were relatively stable over the period investigated (except in Connecticut), with the mean increasing from 89% in 2001-02 to 96% in 2004-05. The median increased from 92% to 98%.

Figure 6. Participation Rates for Students with Disabilities on Middle School Math Assessments

High School Math – Regular Assessments. Figure 7 shows the nine states that reported mathematics participation data from 2001-02 through 2004-05. Across the four years, the participation rate for high school showed a mean gain of nine percentage points, rising from 85% to 94%. Four states reported relatively large increases in participation, thus producing an increase in the median from 85% to 97%.

Figure 7. Participation Rates for Students with Disabilities on High School Math Assessments

Performance on Regular Assessments

Trends in the percent of students attaining proficient or above performance levels on their state assessments are displayed for the content areas of reading and math by school level; data were available for a total of 32 states. Nineteen states provided data for at least one grade within: Elementary (3-5), Middle School (6-8), and High School (9-12). In addition, eight states provided elementary and middle school data, but not high school data: California, Louisiana, Maryland, Michigan, North Carolina, New York, Texas, and Virginia; four states provided high school data only: Alabama, New Hampshire, New Mexico, South Carolina; and one state provided elementary and high school data, but no data for middle school: Nevada. To view data from the largest numbers of states possible, we allowed the number of states to differ for each school level and content area.

Elementary Reading – Regular Assessments. Performance data were available for 28 states for elementary reading; these data are shown in Figure 8 in terms of the percentage change in students who are proficient or above from 2001-02 through 2004-05. During this period, the average percent of students proficient for these states increased seven percentage points. For the four-year span, Kansas, Louisiana, Maryland, and Michigan all saw improvements of more than 20 percentage points in the number of students with disabilities achieving proficiency on the state’s regular assessment. Eight states showed decreases, including Texas with the largest decrease (from 89% in 2001-02 to 69% in 2004-05), and Massachusetts with the smallest decrease (from 19% in 2001-02 to 18% in 2004-05). The median change for all states was an increase of six percentage points across the four-year span.

Figure 8. Performance Trends for Students with Disabilities on Elementary Reading Assessments

Middle School Reading – Regular Assessment. Performance data were available for 27 states for middle school reading; these data are shown in Figure 9. The average proficiency rate of 25% in 2001-02 increased to 29% by 2004-05. Maryland saw an increase of more than 20 percentage points in the percentage of students with disabilities attaining proficient or above on the state assessment. Across the time span, five states saw decreases. Texas showed a large decrease, from 85% to 61%; Alaska, Connecticut, Montana, and North Carolina all showed smaller decreases in the percentage of students with disabilities proficient or above. The median change in percent proficient across the four-year period was an increase of four percentage points.

Figure 9. Performance Trends for Students with Disabilities on Middle School Reading Assessments

High School Reading – Regular Assessments. Performance data were available for 24 states for high school reading; these data are shown in Figure 10. The average rate of proficiency rose one percentage point for these states between 2001-02 and 2004-05. New Jersey had an increase of more than 20 percentage points in the percentage of students with disabilities achieving proficient or above levels on the state’s regular assessment. Nine states showed decreases, including New Mexico, with the largest decrease in the percentage of students proficient (38 percentage points decrease between 2001-02 and 2004-05), and eight other states with smaller decreases in the percent proficient across the four-year span. The median gain for all states across the four-year span was three percentage points.

Figure 10. Performance Trends for Students with Disabilities on High School Reading Assessments

Elementary Math – Regular Assessments. Performance data were available for 28 states for elementary math; these data are shown in Figure 11. During the four years studied, the average rate of students proficient or above increased eight percentage points. The percentage of students with disabilities who were proficient or above increased more than 20 percentage points in Arizona, Kansas, Louisiana, and Maryland. Nine states had decreases across the same period, ranging from 18 percentage points in Texas and 17 percentage points in North Carolina, to single digit drops in seven other states. The median change across all states was an increase of 10 percentage points for the four-year span.

Figure 11. Performance Trends for Students with Disabilities on Elementary Math Assessments

Middle School Math – Regular Assessments. Performance data were available for 27 states for middle school math; these data are shown in Figure 12. The average rate of students proficient increased from 18% in 2001-02 to 22% in 2004-05. Wisconsin showed an increase of more than 20 percentage points while 22 other states saw gains of lesser amounts. Four states showed decreases in the percentage of students with disabilities who were proficient or above, including Texas with the largest decrease (50 percentage points) and Ohio with a medium decrease (13 percentage points), to single digit decreases in two other states. The median gain across the four-year span was four percentage points.

Figure 12. Performance Trends for Students with Disabilities on Middle School Math Assessments

High School Math – Regular Assessments. Performance data were available for 24 states for high school math; these data are shown in Figure 13. In 2001-02 the 24 states had an average rate of 23% of students with disabilities who were proficient and above; that rate was 22% in 2004-05. New Jersey showed the largest increase in percentage (24 percentage points) and was the only state to show an increase of more than 20 percentage points. Three states showed declines of more than 20 percentage points during this period (New Mexico, Ohio, and South Carolina), and seven others showed lesser declines in the percent of students proficient or above. Overall, the median change was an increase of one percentage point across all states for the four-year period.

Figure 13. Performance Trends for Students with Disabilities on High School Math Assessments

Student Participation and Performance on Alternate Reading and Math Assessments

Participation on Alternate Assessments

Trends in alternate assessment participation are unclear because state data have only recently begun to be reported publicly on a consistent basis. When the data are reported, they often are reported only as reading and math overall totals, and not disaggregated by school levels. For example, in 2001-02, 13 states gave a rate of either the percent of students tested or the percent not tested in their alternate assessment. However, the denominator used to arrive at this information varied dramatically. Some states reported an overall percentage of students with disabilities who took the alternate (e.g., 10%) whereas other states gave the percentage of all students tested in the alternate (e.g., 1%). Still others gave the percentage of students tested out of the total eligible for the alternate (e.g., 97%). Some states disaggregated participation rates by grade level or content area while others gave an overall rate. By 2004-05, 38 states provided a percentage of either students tested or not tested. However, not all of these states reported the data clearly for each grade and content area, either aggregating across grade or content, or presenting the percent of all students tested, or the percent of students out of the total eligible for the alternate assessment.

There were two states that reported participation using the total number of students with disabilities as the denominator as far back as 2002-03 (year two of this analysis) and also in 2004-05. These states showed little or no change across years (Wisconsin: 7% increased to 8%; North Carolina: no change). Three states reported participation using the total number of enrolled students as the denominator (which leads to small percentages) in both 2002-03 and 2003-04. None of these three states reported in the same way for 2004-05. Massachusetts and New Hampshire reported small changes from 2002-03 to 2003-04, and Washington reported a decrease from 3.6% to 0.7%.

Performance on Alternate Assessments

Trends in alternate assessment performance across states are displayed for the content areas of reading and math by school level. These data were reported with more depth and clarity between 2001-02 and 2004-05 than were participation data for the alternate assessment. Nevertheless, the number of states with these data also was small. In an effort to increase the number of states in each analysis, data were included from any state that had data for at least three of the four years. In all cases, the percent of students proficient shown are the number proficient divided by the number of alternate assessments taken.

Elementary Reading – Alternate Assessments. Performance data were available for eight states for elementary reading; these data are shown in Figure 14. The average percent of students proficient and above increased 22 percentage points from 2001-02 through 2004-05. During this time, Alaska, Delaware, New York, and Washington all had increases of more than 30 percentage points for students with disabilities who were proficient or above on the alternate assessment. Two states had slight decreases: Colorado and Texas.

Figure 14. Performance Trends for Students with Disabilities on Elementary Reading Alternate Assessments

Middle School Reading – Alternate Assessments. Performance data were available for eight states for middle school reading; these data are shown in Figure 15. In 2001-02 the average percent proficient was 49%. This increased to 70% by 2004-05. During this time, the percentage of students with disabilities who were proficient or above on the alternate assessment increased by more than 30 percentage points in Alaska, New York, and Washington. Two states, Arizona and Texas, saw decreases in the percentage of students who were proficient on the alternate assessment.

Figure 15. Performance Trends for Students with Disabilities on Middle School Reading Alternate Assessments

High School Reading – Alternate Assessments. Performance data were available for six states for high school reading; these data are shown in Figure 16. The average percent proficient of 33% in 2001-02 increased to 65% in 2004-05. Alaska, Delaware, and Washington all saw improvements of 35 percentage points. Two states, Arizona and Colorado, saw slight decreases from 2001-02 to 2004-05 in the percentage of students with disabilities who were proficient on the alternate assessment.

Figure 16. Performance Trends for Students with Disabilities on High School Reading Alternate Assessments

Elementary Math – Alternate Assessments. Performance data were available for eight states for elementary math; these data are shown in Figure 17. The average percent proficient rate increased 24 percentage points between 2001-02 and 2004-05. An increase of 12 percentage points occurred between 2002-03 and 2003-04. The average percentage of students with disabilities proficient or above increased by more than 20 percentage points in Alaska, New York, and Washington. All states with data showed positive gains in the percentage of students proficient on the alternate assessment.

Figure 17. Performance Trends for Students with Disabilities on Elementary Math Alternate Assessments

Middle School Math – Alternate Assessment. Performance data were available for eight states for middle school math; these data are shown in Figure 18. The 2001-02 average percent proficient of 45% for students with disabilities increased to 67% by 2004-05. Alaska, New York, and Washington saw increases of more than 30 percentage points. Only one state, Arizona, saw a decrease during this time.

Figure 18. Performance Trends for Students with Disabilities on Middle School Math Alternate Assessments

High School Math – Alternate Assessments. Performance data were available for six states for high school math; these data are shown in Figure 19. A 2001-02 average proficient rate of 29% for students with disabilities on the alternate assessment increased to 59% by 2004-05. Five states showed gains, with the most dramatic being that shown by the state of Washington—from 8% of students with disabilities proficient in 2001-02 to 47% of students with disabilities proficient in 2004-05.

Figure 19. Performance Trends for Students with Disabilities on High School Math Alternate Assessments


Discussion

Publicly reported data on the participation and performance of students with disabilities provide important information about how well states are meeting their reporting responsibilities. Where states are reporting well, the public data reveal actual student participation and performance trends. In this report, we looked at the public reporting of state assessment participation and performance data over a four year time span—from 2001-02 through 2004-05. The start of this time period coincides with the 2001-02 baseline year for determining Adequate Yearly Progress (AYP) and the beginning of implementation of NCLB. The results reflect the NCLB mandate to assess all students, including IDEA-eligible and Section 504-eligible students, and to report not only schoolwide results for AYP calculations, but also to report disaggregated assessment data for subgroups, one of which is students with disabilities.

The time frame of this report also covers the beginning implementation of the 2004 IDEA reauthorization. IDEA 2004 reinforced IDEA 1997, which for the first time in a special education law required state education agencies to report participation and performance results for students with disabilities with the same frequency and detail as for students without disabilities. By studying trends over these years, we looked for: (1) whether progress has been made by states in publicly reporting the data of students with disabilities on state assessments, and (2) whether trends existed in the four years of participation and performance data for students with disabilities.

Given the increasing emphasis on public reporting, it is not surprising to find that more states reported more of the data required in 2004-05, the last year of analysis, than in the initial years. It is disappointing, nevertheless, to find that the number of states for which data were available across the four years was relatively small. For the regular assessment, only slightly over half had performance data across this time period, and fewer than 10 states had participation data for each of the four years studied. Even with a more lenient approach to looking at alternate assessment data (i.e., including a state if it had at least three years of data), only eight states reported performance data for the alternate assessment across the time period, and no states reported participation data in a consistent manner across each of the four years (though three-year trend data starting in 2002-03 was available for two states).

Discrepancies between the numbers of states reporting performance data and the numbers reporting participation data for both the regular assessment and the alternate assessment are worrisome. Performance data have little meaning if we do not know the number and percentage of students included in the data on which performance results are based. Furthermore, the finding that the gap between the reporting of proficiency data and participation data narrowed and then increased again, with gaps going from 4% to 2% to 3% to 9%, begs explanation. It is unclear whether the differences in reporting are related to accountability requirements or can be explained in some other way—such as that changing data management systems had made it difficult to retrieve participation data for a year or two.

Trends in the Public Reporting of Assessment Data

When we examined reporting year by year, we found considerable evidence of improvements in reporting. For example, 48 states reported at least some disaggregated results for regular assessments of students with disabilities in school year 2001-02. By 2004-05 all 50 states included at least some disaggregated data in their reporting. While this trend seems positive, it reflects a global look at reporting, counting states even if they report only part of what should be reported. The number of states reporting disaggregated participation and performance data for all of their regular state assessments was only 36 states (72%) in 2004-05. This number of states was only one more than the lowest number of states during the four year period. Over one quarter of the states (14 states – 28%) did not publicly report assessment data for students with disabilities for all their regular assessments in 2004-05.

Greater increases in number of states reporting were found for alternate assessments. In 2001-02 only 32 states reported any alternate assessment data (either participation or performance, or both).This increased to 47 states (94%) by 2004-05. Thurlow and Wiley (2006) suggested that the low number of states reporting alternate assessment data in 2001-02, in comparison to the number reporting regular assessment data (48 states), might have been partly attributable to the longer time needed to develop state alternate assessments and meet reporting requirements. Although the increases in reporting may support this hypothesis, it does not explain why states reporting alternate assessment results did not also report regular assessment results.

The level of detail in which states reported participation data for each of their regular assessments increased across the four years studied. Reporting of the percent of students tested and the percent of students not tested showed an increase of nearly 20 percentage points, based on all of the regular assessments administered. Still, for only slightly over half of the 2004-05 assessments was percent of students tested reported, and for only one-third was percent of students not tested reported. It was less likely still that percent exempt/excluded and percent absent were reported for an assessment. These low rates of reporting occurred despite the fact that the number of students tested was reported for 82% of the assessments. Similar discrepancies and low rates of reporting were observed for details about alternate assessment participation, although the percentage for which details were provided was sometimes higher than for the regular assessment.

Trends in Participation and Performance

Analyses of trends in actual student participation or performance are dependent on the data available. In this study we looked at four years of data, from 2001-02 through 2004-05. Despite relatively high year-to-year rates of reporting, with the rates generally increasing each year, the data available for examining participation and performance trends were limited. Performance data met our criteria more often than participation data, a finding that should raise concerns. To really understand performance results, it is necessary to have a good understanding of participation rates, including the details surrounding those rates.

Even though there were limited numbers of states with data available across the years 2001-02 through 2004-05, we did examine data from those states with data available. In doing so, we recognize the limited scope of these analyses. For instance, those states that reported publicly might be the states in which students are doing better in terms of either participation or performance, and so relying on these data for a picture of trends may overestimate participation or performance across the country or otherwise misidentify trends.

The fact that a state is included here is positive—it indicates that the state has consistently reported data disaggregated for students with disabilities across years. Unfortunately, only eight states did so for participation in regular reading assessments, and only nine states did so for participation in regular math assessments. For those states that reported, participation rates for the regular assessment were relatively stable across years, and the median rates for most states were above 95 percent by the 2004-05 school year. This value is notable because it corresponds to the NCLB accountability criterion that for a school to be considered to meet AYP requirements it must have a participation rate of at least 95% overall and for each subgroup. In the end, however, the inferences that can be drawn about trends in the regular assessment participation of students with disabilities are limited by the small number of states with these data.

Trends in regular assessment performance may be easier to gauge based on the data publicly reported by states. Many more states reported performance data than had reported participation data, and this difference is magnified when looking at data across years. Data were available for between 24 and 28 states, depending on the school level (elementary, middle, high school) and the content area. Nineteen states reported data for all of the content and school levels that we examined. Based on the states with data across years, average percentages of students with disabilities performing at the proficient or above level showed moderate increases across the four years for both reading and math in elementary and middle schools. Performance data for high school students did not show the same kinds of gains. This may be associated with increasing participation rates since the last students to be included in assessment may have been the lowest performing.

Results across the years also showed lower percentages of high school and middle school students demonstrating proficient or above performance, as compared to elementary school students. This tendency was true for both reading and math. Performance of students at the middle school level seemed to be showing increases (in terms of the number of students proficient or above), more so for reading than for math. The high school data indicated relatively few students with disabilities proficient or above, and even minor decreases in performance across years in many states. These findings, though based on just half the states, mirror those in other recent research efforts (Spellings, 2007; Thurlow, Altman, Cormier, & Moen, 2008).

Trends in performance showed more variability from year to year than participation. Several states reported increases in the percentage of students proficient and above, with as much as 30 percentage point changes over four years. At the same time, several other states showed marked drops in the percentage of students proficient and above, including a drop as high as 50 percentage points (middle school math in Texas) and 38 percentage points (high school reading in New Mexico). These types of large shifts may reflect a variety of factors, such as changes in the assessment system or curricular and instructional improvements, and further complicate inferences about trends over time in performance.

Alternate assessment participation data could not be analyzed for trends as the methods of calculation and consistency in reporting do not reach back to the first year covered in this report. It is our hope that a similar analysis completed in the future could uncover changes in student participation in alternate assessment systems, especially in relation to the recent changes in guidance and policy surrounding these assessments.

Alternate assessment performance data were available from only six states at the high school level and eight states at the elementary and middle school level, even when analysis criteria were relaxed to require data from only three years. The percentage of students performing at a proficient or above level on states’ alternate assessments was generally higher than it was for students with disabilities on regular assessments, and gains tended to be larger across the four years, from 2001-02 through 2004-05. The discrepancy between regular assessment percentages and alternate assessment percentages, for example, 29% versus 70% proficient and above for middle school reading, raises the question of whether states have adopted somewhat less rigorous criteria for achieving proficiency on alternate assessments. Of course, these data—and thus, the questions that they raise—are limited by the small number of states with sufficient data sufficient for the current analysis.

Conclusions

It is vital to include the changing standards, assessments, policies, and practices within states as a reference point when one draws conclusions from a trends analysis such as this. Changes in participation guidelines within states lead to noticeable changes within the testing population, and for the states’ assessment performance overall. This is especially true for small subgroups such as students with disabilities. For example, in several states an increasing number of students are being tested on grade level content using the regular assessment, many of them using accommodations. It is quite possible that this new testing population includes a percentage of students who were previously assessed on below grade-level content standards or who were included in alternate assessments. Also, states that were among the first to approve an alternate assessment option for their students have recently begun to raise the bar on the expected content knowledge required to pass such assessments, which could result in uneven trends in student performance.

Although the analyses of trends identifies some areas of increased reporting and better achievement data, they also re-emphasize the need for states to publicly report data for educators, parents, and others to see. Further, our findings make evident the need for states to include all students with disabilities so that it is possible to look at the same data points across time.

Results of analyses of 2005-06 school year data currently in process will provide some indication of whether the trends reported here have changed. These data should be particularly comprehensive, since this period marks the first year that all states are testing all students in grades 3 through 8, and once in high school, as required by NCLB. Continued investigations of not only trends but also the data available for examining trends is an important step for evaluations of the increases in participation and performance of students with disabilities over time.


References

Bielinski, J., Thurlow, M. L., Callender, S., & Bolt, S. (2001). On the road to accountability: Reporting outcomes for students with disabilities (Technical Report 32). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/TechnicalReport32.html.

Education for All Handicapped Children Act. (1974). Public Law 94-142. Washington, DC: U. S. Government Printing Office.

Elementary and Secondary Education Act. (1965). Public Law 89-10. Washington, DC: U. S. Government Printing Office.

Improving America’s Schools Act. (1994). Public Law 103-382. Washington, DC: U. S. Government Printing Office.

Individuals with Disabilities Act. (1997). Public Law 105-117. Washington, DC: U. S. Government Printing Office.

Individuals with Disabilities Act. (2004). Public Law 108-446. Washington, DC: U. S. Government Printing Office.

Klein, J. A., Wiley, H. I., & Thurlow, M. L. (2006). Uneven transparency: NCLB tests take precedence in public assessment reporting for students with disabilities (Technical Report 43). Minneapolis, MN: University of Minnesota, national Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/TechnicalReport43.html.

No Child Left Behind Act. (2001). Public Law 107-110. Washington, DC: U. S. Government Printing Office.

Rehabilitation Act of 1973. (1973). Public Law 93-112. Washington, DC: U. S. Government Printing Office.

Spellings, M. (2007). Building on results: A blueprint for strengthening the No Child Left Behind Act. Washington, DC: U.S. Department of Education

Thurlow, M. L., Altman, J. R., Cormier, M. & Moen, R. (2008). Annual performance reports: 2005-2006 state assessment data. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/APRreport2005-2006.pdf

Thurlow, M. L., House, A., Boys, C., Scott, D., & Ysseldyke, J. (2000). State participation and accommodations policies for students with disabilities: 1999 update (Synthesis Report 33). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/Synthesis33.html

Thurlow, M. L., Langenfeld, K. L., Nelson, J. R., Shin, H., & Coleman, J. E. (1998). State accountability reports: What are states saying about students with disabilities? (Technical Report 20). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/TechnicalReport 20.html.

Thurlow, M. L., Nelson, J. R., Teelucksingh, E., & Ysseldyke, J. E. (2000). Where’s Waldo? A third search for students with disabilities in state accountability reports (Technical Report 25). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/TechnicalReport 25.html.

Thurlow, M. L., & Wiley, H. I. (2004). Almost there in public reporting of assessment results for students with disabilities (Technical Report 39). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://cehd.umn.edu/NCEO/OnlinePubs/Technical39.htm.

Thurlow, M., Wiley, H. I., & Bielinski, J. (2003). Going public: What 2000–2001 reports tell us about the performance of students with disabilities (Technical Report 35). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/TechnicalReport35.html.

VanGetson, G. R., & Thurlow, M. L. (2007). Nearing the target in disaggregated subgroup reporting to the public on 2004-05 assessment results. (Synthesis Report, 2007). Minneapolis, MN: University of Minnesota, national Center on Education Outcomes. Available at www.nceo.info/OnlinePubs/Tech46/.

Wiley, H. I., Thurlow, M. L., & Klein, J. A. (2005). Steady progress: State public reporting practices for students with disabilities after the first year of NCLB (2002-03) (Technical Report 40). Minneapolis, MN: University of Minnesota, national Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/TechnicalReport40.html.

Ysseldyke, J. E., Thurlow, M. L., Langenfeld, K., Nelson, J. R., Teelucksingh, E., & Seyfarth, A. (1998). Educational results for students with disabilities: What do these data tell us? (Technical Report 23). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at www.nceo.info/OnlinePubs/TechnicalReport23.html.