Understanding Out-of-Level Testing in Local Schools: A Second Case Study of Policy Implementation and Effects


Out-of-Level Testing Project Report 12

Published by the National Center on Educational Outcomes

Prepared by:
Jane Minnema • Martha Thurlow • Sandra Hopfengardner Warren

September 2004


This document has been archived by NCEO because some of the information it contains may be out of date.


Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Minnema, J., Thurlow, M., & Warren, S. H. (2004). Understanding out-of-level testing in local schools: A second case study of policy implementation and effects (Out-of-Level Testing Project Report 12). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://cehd.umn.edu/NCEO/OnlinePubs/OOLT12.html


Overview

Standards-based instruction, with the aim of grade-level achievement for all students, is undoubtedly the most comprehensive educational reform of the recent past. A hallmark of this reform effort is the measurement of student academic achievement with large-scale assessments that are used for accountability purposes. Assessment results are to be made public as a way of accounting for the academic achievement of all subgroups of students. Just as teachers, parents, and students are interested in individual student achievement, policymakers and the public in general are interested in student group achievement that indicates how specific schools, school districts, and states are performing. Never before have schools and states been under such scrutiny for demonstrating improved student outcomes for specific subgroups of students – students with disabilities, English language learners, students receiving free and reduced lunch, and students in general education.

Today’s widespread emphasis on statewide testing used for accountability purposes has essentially been driven by federal mandates. The Elementary and Secondary Education Act (ESEA) of 1994 included a strong mandate that all students, including those with disabilities, participate in states’ standards-based assessments and be counted in states’ accountability programs. Following a similar course in policy implementation, the 1997 amendments to the Individuals with Disabilities Education Act required the participation of students with disabilities in state and district-wide assessments. Most recently, the re-authorization of ESEA, No Child Left Behind Act (NCLB) of 2001, has re-focused states’ attention toward ensuring access to challenging, grade-level standards that are designed for the student’s grade of enrollment.

Reviewing the chronology of federal law that has strengthened the inclusion of students with disabilities in states’ large-scale assessment and accountability programs, and the new emphasis on grade-level standards, does not capture the political and controversial issues that have surrounded the implementation of assessments. This is certainly true for out-of-level testing, which refers to the practice of testing students below their grade of enrollment in states’ large-scale assessment programs. It is possible that no approach to testing has prompted such controversy at all levels of the American educational system – local, state, and federal – as has out-of-level testing.


Out-of-Level Testing Background

Including all subgroups of students in statewide testing has been challenging for states. In order to administer more inclusive large-scale assessments, 14 states in 2001-2002 added an approach to their testing program known as “out-of-level testing” so that some students could be tested at test levels below their grade of enrollment (Minnema & Thurlow, 2003). Policymakers, educators, and parents of students with disabilities thought that testing a student at the level on which teachers or others indicated that they were instructed in the classroom would yield more accurate, precise, and useful test results (Thurlow, Minnema, Bielinski, & Guven, 2003; Minnema & Thurlow, 2003). It was also thought that testing students on their “instructional-level” would be less frustrating and embarrassing since students could fully engage in completing test items. Other commonly held beliefs about out-of-level testing included that it would produce improved student motivation when taking tests, better attending behavior during test taking sessions, and enhanced student self esteem when students answered test items that tested content that they knew.

Also circulating in practice were attitudes and beliefs that discounted the value of out-of-level testing (Thurlow, Elliott, & Ysseldyke,1999). While referencing different reasons, other policymakers, educators, and parents thought that out-of-level testing would not yield more accurate, precise, and useful test results because students were tested on test material that was developed for much younger students (Thurlow & Minnema, 2001). Since the test material would most likely not be age appropriate, test motivation, attending behavior, and students’ self esteem could be adversely affected. Possibly the worst consequence of testing students with disabilities is the effects of setting lower expectations for students’ classroom performance or test level selection (Minnema, Thurlow, & Warren, 2004a). In addition, public reporting of out-of-level test results was particularly problematic because data managers were unclear as to how to report the test scores – on the grade of the student’s test level or grade of enrollment in school (Minnema & Thurlow, 2003).

The debate over the merit and worth of testing students with disabilities below their grade of enrollment continues to date. Researchers have begun to tease apart the complications of local and state level reporting (Minnema & Thurlow, 2003), uneven policy implementation (Minnema et al., 2004a), the prevalence of below-grade level testing (Thurlow et al., 2003), and other such issues that surround the implementation of out-of-level testing policies results. Nevertheless, research has yet to weigh in on the factual basis of many of the beliefs, attitudes, and perceptions that surface in educational practice.

In order to understand how states actually administered out-of-level testing policies at the local level, we designed a case study to look closely at local educational agencies where students with disabilities were tested below their grade of enrollment. We also sought to determine whether the many popular beliefs in practice about out-of-level testing were actually true. To meet these aims, we implemented two research studies in two school districts in two states. Both of these states were administering out-of-level tests as part of their large-scale assessment programs during the school year 2001-2002 when we collected our data.

This report is an accounting of a second case study of large-scale assessment practices in a local educational agency where students with disabilities were administered the state’s standards-based tests out of level. The first report (Minnema et al., 2004a) provided the results from the first case study conducted in another school district in another state. The overall purpose of our research project is to describe the specific effects of testing students with disabilities out of level as well as teachers’ and students’ perceptions of these effects.


State Context

In order to understand our qualitative and quantitative findings about out-of-level testing, it is important to first consider the statewide testing program that was used for below grade level testing in the state selected for the second case study. In 2001-2002, the large-scale assessment program included two criterion-referenced assessments designed to measure students’ mastery of the state’s academic content standards. One assessment measured academic achievement in the content areas of English language arts, mathematics, science, and social studies skills of students in 4th and 8th grades. This test was high stakes, meaning that students who scored below the approaching basic achievement level on both the English language arts and mathematics portions were not promoted to the next grade in school. A second test, also high stakes for high school graduation, assessed English language arts, mathematics, science, and social studies skills of students in 10th grade. This test, a graduation equivalency exam, could be retaken if the student did not pass it. In order to receive a high school diploma, students were required to score at the approaching basic achievement level on both English language arts and mathematics sections of the assessment, as well as either the science or social studies portion. Additionally, a norm-referenced test was administered to students in the 3rd, 5th, 6th, and 8th grades, and another version of this norm-referenced instrument was administered to students in the 9th grade. These tests were given as a statewide program of norm-referenced measurement of student academic achievement.

Students with disabilities in 3rd through 9th grade were eligible to participate in an out-of-level test in lieu of the criterion-referenced, standards-based measures administered in the 4th and 8th grades. The state’s out-of-level testing policy allowed for taking different content areas of the norm-referenced test at different grade levels according to the student’s abilities. Policy did require that either the English language arts or mathematics subtests be presented at least three grade levels below the student’s grade of enrollment. The science and social studies portions had to be administered at the same test level of either the English language arts or mathematics subtests. Administering all subtests on one grade level below the student’s grade of enrollment was also an option.

State policy required that students meet selection criteria for out-of-level testing and their Individualized Education Programs (IEPs) had to reflect the need for below-grade level testing. These criteria included scoring at the unsatisfactory level on the previous year’s general standards-based assessment in English language arts or mathematics or attaining a total score in reading, language, or mathematics at or below the fifth percentile on the norm-referenced test. In addition, a student’s IEP needed to reflect a functioning grade level in English language arts or mathematics that was at least three grade levels below the grade of enrollment in school. For these students, the norm-referenced test was administered below grade level instead of the general large-scale assessment.

A final condition that needed to be met in order to administer an out-of-level test was parental permission. Parents were required to sign a state-developed form that indicated that the family understood that by taking an out-of-level test, the child was not preparing academically for a standard high school diploma. In other words, a student who was not achieving on-grade level in elementary or middle school would not have the necessary academic skills to pass the graduation exit exam required for high school graduation.

 The School District

Data were collected in a school district in a southern state that serves approximately 18,595 students in grades pre-kindergarten through 12th grade. The district contains 16 elementary, 6 middle, and 6 high schools. Students in this district are of predominately Caucasian (66.0%) and African-American (29.7%) ethnic backgrounds. Within the total student population, 41.4% of the district’s students were enrolled in a free and reduced lunch program. On average, the student to teacher ratio is 16.4 students to one teacher.

Four schools from this school district were considered for this case study. School A and School B were a middle school and a high school in one middle-sized city while School C and School D were a middle and high school respectively in a small neighboring community.

School A. The middle school that was located in the middle-sized city served approximately 691 students in 6th through 8th grades. Of the students attending this school, 33% received free or reduced-priced lunch. Class size varied, with 27% of classes having 1-20 students, 58% of classes having 21-26 students, and 15% of classes having more than 27 students. There were, on average, 16.9 students per teacher employed at the school. Students at School A came from a variety of ethnic backgrounds, including 65% Caucasian, 30% African-American, 3% Asian-American, 2% Hispanic, and <1% American Indian.

School B. The middle-sized city’s high school educated approximately 1,329 students in grades 9 through 12. In 46% of the school’s classes, class size consisted of 20 or fewer students, while 28% consisted of 21-26 students, and 26% consisted of 27 or more students. For this high school, the student-teacher ratio was approximately 16.8 students for one teacher on staff. Students receiving free or reduced-price lunch comprised 24% of the student enrollment. School B had an attendance rate of 91.1% and a dropout rate of 3.8%. The students were predominately Caucasian (71%), followed by African-American (25%), Asian-American (2%), Hispanic (2%), and American Indian (<1%).

School C. The second middle school in this case study was located in the smaller neighboring community, which served approximately 627 students in 6th through 8th grades. Of this school’s student population, 36% received free or reduced-priced lunch. Class size varied, with 16% of classes having 1-20 students, 48% of classes having 21-26 students, and 36% of classes having more than 27 students. On average, there were 17.4 students per teacher employed at the school. A variety of ethnicities were represented at School C, including 79% Caucasian, 20% African-American, 1% Hispanic, and <1% American Indian and Asian-American.

School D. There were approximately 631 students in 9th through 12th grades who attended this high school located in the neighboring, smaller-sized community. Class sizes consisted of 20 or fewer students in 48% of the classrooms with 31% of classes containing 21-26 students and 31% containing 27 or more students. There are approximately 17.1 students per teacher employed at the high school. Students who received free or reduced-priced lunch comprised 21% of students enrolled. The attendance rate for School D was 93.4% with a dropout rate of 4.9%. The student body was predominately Caucasian (81%), followed by African-American (18%), Asian-American (<1%), Hispanic (<1%), and American Indian (<1%).


Method

To study the implementation of an out-of-level testing policy at the local school level, we posed the following research questions:

(1)  What are the instructional effects on students with disabilities who are tested out of level in statewide assessments?

(2)  What are teachers’ learning expectations for students with disabilities who are tested out of level?

(3)  How are students with disabilities selected for an out-of-level test?

(4)  How do students and their parents perceive out of level testing?

Research design. We used a mixed methods approach to implementing our case study where a “case” was defined as a school district. Although the school district was relatively large, limited resources constrained our data collection process to a limited number of participating schools. Two district-level personnel, a special education director, and a special education coordinator, selected and recruited two middle schools and two high schools to participate in our study. No elementary schools participated because this state did not administer out-of-level state tests to elementary-aged students. At the school and district levels, we collected both narrative and numeric data through interview, survey, and document review data collection techniques.

Sample. The special education coordinator invited all special and general education teachers and administrators from each school to participate in our case study. Our purposive sample consisted of special education teachers (n = 15), general education teachers (n = 10), and principals (n = 3). In terms of a district level perspective, we also included a district level test coordinator in our data collection activities. Each participant received a gift certificate from a local department store as a thank you for the time invested in our case study.

Instruments. Our data collection techniques included face-to-face interview protocols and a document review data collection sheet. The face-to-face interviews required approximately 25 minutes for school personnel. The purpose of these educator interviews was to garner their opinions about and their perceptions of student experiences in out-of-level testing (see Appendix A for copies of the interview protocols). The purpose of the IEP document reviews was to describe instructional features of those tested out of level. All instruments were pilot tested prior to collecting data.

Procedures. Two NCEO researchers collected all data on-site in the schools. The student interview protocol was reviewed with a NCEO researcher prior to conducting the student interviews to ensure an appropriate interview process. To introduce the research project to district and school staff, we recruited one contact person at the district level. She agreed to read a letter from NCEO explaining the research project prior to our visit. Once on site, she helped distribute and collect written surveys as well as access student IEPs for the document review. In addition, the contact person worked with school staff to schedule the face-to-face interviews.

A data collection activity was begun during the study to address the instructional effects of testing students with disabilities below their grade of enrollment because of contextual factors beyond our control. After data collection had been initiated, we learned that the school district was under a decree from the U.S. Office for Civil Rights to provide instruction to all students at the same pace in every general educational classroom. In other words, all students, including students with disabilities who were tested out of level, progressed in curriculum in all content areas at the same pace whether or not the necessary information had been learned. Rather than collecting interview data to answer our research question about the instructional effects of out-of-level testing, we created “scenarios” that described typical students in their classrooms based on conversations with general and special education teachers. We also asked teachers to describe special education instruction as it was delivered in their school. These discussions became part of our interviews, so were tape recorded and transcribed. These data were analyzed for information about the instruction and learning of students with disabilities who were tested out of level.

Data analysis. We used two basic approaches to analyze the case study data. For qualitative data analysis, all educator interviews were tape recorded, transcribed, and subjected to a content analysis that yielded thematic findings. Since the student interview responses were briefer than the educator responses, we tabulated by response content within the student data set to create frequencies of types of student responses. IEP review data were analyzed with descriptive statistics.


Findings

Our findings are organized according to the three data collection activities employed for this case study. The IEP document review is presented first with the scenario results presented second. Face-to-face interview results are presented third.

 

IEP Document Review Results

We reviewed students’ IEPs from Schools A, B, and C where students were tested out of level in statewide testing over three school years. Since no students were tested out of level in School D, the smaller-sized city high school, we did not review IEPs there.

A summary of the number of out-of-level tests in each school is shown in Table 1. In School A, a total of 21 out-of-level tests were administered with 9 tests given in school year 1999-2000, 8 tests given in school year 2000-2001, and 4 tests given in school year 2001-2002. Three students with disabilities were tested out of level all three years with one of those students tested only in math two of the three years. Other students were tested in both reading and math for either one or two years. Four out-of-level tests were taken in School B during school year 1999-2000, one out of-level test during school year 2000-2001, and no out-of-level tests in 2001-2002. There were four out-of-level tests given in School C during school year 1999-2000, one out-of-level test during school year 2000-2001, and no out-of-level tests during school year 2001-2002. No students in either school B or school C were tested out of level all three years.

Table 1. Number of Out-of-Level Tests by School Year by School

 

1999 – 2000

2000-2001

2001-2002

School A (n = 3)*

9

8

4

School B (n = 0)*

4

1

0

School C (n = 0)*

4

1

0

 * Number of students with disabilities tested out of level for all three years.

For each school, we determined the number of grade levels below the students’ grade levels of enrollment that out-of-level tests were administered in reading/language arts and math. These results are presented in Table 2 for each school. Overall, the numbers of out-of-level tests for the two content areas were about the same.

As shown in the table for reading/language arts, most of the tests taken out of level in School A were 4 grades below the grade of enrollment (n = 8). Others ranged between 3 and 6 grade levels below. Of the 3 tests administered out of level in School B, test levels were 4, 5, and 6 levels below students’ grade level in school. There were also 4 out-of-level tests in School C with 1 test at each level from 2 to 5 levels below the grade of enrollment.

Of the 15 math tests administered out of level in School A, the largest number of tests (n = 5) were given 4 levels below grade level. The remaining tests were administered from 1 level below to 6 levels below grade level. In School B, 1 math test was administered 4 levels below and 1 test at 5 levels below; 2 tests were administered 7 grade levels below. The number and test levels of math tests given in School C were spread from 1 level below to 5 levels below grade level.

 

Table 2. Number of Test Grade Levels Below Students’ Grade of Enrollment by Test and School

 

Reading/Language Arts

Math

Grade Levels Below

 

School A

 

School B

 

School C

 

School A

 

School B

 

School C

1

--

--

--

3

--

1

2

--

--

1

--

--

--

3

2

--

1

3

--

1

4

8

1

1

5

1

2

5

4

1

1

3

1

1

6

3

1

--

1

--

--

7

--

--

--

--

2

--

Total

17

3

4

15

4

5

Missing

3

--

--

--

--

--

On Grade Level

--

--

--

--

5

--

Above Grade Level

 

--

 

--

 

--

 

--

 

1

 

--

 

Student descriptors were gleaned from students’ IEPs for each school. We developed summaries of students’ disability categories, grades of enrollment, the grade level at which students tested in reading/language arts and math, and the “instructional grade level” for reading/language arts and math. The instructional grade level was determined by the results of norm-referenced tests that were administered to determine the test grade level at which an out-of-level test should be given. Most of the students received the Kaufman Test of Educational Achievement, 2nd Edition (KTEA-II), although a few students received either the Iowa Test of Basic Skills (ITBS) or the Woodcock-Johnson-Revised (WJ-R) to determine an out-of-level test level.

Table 3 shows the student descriptors for the middle school located in the middle-sized city, which had 11 students with disabilities tested out of level across the school years that we studied. Eight of the students tested out of level had learning disabilities (LD) while three students had mild cognitive disabilities (MCD). Two of the students with LD had secondary disabilities, one being a speech/language disability (SPL) and the other a hearing disability (HD). Students in School A were tested out of level for one, two, or three school years. Specifically, four students were tested out of level for only one school year; these students included one student in 7th grade and three students in 8th grade. Six students were tested out of level in two consecutive school years. One student was tested in both 5th and 6th grade, another student in 6th and 7th grades, and three students tested in 7th and 8th grades. One student repeated 7th grade and was tested out of level in both of those school years. Three students were tested out of level over three consecutive school years in 5th, 6th, and 7th grades.

In reading/language arts most of students in School A were tested at or close to their instructional level as determined by a standardized test. In fact, two students were tested on the grade of enrollment in reading/language arts and four students were tested on the grade of enrollment in math. The exception was student 1 who was tested at grade level 5 in the first year and grade level 6 in the second and third years when the standardized test indicated an instructional level of 1.9 in the first year of testing. One other student was tested one grade level above the grade level indicated by a standardized test.

Compared to reading/language arts testing, more students were tested more grade levels above the instructional level in math testing. Four students were tested either 3 or 4 grade levels above the level indicated by the standardized test. These students were in 6th, 7th, or 8th grade in school, but scored at approximately the 2nd or 3rd grade levels in math. All of these students were tested out of level in math at test levels closer to their grade of enrollment in school. There was one student, student 3, who was tested at the 7th grade level when in 6th grade and the 6th grade level in the 7th grade. Reading/language arts tests were administered in the same way with the second year of testing dropping down a test level. Standardized testing during the second year indicated that the out-of-level test could be a grade level lower. While seeming to be inaccurate, both researchers recorded these instructional and testing levels according to the student’s IEP.

Table 3. Out-of-Level Tested Student Features by Student in School A

 

Student

 

Disability

Grade Level Enrolled

Reading/Language Arts

Math

Instructional Grade Level*

Grade Level Tested

Instructional Grade Level*

Grade Level Tested

1

 

LD/SPL

5

5

6

1.9

--

--

5

6

6

2.4

--

--

2

2

2

2

MCD

5

6

7

2.3

2.3

--

2

2

2

2.1

2.1

--

2

2

2

3

LD/HD

6

7

2.5

1.9– 3.6

3

2

4.5

2.1– 6.6

7

6

4

LD

7

2.0

3

3.8

7

5

LD

6

7

7

 

2.3

2.1

2

2

3

 

3.3

3.9

3

3

7

6

MCD

7

8

--

2.2

2

2

--

2.8

3

3

7

LD

7

8

2

--

3

2

3

--

7

7

8

LD

7

8

 

2.8

3

3

 

3.5

3

3

9

LD

8

4.3

4

6.8

7

10

MCD

8

2.4

2

2.1

2

11

LD

8

--

--

--

--

 * Based on standardized test scores

Table 4 summarizes information on the five students with disabilities in the middle-sized city’s high school who were tested out of level. Three students had mild cognitive disabilities (MCD), one had an other health impairment (OHI), and one had an emotional behavioral disability (EBD). The student with OHI had an orthopedic disability (OD) as a secondary disability. One of the five students was in 8th grade with the remaining four students in 9th grade. None of these students were tested at a grade level commensurate with the grade of enrollment in school. All students were tested consistent with instructional levels set according to district criteria. This was true for both reading/language arts and math tests. The levels at which students with disabilities were tested below the grade of enrollment ranged from 4 levels to 7 levels below grade level.

Table 4. Out-of-Level Tested Student Features by Student in School B

 Student

 Disability

Grade Level Enrolled

Reading/Language Arts
 

Math

Instructional Grade Level

Grade Level Tested

Instructional Grade Level

Grade Level Tested

1

EBD

8

4.6

4

4.0

4

2

OHI/OD

9

4.5

5

5.5

4

3

MCD

9

4.7

4

2.2

2

4

MCD

9

3.1

3

2.2

2

5

MCD

9

--

--

--

--

 * Based on standardized test scores

We display descriptors of students with disabilities (n = 5) who were tested out of level in statewide testing in School C in Table 5. Among the five students for whom we reviewed IEPs, there were two students with learning disabilities (LD), two students with mild cognitive disabilities (MCD), and one student with a hearing disability (HD). One of the students with a mild cognitive disability and the student with a hearing disability had speech/language disabilities (SPL) as a secondary disability. There were two 5th grade students and three 6th grade students tested out of level.

All but one student was tested out of level in reading/language arts and math at the level indicated by the standardized test. When in 6th grade, student 5 was tested in math at the 1st grade level, but there was no instructional grade level or test grade level data available. Test data for reading/language arts were not available for this student either.

 

Table 5. Out-of-Level Tested Student Features by Student in School C

 

Student

 

Disability

Grade Level Enrolled

Reading/Language Arts

Math

Instructional Grade Level*

Grade Level Tested

Instructional Grade Level*

Grade Level Tested

1

HD/SPL

5

2.1

2

3.7

4

2

LD

5

3.7

3

2.1

2

3

MCD

6

1

1

2

2

4

MCD/SPL

6

2.8

2

4.1

2

5

LD

6

--

--

--

1

 * Based on standardized test scores

Additional information collected was examined as part of the document review for the students tested out of level in the three schools. For example, we checked the signed form that indicated parent approval to test below the grade in which their children were enrolled in school, a form required by the state educational agency. One file was missing this form, but notes were included in the file indicating that the conversation with the parent had occurred by telephone and that the parent had agreed with the decision to test out of level.

Data also were collected on the type of instructional setting in which core content was delivered for students who were tested out of level. In the two middle schools (Schools A and C), all of the students who were tested out of level were included in general education instruction for part of each school day. Typically, students attended science and social studies classes in the general education classroom, but attended reading/language arts and math instruction in a special education resource room. In School C, the only high school where students with disabilities were tested out of level, students were fully included in general education courses with either consultative special education services provided periodically throughout the school year or resource room support provided for specific core classes.

Our document review also considered the test accommodations that were recommended for specific students. We found that of the 21 students tested out of level in Schools A, B, and C, the majority (n = 18) had the same test accommodations documented in their IEPs. These test accommodations included reading test directions aloud, extending test taking time, administering the test either individually or in a small group, repeating oral test directions, and using a calculator. The only exceptions were two students who did not need to use a calculator and one student who did not require extra time to take the test. One of the students who did not need to use a calculator did need someone to record test answers. We found no IEPs with missing test accommodations information for those students who were tested out of level.

 

Scenario Results

Table 6 presents a description of the features of instruction for students tested out of level in School A. These findings were gleaned from student scenarios that we used when interviewing teachers to determine the instructional effects of testing students out of level in these schools. In this middle school, the only students who participated in out-of-level tests were those students with disabilities who were not included in general education instruction. Instead, instruction occurred in resource classrooms at instructional levels below the grade level in which students were enrolled in school.

Table 6. Features of Instruction for Students Tested Out of Level in School A

 

Setting

Alignment

Pacing

Accommodations

Staffing

General Education

Remedial math and reading for general and special education students.

 

Included for social studies and science.

Alternate curriculum with attempt at aligning with general education curriculum.

 

Curriculum aligned to state standards.

 

All classes on same topic at varying depths.

Special education teacher develops accommodations.

 

Educational assistants implement accommodations.

General ed teacher develops modified tests.

Lead teacher provides supplemental support.

 

Educational assistants provide 1:1 support.

Special Education

Resource room for math and language arts *

 

Tried to stay with general education pacing in alternate curriculum.

LEA approved accommodations.

Special education teacher instructs resource classroom.

* All students tested out of level attend resource room.

 

In Table 7, we demonstrate the features of special education instruction delivery in the middle-sized city’s high school. Students who had been tested out of level in earlier grades are not eligible to take a high stakes graduation exam. Because of this, students with disabilities are not participating in large-scale assessments at the high school level. Students with disabilities who have been tested on-grade level are included in general education coursework, but are learning surface content rather than in-depth content that supports meeting academic standards.

Table 7. Features of Instruction for Students Tested Out of Level in School B

 

Setting

Alignment

Pacing

Accommodations

Staffing

General Ed

 

--

 

One curriculum with one set of standards.

All classes on same topic each day.

 

--

 

--

Special Ed

Full inclusion for all courses.

Or

½ day high school – ½ day vocational school.

Required general ed instruction for Carnegie Units.

Full day high school, same pace as general ed with less depth.

 

 

 

--

 

 

--

 

Table 8 shows the features of instruction for students in School C. The middle school students in the smaller city were not achieving on-grade level even when included in general education instruction. The required instructional pace was too rapid for students with disabilities to learn content material in-depth. Most teaching occurred in the resource room where students were instructed below their grade of enrollment and did not meet grade level content standards. All students tested out of level in statewide testing received special education instruction in a resource classroom.

Table 8. Features of Instruction for Students Tested Out of Level in School C

 

Setting

Alignment

Pacing

Accommodations

Staffing

General Education

Included for social studies and science

Curriculum aligned to state standards

All classes on same topic each day

 

SEA approved accommodations

Ed assts and special ed teachers support in classrooms

Special Education

Resource room for math and language arts.*

Most teaching in resource room by lower level curriculum.

Not meeting grade level standards

Doesn’t support learning in depth as pace too rapid

 

--

 

--

 * All students tested out of level attend resource room.

A different pattern emerged in the data collected from the high school in the small city (see Table 9). All students with disabilities were fully included in general education coursework with support in general education classrooms and in the resource room as needed. Each student was expected to pass all high school courses, accrue the necessary credits, and receive a high school diploma. Even those students previously tested out of level were expected to meet grade level standards. This high school is known in the surrounding community as one that promotes high school graduation for students with disabilities.

 

Table 9. Features of Instruction for Students Tested Out of Level in School D

 

Setting

Alignment

Pacing

Accommodations

Staffing

General Education

 

**

 

 

**

 

**

 

**

 

**

Special Education

Full inclusion.

Special education teachers provide resource support*

Standards met to receive high school diploma

Pace same as general ed courses

SEA approved accommodations

Educational assistants support in classroom

 

 * Students tested out of level fully included in general education.
** No general education teachers interviewed at this school.

 

Next, we asked teachers to think of three students in their classrooms: one with relatively “high” academic ability, one with “moderate” academic ability, and one with relatively “low” ability. Once teachers had students in mind, we asked them a series of questions about the students’ instruction to identify the qualitative differences between special education and general education instruction. In our analysis procedures, we looked more closely at the instructional differences between students in special education who were tested on-level vs. out-of-level. We report these findings in composite student scenarios that represent the instructional experience of students with disabilities who function at varying levels of academic achievement.

 

SCENARIO 1: Special Education Students with High Academic Ability

Jeremy is a 6th grade student with a math disability who attends the middle school in the larger city. He has received special education services since 2nd grade. Because he is able to use instructional and test accommodations, a calculator and number line in particular, he was able to participate in the statewide assessment on the grade in which he is enrolled in school. Jeremy receives instruction in the general education classroom setting for all core content areas. The educational assistant provides constant support during math instruction. For those constructs that are especially difficult for Jeremy to grasp, his special education teacher will work either 1:1 or with a small group of students to ensure meeting all math standards. He took the statewide assessment on the grade in which he is enrolled in school.

General Conclusion: Relatively “higher” functioning students with disabilities are included in general education instruction with support. Instruction is delivered on-grade level as is the statewide assessment. These students with disabilities are learning in-depth at the general education instructional delivery pace.

 

 SCENARIO 2: Special Education Students with Moderate Academic Ability

Sara is a student in the middle school located in the small city who has been deaf since birth. She finger spells and signs, which requires a sign language interpreter for all of her core content classes. She uses numerous accommodations that include reduced assignment and test items, small group setting for testing, and writing in workbooks and test booklets. While Sara is included in general education instruction in math, social studies, and science, she receives academic support from both her special education teacher and sign language interpreter. Her special education teacher designs most instructional accommodations while her general education teacher modifies classroom tests. Because the general education teacher is required to pace instruction according to a decree by the U.S. Office for Civil Rights, instruction moves ahead before Sara is able to learn material in-depth. She receives reading instruction daily in the special education resource room using materials that are four grade levels below the grade in which she is enrolled in school. Since math is an academic strength for Sara, she participated in the statewide test with accommodations by taking the grade level assessment and a read-aloud accommodation in sign language.

General Conclusion: “Moderate” functioning students with disabilities are included in general education instruction with extensive teacher support combined with resource room instruction. Instruction in the general education setting is delivered on-grade level, but instruction in the special education is delivered below grade level. The statewide assessment was administered on-grade level with accommodations use as needed. These students with disabilities, even though included in general education classes, are not consistently learning in-depth academic content for the grade in which they are enrolled in school.

 

 SCENARIO 3: Special Education Student with Low Academic Ability

Because Joshua has a mild cognitive disability that was identified when he was of preschool age, he has received special education services each year of elementary school. His same age peers are now attending 7th grade, but Joshua is not included in any general education classroom instruction. His special education teacher uses curriculum developed for students in grade 4 with supplemental materials that range in grade level from grade 2 to grade 6. All instruction is delivered 1:1 or in small groups of students whose academic ability is similar to Joshua’s ability. Since Joshua’s academic ability was below the grade in which he is enrolled in school, his IEP team (which included his parents) decided to test him out of level in the statewide test.

 

General Conclusion: “Low” functioning students with disabilities receive all core content instruction in a special education resource classroom. Instruction is delivered at grade levels below the grade in which they are enrolled in school. These students with disabilities are tested out of level in statewide assessments.

 

 Principal and Teacher Interview Results

Interview results are presented in terms of major themes. We present these for our two major groups of interviewees—principals and teachers. Since we only interviewed one district test coordinator, we present findings in the form of direct answers to our interview questions. In analyzing our principals’ data set, we found differences across school levels; we did not find major differences of opinion across elementary, middle school, and high school teachers’ responses. Thus, we present our thematic findings for the principal subgroup and teacher subgroup in different formats. For our principal data set, each theme is discussed separately in a narrative format. For our teacher data set, thematic findings are presented in a comparative chart format that highlights similarities and differences between special education and general education perspectives. We treat our narrative findings from School D in isolation by reporting those findings separately from the results from the other schools because no students had been tested out of level in School D.

District Test Coordinator
We posed the following seven interview questions to the district test coordinator. The responses to each question were as follows.

Q1) Do you think that out-of-level testing is beneficial for your students with disabilities? If so, why? If not, why not?

Out-of-level testing is beneficial for the students because “it gives them an opportunity to perform on the level where they’re working in the classroom.”

On the other hand, out-of-level testing results do “not give us readily comparable data for curriculum decisions with students who are not tested out of level. It’s very difficult to compare where we’re going with all of our students.” Out-of-level test data do provide a “milepost to determine how they are progressing, but … on the school level and the district level, we do get skewed results.”

Q2) Do you, or anyone else, advise your teachers about out-of-level testing? If so, what kinds of things are said?

The district test coordinator does not “personally advise teachers about out-of-level testing,” but “the information that we provide from the [test] contractor does filter down to the teachers.” Rather, the district test coordinator “works with the district special education department to interpret guidelines.” In turn, this department works with school test coordinators who “actually conduct the workshops that advise the teachers who will be the [test] administrators.”

Also, “we have information from the state that deals with a variety of issues such as test security [or] dealing with irregularities.”

Q3) Who actually decides which students with disabilities take an out-of-level test?

For each student with a disability who might be eligible for out-of-level testing, “through the IEP process the student’s level of instruction is determined. We have students who are working on several levels below grade level. Then decisions are made by the team that will determine whether they are eligible.”

Q4) Can you please describe how the decision to test a student out of level is made?

“The team would determine that by the data it has collected on what level the child is working as well as their previous test scores.” In some cases, some students had progressed academically to the point of being able to “move out of the alternate assessment … to out-of-level assessment and back to the regular program.”

Q5) How do IEP teams determine the appropriate level of an out-of-level test? Does the test level typically align with a student’s instructional grade level? Are test levels ever assigned according to a level at which a student is certain to succeed?

“I can’t say that I can be sure of that because I’m not in a position to know what’s happening in every classroom with every student. I would say it is appropriate because I know the staff that works with the program.”

“I don’t think that we make a judgment on assigning a test level where a student can succeed. The criteria are pretty set … that the State has given us as far as the test scores [in terms of] how low or high … as well as the level where the student is working academically in the classroom. I don’t think that we just assign one without a good bit of study.”

Q6) Can you please describe what happens to out-of-level test scores after a student has completed the test? How are these scores included in school reports? In district reports? In state reports?

Students’ test scores are sent to the school district office after “good bit of clean-up … but it usually doesn’t involve the score itself. It’s more demographics. Once we receive individual test results, we send those home with the students with an explanation. Generally the explanation is geared to the average parent so that they can understand. We also invite them to come to school for a little more in depth discussion of those scores. We want parents to know how well their child has performed on the test, and that’s really not our plan, but the state plan.”

In terms of reporting out-of-level test scores in district or state reports, “I can’t really answer that. I am more of an administrator of the testing, not so much a decision-maker of where do we go now. We have departments that do that.”

Q7) How do you interpret an out-of-level test score? How are out-of-level test scores used by your staff? Are these scores used for student accountability purposes? For system accountability purposes?

The special education department at the district level “do a lot of interpretation of scores … because we want to as best we can make district decisions on curriculum. I know they want to know how well our students are doing. Decisions are made at the school level as to whether the child will be tested out of level again. It also has to say about the comparison, how that child has done with other students that take the test on that level. That helps us make wise decisions for that student and the program that we offer that student.”

“I can’t think of one time that we have done that [equated out-of-level test scores to on-level test scores]. I think that we would have to have a lot of help.”

Out-of-level test scores are not reported publicly at the local level. “I would say that we’re in the beginning stages of learning all the different ways to be able to use it (out-of-level testing).”

Principals

We interviewed principals from Schools A, B, and C. Overall, their responses fell into four major themes.

There are advantages and disadvantages in testing students with disabilities out of level in statewide assessments. Overall, there was no consistent pattern of agreement or disagreement about the benefits of out-of-level testing for students with disabilities among the three principals. At the elementary level, the principal thought that “it may be beneficial for the student to get the extra needed assistance, but it creates a hardship on the school … in terms of logistics and scheduling.” The principal compared administering a test below a student’s grade of enrollment to specific student test accommodations as “creating some problems” that could interfere with school-wide test administration. On the other hand, the middle school principal’s response did not focus on school staff, but on the students with disabilities themselves. Referring to out-of-level testing as “not beneficial for the student,” the principal said, “I feel like the student would have found out more about themselves …. When the test results came back, they really didn’t show anything about the capabilities of the student.”

Another type of perspective emerged from the high school principal’s responses: “I still to this day do not know what benefit the out-of-level testing is going to give a student once they have to face reality.” In order to receive a high school diploma, students must past a high stakes graduation exam, so that achieving below grade level will not prepare students for the “reality” of taking that test. Saying, “since we don’t have enough [information] about the benefits of out-of-level testing for us to see what the benefits for those children … I’d like to see out-of-level testing [used] within a classroom everyday.” In other words, out-of-level testing could be appropriately used for classroom instructional purposes, but not for high stakes student accountability purposes.

Out-of-level testing does not promote the inclusion of students with disabilities in general education instruction. There were two different perspectives that emerged from our analysis that address the ways in which testing students with disabilities below their grade of enrollment works against successful inclusionary practices. One middle school principal commented when introducing out-of-level testing in a faculty meeting, “We’re going to be doing some out-of-level testing, but it really will not involve the general education teacher. Special education teachers will be going over whether or not they meet the criteria. They’ll be tested in a different setting from what you’ll be testing in your classroom.” Another middle school principal also thought that “out-of-level testing is more or less a special education decision prior to any regular education discussion.” Even so, students with disabilities who were tested out of level in these middle schools were included in general education instruction for core content instruction.

More specifically, the high school principal indicated that out-of-level testing “separates the special education community” from mainstream education. “In our inclusion situations, special education teachers go in and assist the regular education teacher.” With out-of-level testing, “it is two ships sailing in the same direction, and one of those ships is in the shadow of the big one.” Further, this principal hoped for the day when “teachers won’t be looking at kids as having an IEP. That’s just [high school] student whose needs are being met. We won’t interpret anything as special education.”

Out-of-level test scores are not useful for school improvement planning. The three principals made several comments that indicated that out-of-level test scores are not considered in school improvement planning. “I don’t see any way of actually reporting out-of-level test scores that would make sense to anyone in the community or even the school.” When it comes to using those test data, “I don’t know about out-of-level testing when it comes to [preparing school improvement plans.]” In this school, teachers look at statewide test results from the previous year “to look at a group of students,” but out-of-level test results are difficult to use for groups of students. Rather, “a special education teacher would certainly look at out-of-level results for an individual child … to work with some strengths and weaknesses.” But in terms of school improvement planning, another principal indicated that “we’ve had so few [out-of-level test scores] that I would say that we look at our results overall.” The out-of-level testing scores don’t enter into the process where “we lay out [test results] on the table and analyze where our [academic] strengths and weaknesses are” for all students in the school district.

Principals thought that out-of-level test results should be disaggregated in data reports. There was a point of agreement among the three principals on the reporting of out-of-level test results. “If [out-of-level test scores] could be separated, it would be beneficial to me.” When out-of-level test data are intended for use at the system level, “we’re not using the same data” to make comparisons across all students at specific grade levels. “I need raw [test] data to tell me specifically what it is supposed to.” Also, when out-of-level test scores are “added to the [school’s] total, it brings down the entire total score.”

Special Education and General Education Teachers

Our thematic results from the face-to-face interviews with teachers fell into three categories: findings from both special and general educators, findings contributed only by special educators, and findings contributed only by general educators. Our content analysis revealed no patterns in our narrative data that pointed to differing opinions between elementary, middle, and high school. Therefore, school levels are not reported in the teacher findings.

Each interview question elicited some similarities in responding between special education and general education teachers, which we developed in themes. Teachers made similar comments concerning out-of-level testing benefits, test taking behavior and feelings, parent understanding of out-of-level testing, out-of-level testing reporting preferences, and out-of-level test score use. Themes and supporting quotes from both special and general education teachers are presented in Table 10.

Findings that emerged from the content analysis of special education teacher responses are presented in Table 11 as themes and supportive quotes. We found no overlap between these findings and the findings generated from the general education  teacher data set.

While minimal in number, there were rich, interesting themes that emerged solely from the general educator data set. These themes and supporting quotes are presented in Table 12.

Table 13 presents contrasting findings from the responses of special education and general education teachers. These themes and supporting quotes reflect opposing opinions.

 Table 10. Thematic Findings Common to Special Education and General Education Teachers

Responses from Special Education and General Education Teachers

Teachers reported successful on-grade level achievement for students with disabilities.

“I was thrilled last year with what she got on her math part. She’s a whiz at math and she really scored well on an on-grade level test.” – special education teacher

“There are some students that do pass the exit exam that are functioning on the 4th grade level when you test them. Mildly mentally impaired students that pass their regular classes.” – special education teacher

“I have some students with disabilities who are coming to me telling me they’re going to graduate from the high school. I’m excited that they have made it!” – special education teacher

“If you have a very conscientious person and a hard worker, and even though they are limited abilities, they can meet the standard. If you read their level of functioning, you wouldn’t think they would succeed, but they do.” – general education teacher

“I was surprised to find out that my 8th grade transition students were sometimes higher functioning than my 9th graders because they had … a whole year of remediation. They didn’t pass the [state test] because their parents chose for them to take it on-grade level. But it was misleading because … the ones who took if out-of-level passed and were promoted to 9th grade and were actually weaker than some of the 8th grade transition students.” –  general education teacher

Out-of-level testing was appropriate for students with disabilities since they cannot achieve on-grade level.

“A lot of the special education kids are not going to pass the exit exam once they get to high school. They work hard … and we try to introduce the standards and teach them everything, but some just don’t have the ability to grasp the concept or retain it.” –  special education teacher

“I don’t think that on-level testing is really fair for students with disabilities, so no – I don’t think it would be very beneficial.” – general education teacher

Out-of-level testing promoted participation in statewide testing for students with disabilities.

“It is beneficial for those students who are not able to meet or who would not be able to perform well on the state test at their grade level.” – special education teacher

“It [out-of-level testing] provides the students that are not always on the grade level an opportunity to take a test and maybe be more successful than they would if they had to do an on-level test.” – special education teacher

“If I was going to give him a test that’s higher than his level, he’s not going to make it anyways. So, if the grade was lower and the material was on his level, I think that he could accomplish that.” – general education teacher

Teachers believed that only certain students with disabilities should be tested out of level.

“For inclusion students, no. Because they’re in regular classrooms participating in grade-level curriculum that [testing out of level] would just put them behind.” – special education teacher

“The only students that would be tested out of level are pulled for resource in reading and math instruction. The only time they should be tested out of level is if they are taught out of level.” – general education teacher

The benefits of out-of-level testing depend on students’ future academic goals.

“Because they’re still required to take an on-level test to get out of high school, so to me … to me they are just taking an extra test – one out-of-level and another on-grade level to take the graduation exit exam.” – special education teacher

“Now if they’re not trying to get a high school diploma, I don’t have as much a problem with it (out-of-level testing). But if they’re trying to get a high school diploma and they’ve got to pass those tests that are on level, there’s just no way that they can go from one to the other.” – general education teacher 

Teachers’ opinions about students’ test taking behavior during out-of-level testing and on-grade level testing varied.

“We have a lot of students that know when they take the [state test] that they’re never going to pass it. And so that, of course, affects their behavior.” – special education teacher

“You would think that they would be able to do it because it’s more at their ability level. I have heard instances where the behavior was still the same. They rushed through the test without really reading it or if it was being read aloud, they didn’t really pay attention. This was true … for students who took on-grade level testing and for out-of-level testing as well.” – special education teacher

“They turn the test upside, put it on the floor, make an airplane out of it – whatever you can do with that test. They’re just through. Everybody else is testing and you have a behavior problem to try to control for the rest of the forty minutes.” – general education teacher

“You know if it’s on their level, they will at least wait and listen for the questions and attempt to answer it. If it’s way above them, they’re not going to bother. They’ll just bubble in [the answers].” – general education teacher

Out-of-level testing promotes increased motivation to perform more successfully.

“If they knew they were going for a separate set of standards, that attitude for taking the out-of-level test increases their motivation.” – special education teacher

“Sometimes I think that to these special students … a motivation to really try hard to get to have a test like everybody else. I think that’s a part of motivation.”
 – general education teacher

Teachers reported that students experienced a variety of positive and negative feelings when taking an out-of-level test.

“It’s beneficial because they’re not under stress of taking the [state test]. The look on their face is total frustration when taking the state test.” – special education teacher

“When she’s taking the 2nd grade test out of level, she says, ‘bored,’ when she takes the test.” – special education teacher

“When in the room with other students, their work is different. I mean if you’re on the out-of-level, that’s what you are. The other students tend to stigmatize a student who takes a test out of level. They do realize it.” – special education teacher

“No differences between out-of-level and on-grade level testing. Frustration and anxiety were present among all of them.” – special education teacher

“I had different reactions. I had some students that were maybe more mature students that it seemed to bother them that they weren’t taking the same test as their peers. I had other students who were oblivious to what level anyone was taking because it came easier to them and they weren’t frustrated.” – special education teacher

“You have two extremes. You do have the type that are going to bubble in. You can see their frustration. But then you also have the type to try to figure it out even though they can’t. They’re either way ahead or they’re going to sit there for forty minutes on one problem.” – general education teacher

Out-of-level testing is a disservice to parents and students.

“Because in … 9th grade it’s (out-of-level testing) not provided. Parents don’t realize that … giving them a ‘crutch’ along the way will be pulled out from under them once they get there (high school) – special education teacher

“I think that sometimes it (out-of-level testing) gives them (parents) false hope -- like they’re an ‘A’ student – when they’re an ‘A’ student on a 3rd grade test. When they get to high school … and take an exit exam, the parents get really upset. Because it has previously been told to them, ‘He has an ‘A,’ but it’s a 3rd grade reading level.’ The child doesn’t get the concept either.” – general education teacher

Teachers reported a variety of opinions about how to report out-of-level testing scores to the public.

“I think they should be reported separately with the test level included.” – special education teacher

“If you mix on-level and out-of-level scores, I believe that affects the way that statistics are manipulated. That’s not an actual representation of a student body.” – special education teacher

“If she’s going to take a 2nd grade test, then I’d like to see it reported on a 2nd grade level with other 2nd graders.” – special education teacher

“I don’t think that they should be included in the grade levels. I mean if they’re in 5th grade and tested out of level, I don’t think they should be reported with the 5th grade scores.” – general education teacher

“I think they [out-of-level test scores] should have their own subcategory.” – general education teacher 

Teachers’ learning expectations for students with disabilities are below the grade in which the students are enrolled in school.

“These students, we think, probably will need to be tested out of level so that we can see if they are academically weak like we think that they are.” – special education teacher

“In the long run, a lot of the special education kids are not going to pass the high school exit exam. I feel like we are setting them up for failure. They need some other way to get an education other than college-bound prep.” – special education teacher

“There are some students that are never going, because of the disabilities they are born with… they’re not going to progress.” – general education teacher 

 

Table 11. Thematic Findings for Special Education Teachers 

Out-of-level tests do not necessarily measure what students with disabilities know and can do.

“I didn’t feel when she tested out-of-level that it gave me an accurate result because she should have been able to do better on the test.”

“Sometimes … they were handicapped by their inability to read, but they thought in a more mature level. So then they were placed into a test situation that was much more juvenile. They just couldn’t read the on-grade level test and … show what they know.”

Out-of-level testing does not necessarily mean that the test is easier.

“It seems as though those tested out of level struggled more with the concept. It took them longer and sometimes I had to encourage them to choose the best answer instead of leaving it blank.”

There were no instances of administrative pressure to test students with disabilities out of level.

“When they first came out with out-of-level testing, I heard that the principals were saying, ‘All my special education students will take out-of-level because your school’s report will be higher.’ It did not make scores any higher. I didn’t see that it affected scores. I was not encouraged to give it (state test) out of level.”

“I don’t recall anyone talking to me about that.”

Transitioning from middle school to high school was different for students with disabilities who are tested out of level than students with disabilities tested on-grade level.

“When my students take the state test [on-grade level] and they fail, they have to go to summer remediation. If they still fail, they go to 8.5 grade level in high school. The kids who do out-of-level go straight to high school.”  

Some students with disabilities prefer on-grade level tests.

“Students with disabilities want to be taking what all the other 9th graders are taking. They’ll accept their modifications but they want the same test.”

“Some of them are horrified. They didn’t want to admit that they aren’t on-level.”

Other students with disabilities accepted being tested out of level.

“They showed out-of-level testing didn’t bother them at all, the ones that I did test.”

The benefits of out-of-level testing depend on a student’s grade of enrollment in school.

“I think for some their disability is a little milder and they’re being successful in the regular program and they’re doing well. So for them, I think that out-of-level is[beneficial].”

Out-of-level testing provides an unfair test advantage to students with disabilities.

“I don’t believe out-of-level testing advances the students on an appropriate level as far as [advancing] grade-wise.”

 “I’m totally opposed to out-of-level testing because they are not receiving the same testing as other students. That’s not fair to the other students. If you only read on a 3rd grade level and you’re an 8th grader, then you need to be taking an 8th grade test.” 

Out-of-level testing does not prepare students with disabilities for passing a high stakes graduation exam.

“If they don’t do out-of-level testing in high school, but they’ve been tested out of level in elementary and middle school – they’re not going to be prepared for what they’re going to face in high school.”

It is more difficult for students who have been tested out of level to receive a regular high school diploma.

“I think that students tested out of level get categorized or pigeonholed. They don’t graduate from high school in the normal way.”

“We have some students tested out of level that are full inclusion, but we have some out-of-level students that are resource for reading and math. Because they are going for a high school diploma, we have to address the on-grade level curriculum. But we do pull in from other levels to meet their needs.”

“They’re being instructed at their grade level of enrollment because they’re going for a high school diploma. We have to meet those benchmarks and standards. They may test out of level at the 2nd grade level, but we are not touching on 2nd grade skills. They’re getting 6th grade instruction.”

Out-of-level test items are not written appropriately for students who are older than the test grade level.

“Some of the questions are worded in a way for my students that she didn’t understand. It had probably been a long time since she’d been asked a question in that way.”

Students with disabilities who are included in general education instruction and tested out of level are not tested on the same grade level.

“Even though they do have the disability, most of our students are functioning at their grade level or maybe one grade level below. When it comes time for testing, we move them down to a much lower level. It’s not what they are used to.”

Testing students with disabilities out of level reduces expectations for learning.

“Continuously testing below level, we’re lowering the expectations and they may have missed some skills that they might need to pass those tests (high school exit exam).”

Out-of-level testing seems to solve immediate problems for students with disabilities.

“Out-of-level testing – it’s like a temporary advantage, but not a long-term solution. A child would probably be frustrated taking a grade-level test … so why put them through that?” 

Some parents understand out-of-level testing better than other parents.

“I’m not sure how much they comprehend as to what level the test is being given at. It depends on the parent.”

“Parents never ask how I think they will do on the exit exam (graduation test). I don’t know if they just don’t want to think about it. Students don’t ask either.”

“I don’t think they understand. There is the statement that says your child will more than likely not receive a high school diploma, but I don’t think they really understand what out-of-level is.”

“They would have to understand that if their student takes an out-of-level test that this is where he would go in the future before they sign the form.”

 

Table 12. Thematic Findings for General Education Teachers

Students with disabilities were tested many levels below a student’s grade of enrollment in school.

“We [high school teachers] had out-of-level tests ranging from first through fourth grade level out-of-level testing.”

The IEP team makes the decision to test a student out of level which may or may not be close to testing time.

“We do IEP meetings in the fall … but we test in the spring.” 

Some teachers confused out-of-level testing with test accommodations.

“Like if there’s a multiple choice [test], instead of four choices, we narrow it down to two choices.”

“The test is kind of broken up into four questions then maybe I’ll draw a long line and then another four questions and draw another line, so it’s spaced out differently. On the other test they may get 40 questions. They may get 20 questions [on an out-of-level test].”

 

Table 13. Opposing Findings from Special Education and General Education Teachers

 Special Education Teachers

 General Education Teacher

Promoting students with disabilities to the next grade is easier with out-of-level testing.

“We were looking for a way for students with disabilities to be successful and to move on and not be stuck because they were never going to pass the [state test]. So we offered them out-of-level testing to move them on to the next grade.”

Promoting students with disabilities to the next grade is unfair with out-of-level testing.

“The problem is they’ll move up to the next grade with the other kids, but they’re really not on the same level. They’re getting further and further behind the more they do that.”

Out-of-level test scores can help teachers improve academic results for students with disabilities.

“It (out-of-level testing) might show a little greater where students’ weaknesses are so that the teacher could focus a little bit more and raise those levels with more remediation.”

Out-of-level test scores do not help improve academic results for specific students with disabilities.

“Not really. I kind of look at the breakdown from any of the tests whether on-level or out-of-level and see where the weaknesses occurred. I predominately check reading and math and see what the subtest areas were as far as the scores.”

Out-of-level test scores are not useful for instructional planning.

“I can see where it could probably help in planning for the district, but I haven’t used that in planning curriculum or instruction.”

Out-of-level test scores are useful for instructional planning.

“I’ve taken courses where we looked at test scores. I know that I should be informed about which levels of what needs more work. So I plan on using that when I get these test scores back.”

On-grade level test scores are more useful than out-of-level test scores.

“No, I haven’t had an opportunity to look at those (out-of-level test scores). If the student is failing on-level testing, there are deficits that need to be addressed.”

No state test score is that useful for instructional planning.

“Usually we don’t get the test results until the end of the year when I’ve already had the student. As far as the group that I teach, I don’t know [the scores] until the end of the year.”

Out-of-level test data do not contain normative information.

“When she took the out-of-level test, there was nothing really to make a comparison to. Last year she took the grade level and at least it gave me a percentage to go by with other children her age.”

Out-of-level tests yield results that reveal learning strengths and needs for students with disabilities.

 “You find out the areas that you have to work on. I have seen copies of the things they can and cannot do. It breaks it [out-of-level test scores] into ‘low, medium, high.’”

 

 School D – Small High School

Our interviews with two special education teachers in School D are best summarized by the following theme::

There are no benefits for high school students with disabilities in taking an out-of-level test.

Both high school teachers indicated that “because they are still required to take their own level test to get out of high school … I think that they are just wasting their time by taking an extra test. “ Instead, both special education teachers “set high standards for our kids, even though I’m realistic about what a lot of them can do, on-level is what they should be striving for. Even if they don’t have success, they know where they need to be.”

In terms of the student’s attitude toward learning, “the student understands that what we expect is for [the student] to perform as well as the other 25 kids in your classroom. Period. If you need an extra 45 minutes to do it, so be it. If I have to work twice as hard as the other 25 kids, so be it. I don’t think that out-of-level … is an issue for them because they want to achieve the same thing that everybody else is achieving.”

Students tended to carry this same attitude as described by one of the teachers: “One of them was a junior high school student and said, ‘Coach, what is this? And why am I having to do this [out-of-level test]? You’re wasting my time. I need to be in biology!’”

 


Discussion

We conclude this report with a discussion of “grand themes” that were derived by considering our quantitative and qualitative findings holistically. By taking a step back from the results of our IEP review and our educator interviews, we gleaned overarching statements that point to global understandings about out-of-level testing. While these conclusions are derived from one school district where we collected data in four schools, our discussion is general enough so that educators and policymakers from other school districts and states are challenged to consider whether these findings are true for their circumstances also.

School level did not affect teacher opinions..

It is not unexpected to see different opinions about out-of-level testing between educators at the  middle and high school levels. Since the state in which our research was conducted uses a high stakes exit exam for determining high school graduation, we expected high school teachers to be adamant about disallowing below grade level testing and middle school teachers to be adamant about its use. While we did find teachers who were resolute about either allowing or disallowing out-of-level testing, our data did not yield any patterns that organized these opinions among teachers into consistent categories of responses.

What we found was a microcosm of what we have found in earlier studies of opinions about out-of-level testing in the U.S. (Thurlow & Minnema, 2001). There seems to be no regularity in the quagmire of contentious debate over the merits and value of out-of-level testing of students with disabilities in large-scale assessment programs. It is interesting to note that in earlier research in this state, an interview with the state assessment director also yielded this same finding (Thurlow & Minnema, 2001). It is not known whether this finding may be unique to this state, although our previous research would suggest that it probably is not. Opinions about out-of-level testing tend to vary widely with no clear regularity within stakeholder groups.

Teachers’ learning expectations set for students with disabilities who were tested out of level as a group were pervasively low.

Throughout the teacher responses from Schools A, B, and C, there was a pervasive attitude of low learning expectations for students with disabilities. Comments were made in general about students with disabilities treating them as one homogeneous group. In fact, it was not uncommon for teachers to indicate that some students with disabilities—including those with either mild learning disabilities or speech/language disorders—would never be able to meet grade level content standards. This tendency was true across all teachers at the middle and secondary levels. For instance, one general educator said, “They have no success rate taking tests that are not on their level. There would be solid Fs and they’d see no success.” Another special educator said, “For many of our kids with disabilities, the test requirements are not beneficial at all. I think that for a large majority of the learning disabled kids … they should not be required to take the exit examination.”

When thinking about individual students with disabilities, some special and general education teachers provided testimonial evidence of those students who had achieved on-grade level. There were instances of students with disabilities who had received a high school diploma or had passed statewide tests. It was as though isolated cases had “beaten the odds,” but these students tended to be exceptions and not to be expected. School D, where all students with disabilities are expected to graduate from high school and receive the instructional support to do so, is uncommonly exceptional in setting high learning expectations for all students with disabilities both as individuals and as a group. Otherwise, students with disabilities were routinely dismissed as an entire group that is unable to achieve grade level proficiency standards. It is interesting to note that Schools A, B, and C contained personnel who favored testing students with disabilities out of level and routinely administered out-of-level tests. On the other hand, special education teachers in School D were unalterably opposed to out-of-level testing and had not administered one state test below grade level.

Special education teachers appeared to have a deeper understanding of accountability issues.

In general, special education teachers were able to articulate multiple views about out-of-level testing as it related to accountability issues with greater detail than their general education counterparts. Issues such as public reporting of test scores, the use of test scores for school-wide planning, or the future ramifications of testing students with disabilities below their grade of enrollment emerged during special education interviews. Issues such as these emerged in interviewing general education teachers, but typically with probing follow-up questions or checks on participants’ comprehension. Overall, general educators did not initiate conversation about accountability issues.

For the most part general education teachers indicated that they had no direct experience either administering out-of-level tests or selecting the students for whom an out-of-level test was appropriate. The lack of direct experience with out-of-level testing may account for interview responses that were not as detailed in terms of testing students with disabilities below their grade of enrollment. In fact, one general education teacher was unable to adequately describe out-of-level testing even though she had instructed students with disabilities who had been tested below the grade in which they were enrolled in school. Instead, this teacher described what would commonly be thought of as accommodations for use in classroom instruction or testing. These accommodations would not have been used appropriately in a statewide testing situation. It is surprising that general education teachers expressed few ideas about the need for accounting for students’ progress toward achieving grade level standards. Comments instead seemed to address the need to make sure that students are tested rather than placing statewide testing in its context of system accountability.

Administrators expressed both student-centered and system-level view points while teachers only expressed student-centered views.

To avoid raising concern about general educators’ understanding of accountability issues unfairly, it is important to highlight another pattern in our narrative data. When referencing accountability issues, special education teachers only mentioned student accountability. There were discrepancies in how out-of-level test scores could be used for instructional planning. Some teachers indicated that making comparisons to other students’ test performance was helpful while other teachers indicated that monitoring progress over time was useful. In reviewing specific test data analysis, we learned that below grade level test scores are not equated to on-grade level scores making either comparative or monitoring uses difficult. In this case, special education teachers did not necessarily display a thorough understanding of accountability issues. However this finding is interpreted, the overarching “grand theme” is a child-centered perspective on testing students with disabilities out of level. Neither special education nor general education teachers exhibited system-level thinking in responding to our interview questions.

It seems logical that we would find the opposite to be true of administrators who, by the nature of their positions, encounter school wide and district wide issues on a daily basis. What we found were mixed perspectives both within and across the administrators interviewed. Two principals were clearly speaking only from a system-centered view point while another principal held only a child-centered perspective. On occasion, the third principal would address system-level concerns, but the majority of interview responses focused on groups of students within the student body. A fourth district-level employee indicated that responding could only be at the system level, that any classroom references were based on assumptions.

We found it was more common for students with disabilities to be tested four, five, or six levels below the grade in which they were enrolled in school than to be tested one or two levels below grade level. This finding prompts the question, if students demonstrate limited academic progress over time as some of the students have, what instructional changes have been considered to improve student learning? We also found that another norm-referenced test was used to determine the level at which a student should be tested out of level. Given this, it is important to ask whether the test content of the other assessment is aligned to content standards, so that the level of the test designated for statewide testing yields valid measurements of standards-based achievement.

We also found that students were tested either on or near the grade level below the grade level that was indicated by the other test. Some out-of-level tests were taken below the level of the norm-referenced test. Given these results we ask, what assurance is there that the results of the other test are accurate indicators of students’ instructional levels? Is there any external evidence that verifies these test results? Could students whose out-of-level tests were administered below the level at which they tested on a norm-referenced test pass tests administered at a higher level? (e.g., a norm-referenced test level of 2.8 and tested out of level at a 2nd grade level might have passed a 3rd grade level test).

Students’ out-of-level test scores were not included in the school district’s accountability indices.

Possibly the most disconcerting finding of this case study was the uncertainty of using the results of out-of-level testing, which was evident at both the school and the district level. In terms of schools, there appeared to be an overall lack of specific information about testing students out of level. This lack of information was particularly evident in general education teachers’ responses to interview questions; they expressed a lack of or limited information about the purpose and administration of out-of-level tests. Some teachers could remember earlier briefings about out-of-level testing at teacher meetings but there was no mention of intensive training. Some special education teachers indicated limited information about the rationale for and consequences of testing students with disabilities out of level.

Again, at the school level, principals also expressed concern about the uncertainty of using out-of-level test data in ways that are useful for students, parents, and school districts. Also, principals expressed apprehension that there was not consistency in how out-of-level testing was practiced within schools and across the school district. Apprehension was also expressed in other divisions of the school district. In interviewing the district test coordinator, we asked if whether descriptive statistics were used to analyze out-of-level test data. By doing so, school personnel become aware of the number of students tested out of level, how they performed, and how far below grade level students were tested – information that is important for monitoring assessments in schools. While indicating the importance of disaggregating out-of-level test data for making assessment decisions, the district test coordinator said,


Conclusion

Our findings raise other concerns about using out-of-level testing for students with disabilities in large-scale assessment programs. While the benefits and disadvantages have perplexed educators, policymakers, parents, and students with disabilities for years, the confusion has become increasingly complex in the era of standards-based reform. The uncertainty remained in the schools in which we collected data. Not only do the effects of out-of-level testing operate at the classroom level, but impacts were observable at the district level also. Teachers hold lower learning expectations for those students tested below grade level as do administrators who may be school improvement planners. In fact, administrators revealed that out-of-level test scores are not useful for school improvement planning. This may be in part because out-of-level test scores are not entered into the district’s accountability indices..

For even those schools that claim that few students are tested out of level, one student with disabilities who is not receiving challenging, grade level instruction is one too many. The following quote is a good summation of our experiences in this school district:

“We have, in comparison with the number of students that take the norm-referenced test, a very small number of students [taking out-of-level tests]. In fact, this year it’s about half what we gave last year. So, we’re not dealing with a significant number – although any number would be IMPORTANT!”

 


References

Minnema, J., & Thurlow, M. (2003). Reporting out-of-level test scores: Are these students included in accountability programs? (Out-of-Level Testing Report 10). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://cehd.umn.edu/NCEO/OnlinePubs/OOLT10.html

Minnema, J., Thurlow, M., & Warren, S. (2004a) Understanding out-of-level testing in local schools: A first case study of policy implementation and effects (Out-of-Level Testing Report 11). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at hhttp://cehd.umn.edu/NCEO/OnlinePubs/OOLT11.html

Thurlow, M., Elliott, J., & Yesseldyke, J. (1999). Out-of-level testing: Pros and cons (Policy Directions No. 9). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://cehd.umn.edu/NCEO/OnlinePubs/Policy9.htm

Thurlow, M., & Minnema, J. (2001). States’ out-of-level testing policies (Out-of-Level Testing Report 4). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M., Minnema, J., Bielinski, J., & Guven, K. (2003). Testing students with disabilities out of level: State prevalence and performance resultss (Out-of-Level Testing Report 9). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at hhttp://cehd.umn.edu/NCEO/OnlinePubs/OOLT9.html


Appendix A

Instruments

 Interview Protocols

            Teacher

            Principal

            District Test Coordinator

            Student

 

Teacher Face-to-Face Interview Protocol

“I am _____ from the University of Minnesota. Your school district has agreed to participate in one of our research studies that is collecting data to understand the effects of testing students with disabilities out of level in large-scale assessments. Part of that research study is our interview. Like we discussed before, I have seven questions to ask you about out-of-level testing in large-scale assessments. I’d like to tape record our conversation if that is all right with you. That way, I will have exactly what you have said to make sure that I don’t make any mistakes when I analyze the responses to these questions. Before we begin however, I have a consent form that I would like for you to read and then sign if agreeable.”

“Thank you. Do you have any questions before we begin?”

Q1) Do you think that out-of-level testing is beneficial for your students? If so, why? If not, why not?

Q2) Do you think that on-grade level is beneficial for your students? If so, why? If not, why not?

Q3) How did your students with disabilities behave when taking an on-grade level test? How did they behave when taking an out-of-level test?

PROBE: Did any of your students act out when taking a test on-grade level?

Q4) How do you think students felt about taking a test out of level? How do you know this? Did your students comment about the test booklet? Did the test material seem age appropriate?

PROBE: Did your child think that the out-of-level test was appropriate for his/her age? If so, how do you know?

Q5) Who actually decides which students with disabilities take an out-of-level test?

Q6) Can you please describe how the decision to test a student out of level is made?

Q7) How do IEP teams determine the appropriate level of an out-of-level test? Does the test level typically align with a student’s instructional grade level? Are test levels ever assigned by the level at which a student is certain to be successful? Can teachers identify the grade level of a test by looking at the content of the test items?

Q8) Do any of your school staff, including the administrators, advise you about out-of-level testing? If so, what do they say?

Q9) How will taking the state test out of level affect your student(s) in the future?

PROBE: Is something being done to make sure that your students are catching up to grade level standards?

Q10) Do you think that the student’s parent(s) understand the consequences of taking the state test out of level? Do you think that the student who is tested out of level understands the future consequences of taking the state test out of level?

Q11) Are you familiar with the public reporting of state test scores in your community? I have a question that asks for your opinion from three choices. I assume your students’ names are kept confidential. When test scores are reported to you, the family, and the public, would you like the out-of-level test scores to be compared to:

( Ö one)

-   ___ The grade level of the out-of-level test?

-   ___ With the grade level of his/her classmates?

-   ___ No opinion.

Please explain why?

Q12) How do you interpret an out-of-level test score? How do you use out-of-level test scores? Is there a difference in how you use out-of-level test scores and in-level test scores?

 

Principal Face-to-Face Interview Protocol

“I am _____ from the University of Minnesota. Your school district has agreed to participate in one of our research studies that is collecting data to understand the effects of testing students with disabilities out of level in large-scale assessments. Part of that research study is our interview. Like we discussed before, I have seven questions to ask you about out-of-level testing in large-scale assessments. I’d like to tape record our conversation if that is all right with you. That way, I will have exactly what you have said to make sure that I don’t make any mistakes when I analyze the responses to these questions. Before we begin however, I have a consent form that I would like for you to read and then sign if agreeable.”

“Thank you. Do you have any questions before we begin?”

Q1) Do you think that out-of-level testing is beneficial for your students with disabilities? If so, why? If not, why not?

PROBE: Do you know if any students acted out when taking an on-grade level test?

Q2) Who actually decides which students with disabilities take an out-of-level test? Can you please describe how the decision to test a student out of level is made?

Q3) Do you think that your IEP teams consider the future consequences of testing students with disabilities out of level? If so, how do you know? Do you think that the parents understand the consequences of testing students with disabilities out of level? Do the students who are tested out of level?

Q4) Do you, or anyone else, advise your teachers about out-of-level testing? If so, what kinds of things are said? How is the information prioritized? Does anyone advise you about out-of-level testing?

Q5) How do IEP teams determine the appropriate level of an out-of-level test? Does the test level typically align with a student’s instructional grade level? Are test levels ever assigned according to a student’s level of success?

Q6) Can you please describe what happens to out-of-level test scores after a student has completed the test? How are these scores included in school reports? In district reports? In state reports? How are out-of-level test scores used in school improvement plans? How do students benefit from school improvement plans?

Q7) How are out-of-level test scores used by your staff? Are these scores used for student accountability purposes? For system accountability purposes?

Q8) I have a question that asks for your opinion from three choices. In asking this question, I assume students’ names are kept confidential. When test scores are reported to you and to the public, would you like for your student’s test scores to be compared to:

( Ö one)

-   ____ The grade level of the out-of-level test?

-   ____ The grade level of his/her classmates?

-   ____ No opinion.

Please explain why?

 

Special Education Coordinator Face-to-Face Interview Protocol

“I am _____ from the University of Minnesota. Your school district has agreed to participate in one of our research studies that is collecting data to understand the effects of testing students with disabilities out of level in large-scale assessments. Part of that research study is our interview. Like we discussed before, I have seven questions to ask you about out-of-level testing in large-scale assessments. I’d like to tape record our conversation if that is all right with you. That way, I will have exactly what you have said to make sure that I don’t make any mistakes when I analyze the responses to these questions. Before we begin however, I have a consent form that I would like for you to read and then sign if agreeable.”

“Thank you. Do you have any questions before we begin?”

Q1) Do you think that out-of-level testing is beneficial for your students with disabilities? If so, why? If not, why not?

PROBE: Do you know if any students acted out when taking an on-grade level test?

Q2) Who actually decides which students with disabilities take an out-of-level test? Can you please describe how the decision to test a student out of level is made?

Q3) Do you think that your IEP teams consider the future consequences of testing students with disabilities out of level? If so, how do you know? Do you think that the parents understand the consequences of testing students with disabilities out of level? Do the students who are tested out of level?

Q4) Do you, or anyone else, advise your teachers about out-of-level testing? If so, what kinds of things are said? How is the information prioritized? Does anyone advise you about out-of-level testing?

Q5) How do IEP teams determine the appropriate level of an out-of-level test? Does the test level typically align with a student’s instructional grade level? Are test levels ever assigned according to a student’s level of success?

Q6) Can you please describe what happens to out-of-level test scores after a student has completed the test? How are these scores included in school reports? In district reports? In state reports? How are out-of-level test scores used in school improvement plans? How do students benefit from school improvement plans?

Q7) How are out-of-level test scores used by your staff? Are these scores used for student accountability purposes? For system accountability purposes?

Q8) I have a question that asks for your opinion from three choices. In asking this question, I assume students’ names are kept confidential. When test scores are reported to you and to the public, would you like for your student’s test scores to be compared to:

( Ö one)

-   ____ The grade level of the out-of-level test?

-   ____ The grade level of his/her classmates?

-   ____ No opinion.

Please explain why?

 

District Test Coordinator Face-to-Face Interview Protocol

“I am _____ from the University of Minnesota. Your school district has agreed to participate in one of our research studies that is collecting data to understand the effects of testing students with disabilities out of level in large-scale assessments. Part of that research study is our interview. Like we discussed before, I have seven questions to ask you about out-of-level testing in large-scale assessments. I’d like to tape record our conversation if that is all right with you. That way, I will have exactly what you have said to make sure that I don’t make any mistakes when I analyze the responses to these questions. Before we begin however, I have a consent form that I would like for you to read and then sign if agreeable.”

“Thank you. Do you have any questions before we begin?”

Q1) Do you think that out-of-level testing is beneficial for your students with disabilities? If so, why? If not, why not?

Q2) Do you, or anyone else, advise your teachers about out-of-level testing? If so, what kinds of things are said? Does anyone advise you about out-of-level testing? If so, what kinds of things are said?

Q3) Who actually decides which students with disabilities take an out-of-level test?

Q4) Can you please describe how the decision to test a student out of level is made? What are the steps that you go through so that a student can take an out-of-level test?

Q5) How do IEP teams determine the appropriate level of an out-of-level test? Does the test level typically align with a student’s instructional grade level? Are test levels ever assigned according to a level at which a student is certain succeed?

Q6) Can you please describe what happens to out-of-level test scores after a student has completed the test? How are these scores included in school reports? In district reports? In state reports?

Q7) How do you interpret an out-of-level test score? How are out-of-level test scores used by your staff? Are these scores used for student accountability purposes? For system accountability purposes?

 

Student Face-to-Face Interview Protocol

“Hi. My name is _____ and I am from Minnesota. I have been in your school this week to learn more about out-of-level testing. Do you know what that is? Good.”

If not, continue with … “Do you remember when you took the (name of test) with all of the other students in your school? Do you know if your test was the same test as other (8th or 10th graders)? Good.”

“Do you mind if I ask you a few questions about that test? The questions are easy. I’m sure that you will do very well. It’s not a test! It’s for a research study that I am doing. When we are finished I have a gift card for you to spend at Target. First, I need for you to listen to me read this paper. Then, if you want to answer my questions, I will need for you to sign this paper.”

“Do you have any questions before we begin?”

Q1) Do you like the (test name)? Why or why not? Did it seem okay for your age?

Q2) Do you know what out-of-level testing is? If a friend asked you what an out-of-level test is, what would you say?

Q3 Do you know who decided that you should take the test (use student’s words to describe test)? Did you help make that decision?

Q4) Do you know how taking this test (use student’s language) will change anything for you in school when you are older?”

“You’ve done a very good job answering my questions. That’s great! Enjoy spending your gift card at [department store name]. Have a good rest of the day. Thank you.”