Technical Report 23

Educational Results for Students with Disabilities: What Do the Data Tell Us?

by James E. Ysseldyke, Martha L. Thurlow, Karen L. Langenfeld, J. Ruth Nelson, Ellen Teelucksingh, and Allison Seyfarth

Published by the National Center on Educational Outcomes

December, 1998


This document has been archived by NCEO because some of the information it contains is out of date.


Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Ysseldyke, J. E., Thurlow, M. L., Langenfeld, K. L., Nelson, J. R., Teelucksing, E., & Seyfarth, A. (1998). Educational results for students with disabilities: What do the data tell us? (Technical Report No. 23). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://cehd.umn.edu/NCEO/OnlinePubs/TechnicalReport23/TechReport23text.html


Executive Summary

Over the past 10 years there has been an increased demand for accountability for the results of education for all students, especially students with disabilities. However, very limited data are available on educational results for students with disabilities. The Individuals with Disabilities Education Act Amendments of 1997 (P.L. 105-17) called for a focus on outcomes, and for data to be in public reports. Realizing the importance of accounting for the performance of students with disabilities, the National Center on Educational Outcomes (NCEO) analyzed public state accountability reports, with the goal of presenting information on how students with disabilities are doing both academically and nonacademically according to these reports.

We collected 115 public reports between October 1997 and March 1998 from state accountability offices and state special education offices. Using the NCEO framework to organize results, we reproduced the relevant findings to present a picture of what we know about the results of education for students with disabilities.

Our analysis revealed very limited information on students with disabilities in state accountability reports. Almost every state (47) provided data in the Academic and Functional Literacy domain, but only 13 reported on students with disabilities in this domain. These states provided information on how students with disabilities performed on statewide assessments, yet there was a range in the amount and types of data presented. In general, the performance of students with disabilities was considerably below that of students without disabilities. While 38 states reported on students with disabilities in the Participation domain (graduation or exit data, enrollment data, dropout rates, time spent in various settings), only 12 provided data beyond that required for federal reporting. The additional areas were participation in large scale assessments and family involvement.

Our analysis revealed that states are beginning to report on the performance and progress of students with disabilities. The data that do exist confirm suspicions about low performance, but do not yet provide information on performance over time. We should see dramatic changes in reporting practices when we analyze 1998 state reports. These changes in reporting practices will provide the data needed to monitor the progress and performance of students with disabilities.


Overview

Over the past 10 years there has been an increased demand for accountability for the results of education for all students, especially students with disabilities. There has also been a push to include students with disabilities in school reform activities. While substantial changes have been made in education, there is still a concern that reforms and change initiatives have not led to satisfactory results for students with disabilities.

Very limited data are available on the results of education for students with disabilities. The few reports available generally have presented a bleak picture of outcomes. Most of these reports are from special government studies rather than on-going data collection programs. In the mid-1980s, Congress mandated a longitudinal study that reported on the secondary school experiences of a sample of students with disabilities, as well as post-secondary outcomes in employment, education, and independent living. In this study, Wagner, Newman, D'Amigo, Jay, Butler-Nalin, Marder, and Cox (1991) found that only 15% of students with disabilities attended a post-secondary school one year after high school, 30% had not held a paid job, 40% of those employed only worked part-time, 1 in 5 overall had been arrested, and nearly 40% of youth left school by dropping out.

About one-fourth of youth with disabilities had been enrolled in post-secondary vocational schools or 2-year or 4-year colleges by three to five years after leaving high school, almost twice as many as had been enrolled in the first two years after high school (Wagner, D'Amico, Marder, Newman, & Blackorby, 1992). However, in the general population, nearly 68% of youth were enrolled in some type of post-secondary education. Three to five years after high school, only about one in nine students with disabilities had earned some type of post-secondary education degree, certificate, or license.

Results from the National Adult Literacy Survey (NALS) revealed that adults with any type of disability were more likely than those in the total population to perform in the lowest literacy levels (Kirsch, Jungeblut, Jenkins, & Kolstad, 1993). The performance gap between those who reported having a particular disability and those in the total population ranged from 24 to 154 points across the scales used.

More recent analyses of the National Education Longitudinal Study (NELS) of 1988 have shown that students identified by teachers and parents as having a disability earned lower high school grades in core courses, scored lower on math and reading proficiency tests, and were more likely to drop out of school than students without disabilities (Rossi, Herting, & Wolman, 1997). Finally, these students also had lower educational expectations for themselves and by their parents. The publication of these data and the lack of a more consistent set of data have resulted in a call to look more closely at the results of education for students with disabilities, with hopes of improving services for these students.

There is also limited information about the inclusion/exclusion of students with disabilities in large-scale assessments. The NELS 1988 sample of students in eighth grade was estimated to have excluded about 5.4% of all potential students due to either limitations in language proficiency or to mental and physical disabilities (Ingels, 1996). This was similar to the percentage of students excluded from the 1988 NAEP study (5.3 percent) (as cited in Ingels, 1996). It was estimated that approximately 2.0 percent were excluded due to language proficiency, leaving about 3.4 percent due to mental or physical disabilities. This translates roughly to about 34 percent of the students with disabilities—meaning that about 66% of these students were included. These figures may be overestimates since both studies excluded students in residential and separate school placements.

Policymakers, researchers, educators, and families attempted to address many needed educational reforms with the signing of the Individuals with Disabilities Education Act Amendments of 1997 (P.L. 105-17). This law called for a change in focus from “processes or access to education” to “outcomes,” and for major changes in public reporting and accountability procedures (Ysseldyke, Thurlow, Kozleski, & Reschly, 1998). As of July 1998, state education agencies were required to report on the participation and performance of students with disabilities on statewide assessments.

Numerous states are focusing their efforts on improving reporting practices, but several analyses have revealed that many states fall short of what the new requirements mandate. For example, a survey of special education directors (Erickson & Thurlow, 1997) indicated that data gathered on students with disabilities are not publicly reported in most states, but are used primarily for internal review. Only 32 regular and unique states (e.g., Guam, Palau) reported that they have readily available information on the number of students with disabilities who participate in any of their statewide assessments. Of those states that indicated they report such data, only half were able to provide the numbers when requested to do so. Furthermore, the state directors pointed to the altruistic motivation of parents and teachers to “protect” students from testing and high stakes for schools as the leading reasons for not encouraging students with disabilities to participate in assessment programs. It appears that students with disabilities are not encouraged to participate in statewide testing even when appropriate, and if they do participate, participation data usually have not been reported for them.

Our goal was to report on how students with disabilities are doing academically and non-academically. To do this we used the NCEO framework (Ysseldyke, Krentz, Elliott, Thurlow, Erickson, & Moore, 1998) as the basis for our analysis of public state accountability reports. NCEO's framework of educational results goes beyond test participation data. This comprehensive framework, initially created by hundreds of nationally-representative stakeholders, includes both academic and nonacademic domains (refer to Figure 1). Stakeholders identified six domains of desired outcomes, including data on responsibility and independence, personal and social well-being, citizenship, academic and functional literacy, physical health, and (student/parent/community) satisfaction. The NCEO framework specifies outcomes, indicators, and sources of data in each of the six results domains. The complete framework shown in Figure 1 includes Inputs/Resources and Educational Processes, as well as Results, but the focus of our analysis was on Results, and certain components of the student-oriented domains within Educational Processes.


Methods

Data for this report were gathered from public documents that report data on the performance of students. The appropriate documents to analyze were identified by using the annual Council of Chief State School Officers (CCSSO) state accountability survey as a guide (Council of Chief State School Officers, 1997, prepublication copy). This annual survey is sent to state accountability offices and used to obtain the titles of each accountability document available from the state. Each state accountability office was contacted by NCEO staff and the documents listed by CCSSO were requested. In addition, NCEO staff requested any additional information specifically on special education that was available from the state. We asked only for published, public data, either in paper form or on the World Wide Web.

Data were gathered between October 1997 and March 1998. Though most reports were obtained by the end of November 1997, the last report was collected in March 1998. Since we were interested in publicly available accountability data for students with disabilities, we specifically requested data from state accountability offices rather than state special education offices. In those cases where data were not available on students with disabilities, we also contacted and requested any published data from state special education offices. We attempted to obtain every available accountability document that included students with disabilities. However, reports are continually being produced and sometimes reports available through one unit in a state department of education are unknown to another department. Thus, it was difficult to verify the extent to which we had obtained all available reports. In some cases, where recent data were not obtained in time for our cut-off dates, older data that were available were used. It is important to keep in mind that the reports obtained from states spanned the school years 1995-96 through 1997-98 even though all reports were obtained during 1997-98.

For this analysis, we obtained 115 accountability reports (see Appendix A). Each report was searched thoroughly for data on students with disabilities. Fifty documents contained outcome data on students with disabilities. The data were then coded according to the NCEO framework (Ysseldyke et al., 1998). Enough data for summary analyses were obtained in only two categories: (1) Educational Results for Systems and Individuals, and (2) Educational Processes, specifically Student-Oriented Domains. Sporadic data were obtained in other domains of the NCEO framework.

Wherever possible, data are presented in this report in the same way that states presented them. In some cases, data were taken from larger data bases, or from several different sources and were formatted in order to increase clarity. The data, however, only include information actually stated in the public reports. A summary of cautions about the data included in this report is presented in Table 1.

 

Specific Data Included in Each Domain

Educational Results. Data from three domains were collected in the area of Educational Results: Academic and Functional Literacy, Personal and Social Well-Being, and Satisfaction. Most of the data included in this area consisted of disaggregated test scores for students with disabilities. When reproducing these data we noted when descriptions of the tests were given and when a description of the scoring rubrics or standards used to determine proficiency were provided. We present here the actual data provided, including the scores of students without disabilities for comparison, and definitions of terms used if these were part of the accountability documentation.

Educational Processes. Data from two domains were collected in the area of Educational Processes: Participation and Family Involvement. Most of the data reported by states in these areas are included in the Annual Report to Congress (U.S. Department of Education, 1997), including enrollment, placement, and graduation data. These data are not included in our analysis. We do mention, however, when these data are included in public accountability reports, since data in these reports are more widely available to the general public than are the data in the Annual Report to Congress. When reproducing these data we included the following:

Data not reproduced, but mentioned in this report include enrollment, placement, and students with disabilities exiting educational programs.

Every effort was made to gather as much of the publicly available data on students with disabilities as possible, and to be fair, thorough, and consistent in data analysis. When interpreting these data, it is important to keep these considerations in mind:


Results

Of the 115 reports that were analyzed from 50 states, a total of 59 reports (39 states) included data on students with disabilities in the domain(s) of academic and functional literacy and/or in the student-oriented process domains. The actual data of educational results and processes on students with disabilities reported in state accountability reports were collated and are reproduced in this document. Because states often produce multiple reports (see Thurlow, Langenfeld, Nelson, Shin & Coleman, 1998), we opted to analyze all data in terms of state performance (e.g., number of states reporting on test scores or number of students participating in testing).

The reproduced data are presented, categorized by state and domain, in Appendices B and C. In Appendix B are the data on how students with disabilities are doing in domains of Academic and Functional Literacy, Personal and Social Well-Being, and Satisfaction. Descriptions of data sources are provided if the information was in the reports; not all states provided contextual information. Information that states provided on the Student-Oriented Domains of Participation and Family Involvement is listed in Appendix C. Again, any clarifying information in the actual report(s) is included here.

A summary of which states report data on educational results and processes is provided in Table 2. As indicated in the Results area, the most frequent domain for which data were presented was Academic and Functional Literacy. Only two states included other areas (Kansas has Personal and Social Well-Being data; New York has Satisfaction data as well as Academic and Functional Literacy). In the Process area, most states reported on Participation.

 

Educational Results

Thirteen states disaggregated performance data for students with disabilities in the area of Academic and Functional Literacy (Connecticut, Delaware, Georgia, Louisiana, Maine, New Hampshire, New York, North Carolina, North Dakota, Rhode Island, South Carolina, Texas, Virginia). These states provided information on how students with disabilities performed on statewide assessments. Generally, the data are for one year only. There is very little information included in state accountability documents on how students with disabilities are performing over time and whether there is improvement or progress in performance from year to year.

Three states (Nevada, Oregon, Vermont) completed special studies on the academic and functional literacy of students with disabilities. (These unique indicators that were reported are not gathered annually.) For example, Vermont reported the results of a pilot study on the outcomes of IEP interventions for special education students.

Two states reported on other domains of results for students with disabilities. Kansas, the only state to report on the area of Personal and Social Well-Being, cited the number of violent acts committed by students with disabilities. New York reported data in the domain of Satisfaction: the results of a Consumer Satisfaction Survey on vocational rehabilitation services provided to special education students.

For the 13 states that presented information on statewide assessments, the most frequently reported content areas (see Table 3) were: reading (12 states) and math (11 states). Only six states reported social studies data. Ten states reported on students with disabilities in three or more content areas (Connecticut, Georgia, Maine, New Hampshire, New York, North Carolina, North Dakota, Rhode Island, South Carolina, Texas).

According to CCSSO (1998), 19 states had a high stakes graduation exit exam in 1997. Fifteen of these states reported graduation exam results for regular education students (Florida, Georgia, Indiana, Louisiana, Maryland, Mississippi, New Jersey, New Mexico, New York, North Carolina, Ohio, South Carolina, Tennessee, Texas, Virginia), and only 47% of the 15 states (7 states: Georgia, Louisiana, New York, North Carolina, South Carolina, Texas, Virginia) reported these results for students with disabilities.

Only a handful of states presented any other types of data in the domain of Academic and Functional Literacy. Georgia reported the results of retests on its graduation exam. New York and Texas both provided extensive data on students with disabilities in the area of Academic and Functional Literacy. These two states have state assessments in place, a graduation exam, and end-of-course assessments that include students with disabilities. Furthermore, both of these states have other unique indicators in this domain. New York has an Occupational Education Proficiency Exam. Texas has an assessment, the Texas Academic Skills Program Test (TASP), that provides results of college entrance exams for students entering Texas institutions of higher education. Both of these states provided clear and concise data on students with disabilities and should be viewed as models for their reporting practices.

A synthesis of the state achievement test data of students with disabilities is presented in Table 4. Because it is difficult to aggregate and analyze achievement data of states due to differences in tests, standards, rubrics, the time of year given, content difficulty of tests, accommodations given, exclusion of students, the grade the test was given, or the year the data were collected, we decided to examine how students performed relative to standards set by the states. We used the percentage of students above the passing score or other index of “adequate” performance. As indicated in Table 4, approximately 30-50 percent fewer students with disabilities are meeting standards than are students without disabilities. Looking at score results within states, students with disabilities generally performed similarly on math and reading assessments. Yet a couple of states did have significant discrepancies between the number of students who met their state's standard in these content areas. For example, 31.6% of special education students in New York passed state standards on the Pupil Evaluation Program (PEP) assessment in reading while 63.7% passed the PEP assessment in math.

The percentage of students with disabilities meeting state standards in reading achievement ranged from 27.5% to 50.4% (see Table 5). Figure 2 depicts the differences between percentages of students meeting standards in reading. The three states that had the smallest discrepancy between the percentage of students with and without disabilities were New York (on one of two tests), Rhode Island, and Connecticut.

 

Educational Processes

In the area of Student-Oriented Domains, 38 states reported on students with disabilities (see Table 2). Although this number is greater than for the area of Academic and Functional Literacy, approximately 25 percent of states are not reporting in this area. In Table 6 we provide a summary of educational process data, specifically Participation and Family Involvement data. In the area of Participation, states reporting on such indicators as the number of students with disabilities participating in large scale assessments, graduation or exit data, enrollment data, dropout rates, or time spent in various settings are noted in Table 6.

The only Educational Process indicators that were not part of federal reporting requirements when these documents were produced were participation in large-scale assessments and family involvement. Twelve states included these data in reporting on students with disabilities (see Table 6). Approximately the same number of states (13) did not report on any Participation indicators. Six states reported on four or more indicators in the area of Participation (Connecticut, Louisiana, New Jersey, New York, South Dakota, Texas). Overall, only 25% of the states reported on educational process indicators of Student-Oriented domains that they are not required to report to Congress. Thus, little educational process data are reported on students with disabilities that are not already federally mandated.

Information on family involvement was scarce in state reports. Only one state included any information in the domain. Oregon included the number of families and children served through a special parent education program for families considered to be at-risk for having children with disabilities.

Of those requirements that are mandated to be reported in the Annual Report to Congress, the majority of states (33) reported on the enrollment of students with disabilities, making it the most common indicator reported for these students. Ten states (Colorado, Georgia, Kansas, Louisiana, Maine, New Jersey, New York, South Dakota, Texas, Virginia) reported drop-out data on students with disabilities in their public reports. Graduation/exit data on students with disabilities were reported by 11 states (Alaska, Colorado, Connecticut, Georgia, Louisiana, Mississippi, New Jersey, New York, South Dakota, Texas, Virginia). Eleven states reported on students with disabilities' time spent in various settings (Connecticut, Louisiana, New Jersey, New York, Oregon, Rhode Island, South Dakota, Texas, Utah, Vermont, Washington). Three states (New York, Oklahoma, Texas) had unique indicators on students with disabilities—failure to graduate, post-education outcomes, number of students returning to general education, advanced course completion, and retention rates.

Table 7 is a compilation of the participation data available in the state accountability reports. Data provided in this table include:

Twelve states provided some type of participation data of students with disabilities in statewide assessments. Only two states (Connecticut and Maine) provided participation data as the number of students with disabilities who took the test, divided by the population of all students with disabilities at the grade level being tested. Three states (New Jersey, Oregon, and South Carolina) provided just the number of students with disabilities tested. Exemption data, giving the percentage of all students with disabilities who were excluded from testing, were provided by five states (Connecticut, Massachusetts, New York, Oregon, and Texas). Arizona provided only the number of students who were excluded from testing. From the data available (using both participation data in column 4 and exemption data in column 6), it appears that between 50 and 80% of students with disabilities are participating in testing in the 12 states that reported participation data.


Discussion

States are beginning to report data on students with disabilities. The data presented in this report, which were obtained from state reports, are intended to be used as a general overview of the performance of students with disabilities, and should be interpreted with caution for a number of reasons. States gather and report data at different times. States vary in their reporting practices, the types of tests, rubrics and standards used to judge performance, and the amount of data collected on students with disabilities.

Table 8 summarizes the information obtained from state reports in the area of educational results. Although most of the data on students with disabilities provided in state reports were educational process data, approximately 26% of the states did disaggregate performance data on statewide assessments. Generally, the data represented only one year; little information was available on how students with disabilities performed over time and whether there was improvement or progress in performance from year to year. Looking at score results within states, students with disabilities performed similarly on math and reading assessments, yet a couple states did have significant discrepancies between the number of students with disabilities who met the state's criteria in the two areas. For example, 31.6% of special education students in New York passed state criteria on the Pupil Evaluation Program (PEP) assessment in reading while 63.7% passed the PEP assessment in math. The low reading achievement is comparable to the NALS results that adults with any type of disability were more likely than those in the total population to perform in the lower literacy levels (Kirsch et al., 1993). Despite some consistencies in data across states, and with national data, states are only beginning to report these types of achievement data on students with disabilities, making it impossible to generalize these results to all 50 states.

Table 9 summarizes the information obtained from state reports in the area of Educational Processes, excluding those federal requirements for state reporting. In the area of Student-Oriented Domains, 38 states reported on students with disabilities. Though many of the states reported only enrollment information, 25% of the states did not report any process data on students with disabilities. Twelve states did include participation in large-scale assessment or family involvement information about students with disabilities. Oregon reported on family involvement, and this included the number of families and children served through a special parent education program for families considered to be at-risk for having children with disabilities. This information is pertinent to understanding the extent to which students with disabilities and their families are actively participating in public education.

It was difficult to determine and interpret participation rates in the 12 states that reported such data because it was still not clear exactly what proportion of students with disabilities were participating in tests. Some states only provided the number of students with disabilities tested, leaving out those students with disabilities who were excluded from testing. From the data available, it appears that between 50 and 80% of students with disabilities did participate in testing. This is similar to the estimated 66% of students included in both the 1988 NELS sample and the 1988 NAEP sample (Ingels, 1996). This is still far from the recommended 85% of students with disabilities who should be participating in statewide assessments (Ysseldyke, Thurlow, McGrew, & Shriner, 1994).

There were many difficulties in determining and interpreting participation rates. These difficulties remain the same as those identified by Erickson, Thurlow, and Ysseldyke (1996). Erickson et al. (1996) defined participation rates as “the number of students with disabilities who take the test, divided by the population of all students with disabilities at the particular age or grade level being tested” (pp. 3-4). Erickson et al. (1996) also provided the following recommendations to help states report on the participation of students with disabilities in large-scale assessments:

(1) Identify students with disabilities in statewide assessment programs. Since 13 states published disaggregated data on students with disabilities, we know that this is being done in at least these states.

(2) Standardize procedures for calculating participation rates. Assessments often are not conducted in alignment with December 1st Child Count data (Erickson et al., 1996), so it is difficult to determine the actual percentage of students with disabilities who took the test. For example, New Hampshire provided the total number and percentage of test-takers whose test booklets were coded that the person had a disability. In a separate section of the report, they also provided the number and percentage of students who were excluded from testing. With these two numbers, we can add to get the total number of students with disabilities at the time of testing (provided this number does actually reflect all students with disabilities) and calculate a participation rate. However, New Hampshire does not report participation rates on its own, and most states do not provide actual participation rates at all.

These data are additionally difficult to interpret due to lack of information on important factors, such as the number and percentage of students with disabilities who took the test with and without accommodations, who took parts but not all of the tests, or were tested below grade level. Exemption data may also fail to include those students with disabilities who were not “excluded” from the test, but were still not tested due to factors such as attendance, or parent preference. For example, in Connecticut, 5.6% of students with disabilities were fully exempted from testing, but an additional 14.4% of students with disabilities were reported as “status not recorded.” It is clear from these data that additional information as well as standardization across states would make it easier to assess the participation rates of students with disabilities in large-scale testing.

Although states are beginning to report on the performance and progress of students with disabilities, it is still important to look at why more are not reporting these data. There are several possible explanations for the limited amount of information on students with disabilities:

With the recently passed IDEA Amendments, states are now federally mandated to report on the participation and performance of students with disabilities. In order to successfully fulfill these requirements, it is imperative that states have clear guidelines about what is expected for reporting practices. Having models or frameworks as examples of best practices can help in expediting this process.

Our data reflect that states were just beginning to report on students with disabilities. In many states, students with disabilities were still “out of sight.” However, we do not know the extent to which they also were “out of mind.” We should see dramatic changes in reporting practices when we analyze 1998 state reports. Hopefully, these changes will reflect the belief that reporting practices must be more inclusive if we are to have any hope for monitoring the progress and performance of students with disabilities.


References

Council of Chief State School Officers. (1997, Aug.) State education accountability and indicator reports: 1997 (prepublication copy). Washington, DC: CCSSO.

Council of Chief State School Officers. (1998, Aug.). Standards, graduation, assessment, teacher licensure, time and attendance: A fifty state report. Washington, DC: CCSSO.

Erickson, R. E., & Thurlow, M. L. (1997). 1997 State special education outcomes. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Ingels, S. J., & Quinn, P. (1996, May). Sample exclusion in NELS: 88. Characteristics of base year ineligible students, changes in eligibility status after four years (NCES 96-723). Washington, DC: National Center on Education Statistics.

Kirsch, I. S., Jungeblut, A., Jenkins, L, & Kolstad, A. (1993, Sept.). Adult literacy in America: A first look at the results of the National Adult Literacy Survey. Washington, DC: Educational Testing Service, National Center for Education Statistics.

Rossi, R., Herting, J., & Wolman, J. (1997, June). Profiles of students with disabilities as identified in NELS: 88 (NCES 97-254). Washington, DC: National Center for Educational Statistics.

Thurlow, M. L., Langenfeld, K. H., Nelson, J. R., Shin, H., & Coleman, J. E. (1998, May). State accountability reports : What are states saying about students with disabilities? Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

U.S. Department of Education. (1997). Nineteenth annual report to Congress on the implementation of the Individuals with Disabilities Education Act. Washington, DC: U.S. Department of Education.

Wagner, M., D'Amico, R., Marder, C., Newman, L., & Blackorby, J. (1992). What happens next? Trends in postschool outcomes of youth with disabilities. Menlo Park, CA: SRI International.

Wagner, M., Newman, L., D'Amico, R., Jay, E. D., Butler-Nalin, P., Marder, C., & Cox, R. (1991). Youth with disabilities: How are they doing? Menlo Park, CA: SRI International.

Ysseldyke, J., Krentz, J., Elliott, J., Thurlow, M., Erickson, R., & Moore, M. (1998). Framework for educational accountability. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Ysseldyke, J. E., Thurlow, M. L., Kozleski, E., & Reschly, D. (1998). Accountability for the results of educating students with disabilities: Assessment conference report on the new assessment provisions of the 1997 amendments to the Individuals with Disabilities Education Act. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Ysseldyke, J. E., Thurlow, M. L., McGrew, K. S., & Shriner, J. G. (1994). Recommendations for making decisions about the participation of students with disabilities in statewide assessment programs. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Appendix A

State Accountability Reports Included in Analysis

Alabama Department of Education (1994). State of Alabama department of education bulletin 1994 - No. 2 annual report 1994. Statistical and financial data for 1993-1994. Montgomery, AL: Author.

*Alabama Department of Education (1997). System report card [On-line]. Available: World Wide Web: http://www.alsde.edu/.

Alaska Department of Education (1997, April). Summary of Alaska's public school districts': Report cards to the public: School year 1995-96. Juneau, AK: Author.

Arizona Department of Education (1997a). Statewide report and appendices: Arizona pupil achievement testing 1995-96 school year. Tempe, AZ: Author.

*Arizona Department of Education (1997b). State report, Arizona student achievement program. Tempe, AZ: Author.

*Arizona Department of Education (1997c). Arizona school report card [On-line]. Available: World Wide Web: http://www.ade.state.az.us/.

*Arkansas Department of Education (1996, December). Arkansas department of education Annual school district 1995-96 report card. Little Rock, AR: Author.

*Arkansas Department of Education (1997). Advisory council for the education of children with disabilities. Little Rock, AR: Author.

California Department of Education (1997). Public school summary statistics [On-line]. Available: World Wide Web: http://www.cde.ca.gov/.

Colorado Department of Education (1997, October). Foundations for high achievement: Safety, civility, literacy. K-12 public education. Denver, CO: Author.

Connecticut State Board of Education (1997a, January). Special education in Connecticut 1995-96. Hartford, CT: Author.

*Connecticut State Board of Education (1997b, February). Profile of our schools: Condition of education in Connecticut 1995-96. Hartford, CT: Author.

*Connecticut State Department of Education, (1997c). Strategic school profiles, 1995-96. CD-ROM. Hartford, CT: Author.

Delaware Department of Education (1997, October). State summary report: 1997 Delaware writing assessment program. Dover, DE: Author.

*Delaware Department of Public Instruction (1995, September). Delaware interim assessment report 1993-1995. Dover, DE: Author.

District of Columbia Office of Educational Accountability. (1996, December). A five year statistical glance at D.C. public schools: School years 1991-92 through 1995-96. Washington, DC: Author.

Florida Department of Education (1996a, December). School advisory council report 1995-96: Sample report: elementary school. Tallahassee, FL: Author.

Florida Department of Education (1996b, December). School advisory council report 1995-96: Sample report: middle school. Tallahassee, FL: Author.

Florida Department of Education (1996c, December). School advisory council report 1995-96: Sample report: High school. Tallahassee, FL: Author.

Georgia State Department of Education (1996a, December). 1995-96 Report cards: State, systems and schools. CD-ROM. Atlanta, GA: Author.

*Georgia State Department of Education (1996b, June). Georgia student assessment program. Official state summary. Georgia high school graduation tests, Spring 1996. Atlanta, GA: Author.

*Georgia State Department of Education (1996c). Georgia student assessment program. Official state summary. Curriculum-based assessment program results, 1995-96. Atlanta, GA: Author.

*Georgia State Department of Education (1996d, July). Georgia grades 5 and 8 writing assessments. Official state summary, 1995-96. Atlanta, GA: Author.

*Georgia State Department of Education (1996e). Georgia student assessment program. Official state summary. Norm-referenced test scores, 1995-96. Atlanta, GA: Author.

*Georgia State Department of Education (1996f). Grade three CBA. State summary information, 1996. Atlanta, GA: Author.

Hawaii Department of Education (1996). School status and improvement report: School year 1994-1995. Honolulu, HI: Author.

Hawaii Department of Education (1997, May). The superintendent's seventh annual report on school performance and improvement in Hawaii: 1996. Honolulu, HI: Author.

*Idaho State Department of Education Public School Finance (1997, February). Annual statistical report 1996-1997. Public school certified and noncertified personnel. Boise, ID: Author.

*Idaho State Department of Education (1995). 1995-96 Idaho school profiles. Boise, ID: Author.

*Illinois State Board of Education (1996a). IGAP (Illinois goal assessment program): Summary of student achievement in Illinois 1995-1996. Springfield, IL: Author.

*Illinois State Board of Education (1996b). 1996 Sample school report card, Elementary Springfield, IL: Author.

*Illinois State Board of Education (1996c). 1996 Sample school report card, High School. Springfield, IL: Author.

Indiana Department of Education (1997). Indiana K-12 school data [On-line]. Available: World Wide Web: http://www.doe.state.in.us/.

Iowa Department of Education (1997). The annual condition of education report, 1997: A report on pre-kindergarten, elementary, and secondary education. Des Moines, IA: Author.

*Kansas State Department of Education (1997, August). Kansas assessment program: Results of 1997 mathematics, reading, science, and social studies. Topeka, KS: Author.

Kansas State Board of Education (1996). Accountability report 1995-96. Topeka, KS: Author.

*Kentucky Department of Education (1997, September). Kentucky school and district corrected accountability results. Louisville, KY: Author.

*Kentucky Department of Education (1996). KIRIS 1995-96 assessment curriculum report. Louisville, KY: Author.

*Louisiana Department of Education (1997a, March). 1995-96 Louisiana progress profiles state report. Baton Rouge, LA: Author.

*Louisiana Department of Education (1997b, March). 1995-96 Louisiana progress profiles district composite report. Baton Rouge, LA: Author.

*Louisiana Department of Education (1997c, April). 1995-96 Louisiana progress profiles district composite report. CD-ROM Baton Rouge, LA: Author.

Louisiana Department of Education (1997d, July). State special education data profile. Baton Rouge, LA: Author.

Maine Department of Education (1996, October). Maine educational assessment report. Sample, Grade 8. Augusta, ME: Author

Maine Department of Education (1997). Maine Education Assessment (MEA) scores, 1996-97 school year. [On-line]. Available: World Wide Web: http://www.state.me.us/education/homepage.htm.

Maryland State Department of Education (1996, December). Maryland school performance report, 1996: State and school systems. Baltimore, MD: Author.

*Massachusetts Executive Office of Education (1997a). Massachusetts school district profiles. Boston, MA: Author.

Massachusetts State Department of Education (1997b). 1997 School district profiles [On-line]. Available: World Wide Web: http://www.doe.mass.edu/.

*Michigan Department of Education (1997). Michigan school report [On-line]. Available: World Wide Web: http://www.med.state.mi.us/.

*Minnesota Department of Children, Families, and Learning (1997) [On-line]. Information Available: World Wide Web: http://www.educ.state.mn.us/.

Mississippi Department of Education (1997). Annual report of the state superintendent of public education, 1997. Jackson, MS: Author.

Mississippi Department of Education, Department of Accountability (1996). Mississippi report card [On-line]. Available: World Wide Web: http://mdek12.state.ms.us/report/acc.htm.

Missouri State Board of Education (1996a). 1995-96 Report of the public schools of Missouri. Jefferson City, MO: Author.

*Missouri State Department of Elementary and Secondary Education (1996b, February). Profiles of Missouri public schools. Financial, pupil & staff data for fiscal year 1995-96. Jefferson City, MO: Author.

*Montana Office of Public Instruction (1997, April). Montana statewide summary, 1995-96 Student Assessment Rule 10.56.101, ARM. Helena, MT: Author.

*Montana Office of Public Instruction (1997). Montana public school enrollment data: Fall 1996-97. Helena, MT: Author.

*Nebraska Department of Education (1997). Statistics and facts about Nebraska public schools. Lincoln, NE: Author.

*Nevada Department of Education (1995). Analysis of Nevada school accountability system (based on NRS 385.347) submitted to Nevada State Legislature. Carson City, NV: Author.

New Hampshire Department of Education (1996a). Educational Improvement and Assessment Program: Educational assessment report: End-of-grade six. Concord, NH: Author.

New Hampshire Department of Education (1996b). Educational Improvement and Assessment Program: Educational assessment report: End-of-grade ten. Concord, NH: Author.

New Hampshire Department of Education (1996c). Educational Improvement and Assessment Program: Educational assessment report: End-of-grade three. Concord, NH: Author.

*New Hampshire Department of Education (1996d). Educator report 1995-96: Statistical report #2. Concord, NH: Author.

*New Hampshire Department of Education (1997). Fall enrollments as of October 1, 1996; student teacher ratio for 1996-97; teacher average salary for 1996-97. Concord, NH: Author.

*New Hampshire Department of Education (1995). Race of pupils enrolled in New Hampshire schools 1989-1995: Statistical report #3.

New Jersey State Department of Education (1995a). April 1995 Grade 11 High School Proficiency Test (HSPT11): State Summary. Trenton, NJ: Author.

New Jersey State Department of Education (1995b). March 1995 Grade 8 Early Warning Test (EWT): State Summary. Trenton, NJ: Author.

New Jersey State Department of Education Office of Special Education Programs (1996). Special education: A statistical report for the 1995-96 school year. Trenton, NJ: Author.

New Mexico State Department of Education (1996, Nov.). The accountability report: Indicators of the condition of public education in New Mexico. Santa Fe, NM: Author.

*New York State Education Department (1996). Comprehensive assessment report (a resource document). Albany, NY: Author.

*North Carolina State Board of Education (1996). Report card 1996: The state of school systems in North Carolina. Raleigh, NC: Author.

North Carolina State Board of Education (1997a, February). The 1995-96 North Carolina state testing results: Multiple-choice end-of-grade and end-of-course tests. Raleigh, NC: Author.

*North Carolina State Board of Education (1997b). North Carolina Open-Ended Assessment Grades 5 and 6. Raleigh, NC: Author.

North Carolina State Board of Education (1997c, July). North Carolina testing program; report of student performance in writing 1996-1997: Grades 4, 7, and 10. Raleigh, NC: Author.

*North Carolina State Board of Education (1997d). State of the state: Education performance in North Carolina, 1996. Raleigh, NC: Author.

North Dakota Department of Public Instruction (1996). North Dakota school report of the state superintendent of public instruction 1995-1996. Bismark, ND: Author.

North Dakota Department of Public Instruction (1997). North Dakota 1997 research results (State-wide testing) (NP of the mean NCE). Bismarck, ND: Author.

Ohio Department of Education (1997). Education Management Information System (EMIS) State Profile — FY95 vs FY 96). Accessed through World Wide Web: www.ode.ohio.gov: choose “Reports/Data,” choose “EMIS Profiles,” choose “State Composite Profiles 1996,” information cited is on page 5. Direct www address for 1996 file: http://ode000.ode.ohio.gov/www/ims/emis_profiles/state_profile96.txt.

*Oklahoma State Department of Education (1997a, June). Profiles 1996 Oklahoma educational indicators program: District report volume II - eastern Oklahoma. OK: Author.

*Oklahoma State Department of Education (1997b, June). Profiles 1996 Oklahoma educational indicators program: District report volume II - western Oklahoma. OK: Author.

Oklahoma State Department of Education (1997c, April). Profiles 1996 Oklahoma educational indicators program: State report. OK: Author.

Oregon Department of Education (1997a, September). 1996-97 Oregon report card: An annual report to the legislature on Oregon public schools. Salem, OR: Author.

Oregon Department of Education (1997b). 1996 Status report: Special education, student services, and compensatory education. Salem, OR: Author.

*Pennsylvania Department of Education (1997a, June). The 1995-96 Pennsylvania school profiles [On-line]. Available: World Wide Web: http://www.paprofiles.org/.

*Pennsylvania Department of Education (1997b, March). The Pennsylvania system of school assessment: 1995-96 School-by-school results quartile distribution. Harrisburg, PA: Author.

Pennsylvania Department of Education (1996, September). Status report on education in Pennsylvania. Harrisburg, PA: Author.

Rhode Island Department of Elementary and Secondary Education (1997a, June). Rhode Island public schools: 1996 District profiles. Providence, RI: Author.

Rhode Island Department of Elementary and Secondary Education (1997b, April). Statistical profile of special education 1995-96. Providence, RI: Author.

*Rhode Island Department of Elementary and Secondary Education (1997c, May). Student performance in Rhode Island, 1996. Providence, RI: Author.

South Carolina Department of Education (1996a). School performance profile. Columbia, SC: Author.

South Carolina Department of Education (1996b). South Carolina education profiles. Columbia, SC: Author.

South Carolina Department of Education (1996c). State performance profile. Columbia, SC: Author.

South Carolina State Board of Education (1996d, December). What is the penny buying for South Carolina? Columbia, SC: Author.

South Dakota Department of Education & Cultural Affairs (1997, January). 1995-96 Education in South Dakota: A statistical profile. SD: Author.

Tennessee Department of Education (1997, November). 21st Century Report Card 1997 [On-line]. Available: World Wide Web: http://www.state.tn.us/education/rptcrd97

Texas Education Agency (1996, November). Academic Excellence Indicator System for the state of Texas 1994-95 [On-line]. Available: World Wide Web: http://www.tea.state.tx.us

The University of the State of New York, New York State Education Department, and the Office of Vocational and Educational Services for Individuals with Disabilities (1996b). VESID: 1996 Pocketbook of Goals and Results for Individuals with Disabilities. Albany, NY: Author.

The University of the State of New York, and the New York State Education Department, and the Office of Vocational and Educational Services for Individuals with Disabilities (1996a). Consolidated Special Education Performance Report for 1994-1995: State grant program, section 611 IDEA, preschool grant program, section 619 IDEA, state operated programs, chapter 1 ESEA. Albany, NY: Author.

The University of the State of New York, and The New York State Education Department (1997a). A report to the Governor and the Legislature on the educational status of the State's schools: Submitted February 1997: Statewide Profile of the Educational System. Albany, NY: Author.

The University of the State of New York, and the New York State Education Department (1997b). A report to the Governor and the Legislature on the educational status of the State's schools: Submitted February 1997: Statistical profiles of public school districts. Albany, NY: Author.

*Utah State Office of Education (1996a). 1995-96 Accountability reports for all districts and schools. Salt Lake City, UT: Author.

Utah State Office of Education (1996b). 1995-96 State superintendent of public instruction: Annual report. Salt Lake City, UT: Author.

Utah State Office of Education (1996c, January). 1995-96 Annual report of the state superintendent of public instruction: Summary of statistical and financial data. Salt Lake City, UT: Author.

*Vermont Department of Education (1996, March). A scorecard for school finance FY95. Montpelier, VT: Author.

*Vermont Department of Education (1997a, June). Summary of the annual statistical report of schools: Describing and summarizing the FY96 school statistical report. Montpelier, VT: Author.

Vermont Department of Education (1997b, January). Vermont special education: Expenditures, equity and outcomes. Montpelier, VT: Author.

Vermont Department of Education & Center for Rural Studies (1997). Vermont department of education school reports [On-line]. Available: World Wide Web: http://crs.uvm.edu/schlrpt.

*Virginia Department of Education (1997a). 1997 interpretive guide to reports. Richmond, VA: Author.

Virginia Department of Education (1997b). 1997 Virginia summary report: Outcome accountability project. Richmond, VA: Author.

*Washington Office of Superintendent of Public Instruction (1997, February). Washington state assessment program: Grades 4, 8, and 11. WA: Author.

Washington Office of Superintendent of Public Instruction, Special Education (1996, August). Special education: Fourth annual report for special education services in Washington state. WA: Author.

*West Virginia Department of Education (1997, May). West Virginia performance based accreditation system: School district approval status and school accreditation status. WV: Author.

*West Virginia Department of Education (1997). West Virginia report cards 1995-96: State, county & school data. WV: Author.

*Wisconsin Department of Public Instruction (1997, October). Wisconsin statewide school performance report 1997 [On-line]. Available: World Wide Web: http://www.dpi.state.wi.us.

*Wyoming Department of Education (1997, February). Statistical report series no. 1: 1996 School district property valuations, mill levies and bonded debt. Cheyenne, WY: Author.

*Wyoming Department of Education (1997, March). Statistical report series no. 2: 1996 School district property valuations, mill levies and bonded debt. Cheyenne, WY: Author.

*Wyoming Department of Education (1997, January). Statistical report series no. 3: 1995-96 Wyoming public school fund accounting and reporting. Cheyenne, WY: Author.

* Indicates state reports that were collected but did not contain information on students with disabilities (n=55).


Figures and Charts from Technical Report 23 are available by purchasing the report from the NCEO Publications Office, or by viewing the PDF version.