Data on LEP Students in State Education Reports


Minnesota Report 26

Published by the National Center on Educational Outcomes

Prepared by Kristin Liu, Deb Albus, and Martha Thurlow

August 2000


This document has been archived by NCEO because some of the information it contains is out of date.


Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Liu, K., Albus, D., & Thurlow, M. (2000). Data on LEP students in state education reports (Minnesota Report No. 26). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://education.umn.edu/NCEO/OnlinePubs/MnReport26.html


Overview

Title I of the Elementary and Secondary Education Act mandates that state education agencies develop and implement an assessment system that “allows for disaggregation of results at state, district and school levels, by gender, race, English proficiency and migrant status” (Baker, 1996). The primary goals of Title I legislation are to focus on high standards, promote effective instruction and improve the quality of school curricula and instruction (U.S. Department of Education, 1996). Schools receiving Title I funds must demonstrate “adequate yearly progress” in student performance, as stated in Public Law 103-382 (Linn & Herman, 1997). Determining adequate yearly progress depends on the existence of high quality assessment data on the performance of students receiving Title I services, particularly on the achievement of students with limited English proficiency who are struggling to learn academic English and content material at the same time.

Some educators and policymakers argue against reporting assessment data specifically for students with limited English proficiency, saying that it reinforces the idea that these students are different from their peers (August & Hakuta, 1994), and that information on limited English proficient (LEP) students will be misunderstood or misused to blame LEP students for their lack of achievement (NCES, 1996). However, it is important to keep in mind that the true aim of reporting results is to improve student learning (NCREL, 2000). As Linn and Herman (1997) point out, assessments provide the targets for teaching and learning. Publication of assessment data can motivate educators and others to conduct more in-depth analyses of what students are learning, how students are being taught and where changes can be made in curriculum and teaching methods to improve student learning. “When properly presented, assessment reports can help build support for schools and for initiatives that educators wish to carry out” (NCREL, 2000).

Each year, states and districts report on the performance of students on achievement tests. The Council of Chief State School Officers (1997) produced a report that listed all of the education reports that states indicated they produce. The National Center on Educational Outcomes (NCEO) (Thurlow, Langenfeld, Nelson & Ysseldyke, 1998) recently studied the extent to which these reports provided data on the performance of students with disabilities. They found that only 12 states had reported any data on the performance of students with disabilities.

The purpose of this report is to examine practices in the reporting of LEP student performance data throughout the 50 states and the District of Columbia. We accomplished this by examining reports published in 1998 and that included data spanning 1995-96 through 1997-98. This information provides important evidence of the extent to which states are looking at the performance of their LEP students.


Methods

For our analysis, we used data obtained from state public documents that report performance data on student testing. The documents used in the analysis had been collected by staff at NCEO. They had contacted each state’s accountability office to request documents listed by CCSSO (Council of Chief State School Officers, 1997) and any other supplemental information the state could give about documents on the World Wide Web or in published form on paper.

Documents were received between October 1997 and March 1998. These reports spanned school years 1995-96 through 1997-98 and were included in a report focusing on special education students, which was published by NCEO in 1998 (Thurlow et al., 1998).

For this analysis we used 73 of the 115 reports collected by NCEO. Reports that were not used had primarily been published as special education reports and did not use LEP descriptors in reporting. Data used for analysis in this report are presented as close in form to their original presentation in state reports as possible.


Results

The 65 reports and nine Internet-only published data were from 50 states, including the District of Columbia. From these, there were nine reports that included data that disaggregated LEP student performance in statewide testing. The tests from which the data came included commercial or state-developed graduation standards tests, grade level testing, and literacy tests. Appendix A includes a list of the documents used for this report. Appendix B includes brief descriptions of information found in the reports, in the same detail as provided by the reports.

An indication of which states provided disaggregated data on the performance of LEP students in at least one test at one grade level is given in Table 1. All reported LEP student data were within annual or other regularly published reports. As seen in this table there were six states that reported disaggregated test scores for LEP students (Delaware, Georgia, New Hampshire, North Carolina, Rhode Island, and Virginia).

 

Table 1:  States that Report Disaggregated LEP Student Test Data

State

Yes/No

State

Yes/No

State

Yes/No

Alabama

 

No

Louisiana

No

Ohio

No

Alaska

 

No

Maine

No

Oklahoma

No

Arizona

 

No

Maryland

No

Oregon

No

Arkansas

 

No

Massachusetts

No

Pennsylvania

 

California

 

No

Michigan

No

Rhode Island

Yes

Colorado

 

No

Minnesota

No

South Carolina

No

Connecticut

 

No

Mississippi

No

South Dakota

No

Delaware

 

Yes

Missouri

No

Tennessee

No

Florida

 

No

Montana

No

Texas

No

Georgia

 

Yes

Nebraska

No

Utah

No

Hawaii

 

No

Nevada

No

Vermont

No

Idaho

 

No

New Hampshire

Yes

Virginia

Yes

Illinois

 

No

New Jersey

No

Washington

No

Indiana

 

No

New Mexico

No

West Virginia

No

Iowa

 

No

New York

No

Wisconsin

No

Kansas

 

No

North Carolina

Yes

Wyoming

No

Kentucky

 

No

North Dakota

No

District of Columbia

No

Five states reported only the participation of LEP students in testing and not their performance (Alaska, Maryland, New Jersey, Texas, and Washington.) Other states did report scores by other categories of students (e.g., Hispanic), but not specifying LEP status. For example, 11 states only reported scores based on race or ethnicity, with varying degrees of specificity (Arizona, Colorado, Connecticut, Florida, Kansas, Kentucky, Maryland, New Mexico, Pennsylvania, South Carolina, and Texas.) Also, one state, Arizona, reported scores by native/home language group, but did not specify students as LEP or non-LEP.

 

Types of Test Scores Reported for LEP Students

Table 2 shows the states that reported disaggregated LEP scores by year and the types of test scores that were reported. Five of the states reported scores from state developed tests. Of these, two states reported only writing tests (Delaware and North Carolina). The other three states reported state tests results on multiple subjects. One other state (Rhode Island) used the Metropolitan Achievement Test (MAT), a standardized test, and thus also reported on multiple subjects.

 

Table 2:  Types of LEP Student Test Data Disaggregated

 

 

 

Statewide Tests

State

(Type of test)

Years

Grades

Writing

Lang Arts/

Rdg

Math

Social Studies

Science

Composite

Delaware

(State test)

1993, 1996, 1997

3,5,8,10

X

 

 

 

 

 

 

1994-1997

1995-1997

3-5,5-8 and

8-10

X

X

X

X

X

 

Georgia

(State test)

1996

11

X

X

X

X

X

 

New Hampshire

(State test)

1996

End grade 3,

End grade 6

X

X

X

X

X

X

X

X

X

X

 

North Carolina

(State test)

1996-1997

4,7

X

 

 

 

 

 

Rhode Island

(MAT)

1996-1997

4, 8, 10

X

X

X

 

 

 

Virginia

(State LTP)

1994-1995

6  to 11

X

X

X

 

 

X

 

1994-1995

9  to 11

X

X

X

 

 

X

 (ungraded)*

 

1994-1995

None reported

X

X

X

 

 

X

 (ungraded)**

 LTP = Literacy Test Program
* Grade is where students would be if they passed the literacy tests.

** Ungraded for other reasons than literacy testing program (alternative programs)

 

As shown in the chart, the grades for which data were most commonly reported were 10 and 8. The majority of states reporting LEP student data did so for writing, followed by reading and math. The table shows that 4 of the 6 states reported testing LEP students in the areas of reading and math (Georgia, New Hampshire, Rhode Island, and Virginia). Five states reported tests in writing (Delaware, New Hampshire, North Carolina, Rhode Island, and Virginia). Other subject areas, Social Studies and Science, were provided by only a couple of states (Georgia and New Hampshire).

Overall, documents that reported the performance of LEP students (see Appendix B) showed that LEP students are not performing as well on state assessments as other students. States reporting only writing tests showed LEP students with low passing rates. Both Delaware and North Carolina used a scale of 4 points. For Delaware, over three test years, students scoring at least 2.5 ranged from 17-33% in grade 3, 16-19% in grade 5, 14-36% in grade 8 and 11-52% in grade 10. In North Carolina, 26% scored 2.5 or above in grade 4 and 26.2% scored 2.5 or above in grade 7. Compared to the non-LEP student scores reported (see Appendix B), LEP students scored consistently lower across years, which may be expected because these students are learning both language and content.

Table 3 shows how LEP students performed by state by percentage passing. It is not meant to be a comparison of students across states, because states use completely different assessment instruments and may have determined different passing levels.

 

Table 3:   Delaware LEP Student Test Data (*need to change references in text)

 

Grades

 

Year

Percent Receiving Each

Score (scale 1-4)

 

Average Score

 

% at 2.5 or above

4

3.5

3

2.5

2

1.5

1

3rd

1997

0%

0%

0%

33%

33%

17%

17%

1.9

33%

 

1996

0%

0%

9%

8%

28%

20%

35%

1.7

17%

 

1993

0%

5%

9%

11%

35%

12%

28%

1.9

25%

5th

1997

0%

0%

8%

8%

21%

25%

38%

1.6

16%

 

1996

0%

2%

10%

6%

27%

14%

41%

1.7

18%

 

1993

1%

0%

10%

8%

43%

21%

17%

1.9

19%

8th

1997

0%

0%

9%

5%

68%

9%

9%

2.0

14%

 

1996

6%

2%

30%

11%

32%

8%

11%

2.4

49%

 

1993

5%

0%

23%

8%

36%

8%

20%

2.1

36%

10th

1997

0%

0%

0%

11%

67%

11%

11%

1.9

11%

 

1996

2%

0%

32%

16%

34%

9%

7%

2.3

50%

 

1993

3%

3%

30%

16%

24%

11%

13%

2.3

52%

States reporting only writing tests showed LEP students with low passing rates (see Tables 3 and 4.) Both Delaware and North Carolina used a scale of 4 points. In Delaware, over three test years, the percentage of students scoring at least 2.5 were as follows: 17-33% in grade 3, 16-19% in grade 5, 14-49% in grade 8 and 11-52% in grade 10. Delaware did not report total number of LEP students tested. In North Carolina, 26% scored 2.5 or above in grade 4 (N=730) and 26.2% scored 2.5 or above in grade 7 (N=649). Compared to the non-LEP student scores reported (see Appendix B), LEP students scored consistently lower across years.

 

Table 4:   North Carolina LEP Student Test Data (*new # need to change references in text)

Grades

Year

Focused Holistic Score Points - %

No. Tested

% at 2.5 or above

 

 

4

3.5

3

2.5

2

1.5

1

NS

 

 

4

1996-97

0%

.7%

11.6%

13.7%

57.3%

5.1%

8.5%

3.2%

730

26.0%

7

1996-97

0%

.2%

13.1%

12.9%

50.4%

5.4%

13.9%

4.2%

649

26.2%

For states reporting on multiple subjects, LEP student performance on reading tends to be the area more difficult to pass, though in some states social studies and science are the most difficult to pass, as shown below in Table 5.

 

Table 5:   Georgia and Virginia LEP Student Test Data (* new # need to change references in text)

 

 

% of LEP Students Passing State Tests with Multiple Subjects

 

 

English

Math

Soc. Studies

Science

Writing

All Tests

State

Grade

N tested

% pass

N tested

% pass

N tested

% pass

N tested

% pass

N tested

% pass

N tested

% pass

Georgia

11

372

44%

373

64%

366

33%

286

*26-32%

--

--

--

--

10,11,12

500

49%

515

65%

500

*41-47%

488

*31-37%

570

39%

--

_

Virginia

6

407

40%

548

71%

--

--

--

--

402

58%

388

32%

7

235

27%

215

36%

--

--

--

--

193

45%

  80

  9%

8

294

31%

237

42%

--

--

--

--

215

46%

  93

16%

9

229

20%

252

39%

--

--

--

--

211

51%

114

11%

10

243

23%

201

43%

--

--

--

--

213

44%

118

14%

11

246

26%

151

54%

--

--

--

--

209

60%

111

19%

Repeat 9

  81

30%

  43

12%

--

--

--

--

  53

40%

  21

  5%

Repeat10

  63

18%

  37

16%

--

--

--

--

  45

40%

  19

  0%

-- means not reported
* means estimated

Also, for repeat test takers in Virginia, math proved to be a subject with fewer LEP students passing. It should be noted too that the percentage of LEP students passing all subjects is low.

Scores for Rhode Island and New Hampshire were not given in the format of percentage passing. Instead they were reported in proficiency levels across subjects (see Tables 6 and 7).

 

Table 6:   Rhode Island LEP Student Test Data (* new # need to change references in text)

 

Subject

Grade 4

Grade 8

Grade 10

N

% Low

%

Mid

% High

N

% Low

%

Mid

% High

N

%

Low

%

Mid

%

High

 

 

 

 

 

 

 

 

 

 

 

Reading

509

90.4%

  9.0%

0.6%

227

95.1%

  4.9%

0.0%

199

69.3%

16.6%

14.1%

Math

499

89.8%

  7.8%

2.4%

226

85.5%

12.4%

1.8%

194

67.5%

19.6%

12.9%

Writing

 

603

76.2%

23.1%

0.7%

352

77.5%

22.2%

0.3%

169

76.3%

23.7%

  0.0%

 

Table 7:   New Hampshire LEP Student Test Data (* new # need to change references in text)

New Hampshire Bilingual/LEP

English

% Basic or above

Math

% Basic or above

Social Studies

% Basic or above

End of grade 3

41%

45%

Not tested

End of grade 6

16%

11%

11%


Discussion

It is apparent from our analysis that LEP student performance often is not disaggregated from the scores included in education reports in the U.S. And, even when the data are disaggregated, they are not necessarily available in all the same ways as the performance data are for other students. For example, some states had LEP student performance data on Web sites but these data were not available in printed documents. It may also be that some states provide data on the performance of LEP students in other documents. Because this analysis was conducted using documents offered by NCEO, we did not have the opportunity to ask for additional documents that might contain only LEP data. When NCEO asked for reports with data on special education students, it received numerous additional documents. Whether this would have also occurred for LEP students remains a question. Further, states do not all classify limited English proficient students using the same criteria and may use different terms to refer to these students. For example, in one document we found the term LES (Limited English Speaking). It is important for states to explain their terminology, because “LEP” may be used for all students who are eligible for services or all students receiving services. Before it is appropriate to look at data across states, there needs to be a common name and criteria for identifying LEP students.

We noted that states may report test performance for other categories such as language group, race, or ethnicity, but these do not indicate whether students are receiving ESL or Bilingual services. Also, states may decide to exempt some or all LEP students from testing, thus reports would have no LEP data, while other states may test LEP students with regular students but choose to report them together; this merging does not allow one to track LEP performance. Still, other states may provide an alternative test in an LEP student’s first language. These data may be reported in a completely different report from regular students’ results, or may not be reported at all. Finally, states may not be consistent in disaggregating LEP performance over time. For example, LEP student performance may be reported in one study or report, but not in later ones. Other states may choose to report LEP performance for some tests, but not for others.

It is recommended that there be consistent terminology and criteria for identifying LEP students. Further, all states should be expected to report the number of LEP students exempted from testing, as well as the performance of students tested. This is required for Title I assessments. Also, it should be the goal of states to track the disaggregated performance of LEP students over time, so that their needs can be better identified.

 

Conclusion

While we believe states have come a long way in the inclusion of students with limited English proficiency in statewide assessments, at this time few states publish documents containing assessment data specifically for LEP students. One of the biggest issues that we noted in those states that disaggregate data is a lack of consistency in where results are reported (Internet vs. published documents), how results are disaggregated (race, ethnicity, or LEP status) and what terminology is used to refer to students with limited English. These issues make it difficult to interpret and use the limited amount of disaggregated data that currently exist to improve instruction for LEP students.


References

August D., Hakuta, K., & Pompa, D. (1994). For all students: LEP students and Goals 2000. A discussion paper. Paper presented at the NAE panel meeting No. 15., Washington, D.C.

Council of Chief State School Officers. (1997). State education accountability and indicator reports. Pre-publication copy. Washington, DC: Author.

Baker, E. (1996). Assessment requirements under Title I of the Elementary and Secondary Education Act. In Improving America’s school: A newsletter on issues in school reform [On-line, Baker, E., (Ed.)]. Available: http://www.ed.gov/pubs/IASA/newletters/assess/pt2.html.

Linn, R., Herman, J. (1997). A policymaker’s guide to standards-led assessment. Denver, Colorado: ECS Distribution Center.

National Center for Education Statistics. (1996). Proceedings of the conference on inclusion guidelines and accommodations for LEP students in the LEP students in the NAEP: December 5-6, 1994. (NCES 96-86[1]). Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

North Central Regional Educational Laboratory. (2000). Critical issue: Reporting assessment results [On-line]. Available: http://www.ncrel.org/sdrs/areas/issues/methods/assment/as600.htm.

Thurlow, M. L., Liu, K., Erickson, R., Spicuzza, R., & El Sawaf, H. (1996). Accommodations for students with limited English proficiency: Analysis of guidelines from states with graduation exams (Minnesota Assessment Report 6). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

U.S. Department of Education. (1996). Mapping out the national assessment of Title 1: The interim report [On-line]. Available: http:/www.ed.gov/pubs/NatAssess/index.html.

Ysseldyke, J., Thurlow, M. L., Langenfeld, K., Nelson, R., Teelucksingh, E., & Seyfarth, A. (1998). Educational results for students with disabilities: What do the data tell us? (Technical Report 23). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Appendix A

State Accountability Reports Included in Analysis

Alabama Department of Education (1997). System report card [On-line]. Available: World Wide Web: http://www.alsde.edu/.

Alaska Department of education (1997, April). Summary of Alaska’s public school districts’: Report cards to the public School year 1995-9 6. Juneau, AK: Author.

Arizona Department of Education (1997a). Statewide report and appendices: Arizona pupil achievement testing 1995-96 school year. Tempe, AZ: Author.

Arizona Department of Education (1997b). State report, Arizona student achievement program . Tempe, AZ: Author.

Arkansas Department of Education (1996, December). Arkansas Department of education Annual school district 1995-96 report card. Little Rock, AR: Author.

California Department of education (1997). Public school summary statistics [On-line]. Available: World Wide Web: http://www.cde.ca.gov/.

Colorado Department of Education (1997, October). Foundations for high achievement : Safety, civility, literacy. K-12 public education. Denver, CO: Author.

Connecticut State Board of Education (1997b, February). Profile of our schools: Condition of education in Connecticut 1995-96. Hartford, CT: Author.

Delaware Department of education (1997, October). State summary report: 1997 Delaware writing assessment program. Dover, DE: Author.

District of Columbia Office of Educational Accountability. (1996, December). A five year statistical glance at DC public schools: School years 1991-92 through 1995-96. Washington, DC: Author.

Florida Department of Education (1996a, December). School advisory council report 1995-96: Sample report : elementary school. Tallahassee, FL: Author.

Georgia State Department of Education (1996b, June). Georgia student assessment program. Official state summary. Georgia high school graduation tests, Spring 1996. Atlanta, GA: Author.

Georgia State Department of Education (1995, June). Georgia Student Assessment Program Official State Summary: Georgia High School Graduation Tests, Spring 1995. Atlanta, GA: Author.

Hawaii Department of Education (1997, May). The superintendent’s seventh annual report on school performance and improvement in Hawaii: 1996. Honolulu, HI: Author.

Idaho State Department of Education (1995). 1994-95 Idaho school profiles. Boise, ID: Author.

Illinois State Board of Education (1996a). IGAP (Illinois goal assessment program): Summary of student achievement in Illinois 1993-1996. Springfield, IL: Author.

Illinois State Board of Education (1996c.). 1996 Sample school report card, High School. Springfield, IL: Author.

Indiana Department of Education (1997). Indiana K-12 school data [On-line]. Available: World Wide Web: http://www.doe.state.in.us/.

Iowa Department of Education (1997). The annual condition of education report, 1997: A report on pre-kindergarten, elementary, and secondary education. Des Moines, IA: Author.

Kansas state Board of Education (1996). Accountability report 1995-96. Topeka, KS: Author.

Kentucky Department of education (1997, September). Kentucky school and district corrected accountability results. Louisville, KY: Author.

Kentucky Department of Education (1996). KIRIS 1995-96 assessment curriculum report. Louisville, KY: Author.

Louisiana Department of Education (1997a, March). 1995-96 Louisiana progress profiles state report. Baton Rouge, LA: Author.

Louisiana Department of Education (1997c, April). 1995-96 Louisiana progress profiles district composite report. CD-ROM Baton Rouge, LA: Author.

Maine Department of Education (1997). Maine Education Assessment (MEA) scores, 1996-97 school year. [On-line]. Available: World Wide web: http;//www.state.me.us/education/homepage.htm.

Maryland State Department of Education (1996, December). Maryland school performance report, 1996: State and school systems. Baltimore, MD: Author.

Massachusetts State Department of Education . DOE. Massachusetts Educational Assessment Program, 96 Statewide Summary, October 1996. No citation.

Michigan Department of Education (1997). Michigan school report [On-line]. Available: World Wide Web: http://www.med.state.mi.us/.

Mississippi Report Card (1995). MDE. (1994-95).

Mississippi Department of Education, Department of Accountability (1996). Mississippi report card [On-line]. Available: World Wide Web: http://mdek12state.ms.us/report/acc.thm.

Missouri State Board of education (1996a). 1995-96 Report of the public schools of Missouri. Jefferson City, MO: Author.

Missouri State Department of elementary and Secondary Education (1996b, February). Profiles of Missouri public schools. Financial, pupil & staff data for fiscal year 1995-96. Jefferson City, MO: Author.

Montana Office of Public Instruction (1997, April). Montana statewide summary, 1995-96 Student Assessment Rule 10.56.101, ARM. Helena, MT: Author.

Montana Office of Public Instruction (1997). Montana public school enrollment data: Fall 1996-97. Helena, MT: Author.

Nebraska Department of Education (1997). Statistics and facts about Nebraska public schools. Lincoln, NE: Author.

Nevada Department of Education (February, 1995). Analysis of Nevada School Accountability System (based on NRS 385.347.) Submitted to Nevada State Legislature. Nevada: Author.

New Hampshire Department of Education (1996a). Educational Improvement and Assessment Program: Educational assessment report: End-of-grade six. Concord, NH: Author.

New Hampshire Department of Education (1996c). Educational Improvement and Assessment Program: Educational assessment report: End-of -grade three. Concord, NH: Author.

New Jersey State Department of Education (1995a). April 1995 Grade 11 High School Proficiency Test (HSPT11): State Summary. Trenton, NJ: Author.

New Jersey State Department of Education (1995b) March 1995 Grade 8 Early Warning Test (EWT): State Summary. Trenton, NJ: Author.

New Mexico State Department of Education (1996, Nov.). The accountability report: Indicators of the condition of public education in New Mexico. Santa Fe, NM: Author.

New York State Education Department (1996). Comprehensive assessment report (a resource document). Albany, NY: Author.

North Carolina State Board of Education (1996). Report card 1996: The state of school systems in North Carolina. Raleigh, NC: Author.

North Carolina State Board of Education (1997b). North Carolina Open-Ended Assessment Grades 5 and 6. Raleigh, NC: Author.

North Caroline State Board of Education (1997c, July). North Carolina testing program; report of student performance in writing 1996-1997: Grades 4,7, and 10. Raleigh, NC: Author.

Ohio Department of Education (1997). Education Management Information System (EMIS) State Profile - FY95 vs. FY96). Accessed through World Wide Web: www.ode.ohio.gov:choose “Reports/Data,” choose”EMIS Profiles,”choose “State Composite Profiles 1996,” information cited is on page 5. Direct www address for 1996 file: http://ode000.ode.ohio.gov/www/ims/emis_profiles/state_profile96.txt.

Oklahoma State Department of Education (1997a June). Profiles 1996 Oklahoma educational indicators program: District report volume II - eastern Oklahoma. OK: Author.

Oklahoma State Department of Education (1997b, June). Profiles 1996 Oklahoma educational indicators program: District report volume II western Oklahoma. OK: Author.

Oklahoma State Department of education (1997c, April). Profiles 1996 Oklahoma educational indicators program: State report. OK: Author.

Oregon Department of Education (1997a, September). 1996-97 Oregon report card: An annual report to the legislature on Oregon public schools. Salem, OR: Author.

Pennsylvania Department of Education (1997a, June). The 1995-96 Pennsylvania school profiles [On-line]. Available: World Wide Web: http://www.paprofiles.org/.

Pennsylvania Department of Education (1997b, March). The Pennsylvania system of school assessment: 1995-96 School-by-school results quartile distribution. Harrisburg, PA: Author.

Pennsylvania Department of Education (1996, September). Status report on education in Pennsylvania. Harrisburg, PA: Author.

Rhode Island Department of Elementary and Secondary Education (1997a, June). Rhode Island public schools: 1996 District profiles. Providence, RI: Author.

Rhode Island Department of Elementary and Secondary Education (1997c, May). Student performance in Rhode Island, 1996. Providence, RI: Author.

South Carolina Department of Education (1996a). School performance profile. Columbia, SC: Author.

South Carolina Department of Education (1996b). South Carolina education profiles. Columbia, SC: Author.

South Carolina Department of Education (1996c). State performance profile. Columbia, SC: Author.

South Dakota Department of Education & Cultural Affairs (1997, January). 1995-96 Education in South Dakota: A statistical profile. SD: Author.

South Dakota Department of Education & Cultural Affairs (1997). 1996-97 Education in South Dakota: A statistical profile. SD: Author.

Texas Education Agency (1996, November). Academic Excellence Indicator System for the state of Texas 1994-95 [On-line]. Available : World Wide Web: http://www.tea.state.tx.us.

Texas Education Agency (?) Academic Excellence Indicator System for the state of Texas 1995-96 [On-line]. Available: World Wide Web: http://www.tea.state.tx.us.

The University of the State of New York, and the New York State Education Department, and the Office of Vocational and Educational Services for Individuals with Disabilities (1996a). Consolidated Special Educational Performance Report for 1994-1995: State grant program, section 611 IDEA, preschool grant program, section 619 IDEA, state operated programs, chapter 1 ESEA. Albany, NY: Author.

The University of the State of New York, and the New York State Education Department (1997a). A report to the Governor and the Legislature on the educational status of the State’s schools: Submitted February 1997: Statewide Profile of the Educational System. Albany, NY: Author.

The University of the State of New York, and the New York State Education Department (1997b). A report to the Governor and the Legislature on the educational status of the State’s schools: Submitted February 1997: Statistical profiles of public school districts. Albany, NY: Author.

Utah State Office of Education (1996b). 1995-96 State superintendent of public instruction: Annual report. Salt Lake City, UT: Author.

Utah State Office of Education (1997, January). 1995-1996 Utah Statewide Testing Program: Appendix to the Annual Report of the State Superintendent of Public Instruction. Salt Lake City, UT: Author.

Vermont Department of Education & Center for Rural Studies (1997). Vermont department of education school reports [On-line]. Available: World Wide Web:http://crs.uvm.edu/schlrpt.

Virginia Department of Education (1997b). 1997 Virginia summary report: Outcome accountability project. Richmond, VA: Author.

Report of the Virginia Literacy Testing Program Spring 1995. (1995, July). VA DOE.

Washington Office of Superintendent of Public Instruction (1997, February). Washington state assessment program: Grades 4, 8, and 11. WA: Author.

Wisconsin Department of Public Instruction..(February, 1996). Wisconsin Student Assessment System 1995-96 8th Grade Knowledge & Concepts Examinations: Average Grand Composite Score Results for Districts and Schools Within Districts. WI: Author.


Appendix B

Summary of Reports Reviewed and Disaggregated LEP Data

Due to technical reasons, these report summaries and disaggregated data cannot be provided on-line. To order the printed version of this Report, which includes Appendix B, see the Publications Catalog for ordering information.