2016 Survey of States: State Activities Amid Evolving Educational Policies

Martha Thurlow, Christopher Rogers, & Sheryl Lazarus

September 2017

All rights reserved. Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Thurlow, M. L., Rogers, C., & Lazarus, S. S. (2017). 2016 survey of states: State activities amid evolving educational policies. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Table of Contents


The Mission of the National Center on Educational Outcomes

NCEO Staff

Deb Albus
Linda Goldstone
Sheryl Lazarus
Kristi Liu
Michael Moore
Darrell Peterson
Christopher Rogers
Kathy Strunk
Yi Chen-Wu


Martha Thurlow,
Director

 

IDEAs that Work

NCEO is a collaborative effort of the University of Minnesota, the National Association of State Directors of Special Education (NASDSE), and the Council of Chief State School Officers (CCSSO). NCEO provides national leadership in assisting state and local education agencies in their development of policies and practices that encourage and support the participation of students with disabilities, English learners, and English learners with disabilities in accountability systems and data collection efforts.

NCEO focuses its efforts in the following areas:

  • Knowledge Development on the participation and performance of students with disabilities in state and national assessments and other educational reform efforts.
  • Technical Assistance and Dissemination through publications, presentations, technical assistance, and other networking activities.
  • Leadership and Coordination to build on the expertise of others and to develop leaders who can conduct needed research and provide additional technical assistance.

The Center is supported through a Cooperative Agreement (#H326G110002) with the Research to Practice Division, Office of Special Education Programs, U.S. Department of Education. The Center is affiliated with the Institute on Community Integration at the College of Education and Human Development, University of Minnesota. The contents of this report were developed under the Cooperative Agreement from the U.S. Department of Education, but does not necessarily represent the policy or opinions of the U.S. Department of Education or Offices within it. Readers should not assume endorsement by the federal government.

Project Officer: David Egnor

National Center on Educational Outcomes
207 Pattee Hall
150 Pillsbury Dr. SE
Minneapolis, MN 55455
612/626-1530 • Fax: 612/624-0879 • http://nceo.info

The University of Minnesota is an equal opportunity educator and employer.


Acknowledgments

State Directors of Special Education and State Directors of Assessment and their designees have responded to NCEO surveys over the years to provide a snapshot of their activities, successes, and challenges since the early 1990s. This report, which would not be possible without the support of these individuals, provides a status update on states’ success and challenges, their responses to new educational policies, and their technical assistance needs. We truly appreciate the time taken by respondents to obtain information from other areas or departments, and we hope that this collaborative effort provided an opportunity to increase awareness within and across state programs and departments.

For their support, special thanks go to:

  • David Egnor, Office of Special Education Programs (OSEP), in the U.S. Department of Education
  • Eileen Ahearn, retired, National Association of State Directors of Special Education (NASDSE)
  • June De Leon, University of Guam, for her assistance in obtaining completed surveys from the Pacific unique states
  • Vitaliy Shyyan, National Center on Educational Outcomes
  • Michael Moore, National Center on Educational Outcomes

State Directors of Special Education

ALABAMA
Crystal Richardson

ALASKA
Don Enoch

ARIZONA
Karol Basel

ARKANSAS
Lisa Haley

CALIFORNIA
Kristen Wright

COLORADO
Angela Denning

CONNECTICUT
Isabelina Rodriguez

DELAWARE
Mary Ann Mieczkowski

FLORIDA
Monica Verra-Tirado

GEORGIA
Zelphine Smith-Dixon

HAWAII
Carey Tambio

IDAHO
Charlie Silva

ILLINOIS
Kate Anderson Foley

INDIANA
Pam Wright

IOWA
Barbara Guy

KANSAS
Colleen Riley

KENTUCKY
Gretta Hylton

LOUISIANA
Jamie Wong

MAINE
Janice Breton

MARYLAND
Marcella Franczkowski

MASSACHUSETTS
Marcia Mittnacht

MICHIGAN
Teri Chapman

MINNESOTA
Robyn Widley

MISSISSIPPI
Gretchen Cagle

MISSOURI
Stephen Barr

MONTANA
Frank Podobnik

NEBRASKA
Steve Milliken

NEVADA
Will Jensen

NEW HAMPSHIRE
Santina Thibedeau

NEW JERSEY
John Worthington (interim)

NEW MEXICO
Denise Koscielniak (interim)

NEW YORK
Pat Geary

NORTH CAROLINA
Bill Hussey

NORTH DAKOTA
Gerry Teevens

OHIO
Sue Zake

OKLAHOMA
Todd Loftin

OREGON
Sarah Drinkwater

PENNSYLVANIA
Pat Hozella

RHODE ISLAND
David Sienko

SOUTH CAROLINA
John Payne

SOUTH DAKOTA
Linda Turner

TENNESSEE
Allison Davey (interim)

TEXAS
Gene Lenz

UTAH
Glenna Gallo

VERMONT
Cindy Moran

VIRGINIA
John Eisenberg

WASHINGTON
Doug Gill

WEST VIRGINIA
Pat Homberg

WISCONSIN
Barbara Van Haren

WYOMING
Anne-Marie Williams

AMERICAN SAMOA
Paulo Salave’a (interim)

BUREAU OF INDIAN EDUCATION
Gloria Yepa

DEPARTMENT OF DEFENSE
David Johansen

DISTRICT OF COLUMBIA
Amy Maisterra

GUAM
Yolanda Gabriel

MARSHALL ISLANDS
Frank Horiuchi

MICRONESIA
Arthur Albert

NORTHERN MARIANA ISLANDS
Suzanne Lizama

PALAU
Helen Sengebau

PUERTO RICO
Carlos Rodriguez

U.S. VIRGIN ISLANDS
Renee Charleswell

These were the state directors
of special education in September,
2016 when NCEO administered
the survey


State Directors of Assessment

ALABAMA
Rebecca Mims

ALASKA
Margaret MacKinnon

ARIZONA
Irene Hunting

ARKANSAS
Hope Allen

CALIFORNIA
Michelle Center

COLORADO
Joyce Zurkowski

CONNECTICUT
Abe Krisst

DELAWARE
Theresa Bennett

FLORIDA
Vince Verges

GEORGIA
Melissa Fincher

HAWAII
Brian Reiter

IDAHO
Heidi Arrate

ILLINOIS
Angela Foxall

INDIANA
Charity Flores

IOWA
Colleen Anderson

KANSAS
Beth Fultz

KENTUCKY
Rhonda Sims

 


LOUISIANA
Jessica Baghian

MAINE
Charlene Tucker

MARYLAND
Doug Strader

MASSACHUSETTS
Michol Stapel

MICHIGAN
Andrew Middlestead

MINNESOTA
Jennifer Dugan

MISSISSIPPI
Walt Drane

MISSOURI
Shaun Bates

MONTANA
Sue Mohr

NEBRASKA
Valorie Foy

NEVADA
Peter Zutz

NEW HAMPSHIRE
Sandie MacDonald

NEW JERSEY
Jeffrey Hauger

NEW MEXICO
Lisa Chandler

NEW YORK
Steven Katz

NORTH CAROLINA
Lou Fabrizio

NORTH DAKOTA
Robert G. Bauer


OHIO
Jim Wright

OKLAHOMA
Craig Walker

OREGON
Mary Anderson

PENNSYLVANIA
Brian Campbell

RHODE ISLAND
Phyllis Lynch

SOUTH CAROLINA
Liz Jones

SOUTH DAKOTA
Abby Javurek-Humig

TENNESSEE
Deb Malone Sauberer

TEXAS
Gloria Zyskowski

UTAH
Jo Ellen Shaeffer

VERMONT
Michael Hock

VIRGINIA
Shelley Loving-Ryder

WASHINGTON
Deb Came

WEST VIRGINIA
Vaughn Rhudy

WISCONSIN
Lynette Russell

WYOMING
Deb Lindsey


AMERICAN SAMOA
Sam Urhle

BUREAU OF INDIAN
EDUCATION

Maureen Lesky

DEPARTMENT OF DEFENSE
Sandy Embler

DISTRICT OF COLUMBIA
Nikki Stewart

GUAM
Robert Malay

MARSHALL ISLANDS
Stanley Heine

MICRONESIA
Miyai M. Keller

PALAU
Raynold Mechol

PUERTO RICO
Angel Canales

U.S. VIRGIN ISLANDS
Alexandria Baltimore-Hookfin

These were the state directors
of assessment in September,
2016 when NCEO administered
the survey.


Executive Summary

This report summarizes the fifteenth survey of states by the National Center on Educational Outcomes (NCEO) at the University of Minnesota. Results are presented for 40 of the 50 regular states and eight of the 11 unique states. The purpose of this report is to provide a snapshot of the new initiatives, trends, accomplishments, and emerging issues during a period of new education laws and initiatives.

Key findings include:

  • Most responding states identified the validity of assessment results as a success and assessing English learners for accountability as a challenge.
  • Most responding states were concerned about possibly exceeding the one percent cap on participation in the alternate assessment based on alternate academic achievement standards (AA-AAS). The strategies most often used to avoid exceeding the cap were providing professional development for Individualized Education Program (IEP) teams and sharing AA-AAS participation data with districts.
  • Only a small number of responding states were planning to develop a state-defined alternate diploma.
  • Nearly all responding states were measuring college- and career-readiness, sometimes using college entrance exams for the state high school assessment.
  • Few responding states publicly reported assessment results disaggregated by disability category even though they did disaggregate by category to examine trends.
  • Most responding states reported that they used direct observation on test day to monitor the provision of accessibility features and accommodations; states also reported being challenged by either training educators to make decisions about accessibility features and accommodations (regular states) or arranging for accessibility features and accommodations (unique states).
  • More than half of responding regular states had made major revisions to their AA-AAS since 2014.
  • Just over one-half of responding regular states indicated that they disaggregated assessment results for English learners with disabilities.

States were continuing to address the need for inclusive assessments while facing new requirements for assessments and accountability systems. States also identified key areas of need for technical assistance to facilitate the successful implementation of inclusive assessments.


Overview of 2016 Survey of States

This report highlights the fifteenth survey of states by the National Center on Educational Outcomes (NCEO). It has been conducted for more than two decades to collect information from states about the participation and performance of students with disabilities in assessments during standards-based reform.

As in the past, NCEO asked state directors of special education and state directors of assessment to agree on their responses to the 2016 survey. In compiling their responses, the directors sometimes elicited assistance of other individuals in the department who had the best current knowledge of the state’s thinking, policies, and practices for including students with disabilities, and other students, in assessment systems and other aspects of educational reform. In many states, people collaborated on completing NCEO’s 2016 Survey of States.

Forty of the 50 regular states responded to the survey. In addition, eight of 11 unique states completed the survey in 2016. Most survey responses were submitted using an online survey tool. In at least one instance, Word or PDF files were provided to respondents who wished to complete the survey that way.

Survey respondents reported on trends in the large-scale assessment of students with disabilities and other groups of students. Topics addressed assessment participation, assessment performance, use of accessibility tools and accommodations, alternate assessments, and other related topics.

 

Eleven Unique States

American Samoa
Bureau of Indian Education
Department of Defense
District of Columbia
Guam
Marshall Islands
Micronesia
Northern Mariana Islands
Palau
Puerto Rico
U.S. Virgin Islands


Successful Practices and Recurring Challenges

For several assessment topics, state respondents were asked to indicate whether states had developed successful practices or faced recurring challenges. Respondents rated each item as very challenging, challenging, successful, or very successful. Most regular states reported that validity of general assessment results and validity of English language proficiency (ELP) assessment results were areas of success, and many regular states reported success with assessment accessibility and accommodations (see Table 1). Assessing English learners (ELs) for accountability purposes was reported to be challenging by more regular states than reported success in this area; some states found this area to be very challenging.

Unique states were mixed about the areas they experienced as successful or challenging. Still, nearly all unique states found both instructional accessibility and accommodations, and use of assistive technology for assessment activities, to be challenging. Most unique states, and many regular states, indicated that inclusion of students with disabilities in graduation tests was not applicable to them.

Table 1. Successful Practices and Recurring Challenges

Regular States Unique States
Very Challenging Challenging Successful Very Successful N/A Very Challenging Challenging Successful Very Successful N/A
Assessment accessibility and accommodations 0 9 19 8 0 0 3 4 0 0
Instructional accessibility and accommodations 1 13 18 1 3 0 6 1 0 0
Validity of general assessment results 0 3 24 9 0 0 3 3 1 0
Validity of English language proficiency (ELP)
assessment results
0 5 20 10 1 0 2 1 1 3
Validity of alternate assessment results 0 12 20 4 0 0 3 3 1 0
Inclusion of students with disabilities in formative assessments 1 8 9 4 14 0 3 3 0 1
Assessment of English learners (ELs) with disabilities for accountability purposes 5 16 12 3 0 0 4 3 0 0
Inclusion of ELs with disabilities in ELP assessment 3 9 20 4 0 0 2 2 0 3
Inclusion of students with disabilities in graduation tests 3 7 5 2 19 0 1 1 0 5
Use of assistive technology for assessment activities 1 18 14 3 0 1 5 1 0 0

Note: Thirty-six regular states and seven unique states responded to this survey question. State respondents were able to select multiple responses.


Every Student Succeeds Act (ESSA)

The Every Student Succeeds Act (ESSA) was signed into law on December 10, 2015. Two provisions of the law were addressed in the survey: the one percent participation cap for the alternate assessment based on alternate achievement standards (AA-AAS) and state-defined alternate diplomas. We also asked about perspectives on the opportunities and challenges of ESSA for students with disabilities, ELs, and ELs with disabilities. Over sixty percent of respondents in regular and unique states indicated that they were concerned about the possibility that the one percent cap would be exceeded (see Table 2).

Table 2: States Concerned about Exceeding the 1% Cap on the Alternate Assessment

Regular States Unique States
Response Count Percent Count Percent
Yes 25 62.5% 4 66.7%
No 15 37.5% 2 33.3%

Note: Forty regular states and six unique states answered this question.

Forty-six states (regular and unique) reported planning to use one or more strategies to ensure that districts do not exceed the one percent cap. Two strategies were identified by most regular states: providing professional development to district special education administrators for communicating to IEP teams, and sharing data with districts about the participation rates of students with significant cognitive disabilities in alternate assessments. Much less common among regular states were revising participation guidelines and providing information to parents of students who had participated in the alternate assessment. At least half of the unique states indicated the strategies of revising participation guidelines and providing information to parents; professional development and data sharing were much less commonly identified by unique states (see Figure 1).

State-defined Alternate Diplomas

ESSA allows states to develop a state-defined alternate diploma that can be counted in the graduation rate used for school accountability. When asked whether the state had a state-defined alternate diploma, states indicated they were involved in a variety of activities related to the development of alternate diplomas (see Figure 2). About two-thirds of the regular states and nearly all of the unique states indicated that they were not planning to develop a state-defined alternate diploma.

The regular states indicating that alternate diplomas were in various stages of development. Six states indicated they had already developed a state-defined alternate diploma, with five of these indicating that their diplomas met the ESSA requirements. One unique state indicated that it had developed an alternate diploma,
but was uncertain whether it meets ESSA requirements.

Figure 1: Strategies States Plan to Use to Ensure that Districts Do Not Exceed the 1% Cap

Figure 1 Bar Graph

Note: Thirty-nine regular states and seven unique state answered this question, out of the 40 regular states and eight unique states participating in the survey. Two regular states and one unique state responded “other”; these regular states indicated that they were still determining their strategies, and the unique state indicated that this question was not applicable. State respondents were able to select multiple responses.

Figure 2. Status of State-Defined Alternate Diplomas for Students with Significant Cognitive Disabilities

Figure 2 Bar Graph

Note: Thirty-four regular states and seven unique states answered this question out of the 40 regular states and eight unique states participating in the survey.

Most of the regular states and the one unique state reported that their alternate diplomas met only one ESSA requirement. Only one regular state indicated that its alternate diploma met all three requirements (see Table 3).

Table 3: Number of ESSA Requirements Addressed by State-Defined Alternate Diplomas

Regular States Unique States
Response Count Percent Count Percent
1 requirement 9 69.2% 0 0.0%
2 requirements 3 23.1% 0 0.0%
All 3 requirements 1 7.7% 0 0.0%

Note: Thirteen regular states answered this question. The response of one unique state that indicated it did not plan to develop an alternate diploma is not included in this table.

The ESSA requirement met by the greatest number of states (eight regular states and one unique state) was alignment to requirements for a regular diploma (see Figure 3). Seven regular states had alternate diplomas that met the requirement that they were received in the period of FAPE. Four regular states indicated that their alternate diplomas were standards-based.

Figure 3. ESSA Requirements Currently Addressed by State-Defined Alternate Diplomas

Figure 3 Bar Chart

Note: Thirteen regular states and one unique state answered this question. State respondents were able to select multiple
responses.

Of the states reporting alternate diplomas meeting only one requirement, five had diplomas aligned to requirements for regular diplomas, three were received in the period of FAPE, and one was standards-based. Of the regular states reporting that their alternate diplomas met two requirements, two indicated that they were standards-based and awarded within the period of FAPE; one was aligned to requirements for a regular diploma and received in the period of FAPE.

States identified the biggest opportunities and challenges for students with disabilities, ELs, and ELs with disabilities resulting from ESSA. The full list of opportunities and challenges identified by states are included in Appendix A.

Opportunities noted for students with disabilities focused on inclusion in general, and maintaining high expectations for learning and positive outcomes. Some states indicated the emphasis in ESSA on measuring growth.

Opportunities noted for ELs included increased visibility in the accountability system, attention to EL programs, and emphasis on monitoring and supporting ELs’ progress. One state indicated improvement in exiting criteria. Opportunities for ELs with disabilities echoed those noted for the other groups, with the addition of assessment supports and other improvements in assessment providing more accurate understanding of this group of students and their outcomes. (See Table A-1 in Appendix A.)

Challenges associated with ESSA for students with disabilities included the “1% cap rule” and compliance and implementation concerns. For ELs, states identified challenges in improving appropriate supports and assessments; some states also noted the expanding number of native languages for this student population. Challenges for ELs with disabilities included identification of these students, and disaggregating and interpreting their outcomes data. The alternate diploma guidelines were identified as both an opportunity and a challenge, in that they encourage students with significant cognitive disabilities to earn high school credentials, yet also challenge states’ capacity to set equitable exiting criteria. (See Table A-2 in Appendix A.)


College and Career Readiness

States indicated the assessments they used for measuring college- and career-readiness (CCR) (see Figure 4). For their regular assessment, 12 regular states and one unique state used the PARCC assessment, seven regular states and three unique states used the NCSC/MSAA, and six regular states and no unique states used the DLM assessment. Of the 21 regular states and three unique states using at least one consortium assessment, four regular states and one unique state used a combination of Smarter Balanced and NCSC/MSAA, three regular states and one unique state used PARCC and NCSC/MSAA, two regular states used PARCC and DLM, and two regular states used Smarter Balanced and DLM.

Figure 4. High School Assessments of College- and Career-Readiness Used by States

Figure 4 Bar Chart

Note: Forty regular states and seven unique states answered this question. State respondents were able to select multiple responses.

States also reported on their use of ACT and SAT for their high school assessments. Twenty regular states and one unique state reported that they used ACT, and 10 regular states used SAT. Of these 24 regular states and one unique state, seven regular states used both the ACT and SAT. Three regular states and three unique states reported not using any CCR assessments.

States providing additional comments about high school level testing of college and career readiness (21 regular states and two unique states) often indicated that their own assessments measured college and career readiness (see Table 4). Several ACT products were used: four regular states used WorkKeys; two regular states used ACT Aspire; and one regular state used the National Career Readiness Certificate (NCRC). Two unique states reported that students took entrance or placement tests used by their institutions of higher education. One regular state indicated that its CCR measurement is “to be determined.”

Table 4. Othera High School CCR Assessments Reported

Self-Reported Other Responsesb Count of
Regular States
Percentc of
Regular States
Count of
Unique States
Percentc of
Unique States
State-developed testsd 16e 40.0% 1 14.3%
WorkKeys 4 10.0% 0 0.0%
ACT Aspire 2 5.0% 0 0.0%
ASVABf 1 2.5% 0 0.0%
NCRC 1 2.5% 0 0.0%
State-specific college entrance/placement exam 0 0.0% 2 28.6%
To be determined 1 2.5% 0 0.0%

a Twenty-one regular states and two unique states reported “Other” CCR assessments, of the 40 regular state respondents and seven unique state respondents who answered this question.
b These categories are not mutually exclusive; four regular states and one unique state reported more than one “other” assessment.
c These proportions are based on the total numbers of state respondents who answered this question.
d Two states reported that they have used state-developed tests but plan to change to ACT or SAT in 2017.
e One regular state respondent reported that the CCR test was a combination of ACT, SAT, and state-developed test items.
f Armed Services Vocational Aptitude Battery.

States commented on accessibility and accommodations considerations in designing and implementing their assessments (see Table 5). Of the 12 regular states responding, just over half indicated that the testing company that designed their college- and career-readiness high school assessment also developed the accessibility and accommodations policies. Two other approaches employed some form of partnership between the state and the testing company in this work.

Table 5. Statea Approaches to Providing Accessibility and Accommodations on CCR Assessments Not Developed by States or Consortia

If you use a CCR high school assessment that is not a state- or consortium-developed assessment, what has been your approach to providing assessment accessibility and accommodations? 
Answer Options Response Count Response Percent
Testing company also developed the policies for accessibility and accommodations. 7 58.3%
State and testing company worked together to set the policies for accessibility and accommodations. 2 16.7%
State developed the policies for accessibility and accommodations for state accountability purposes while testing company developed the accessibility and accommodations policies for college-entrance purposes. 2 16.7%
Another approach is used to develop accessibility and accommodations policies (please describe your approach). 1b 8.3%

a Twelve regular states and no unique states answered this question.
b The state using “another approach” indicated that its approach is “to be determined.”

States indicated the alignment of their grade 3-8 assessments to college- and career-ready high school assessments. Most regular states indicated that a cross-grades standards setting was conducted or that the assessment was based on a vertical scale (see Figure 5). Unique states most often indicated that they did not know or that they used a suite of assessments available across grades.

Figure 5. Alignment of Grade 3-8 Assessments to CCR High School Assessments

Figure 5 Bar Chart

Note. Twenty-nine regular states and seven unique states answered this question. State respondents were able to select multiple responses.


Participation and Performance

Including students with disabilities in assessment and accountability processes draws attention to how these students participate and perform on large-scale assessments.

Participation Reporting Practices

Participation reporting practices varied across states in both 2014 and 2016 (see Table 6). More regular and unique states did not count students as participants, and students did not receive a score when they did not participate in the assessment in any way in 2016 than in 2014. More regular states in 2016 than in 2014 indicated that students attended (sat for) the assessment but did not complete enough items to earn a score.

Table 6. Reporting Practices for Counting Students as Assessment Participants

State
category
Survey year NOT counted as participants, and received no score Counted as participants, but received no score, score of zero, or lowest proficiency level NOT counted as participants, and earned score counted as valid Counted as participants, and earned score counted as valid
Students who did not participate in state assessments in any way (e.g., absent on test day, parent refusal) Regular states 2016 32 82.1% 7 17.9% 0 0.0% 0 0.0%
2014 37 75.5% 7 14.3% 0 0.0% 0 0.0%
Unique states 2016 6 75.0% 2 25.0% 0 0.0% 0 0.0%
2014 4 50.0% 0 0.0% 1 12.5% 1 12.5%
Students who attended (sat for) assessment, but did not complete enough items to score Regular states 2016 18 46.2% 17 43.6% 1 2.6% 3 7.7%
2014 12 24.5% 23 46.9% 0 0.0% 7 14.3%
Unique states 2016 1 12.5% 5 62.5% 0 0.0% 2 25.0%
2014 2 25.0% 3 37.5% 0 0.0% 1 12.5%
Students who used accommodations resulting in invalid scores (e.g., non-standard, modifications) Regular states 2016 14 35.9% 17 43.6% 2 5.1% 1 2.6%
2014 17 34.4% 20 40.8% 3 6.1% 1 2.0%
Unique states 2016 1 12.5% 4 50.0% 0 0.0% 1 12.5%
2014 2 25.0% 1 12.5% 0 0.0% 1 12.5%

Note. In 2014, 49 regular states and eight unique states answered this question, out of the 50 regular states and eight unique states participating in the survey. In 2016, 39 regular states and eight unique states answered this question, out of the 40 regular states and eight unique states participating in the survey.

Reporting Practices for Students by Disability Category

Twenty-two of 38 responding regular states (58%) reported disaggregating assessment results by primary disability category in 2016—a decrease from the 31 of 49 states (63%) in 2014 and 28 of 49 states (57%) in 2012, but still an increase from the 10 of 49 states (20%) in 2009, and 17 of 50 (34%) states in 2007. The most frequently listed reasons states gave for disaggregating results by disability category in 2016 (see Figure 6) were to examine trends; in contrast to previous year, few states indicated that results were disaggregated for reporting purposes. More often, states indicated that they did so to respond to requests.

Figure 6. Reasons for Reporting General Assessment Results by Disability Category for Regular Statesa,b,c

Figure 6 Bar Chart

a Eighteen regular states reported not disaggregating results by primary disability in both 2016 and 2014.
b In 2016, 38 regular states answered this question; in 2014, 49 regular states answered this question; in 2012, 49 regular states answered this question; in 2009, 49 regular states answered this question; in 2007, 50 regular states answered this question.
c State respondents were able to select multiple responses.

Note. Eighteen regular states reported not disaggregating results by primary disability in both 2016 and 2014.

In 2016, 4 of 8 responding unique states (50%) reported disaggregating results by primary disability. This is a small increase from previous years. The most frequently listed reason for disaggregating results by disability category in 2016 (see Figure 7) was for reporting purposes.

Figure 7. Reasons for Reporting General Assessment Results by Disability Category for Unique Statesa,b,c

Figure 7 Bar Chart

a Four unique states reported not disaggregating results by primary disability in 2016 and three unique states reported not disaggregating results by primary disability in 2014.
b In 2016, four unique states reported disaggregating data; in 2014, five unique states reported disaggregating data; in 2012, two unique states reported disaggregating data; in 2009, one unique state reported disaggregating data; in 2007, no unique states reported disaggregating data.
c State respondents were able to select multiple responses.

States also reported on their practices for disaggregating results by primary disability for students participating in the state’s alternate assessment. The primary reason for doing so for regular states was to examine trends, which the primary reason for doing so for unique states was for public reporting purposes (see Figure 8).

Figure 8. Reasons for Reporting Alternate Assessment Results by Disability Category for Regular and Unique States

Figure 8 Bar Chart

Note: Forty regular states and eight unique states answered this question. State respondents were able to select multiple responses.


Accessibility and Accommodations

States indicated the ways in which they monitored the provision and use of accessibility features and accommodations in 2016 (see Table 7). For regular states and unique states, the most frequent approach was to directly observe test administrations, including the provision of accessibility features and accommodations on test day. In regular states, the next most frequent approaches were conducting desk audits, completing online record reviews, and interviewing students, teachers, and administrators about accessibility features and accommodations. In unique states, the other approaches used were completing online record reviews, interviewing students, teachers, and administrators, and randomly sending teams into districts/schools to compare IEPs and 504 plans to what teachers say happens in class and during assessments.

Table 7. Ways of Monitoring Accessibility Features and Accommodations

Regular States Unique States
Ways of Monitoring Count Percent Count Percent
We do not monitor the provision of accessibility features. 6 18.1% 0 0.0%
We do not monitor the provision of accommodations. 4 12.1% 0 0.0%
We complete online record reviews. 12 36.4% 1 12.5%
We conduct desk audits. 13 39.4% 0 0.0%
We directly observe test administrations, including the provision of accessibility features and accommodations, on test day. 19 57.6% 8 100.0%
We interview students, teachers, and administrators about accessibility features and accommodations. 12 36.4% 1 12.5%
On a random basis, we send teams into districts/schools to compare IEPs and 504 Plans to what teachers say happens in class and during assessment. 8 24.2% 1 12.5%
On a scheduled basis, we send teams into districts/schools to compare IEPs and 504 Plans to what teachers say happens in class and during assessment. 8 24.2% 0 0.0%
On a targeted basis (using data on accessibility features and accommodations), we send teams into districts/schools to compare IEPs and 504 Plans to what teachers say happens in class and during assessment. 9 27.3% 0 0.0%
Other 10 30.3% 0 0.0%

Note: Thirty-three regular states and eight unique states answered this question. State respondents were able to select multiple responses.

States communicated information about accessibility features and accommodations to districts, schools, and teachers in a several ways (see Figure 9). The most frequent approaches used by regular states included making information available on the website, providing webinars, conducting workshops, and providing written manuals or instructions to each district or school. For unique states, the most frequent approaches were providing written manuals or instructions to each district or school and conducting workshops.

Figure 9. Modes of Communicating Accessibility Features and Accommodations Information to Districts, Schools, and Teachers

Figure 9 Bar Chart

Note. Forty regular states and eight unique states answered this question. State respondents were able to select multiple responses.

Regular states continued to examine the validity of the interpretation of results when accessibility features and accommodations were used during the general assessment (see Figure 10). In 2016, 29 of 37 responding states (78%) indicated that they collected data; this was an increase from the 31 of 47 responding states (66%) in 2014. Dependence on reviewing research literature in 2016 (60%) reflected a slight increase from 2012 (57%) and 2014 (49%), but a decrease from 2009 (66%).

Figure 10. Ways That Regular States Examined Validity of Accessibility Features and Accommodations on General Assessmentsa,b

Figure 10 Bar Chart

a In 2016, 37 regular states answered this question; in 2014, 47 regular states answered this question; in 2012, 46 regular states answered this question; and in 2009, 50 regular states answered this question.
b State respondents were able to select multiple responses.

Unique states more often than regular states indicated that they had not examined the validity of the interpretation of results from the general assessment when accessibility features and accommodations were used (see Figure 11). In 2016, five of eight responding states (63%) indicated that they had not examined validity.

Figure 11. Ways That Unique States Examined Validity of Accessibility Features and Accommodations on General Assessmentsa,b

Figure 11 Bar Chart

a In 2016, eight unique states answered this question; in 2014, seven unique states answered this question; in 2012, six unique states answered this question; and in 2009, five unique states answered this question.
b State respondents were able to select multiple responses.

States also provided information on the ways they examined the validity of the interpretation of results from the alternate assessment (see Figure 12). Regular states more often reviewed research literature and collected data, while unique states more often either did not examine validity or collected data.

Figure 12. Ways That Regular and Unique States Examined Validity of Accessibility Features and Accommodations on the Alternate Assessment

Figure 12 Bar Chart

Note. Thirty-seven regular states and eight unique states answered this question. State respondents were able to select multiple responses.

A majority of responding regular states (67%) indicated that they provide different accessibility features and accommodations for their general and alternate assessments. In contrast, a majority of unique states indicate that they are the same. The following are some of the ways in which states indicated that the accessibility features and accommodations were different for the two assessments:

  • There are additional allowable accommodations for the alternate assessment. The accommodations are built into the test administration procedures and do not require pre-identification for individual students, with the exception of the accommodation to provide a paper/pencil version for deaf and/or blind students.
  • Not all accommodations allowed on the general test are appropriate on the alternate assessment and vice versa.
  • The test design and layout allows for more technology use to be incorporated on the alternate assessment.
  • Students taking the alternate assessments are provided all embedded supports provided by the online system. They are also allowed to use any non-embedded supports and classroom supports needed as designated in their IEPs.
  • Certain accommodations are not available for the alternate assessment as they are already a part of the construct of that assessment.
  • Some supports, such as text-to-speech, are made available to all students taking the alternate assessment; this is different from text-to-speech being an accessibility feature for the general assessment.
  • Some accommodations, such as read aloud, are allowed for sections on the alternate reading assessment, but not on the general assessment.
  • The general assessment has more accessibility features and accommodations.

Figure 13. Different Accessibility Features and Accommodations Provided on General and Alternate Assessments

Figure 13 Bar Chart

Note. Thirty-six regular states and eight unique states answered this question.

States indicated whether they had a process in place for assigning accessibility features and accommodations prior to the administration of the assessment (see Figure 14). The majority of regular states (69%) indicated they had a process, while about half of the unique states did.

Figure 14. Process in Place for Assigning Accessibility Features and Accommodations Prior to the Assessment

Figure 14 Bar Chart

Note. Forty regular states and eight unique states answered this question.

Just over half of the regular states indicated that they were transitioning to Unified English Braille (UEB) by using a braille assessments with both UEB and English Braille American Edition (EBAE) (see Figure 15). Half of the unique states indicated that they had not yet begun to plan for the transition.

Figure 15. Transitioning Assessments to Unified English Braille (UEB)

Figure 15 Bar Chart

Note. Forty regular states and eight unique states answered this question.

Accessibility Features and Accommodations Challenges

States noted the challenges associated with providing accessibility features and accommodations (see Figure 16). The most frequently noted challenges by regular states were training educators in making decisions about accessibility features and accommodations; arranging for trained readers, scribes, and sign language interpreters; and having test administrators know which students are to use accessibility features and accommodations. For unique states, the most frequently noted challenges were arranging for and checking on special equipment and having providers of accessibility features and accommodations available. Challenges other than those indicated in Figure 16 included:

  • Lack of available certified interpreters in some parts of the state
  • Making appropriate decisions about text-to-speech and read aloud for reading passages
  • Not completing the personal needs profile in time to have accommodations available on test day
  • Lack of personnel to support 1-to-1 administered accommodations
  • Platform challenges

Figure 16. Challenges in Provision of Accessibility Features and Accommodations

Figure 16 Bar Chart

Note. Thirty-eight regular states and eight unique states answered this question. State respondents were able to select multiple responses.


Alternate Assessments based on Alternate Achievement Standards (AA-AAS)

Most regular and unique states do not have end-of-course alternate assessments for students with the most significant cognitive disabilities (see Figure 17). Less than 18% of regular states and less than 15% of unique states had end-of-course alternate assessments, either for some or all courses.

Figure 17. End-of-Course AA-AAS for Students with the Most Significant Cognitive Disabilities

Figure 17 Bar Chart

Note. Forty regular states and eight unique states answered this question.

More than half of regular states and about just over 40% of unique states have made major revisions to their AA-AAS since 2014. Considerably fewer regular and unique states (33% and 14%, respectively) are planning to develop a new or revised AA-AAS in the next two years (see Figure 18).

Figure 18. AA-AAS Changes

Figure 18 Bar Chart

Note. Thirty-nine regular states and seven unique states answered these questions.


English Learners with Disabilities

English learners (ELs) with disabilities are increasing in numbers across regular and unique states, and increased attention is being given to policies and practices for their participation in assessments.

Reporting Practices for English Learners with Disabilities

Fourteen of 39 (36%) responding regular states and one of the seven (14%) responding unique states indicated that they did not disaggregate assessment results for ELs with disabilities (see Figure 19). For those regular and unique states that did, they most often indicated that they did so for the general assessment, followed by the English language proficiency (ELP) assessment and the alternate assessment (AA-AAS).

Figure 19. Reporting Practices for ELs with Disabilities

Figure 19 Bar Chart

Note. Thirty-nine regular states and seven unique states answered this question. State respondents were able to select multiple responses.

Just over 60% of regular states and all unique states indicated that the ELs with disabilities in their states participated in all four domains (reading, writing, speaking, listening) of the state ELP assessment. One-third of regular states reported that some ELs with disabilities were not included in some portions of the ELP assessment, and 5% indicated that some ELs with disabilities were not included in any portion of the ELP assessment (see Figure 20).

Figure 20. How States Included ELs with Disabilities in ELP Assessment Results

Figure 20 Bar Chart

Note. Thirty-nine regular states and five unique states answered this question; two unique states indicated that they do not administer ELP assessments, and one unique state skipped this question for the same reason. State respondents were able to select multiple responses.

Accessibility and Accommodations for English Learners with Disabilities

Accessibility features and accommodations were offered to ELs with disabilities by the majority of regular states, and under half of unique states (see Figure 21). Just over one-fourth of the regular and unique states indicated that accessibility features and accommodations were offered on some section of the ELP assessment. Some comments about accessibility features and accommodations on the ELP assessment were made by regular states; most pointed out that they used the accessibility features and accommodations provided by their test vendor. One regular state pointed out that it provides accessibility features and accommodations. Some of the unique states indicated that they did not have an ELP assessment.

Figure 21. Accessibility Features and Accommodations Use on ELP Assessments

Figure 21 Bar Chart

Note. Forty regular states and seven unique states answered this question. Two unique states used the comment field to report that they do not administer ELP assessments, and one unique state skipped this question for the same reason.

English Learners with the Most Significant Cognitive Disabilities

Most regular and unique states reported that ELs with the most significant cognitive disabilities participated in an alternate ELP assessment (see Figure 22). Eighteen percent of regular states and no unique states indicated that their ELs with significant cognitive disabilities participated in the same ELP assessment as all other ELs.

Figure 22. ELs with the Most Significant Cognitive Disabilities Participation in ELP Assessments

Figure 22 Bar Chart

Note. Thirty-nine regular states and seven unique states answered this question.

Most regular and unique states indicated that they had no plans in the next two years to develop a new or revised alternate ELP assessment (see Figure 23).

Figure 23. Planning to Develop a New or Revised Alternate ELP Assessment for Students with Significant Cognitive Disabilities (in the next two years)

Figure 23 Bar Chart

Note. Thirty-seven regular states and seven unique states answered this question.

Relatively few states indicated that the exit of a student with a disability was addressed during the IEP process (see Figure 24). Over 50% percent of regular states and 80% of unique states indicated that exiting from ELs services was not addressed in the IEP process.

Figure 24. Exiting from EL Services Addressed in the IEP Process

Figure 24 Bar Chart

Note. Thirty-seven regular states and five unique states answered this question.


Continuing Assessment Issues

Several assessment issues continue to face states. Among those on which states commented in 2016 were assessment audits, student performance growth, improvement plans, graduation requirements, technology, and assessment principles.

Assessment Audits

Nearly half of all responding regular and unique states reported that they had provided leadership to local education agencies to conduct assessment audits that inventory how many assessments are administered throughout the school year (see Figure 25). Still several respondents (in 13% of regular states and 29% of unique states) did not know whether this had occurred.

Figure 25. State Provided Leadership for Local Education Agencies on Conducting Assessment Audits

Figure 25 Bar Chart

Note. Thirty-eight regular states and seven unique states answered this question.

Assessment audits included consideration of students with disabilities in several ways (see Figure 26). Most often, the audits included special education assessments. Less often, the audits identified special education assessments that were duplicative or similar to assessments used for other purposes. Least often for regular states, but not unique states, the auditing team included an educator with expertise in special education.

Figure 26. Assessment Audit Approaches

Figure 26 Bar Chart

Note. Fourteen regular states and one unique state answered this question. State respondents were able to select multiple responses.

Student Performance Growth

Most regular states indicated that they were using student performance growth as a measure of student achievement in 2016 (see Figure 27). In contrast, most unique states indicated that they were not using student growth. About equal percentages of regular and unique states indicated that they were developing an achievement measure that will include a growth measure.

Figure 27. Use of Student Growth in Achievement Measure

Figure 27 Bar Chart

Note. Thirty-eight regular states and seven unique states answered this question.

States included students with disabilities in their growth measures either in the same way as other students are included or only those in the general assessment were included and students in the alternate assessment were excluded (see Figure 28). No states (regular or unique) indicated that students in the general assessment were included in the same way as other students, but that adjustments were made for students in the alternate assessment.

Figure 28. Inclusion of Students with Disabilities in State’s Growth Measure

Figure 28 Bar Chart

Note. Thirty-seven regular states and one unique state answered this question.

Many regular states (44%) and all unique assessments indicated that they were using Student Growth Percentiles (SGP) for their assessment scores to judge improvement in student academic performance (see Figure 29). A small number of regular states were using transition or value tables, a value-added model, or gain scores to judge improvement.

Figure 29. Growth Model Used to Judge Improvement in Student Academic Performance

Figure 29 Bar Chart

Note. Thirty-four regular states and two unique states answered this question.

Among the biggest challenges to including students with disabilities in growth models for regular states were that alternate assessments were not included in growth models, followed by the lack of data from the same assessment across years (see Figure 30). For unique states, most indicated that they had no challenges with their growth models; all students with disabilities were included. Challenges other than those indicated in Figure 30 included:

  • Fewer score points on alternate available to demonstrate growth because assessment is scored using a rubric.
  • Some growth models limit our options for including alternate assessment results.
  • The alternate assessment has not yet provided a reliable and valid way to generate an SGP.
  • We do not have enough students taking the alternate assessment to calculate growth.
  • Professional development so that staff in the field understand and use the growth data to support school and program improvement.

Figure 30. Challenges in Including Students with Disabilities in Growth Measures

Figure 30 Bar Chart

Note. Thirty-three regular states and two unique states answered this question. State respondents were able to select multiple responses.

State Improvement Plans

States provided information on the measurable results for students with disabilities (the State-Identified Measurable Result–SIMR) that they were using in the improvement plans required by the Office of Special Education Programs, the State Systemic Improvement Plan (SSIP). Nearly three-quarters of responding states, both regular and unique, indicated that they were using measures of achievement (see Figure 31).

Figure 31. State-Identified Measurable Result is Achievement

Figure 31 Bar Chart

Note. Thirty-eight regular states and seven unique states answered this question.

Most states indicated that in addition to special education, the offices of assessment and curriculum and instruction were included in the development, implementation, or analysis of SIMR data (see Figure 32). Just over one-third of regular states and no unique states included the Title III office. Other offices that were mentioned by states included:

  • Accountability
  • Data collections
  • Office of Student Support-LEAs

Figure 32. Offices Involved in Developing, Implementing, or Analyzing SIMR Data

Figure 32 Bar Chart

Note. Twenty-seven regular states and five unique states answered this question. State respondents were able to select multiple responses.

Graduation Requirements

States reported on changes that had occurred in their graduation requirements (see Figure 33). Twenty-five percent of regular states and no unique states indicated that between 2014 and 2016, their graduation requirements had changed. Among the 2016-17 changes reported by states were:

  • Nearly a 20% increase in End of Course requirements.
  • Legislation in 2015 creates statewide graduation requirements to be implemented for the 2021 graduating class; before this, there have been no common graduation requirements.
  • Requirements changed twice. On July 1, 2014, the High School Graduation Exit Exam requirement was replaced with a requirement to take a college-or-career ready assessment to graduate. That requirement was repealed on June 30, 2016.
  • All graduation requirements were removed from the state-level statute as of July 1, 2016. For the next four years, graduation requirements will be based on local policies.
  • Current graduation requirements were temporarily suspended and new requirements are being developed.
  • All students became eligible for a regular high school diploma.

Thirty percent of regular states and no unique states indicated that they expected graduation requirements to change between 2017 and 2018. Among the expected 2017-18 changes reported by states were:

  • Graduation requirements for all students are under review.
  • The Occupational Diploma will no longer be offered. All students will work toward a foundational (with or without additional endorsements) or standard high school diploma.
  • Any changes in graduation requirements will depend on the decision about alternate diplomas.
  • New requirements will be put in place for all students.
  • An alternate diploma is a possibility.
  • Local requirements for credits for students with disabilities cannot exceed the state credit requirement, unless the IEP indicates that the credit requirements for the student should exceed those set by the state.
  • A new assessment will be in place for grade 10 students in 2019; the assessment will be based on college and career readiness.

Figure 33. Changes in Graduation Requirements for Students with Disabilities

Figure 33 Bar Chart

Note. Forty regular states addressed changes in graduation requirements in 2014-2016 and in 2017-2018; seven unique states addressed changes in graduation requirements for both periods of years.

Technology

States reported on the technology-related investments they perceived to be needed to better enable students with disabilities to participate in instruction and assessments (see Figure 34). The needs identified by the most regular states were additional devices, improved bandwidth or capacity for Internet connectivity, and additional adaptive technology. The largest number of unique states identified specialized software that enables the provision of accessibility and accommodations, additional devices, improved bandwidth or capacity for Internet connectivity, and test security provisions. The other needs that were identified included:

  • More training on standard accessibility features in modern operating systems
  • Augmentative communication devices and software

Figure 34. Needed Technology-related Investments for Better Participation of Students with Disabilities in Instruction and Assessment

Figure 34 Bar Chart

Note. Thirty-eight regular states and seven unique states answered this question. State respondents were able to select multiple responses.

Assessment Guidelines, Standards, and Principles

States indicated the guidelines, standards, or principles that they asked their test vendors to comply with (see Figure 35). Most states reported that they required compliance with the APA, AERA, NCME Standards for Educational and Psychological Testing, and the ATP/CCSSO Operational Best Practices for Statewide Large-scale Assessment Programs. One-third of regular states and 50% of unique states required compliance with NCEO’s Principles and Characteristics of Inclusive Assessment Systems in a Changing Assessment Landscape. Other requirements that the states noted included:

  • Accessible Portable Items Protocol (APIP)
  • Question and Test Interoperability (QTI)
  • Universal design
  • Americans with Disabilities Act
  • Smarter Balanced UAGG
  • Peer review requirements
  • Web Accessibility Initiative (WAI)
  • Web Accessibility Initiative-Accessible Rich Internet Applications (WAI-ARIA)

Figure 35. Collaboration with Test Vendors on Developing Assessments Compliant with Guidelines, Standards, or Principles

Figure 35 Bar Chart

Note. Thirty-five regular states and four unique states answered this question. State respondents were able to select multiple responses.


Technical Assistance Needs

Survey respondents ranked the helpfulness of 16 types of technical assistance. The needs are listed in order of rank (from most helpful to least helpful) for regular and unique states in Table 8.

The top three types of technical assistance selected by regular states were: (1) “how to” documents on accessibility and accommodations, alternate assessments, etc. available on Internet for self review; (2) conference calls on hot topics; and (3) webinars on assessment related topics. The top three types of technical assistance identified by unique states were: (1) individual consultation in the state, (2) individual consultation at meetings, and (3) assistance with data analysis.

Table 8. Technical Assistance Ranked by Order of Preference

Regular States Rank Unique States
“How to” documents on accessibility and accommodations, alternate assessments, etc. available on Internet for self review 1 Individual consultation in the state
Conference calls on hot topics 2 Individual consultation at meetings
Webinars on assessment related topics 3 Assistance with data analysis
Consultation and review of state materials 4 Individual consultation for the state via phone or web-based meeting space
Assistance with data analysis 5 Conference calls on hot topics
Awareness materials 6 Webinars on assessment related topics
Small group “clinics” 7 Ready-made workshops
Individual consultation in the state 8 “How to” documents on accessibility and accommodations, alternate assessments, etc. available on Internet for self review
Individual consultation at meetings 9 Awareness materials
Opportunities to participate in discussion forums 10 Consultation and review of state materials
Ready-made workshops 11 Small group “clinics”
Individual consultation for the state via phone or web-based meeting space 12 Videos
Descriptions of assessments in other states 13 Opportunities to participate in discussion forums
Videos 14 Descriptions of assessments in other states
Social media (Facebook, Twitter, LinkedIn) posts that provide links to new resources and important information 15 Podcasts
Podcasts 16 Social media (Facebook, Twitter, LinkedIn) posts that provide links to new resources and important information

Note. Thirty-four regular states and seven unique states answered this question. Lowest rank indicated most preferred.

When asked whether their rankings of technical assistance materials and strategies would be different if these materials and strategies focused on ELs and ELs with disabilities, the majority of regular states and unique states responded “No.” Those regular states that responded “Yes” provided the following reasons:

  • More one on one or smaller coaching groups may be necessary.
  • Because of different populations, the needs of English learners could differ.
  • For English learners with disabilities, the issues are more complex.
  • The topics of technical assistance could focus more on instructional supports and interventions, including about language learning.

Appendix A

Table A-1. Opportunities Associated with Provisions in ESSA

Students with Disabilities
Access to SAT for College Admission
Alternate Diploma options
Continued inclusion
Diplomas
Discussion of graduation requirements
Ensuring they are being included in grade level content and assessment
Equitable access to core learning and assessments
Equitable expectations for learning
ESSA focuses on the continued growth of students instead of the unrealistic expectations of NCLB
Focused subgroup identification
Helping districts understand the appropriate use of accommodations in testing
Higher expectations and increased outcomes
More support and accountability for students with disabilities
New Graduation Requirements for 1%
Opportunity to be measured by growth
Our state currently in preliminary stages (collecting stakeholder feedback) of developing a state plan
Promote the use of assistive technologies; raise graduation standards somewhat
Representation growth metrics; recognition of alternate diploma
School ratings focused on improving rates, with a focus on students with disabilities; re-commitment to accommodations to ensure participation for everyone
The 1% cap on participation in the alternate assessment
The Alternate Diploma - to provide students with significant cognitive disabilities an opportunity to earn a high school diploma; 1% participation cap - to ensure that students who truly meet the criteria are utilizing alternate standards and the alternate assessment
English Learners
Closing the achievement gap
ELs will show up and districts will have more accountability; it should allow for more opportunity for growth and better support for newly placed EL students
Equitably including EL growth into accountability
Establish a strong reading program from K-3, implement, monitor and provide need support with fidelity
Greater focus for ELs on the English language development and success in content areas
Greater visibility by inclusion in Title 1 accountability
Improved and clarified exit criteria
Included in required statewide assessments and accountability system; thoughtful analysis of native language needs
Including in accountability--growth to proficiency
Inclusion in Title I
Incorporation into Growth model
More accountability for districts to monitor progress
More focus on ELs as they are a part of the whole accountability system
More realistic accountability measures
Our state currently in preliminary stages (collecting stakeholder feedback) of developing a state plan
Representation growth metrics; recognition of alternate diploma
Visibility within the accountability system
English Learners with Disabilities
Accommodating for language as well as disability to get a more accurate picture of student outcomes
Allows a state to be more strategic in their inclusion and support for these students
Alternate ELP
Attention to a very low incidence population
Closing the achievement gap
Encourages use of translated tests and universal design
Even though students missing a domain can still take the assessment and their scores will count
Greater focus for ELs on the English language development and success in content areas
Greater visibility by inclusion in Title 1 accountability
Heightened accountability for EL programs due to being accountable under Title 1
Higher expectations and increased outcomes
How to include newly arrive English Learners in accountability measures
Identification of these students and reporting of performance
Incorporation into Growth model
More accountability for districts to monitor progress
More focus on ELs as they are a part of the whole accountability system
More realistic accountability measures
Opportunity to be measured by growth
Opportunity to develop a new assessment
Our state currently in preliminary stages (collecting stakeholder feedback) of developing a state plan
Potential to collaborate with other states on addressing how to validly and reliably assess the English language of ELs with disabilities
Representation growth metrics; recognition of alternate diploma
Strong collaboration among educators in providing English reading with appropriate modification/adaptation to meet their learning needs

Table A-2. Challenges Associated with Provisions in ESSA

Students with Disabilities
1% cap for taking the alternate assessment
1% cap
1% Cap
1% cap and inclusion in growth model
1% cap change
1% cap waiver requirement
1% participation cap and conflict with IDEA
1% participation in alternate assessments
A blanket 1 percent state cap on alternate assessments is unnecessary and possibly illegal
AA-AAS participation cap
Assessment too difficult
Assigning a weight to the ELP indicator in the state accountability system; business rules around 1% participation cap
Complying with 1% AA-AAS rule
Define new diploma requirements
Defining what students belong in the area of most significant cognitive disabilities, making sure the right population is being targeted for this and the 1% CAP for Participation on the Alternate Assessment
Ensuring that the alternate diploma, if created, is equitable for this unique student population
Equitably including students in accountability for High School
Formulating more realistic accountability measures
Implementation of the new 1% rule
Many of the rules don’t take into account issues with small schools
New 1% guidelines and Alternate Diploma
Our state currently in preliminary stages (collecting stakeholder feedback) of developing a state plan
Overidentification
Pool of fully qualify teachers ensure services and appropriate support are provided with fidelity
Setting exit criteria for students with significant cognitive disabilities
Supporting students who may not meet requirements for a diploma but who, also, are not eligible for the alternate assessment
English Learners
Assessment too difficult
Developing an accountability system that includes more than assessment data that is able to be applied fairly to all schools
Developing an accountability system to include long term goals and interim performance means
Ensuring equitable access to core content and curriculum
Ensuring that ELs receive the supports and accommodations to which they are entitled
Establishing equitable and research based targets for a diverse EL population
Evaluating and Revamping EL programs
Figuring out how to measure growth; defining what “significant extent” means in relation to native language assessments; assigning a weight to the ELP indicator in the state accountability system
Formulating more realistic accountability measures
How to use new indicator in accountability model
Including in accountability--growth to proficiency
Lack of reading program that is used across all schools and grade level
Low incidence of ELs in most schools makes inclusion of ELP data in school performance reports difficult
Many of the rules don’t take into account issues with small schools
Meeting the needs of so many languages
Monitoring ELP assessment extension
n-size; ELs won’t be factored into most school level accountability
Our state currently in preliminary stages (collecting stakeholder feedback) of developing a state plan
Striking a balance between including first-year ELs in the ELA assessment for accountability purposes, while not over-testing students whose results will not reflect what they know
Supporting students who may not meet requirements for a diploma but who, also, are not eligible for the alternate assessment
Tests in additional languages
Unintended consequences of providing RLA assessments in other languages for 5 years
English Learners with Disabilities
Assessment too difficult
Developing an alternate ELP assessment that will meet the needs of all eligible ELs
Disaggregating results in reports and interpreting results
Ensuring equity and access for all students regardless of EL or disability status in all provisions of the law
Ensuring that districts are providing services in both areas especially EL services and appropriate identification of ELs with significant cognitive disabilities with valid data
Ensuring that ELs receive the supports and accommodations to which they are entitled and developing an alternative assessment for the ELP
Formulating more realistic accountability measures
Funding for the new assessment
How to use new indicator in accountability model
Identification issues around language and disability
Inclusion in the system planning regarding their education
Lack of assessments that are valid and reliable to identify, as well as, assess progress in learning English
Many of the rules don’t take into account issues with small schools
Monitoring accommodations
Our state currently in preliminary stages (collecting stakeholder feedback) of developing a state plan
Providing an assessment in the native language when our state is an English only state
Striking a balance between including first-year ELs in the ELA assessment for accountability purposes, while not over-testing students whose results will not reflect what they know
Students on alternate assessment and health impairments (deaf/hard of hearing) not being able to exit EL services
Supporting students who may not meet requirements for a diploma but who, also, are not eligible for the alternate assessment
Test in additional languages