State Alternate Assessments: Status as IDEA Alternate Assessment Requirements Take Effect


NCEO Synthesis Report 35

Published by the National Center on Educational Outcomes

Prepared by:

Sandra J. Thompson • Martha L. Thurlow

June 2000


This document has been archived by NCEO because some of the information it contains is out of date.


Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Thompson, S. J., & Thurlow, M. L.. (2000). State alternate assessments: Status as IDEA alternate assessment requirements take effect (Synthesis Report No. 35). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://education.umn.edu/NCEO/OnlinePubs/Synthesis35.html


Executive Summary

The phrase “alternate assessment” appears in the reauthorized Individuals with Disabilities Education Act and is required to be in place in all states by July 1, 2000. Alternate assessments are for the small number of students with disabilities who cannot participate in state and district-wide assessment programs. To provide a continuously updated source of information about what states are doing, the National Center on Educational Outcomes (NCEO) developed an on-line survey on the development of alternate assessments. Nearly two years after the survey’s initial design, responses have been received from all 50 states, with 47 states updating their information between March and June 2000. In addition, five educational units that receive IDEA Part B funds (American Samoa, Bureau of Indian Affairs, Marshall Islands, Virgin Islands, and Washington DC) have completed the survey. Guidelines, procedures, and training—all are progressing at a feverish pace as this document goes to print. Among the major findings are:

     There is a divergence in who is involved in the development of alternate assessments that is reflected in the nature of the standards and the approach taken by states. While many states included state and local special and general educators in the design of their alternate assessment systems, a small number viewed alternate assessment development as a special education initiative.

     The most prevalent alternate assessment approach is a collection of a body of evidence that assesses functional indicators of progress toward state standards using a variety of performance-based assessment strategies. Nine states plan to base their alternate assessment on separate standards or skill sets that are not linked to general education standards.

  Although only a few states are actually implementing their alternate assessments statewide, most states are close to being ready to do so.

     Areas of greatest need for development are scoring procedures and how data will be reported.

  Fifteen states currently have information about their alternate assessments on their Web sites, with several others in draft form.

 

Alternate assessments have evolved over the past two years of development, and may be expected to continue this evolution as states implement them and determine what works best. While the presentation in this report of all the approaches states are taking does not imply endorsement of any specific state alternate assessment practices, it does indicate that states are still moving in many directions despite regulations suggesting directions for development.


Acknowledgments

A synthesis report of this magnitude is only as good as the quality of information gathered. To this end, we at NCEO extend our appreciation to each person who took the time to complete and then update the on-line survey on alternate assessment.

This survey was originally intended to serve only as a continuously updated on-line source of information, but we have found it important to take a “slice” of that information periodically and to do a careful analysis of the status of alternate assessments through these stages of rapid development. We appreciate the respondents’ willingness to allow to appear in print information that quickly goes “out-of-date” or is “not quite ready.” To stay up-to-date, we encourage readers to visit our Web site at http://education.umn.edu/NCEO to view current information and to check their state’s status.


Overview

The countdown is on. The 1997 amendments to the Individuals with Disabilities Education Act require states to have alternate assessment systems in place by July 1, 2000. This report presents an examination of the status of alternate assessments across states as of June 1, 2000, just a month from the deadline. What do the alternate assessment systems look like? Who is involved in their development? Who will participate in these alternate assessment systems? These and other important questions and issues are addressed in this report.

The phrase “alternate assessment” appears in the reauthorized Individuals with Disabilities Education Act (see Appendix A):

As appropriate, the State or local agency (i) develops guidelines for the participation of children with disabilities in alternate assessments for those children who cannot participate in State and district-wide assessment programs; and (ii) develops and, beginning not later than July 1, 2000, conducts those alternate assessments.  PL 105-17, Section 612 (a)(17)

IDEA does not provide specific direction to states about what an alternate assessment is, what it should look like, or how it should be scored or reported, nor does it specify the type or number of alternate assessment participants. It does clarify in its “Analysis of Comments and Changes” that accompany the final regulations that:

If IEP teams properly make individualized decisions about the participation of each child with a disability in general State or district-wide assessments, including the use of appropriate accommodations, and modifications in administration (including individual modifications, as appropriate), it should be necessary to use alternate assessments for a relatively small percentage of children with disabilities.

Most states estimate the number to range from less than one-half of one percent to no more than two percent of the total student population.

When IDEA was enacted in 1997, Kentucky was the only state with an operational alternate assessment system. Maryland was piloting a system and a few other states were in initial stages of development. As this report shows, nearly all states are now progressing through stages of development, pilot testing, and implementation of their alternate assessments.

Several states began their process of developing alternate assessments by establishing their purpose and guiding principles. A principle that has guided development in several states is that students with significant disabilities need opportunities to access a state’s educational standards (Burgess & Kennedy, 1998).

For example, the foundation for Kentucky’s Alternate Portfolio Assessment was the mandate for a totally inclusive assessment, with the same academic expectations for all students and a zero exemption rule. With this principle as a guide, Kentucky developed:

  shared content standards;

  scoring rubrics modeled on regular assessment;

  shared assessment language for teachers, administrators, parents, and the community;

    a formula to integrate scores within a school’s accountability index;

  district and school reports listing all student scores; and

  tracking procedures so that Alternate Portfolio scores are sent back to the student’s neighborhood school to promote ownership for student learning.

Here are examples of guiding principles from four states. Note that the first three states focus on high expectations for student learning (Olsen, 1998). The fourth state focuses on meeting the mandate with as little disruption to the status quo as possible.

State #1

  All children can learn.

  All children are full participants in the school experience.

  All children will participate in the statewide assessment system.

State #2

  Expectations for all students should be high, regardless of the existence of any disability.

  The goals for an educated student must be applicable to all students, regardless of disability.

  Special education programs must be an extension and adaptation of general education programs rather than an alternate or separate system.

State #3

  All children have value, can learn and are expected to be full participants in the school experience.

  School personnel, parents, local and state policymakers, and the students themselves are responsible for ensuring this full participation.

  The Standard Course of Study is the foundation for all students, including students with unique learning needs.

State #4

  Meet the law.

  Nonabusive to students, staff, parents.

  Inexpensive.

  Easy to do and takes little time.

Guidelines, procedures, training—all are being developed at a feverish pace as this document goes to print. Some states responded to requests to update their survey with “Oh, please, couldn’t we wait just a few more weeks? Our committee will be making several decisions in the next month.” As an example of how new all of this is, one state published its “Guide for Participation in Statewide Alternate Assessments” on its Web site just a few weeks before the completion of this report. We have heard about several other guides that are in draft form, with publication dates expected by the time school starts in September.

 

Procedures for Collecting Information

The information used in the development of this report was compiled from an ongoing, on-line survey developed and maintained by the National Center on Educational Outcomes at the University of Minnesota. In the fall of 1997, NCEO began to examine the status of states in the development of alternate assessments. States wanted up-to-date information about what other states were doing in the development of their alternate assessments. The survey was placed on-line early in 1998, when most states were just beginning to consider the development of the alternate assessments required by IDEA 97. Most states updated their earliest responses in the winter of 1999, when the first status report on the development of alternate assessments was completed (Thompson, Erickson, Thurlow, Ysseldyke, & Callender, 1999). States were invited to complete another round of survey updates between March and June 2000.

The information reported here was compiled from the on-line survey as of June 1, 2000. This date is important to note, since the development of alternate assessments is on a fast track, with the status of states changing daily. All 50 states, plus other educational units receiving federal special education funding (American Samoa, Bureau of Indian Affairs, Micronesia, Guam, Marshall Islands, Mariana Islands, Palau, Puerto Rico, Virgin Islands, and Washington DC), were invited to complete the survey. A print copy of the survey is included in Appendix B. Supplemental information was gathered from written material about alternate assessments that states have posted on their Web sites, from personal communication with state officials, and from previously published reports (i.e., Burgess & Kennedy, 1998; Olsen, 1998; Thompson et al., 1999; Warlick, & Olsen, 1999).

State department personnel who are assigned the task of facilitating the development of alternate assessments completed the on-line survey. Respondents included both special education and assessment personnel. The respondents’ names can be found on the surveys, along with their e-mail addresses. They can be contacted directly for further information. Although survey questions could only be answered when a password given to each assigned respondent was used, the on-line survey was designed so that anyone could view any state’s responses, or the responses of all states to a single question. Respondents were able to update their survey responses at any time.

As of June 1, 2000, all 50 states and five other educational units completed the survey at least once, with 47 states and two educational units providing updates within the past three months. Multiple requests for updates were solicited from each state via e-mail, mail, fax, phone, and personal communication. The anecdotal data gathered through the surveys, personal communication, and other written documentation have provided us with a rich base of information to use in the compilation of this report. While the presentation of information on states’ alternate assessments is not meant as an endorsement of the approaches taken, the information should be useful as statewide implementation of these important assessments begins in earnest.


Survey Results

The NCEO on-line survey addressed a variety of components of alternate assessments, including: identification of stakeholders, participation guidelines, alignment with state standards, approaches to gathering data, determination of proficiency measures, reporting results, inclusion in high stakes systems, and statewide training. Survey results from all states are summarized in Table 1. Results from the other educational units are summarized in Appendix C.

As shown in Figure 1, there has been a great deal of activity over the past year, with many more states addressing each component of their alternate assessment systems than in 1999.    

 

Table 1. Summary of Alternate Assessment Features Addressed by States

 

 

State

 

Stake-holders

 

Participation

Guidelines

Alignment with State Standardsa

 

 

Approach

 

Proficiency Measuresb

 

 

Reportingc

 

High Stakesd

 

 

Training

Alabama

X

X

Subset

X

X

X

 

X

Alaska

X

X

Different

X

Same

X

Student

X

Arizona

X

X

Additions

X

X

X

Student

X

Arkansas

X

X

Subset

X

Different

X

 

X

California

X

X

Different

X

Different

X

Both

X

Colorado

X

X

Same

X

Different

Separate

System

X

Connecticut

X

X

Additions

X

 

X

Both

X

Delaware

X

X

Additions

X

X

X

System

X

Florida

X

 

Subset

X

Different

X

System

X

Georgia

X

X

Different

X

X

X

 

X

Hawaii

X

X

Same

X

X

X

 

X

Idaho

X

X

Same

X

X

X

 

X

Illinois

X

X

Subset

X

Different

X

 

X

Indiana

X

X

Additions

X

Different

Separate

 

X

Iowa

X

X

Different

X

X

X

 

X

Kansas

X

X

Subset

X

X

Both

 

X

Kentucky

X

X

Subset

X

Same

Aggregated

System

X

Louisiana

X

X

Same

X

Different

X

Student

X

Maine

X

X

Same

X

Same

X

 

X

Maryland

X

X

Additions

X

Different

Separate

Both

X

Massachusetts

X

X

Same

X

Same

X

Student

X

Michigan

X

X

Different

X

X

X

System

X

Minnesota

X

X

Different

X

Different

X

Student

X

Mississippi

X

 

Uncertain

X

 

 

 

 

Missouri

X

X

Same

X

Same

X

 

X

Montana

X

 

Uncertain

 

 

 

 

 

Nebraska

X

X

Different

X

Same

X

 

X

Nevada

X

X

Subset

X

X

X

 

X

New Hampshire

X

X

Subset

X

X

X

 

X

New Jersey

X

X

Subset

X

 

X

Both

X

New Mexico

X

X

Subset

X

Different

Separate

Both

X

New York

X

X

Same

X

X

 

 

X

North Carolina

X

X

Different

X

Different

X

Student

X

North Dakota

X

X

Same

X

X

X

 

X

Ohio

X

X

Uncertain

X

 

X

Student

 

Oklahoma

X

X

Additions

X

Different

X

System

X

Oregon

X

X

Same

X

Same

X

 

X

Pennsylvania

X

X

Subset

X

 

X

 

 

Rhode Island

X

X

Subset

X

Same

 

 

X

South Carolina

X

X

Subset

X

Same

 

 

X

South Dakota

X

X

Subset

X

Same

X

 

X

Tennessee

X

X

Subset

X

Same

Both

 

X

Texas

X

X

Additions

X

Same

X

 

X

Utah

X

X

Subset

X

 

Separate

 

X

Vermont

X

X

Subset

X

Same

X

System

X

Virginia

X

X

Subset

X

Same

X

 

X

Washington

 

X

Subset

X

 

 

 

 

West Virginia

X

X

Subset

X

Different

X

 

X

Wisconsin

X

 

Subset

X

 

Separate

 

X

Wyoming

X

X

Same

X

Different

Separate

 

X

TOTAL

49

46

47

49

40

43

18

44

 

* aligned with state standards: same = alternate assesses general education standards; subset = alternate assesses a subset of general education standards; additions = alternate assesses standards in addition to general education standards; different = alternate assesses different standards from general education

** proficiency levels: same = alternate has same measures of proficiency as general assessment; different = alternate has different measures of proficiency than general assessment; X = this area has been addressed but decisions have not been made or were not reported on the survey

*** reporting: aggregated = alternate results are aggregated with general assessment results; separate = alternate results are reported separately from general assessment results; both = alternate results both aggregated and reported separately from general assessment results; X = this area has been addressed but decisions have not been made or were not reported on the survey

**** high stakes: alternate assessment implications have been addressed within: student = state with high stakes for students; system = state with high stakes for schools/districts; both = state with high stakes for both students and systems

Figure 1. Alternate Assessment Features Addressed by States

Alternate assessment features addressed by states

Stakeholders

States were asked to identify the stakeholders involved in decisions about the development of their alternate assessment systems. Forty-nine states responded that they involved a variety of stakeholders in several ways. Many states included state and local special and general educators in the design of their alternate assessment systems. In some states, assessment personnel represented general education, while other states also included both state and local general education content specialists (i.e., math teacher or language arts consultant).

In contrast, some states viewed the development of an alternate assessment as a special education initiative only. They did not perceive a need to include general educators. As evident later in this report, many of these states designed alternate assessments based on special education standards or skill sets—without considering a connection to general education standards or curricula.

NCEO staff have had opportunities to participate in several stakeholder meetings across the country. We have continually been impressed by the depth of involvement of people who really know the students for whom the alternate assessments are being designed, and advocate passionately for the inclusion of these students in state assessment and accountability systems. For many special educators, this was the first general education initiative they had ever been invited to participate in—and the first time their students would actually be counted with everyone else. These stakeholders are determined to design assessment systems that include EVERYONE, no matter what.

Stakeholder groups were used in a variety of ways, from actually participating in the design of an alternate assessment, to developing principles that would then guide the design, to providing feedback on drafts of alternate assessments, to serving as pilot study implementers. State officials developed some alternate assessment systems with the assistance of a contractor, university personnel, or other consultants. These states often solicited input from a broader stakeholder group of parents, teachers, and advocates to give feedback on the system, or to pilot the system. None of the states reported using students in the design of their alternate assessment systems, except as initial participants in the pilot phase. Table 2 describes the involvement of stakeholders in the development of alternate assessment systems across the 48 states that responded to this item on the survey.

 

Table 2. Identification of Stakeholders

Alabama

A task force is working on the design of our system.

Alaska

We’re using a vendor and a committee of stakeholders. We have a small subcommittee of Dept. Sp. Ed. Staff and district Sp Ed experts who are writing specific performance standards for our alternate assessment. Stakeholders have been involved in all committees and subcommittees.

Arizona

The task force that is developing the alternate assessment involves a small, diverse group. Their work has been widely distributed and public input requested. The field, including parents, has been very helpful in construction and reconstructing their work.

Arkansas

We have a task force made up of state special ed. and assessment personnel, district special ed and general ed administrators, coordinators and teachers, and parents. A district superintendent chairs the task force.

California

The Special Education Unit identified and recruited stakeholders for its workgroup to produce the statewide guidelines. The workgroup includes state and local agencies, general education and special education, service providers, parents, and assessment experts.

Colorado

Stakeholders have been identified through a variety of sources: Local directors of special education and assessment, the Institutions of Higher Education Forum, the Colorado Special Education Advisory Committee, the PEAK Parent Center, and interested persons we meet at conferences, workshops, on sites and grant projects. In addition, we surveyed colleagues with content area curriculum and instruction expertise for task force participants in those content areas.

Connecticut

A committee comprised of Department personnel from the assessment unit and special education unit as well as representatives from local school district, regional education programs, and college and university personnel has been working on the development of the assessment instrument for Alternate Assessment.
Various stakeholder groups have been identified and have participated in activities related to the development of the Alternate Assessment. These activities include, but are not limited to the following: Focus Groups comprised of district personnel to provide information of a formative nature. Focus Groups comprised of instructional personnel to comment on proposals under consideration by internal Department committee. Department sponsorship of representatives of Parent Assistance Center at national conference on alternate assessment. Workshops offered by Department personnel to parents/advocates interested in understanding the regulatory requirements of alternate assessment. Ad hoc and standing committees were established to participate in decisions about the development of the Alternate Assessments.

Delaware

The Alternate Assessment Advisory Committee was formed at the beginning of the process (Fall 1997), which includes parents, administrators, coordinated agencies, teachers and related services personnel.

Florida

Stakeholders and parents have been involved throughout the process.

Georgia

We have a committee on alternate assessment that is ongoing and meets about once a month. The committee is composed of local system teachers and administrators, college/university staff, school psychologists, parents, and state department staff.

Idaho

The Alternate Assessment Task Force is representative of our state's geographical regions, and includes parents, and school and state personnel who represent a variety of positions, including special ed. teachers, administrators, testing coordinators, curriculum directors and higher education representatives. Some of their roles are overlapping, e.g. three members were actually parents of students with disabilities although only one represented that perspective alone. An expanded workgroup including several teachers of students with significant needs will be completing the process.

Illinois

The Alternate Assessment Task Force consists of various stakeholders.

Indiana

A stakeholder committee representing a wide variety of interests and expertise has been active since fall of 1997 and continues to provide ongoing advice related to IASEP. Parents are an integral part of this taskforce, as are other constituencies.

Iowa

A stakeholder committee consisting of parents, regular and special education teachers, administrators, higher education personnel and area education agency consultants are working on our alternate assessment system.

Kansas

Kansas has two committees working on the various components of the alternate assessment and the extended standards. The committees contain: SPED teachers, parents, general ed. administrators, special school staff, school psychologists, technology consultant, and curriculum adaptations specialists. The University of Kansas Center for Educational Testing and Evaluation is developing the Kansas Alternate Assessment.

Kentucky

We have an advisory board consisting of teachers (regular and special), university personnel, parents, and state department representatives (divisions of assessment and exceptional children) which meets for 3 days every summer to discuss refinements and revisions. This board then sometimes meets for one day during the school year.

Louisiana

A large task force comprised of people from across the state is supported by a small focus group. A contracted facilitator guides both groups. This group will be called to participate in a test review after the field test. A contractor is producing the document and provides technical assistance.

Maine

We have an advisory committee – Learning System Assessment Team – that is guiding the process. We currently have a collaborative work group and an advisory group that includes parents – no students with disabilities.

Maryland

Our standards were chosen by an expert panel and reviewed for content validity. Our advisory committee has been meeting on an annual basis to review procedures and results as well as to make necessary adjustments as appropriate. Parents, advocates, school personnel, test personnel and others are included on the advisory committee.

Massachusetts

Statewide Alternate Assessment Advisory Committee has met regularly since December 1998, and is working on development of the system. In 1999, panels of special educators and content specialists reviewed the state learning standards.

Michigan

Michigan’s Office of Special Education and Early Intervention Services are developing the alternate assessment. Michigan has many years of experience in the development of assessments with extensive involvement of Michigan educators and parents. Numerous committees, comprised of Michigan stakeholders, are intimately involved with various aspects of developing the alternate assessment instruments.

Minnesota

The advisory committee is a major source of parental involvement by involving various advocacy groups. They developed a set of principle statements that guided alternate assessment development. In addition, we have a State Special Education Advisory Committee (this has parental representation) that has been involved. A task force that worked on previously determined state sp. ed. goals developed assessments for reading, writing, and math.

Mississippi

A Task Force has been approved by the State Board of Education and is addressing alternate assessment issues.

Missouri

The Alternate Assessment Committee represented elementary, middle, and secondary teachers of students with significant disabilities, and parents and college faculty from nine regions across the state. The participants had agreed, as part of this experience, to return to their region and conduct a meeting to share the results of the initial meeting with other teachers, local district administrators, and parents. Over 500 additional stakeholders participated in nine regional meetings and provided structured feedback. The Alternate Assessment Committee then reviewed input from the regional meetings.

Montana

OPI/Sped has been working in concert with the OPI School Improvement group as it develops standards. They are also involved with special ed activities.

Nebraska

A statewide taskforce designed Nebraska’s alternate assessment framework. Members include parents, special education teachers, state department personnel, and higher education personnel. They have been working together since the fall of 1998.

Nevada

We have identified a stakeholder group.

New Hampshire

The New Hampshire Advisory Task Force was established in 1998 with various stakeholders represented.

New Jersey

Stakeholders have been involved in the development of the Core Curriculum Content Standards and progress indicators for Students with Severe Disabilities. We have expanded our stakeholder group in the final review of the document and will continue to involve them in all aspects of test development and decision-making.

New Mexico

We have a state task force, designated as the Alternate Assessment Design Team, comprised of special education teachers, assessment specialists, parent, university special education faculty, and state department of education personnel.

New York

NYSED has a Task Force of educators, parents, advocates, etc. who are assisting in test development and piloting. The NYS Education Department is collaborating with a statewide Taskforce along with the state’s testing contractor team.

North Carolina

The Exceptional Children Division and the Accountability Services Division of the NC Department of Public Instruction designed the portfolio. The Alternate Assessment Committee gave much input. Teachers, administrators, etc. participated on the committee.

North Dakota

“Key Informants” (special educators, general educators, parents, administrators, university personnel and North Dakota Dept. of Public Instruction representatives) were brought together to review the issues related to the creation of the alternate assessment and prepare written recommendations regarding this aspect of accountability for results of the education process. We have assembled writing teams (general educators – content area, and special educators) to review the content standards and identify how the content standards and benchmarks will be revised to address the needs of all students.

Ohio

Cross-departmental discussion regarding state model curricula. Interagency discussions on assessment of severely handicapped children and youth. We have had an ongoing work group, and will share materials and process with larger groups of stakeholders as a part of the field-testing.

Oklahoma

Task force includes parents; utilizes IDEA Advisory Panel which also includes parents and individuals with disabilities.

Oregon

Teachers have been the designers of our assessment system in partnership with the Oregon Department of Education. The advisory committees for the Office of Assessment and the Office of Special Education act as advisory to extended assessment efforts.

Pennsylvania

PA has established an alternate assessment work group to address alternate assessment. Bureau of Special Education personnel, Bureau Division of Evaluation personnel, Regional Instructional Support Center personnel, and Special Education Advisory Panel designees represent this group.

Rhode Island

Regular and special educators, university staff, parents, and Department staff.

South Carolina

Regular and special educators, university staff, parents, administrators, and Department staff.

South Dakota

The alternate assessment was developed by two separate workgroups of educators and administrators, including educators from schools of higher education. One of the workgroups developed the assessment instrument and the other workgroup developed the implementation guide, supporting documentation and the field test of the alternate assessment. The groups worked independently and periodically met as one group during the past year. This served as a system of checks and balances. The result was an alternate assessment device that was exposed to many storms and was rewritten several times before both workgroups were comfortable with the results.

Tennessee

A Task Force was formed in May 1998.

Texas

The selection and appointment of the national and state level steering committee membership involved several divisions of the Texas Education Agency, Accountability, Student Assessment, and Special Education. The stakeholder steering committee meets regularly and is involved in reviewing the work of the test developers, reviewing and providing input and oversight for the field trials and field activities, input on item selection, item design, scope of content, format, every aspect of the development, and implementation of the assessment system. Also policy and guideline development and training design. The Division of Special Education is involved in a supportive, advisory capacity. The division of Student Assessment is responsible for the production of assessment instrument that comply with the requirements specified in state law. Steering committees, and national and state advisory groups provide technical expertise for development of the general specification in the contract that is offered to test developers for bid. Various stakeholders are included in the statewide steering committee that works with the selected test developer in the actual production and development of the assessment instrument.

Utah

The steering committee for the development of the alternate assessment included special education teachers, a school psychologist, a physical therapist, a speech and language pathologist, and assessment specialists from the state office of education.

Vermont

Assessment Workgroup involves many different stakeholders including state dept. of ed consultants, university faculty, special education administrators, parent reps, and teachers.

Virginia

We have a steering committee comprised of teachers, administrators, parents, higher education representatives, and technical assistance providers. We also have three active subcommittees for performance indicators, communication and assessment strategies.

West Virginia

Various stakeholders, including general and special education teachers, county and regional education service agencies, special education administrators, parents, general education administrators, representatives of higher education, and WVDE staff have been involved in the development/implementation process.

Wisconsin

The Wisconsin Special Education Council has been involved in the process from the beginning. Teams of educators throughout the state met and developed sample alternate performance indicators.

Wyoming

The Expanded Standards Task Force includes state special ed. and assessment personnel, local special and general education teachers and administrators, parents and advocates. Educators from several districts have been involved in a pilot this year.

 

Participants

IDEA states that IEP teams must determine how each student will participate in large-scale assessments and, if not participating in the general assessment, how the student will be assessed. Many states have written participation guidelines and decision-making processes that can be used by IEP teams at the local level. States base alternate assessment participation decisions on several criteria, including the extent to which a student participates in the general education curriculum, whether a student is expected to graduate with a diploma, what type of skills a student is working on, and how much support a student needs. States have found that decisions about participation affect their approach to alternate assessment development, scoring, and reporting.

Some states have adopted their alternate assessment guidelines as state policy, while others make it clear that their guidelines are meant only as examples to be used at the local level. For example, Florida’s guidelines state, “This assessment system is provided to school districts as a choice of alternate assessment. It is not required.” In another example, Iowa’s guidelines state, “These guidelines are not state policy, but can be used by Area Education Agencies as they go about developing their policies.”

When alternate assessments were in initial stages of development, state personnel discussed who would “take them.” As development progressed, many states decided that the word “take” might be incorrect. Many alternate assessments consist not of a single “test” or “event,” but rather of a compilation of evidence that is collected over an extended period of time. Thus, the terminology has changed to reflect this approach, and now states talk about students who will “participate” in the state’s alternate assessment system. Table 3 shows examples of the process two states have used in developing their participation guidelines.

Early in 1999, 34 states reported that they were establishing eligibility guidelines to assist in determining which students would participate in alternate assessments. The most recent update shows an increase to 46 states that reported establishing participation guidelines for their alternate assessments. Table 4 describes examples of guidelines from 11 states.

 

 Table 3. Examples of the Process of Developing Participation Guidelines in Two States

Colorado

Some of Colorado's guidelines for eligibility began with statute or state school board policy. Other elements came from task forces specifically charged with determining how the IEP team would make decisions. Colorado passed legislation in 1993 instituting standards and assessments. The statute required that state and district assessment results are disaggregated and reported by separate disability category, among other variables. The Standards and Assessment Development and Implementation Council (SADI) recommended a reporting policy to the State Board of Education. The policy required 100% of students in each district to be used as the denominator in calculating the percent of students who perform at the state assessment’s four proficiency levels and for the category of "no scores" (not tested). The SADI Council also recommended that the participation decision be made during the IEP process, rather than by applying categorical or numerical criteria determined at the state level. In Spring 1997, Colorado Student Assessment Program participation guidelines were published with general descriptions of students for whom the assessment may be inappropriate, “a very small number of students with IEPs” who are “working on individualized standards rather than on the district-adopted standards.” Teachers were encouraged to provide appropriate accommodations and allow students to attempt the assessment. Over 80% of students in special education were accounted for on the first CSAP test. By June 1998, guidelines for non-participation were refined. Criteria included consideration of the alignment of the student's program of instruction and the assessments. The most recent state assessment results (Spring 1999), indicated that 1% to 2% of the total student population did not participate in the general assessments due to IEP team decisions. These students will be eligible for alternate assessments.

Texas

The guidelines were developed when Texas started accountability assessment, in the mid 1980s. They were disseminated in training activities conducted for special education personnel and for personnel responsible for accountability assessment. The guidelines were developed with personnel and stakeholders from student assessment and from special education divisions of TEA. The guidelines require the IEP Team to consider the student's requirements for instructional content and requirements for accommodations in instruction and testing for FAPE. These requirements are compared to the objective specifications of the accountability assessment and to the allowable accommodations for administration of the state accountability test. The IEP committee must make a determination about the appropriateness of the state accountability assessment instruction, for each student in special education on a case-by-case basis. The guidelines for determining whether the state test is appropriate were developed jointly by the Division of Student Accountability and the Division of Special Education and are disseminated through training activities for regional and district level test coordinator training activities. They are also available on the Texas Education Agency web site. Until the state's alternate assessment instrument field test is completed in the spring of 2000, the IEP committee must select an alternate assessment that is appropriate when a student is exempted from the state accountability test.

 

Table 4. Examples of Participation Guidelines

Alaska

It is expected that a small (less than 2%) number of all students will participate in alternate assessments. These will be students whose disabilities are so significant that they are not involved in a standard course of study leading to a high school diploma. When a student’s IEP calls for alternate assessments, the reasons must be documented on the IEP. All IEP meeting participants must understand that alternate assessments do not lead to a high school diploma. In deciding that a student should participate in alternate assessments, an IEP team must ensure that: 1. The student’s cognitive ability and adaptive skill levels prevent completing the standard academic curricula, even with modifications and accommodations. 2. The student requires extensive direct instruction in multiple settings to apply and transfer skills. 3. The student is involved in a functional, basic-skills education program. 4. The student’s inability to complete the standard academic curricula is not the result of extended absences; visual, auditory, or physical disabilities; emotional-behavioral disabilities; specific learning disabilities; or social, cultural, or economic differences.

California

The IEP team should consider: (1) whether the student participates in an academic or functional curriculum; (2) the types of instructional modifications used with the student; (3) whether the student is working toward a regular high school diploma; (4) the preference of the parent and where applicable, the student; and (5) input from other involved agencies. The decision should not be influenced by the student’s social, cultural or economic background, attendance, or by previous record of achievement.

Colorado

The participation decision must be based on the following considerations:

  • The unique needs of the individual student, not the specific disability category or program, and
  • The student’s IEP that documents the need for individualized standards in the assessed content area and the student’s inability to participate even with accommodations.

The decision must not be made on:

  • Poor attendance by the student.
  • Ongoing disruptive behavior by the student.
  • Student’s reading level.
  • Expectation of poor performance for the student.

Idaho

Students with disabilities will qualify to take the alternate assessment when it has been determined by the IEP team and documented on the student's IEP that the student meets the state criteria for taking the alternate assessment. This includes all of the following descriptors: 1. The student's demonstrated cognitive ability and adaptive behavior prevents completion of the general academic curriculum even with program modifications AND 2. The student's course of study is primarily functional and living skill oriented AND 3. The student is unable to acquire, maintain, generalize skills and demonstrate performance of those skills without intensive, frequent, and individualized instruction. Students are NOT to be included in the alternate assessment based solely on the fact that: They have an IEP, or they are academically behind due to excessive absences or lack of instruction, or they are unable to complete the general academic curriculum because of social, cultural, or economic differences.

Kentucky

The Alternate Portfolio was designed specifically for those students for whom the regular assessment program is not a meaningful measure of learning. Students whose limitations in cognitive functioning prevent the completion of the regular program of studies (even with program modifications and adaptations), and who require extensive instruction in multiple, community-referenced settings to insure skill acquisition, maintenance, and generalization to real life contexts, are eligible. IEP teams are required to review the participation guidelines checklist yearly, which requires each qualifying statement to be answered "yes" before assessing the student using the Alternate Portfolio. For those students in question, the checklist is reviewed each year to insure the student's proper assessment placement. In the past 7 years, approximately .06% of Kentucky's student population is assessed yearly through the Alternate Portfolio Assessment. Testing is done at the marker years of 4th, 8th, and last year of school for all students.

Maryland

Students not pursuing the Maryland Learning Outcomes are eligible to participate in IMAP. The decision to participate is made by the IEP Team and considers the student's severely cognitively developmental delay, over a period of time that has prevented the student even with modifications, and adaptations from completing the general course of study. By secondary school age the student is anticipated to be pursuing a Maryland High School Certificate.

Nebraska

There are 3 principles guiding participation decisions for or IEP teams: 1. a student’s cognitive ability and adaptive behaviors prevent completion of general ed. curriculum even with accommodations and modifications; 2. a student’ course of study is primarily function and life skills oriented; 3. a student requires intensive, frequent, and individualized instruction to acquire, maintain, and generalize skills. Participation decisions are not based on having an IEP, attendance, behavior, or expectations of poor performance.

Ohio

Advisory panel discussed application of certain ground rules. Does the test provide a meaningful measure? Is the student engaged in instruction in content assessed on current statewide assessments?

Oklahoma

Eligibility will be based upon a determination of the curriculum that a student is receiving, that is, the standard Utah Core Curriculum or the functional curriculum. The IEP team will determine exemptions individually.

Oregon

To be eligible for the CLREAS and Extended CIM measures, a student needs to meet the following criteria. The student is: exempt from the Benchmark 1, 2, 3, and CIM measures; receiving instruction in a functional daily living skills curriculum appropriate for the student; and has a moderate to severe disability (e.g., mental retardation, autism, multiple disabilities).

Utah

Eligibility will be based upon a determination of the curriculum that a student is receiving, that is, the standard Utah Core Curriculum or the functional curriculum. The IEP team will determine exemptions individually.

 

Standards

Several states designed their alternate assessments to assess progress toward general education standards. Some states have designed their alternate assessments to assess basic or functional skills rather than progress toward standards. It is interesting to note that the states designing alternate assessments based on special education skill sets rather than general education standards generally did not include any representatives from general education on their planning, advisory, or stakeholder groups, whereas states that used general education standards as the basis for their alternate assessments usually included general education personnel in these groups.

 The decision about the link between general education standards and those expected of alternate assessment participants may vary as a result of the type of standards a state requires students to meet. Some states focus very narrowly on specific academic standards, while others take a broader approach and include many “functional” or life skills within their standards. States have also looked at whether their general standards could be expanded or adapted to include performance indicators toward which even students with the most significant disabilities might be working.

For some states, even though the content standards assessed by an alternate assessment are the same as those assessed by the general large-scale assessment, the indicators of progress are different—often focusing on functional life skills rather than on academic skills. Performance standards may also be defined differently, in order to differentiate between the types of benchmarks and indicators expected at specific grade levels, and the indicators measured by the alternate assessment. The highly committed stakeholder groups described in the previous section of this report have developed these standards through a long and carefully planned sequence of activities. Table 5 shows examples of that process as it unfolded in three states.

Forty-nine states have academic content standards (AFT, 1999). Early in 1999, 32 states reported that they were working on identifying the curricular or content standards for which an alternate assessment would be developed. As of June 2000, this number rose to 49. After careful consideration of the standards described by survey respondents, we reorganized state activities into four general groups of standards or skills toward which alternate assessment participants might be working.

Alternate Assessment participants are working toward:

  General education standards, with a possible reduction in the number required, and with the expansion of benchmarks and/or performance indicators to include functional/access/life skills (28 states).

     A combination of general education standards and an additional set of functional skills (7 states).

  Standards or functional skill sets that have been developed exclusively for alternate assessment participants and then linked back to general education standards (3 states).

  Standards or functional skill sets that have been developed exclusively for alternate assessment participants and are not connected in any way to general education standards (9 states).

  Uncertain (3 states).

States included in each group are described in Tables 6 through 9.

 

Table 5. Process for Developing Standards and Indicators for Alternate Assessments in Three States

Colorado

The Colorado Student Assessment Program for Expanded Standards (CSAP-ES) will follow the same content area progression as the general state assessments: first reading and writing, then math, then science. The task force, Expanded Linkages, developed a framework for the alternate assessment based on the expanded standards for Reading and Writing. They mapped clusters of skills, and then described indicators of proficiency. This resulted in the identification of twelve strands of skill indicators, progressing across three levels of skill development. This literacy matrix was tested in a small study of 19 students. Data collection forms were developed to gather information on the validity and observability of the performance indicators

Michigan

Michigan has invested a considerable amount of time and training in the development of Outcomes for special education. Outcomes and related assessment materials have been developed for students receiving special education services in all 12 eligibility categories recognized in Michigan. The categorical Outcomes' materials have evolved into a four-level document known as Addressing Unique Educational Needs of Students with Disabilities. The AUEN materials are noncategorical and are organized around four levels of independence in adult life roles that students with varying levels of impairments can realistically be expected to achieve. Michigan State Board of Education approved the use of two of the four levels of AUEN materials in the development of a PROPOSED model for instruction and as a foundation for developing an alternate assessment tool to assess the progress of students with moderate (Achieving Supported Independence in Adult Life Roles) and severe (Achieving Participation in Adult Life Roles) cognitive deficit. These two levels of independence do not, at this time, have a relationship with the Michigan Curriculum Frameworks.

Wisconsin

The fourth grade performance standards in Language Arts, Math, Science, and Social Studies were used in the development of the alternate performance indicators. After reviewing the fourth grade performance standards, teams of educators throughout the state identified the population of students the alternate performance indicators would be used with and determined how the standards would be expressed for this population. They then used the process of “backward mapping” to develop alternate performance indicators. During this process each team first identified the specific skills and knowledge that led to mastery of performance standards from the WMAS (Wisconsin Model Academic Standards). After identifying the fundamental skills leading up to mastery of a fourth grade performance standard, the team identified the knowledge and skills appropriate for the student, based on that student’s present level of educational performance. Finally, the team made a list of questions to consider when evaluating the quality of alternate performance indicators and activities. These are listed below.

Is the Alternate Performance Indicator (API):

1.       Aligned with a Performance and Content Standard from the Wisconsin Model Academic Standards?

2.       Appropriate for the student based on his or her present level of educational performance?

3.       Related to the student’s educational program?

4.       Stated clearly?

5.       Observable and measurable?

6.       Applicable across different instructional contexts and settings?

7.       Applicable to a variety of student activities and tasks?

Are the activities and tasks used to measure each Alternate Performance

Indicator

1.       Consistent with the API?

2.       Descriptive of what the student needs to do?

3.       Engaging and challenging for the student?

4.       Representative of the possible range of student performance?

5.       Sensitive to the unique needs of the student?

6.       Able to be incorporated into the student’s daily instruction?

 

Table 6. STates in Which Alternate Assessment Encompasses General Education Standards

Arizona

We are completing work on the downward extension of our State standards to include skills from birth to the kindergarten level. The areas currently approved by the State Board of Education are: Language Arts, Math, Work Place Skills, and Physical Development. A task force continues to work on the other areas of our standards.

Arkansas

Our standards are an extension of the general education curriculum content standards. They include performance indicators across the functional domains of vocational, community, domestic, and rec/leisure for students participating in Arkansas’ Alternate Portfolio Assessment.

Colorado

The Expanded Standards are derived from the standards used for general education, but expanded to benchmarks beginning at the most functional levels. The process to expand the standards involves looking at the key components of the standards and the access skills necessary to learn them. The combination of key components and access skills can be described separately or in combination within an expanded standard. The Expanded Linkages Task Force developed literacy-related indicators based on the expanded standards for Reading and Writing. These form the basis for a state alternate assessment.

Hawaii

The alternate assessment will be based on standards in language arts and mathematics.

Idaho

Standards are extended downward to their basic and most functional skill requirements. The alternate assessment targets communication skills within the social domain and functional math skills within the vocational domain. Performance levels mirror those used with the direct writing and direct math assessments used within general education.

Illinois

The Standards Sets from the Illinois Learning standards are being used as the framework.

Kansas

Kansas has 5 separate curriculum standards from which the state assessments are developed. (Reading, Writing, Mathematics, Science and Social Studies). After careful review and thoughtful consideration of those general curriculum standards, the committee determined that they would recommend that the reading and writing standards, benchmarks, and indicators be extended to accurately reflect the teaching and learning of students who participate in the alternate assessment. The extended standards contain up to 25 clarifying examples in 5 domains per indicator. The clarifying examples are examples of how students might demonstrate that they have acquired that indicator at school, work, community, and recreation or at home. Kansas has: Standards, Benchmarks, and Indicators for each subject. The extended standards are based upon the "Standard."

Kentucky

Critical Standards are a subset but are exactly like those of regular ed. The remaining reg. ed. standards are optional for inclusion in the alternate assessment.

Louisiana

Selected standards from Louisiana Content Standards were identified for the alternate assessment.

Maine

The Maine Learning Results standards are for ALL students.

Massachusetts

MCAS Alternate Assessment will be based on the same Massachusetts Curriculum Framework learning standards as those being assessed on standard MCAS tests. We use the critical functions of the Massachusetts Curriculum Frameworks content standards assessed by the on-demand Massachusetts Comprehensive Assessment System (MCAS) tests.

Missouri

The Alternate Assessment Committee agreed that the Show-Me Standards are appropriate for the alternate assessment when framed in a functional context. A curriculum framework has been developed based on the Show-Me Standards that incorporates this functional context. This framework is a model that teachers of students with significant cognitive disabilities can use to assure that the Show-Me Standards are incorporated into their curriculum, classroom instruction, and IEPs.

New Hampshire

A State Advisory Task Force has extended our existing Curriculum Frameworks to identify the critical functions of standards in the areas of English Language Arts, Math, Science, Social Studies and Career Development.

New Jersey

  

A committee of stakeholders has identified a subset of NJ's Core Curriculum Content Standards and developed new cumulative progress indicators that are appropriate for students with severe disabilities.

New Mexico

New Mexico has K-12 Content Standards and Benchmarks. We have expanded these content standards in the areas of Math, Language Arts, Science and Social Studies in the form of performance indicators. Each content area has five clusters of performance indicators.

New York

The standards for students with severe disabilities are the same standards that have been approved by the Board of Regents for all students. However, alternate performance indicators and sample tasks on a basic functional level were developed to reflect appropriate educational outcomes for students with severe disabilities. The standards and alternate performance indicators will assist school personnel and families in understanding what students with severe disabilities need to know to attain the highest level of performance.

North Dakota

Expanded standards have been drafted for Mathematics and English Language Arts. It is anticipated that work on the expanded standards in the areas of Health, Science, and Social Studies will be complete by June 2000.

Oregon

The Career and Life Role Extended Assessment System (CLREAS) is a new extended career and life role assessment (alternate) that extends the measurement of career related measures below the CAM level to emerging and beginning levels of performance. The information obtained assesses student progress and current needs for students who have as part or all of their education instruction in “life skills.” The assessment system takes a functional approach to measurement and matches content areas to the Career and Life Role education standards of the Oregon Certificate of Advanced Mastery (CAM). The system “extends” measures for Career and Life Role Standards (CLRS) and assesses skills appropriate for students with moderate to severe disabilities. Career and Life Role strands include: Personal Management, Career Development, Communication, Problem Solving, Team Work, Employment Foundations, and Motor Skills (not a CLSR area).

Rhode Island

A subset was chosen from the general ed. content areas.

South Carolina

Subset of English Language Arts and Mathematics.

South Dakota

Subset of general education standards.

Tennessee

Extensions and adaptations of the curriculum content standards have been used extensively in pilot training.

Texas

The standards include curriculum from grade levels not tested in the regular assessment (TAAS). TAAS begins with 3rd grade curriculum for reading, writing, and math. The test is very closely aligned to the mastery performance standards at each grade level of the state Essential Knowledge and Skills (TEKS). Out of level testing is not permitted. The alternate test is designed to follow a student through the curriculum in a developmental sequence, when the student has not yet been included in grade level instruction in the subject area to the same extent as is done for students without disabilities. The alternate assessment will measure progress in the state curriculum from grade one level performance standards.

Utah

The skills that will be targeted in the alternate assessment are the elements of the “Life Skills” curriculum (a component of the Utah Core Curriculum): lifelong learning, effective communication, collaboration, responsible citizenship, and employability.

Washington

Subset of state standards.

West Virginia

Worked with a stakeholder group and the MSRRC to develop "West Virginia's Alternate Assessment Framework: Linking Instructional Goals and Objectives with Adaptive Skills". It is the foundation for the alternate assessment. Just as the state adopted norm-referenced achievement test assesses a sampling of skills from the general curriculum, the Alternate Assessment will measure student performance on a sampling of skills (Instructional Goals and Objectives) from the frameworks in the following areas: English Language Arts, Mathematics, Science, Social Studies and Process/Workplace Skills. The Framework consists of 126 selected Instructional Goals and Objectives (IGOs) from the general education curriculum that are functional and doable with appropriate supports, for students with severe disabilities. The IGOs are linked to adaptive skills students with severe disabilities need for successful functioning in the home, school and community. Examples of real world performance skills are provided for each IGO, illustrating ways students can functionally demonstrate achievement of IGOs and facilitating access to the general education curriculum and inclusion in school accountability efforts. The selected IGOs do not form a curriculum for students with severe disabilities, whose IEPs will address needs beyond those included in the Framework. However, the Framework provides entry points for students’ access to and progress in the general curriculum.

Wisconsin

The Wisconsin Model Academic Standards encompass academic content and performance standards for all students in the state. Teams of educators throughout the state met and developed sample alternate performance indicators. These are extensions of the academic standards for all students. They describe how students with disabilities may demonstrate learning associated with designated content and performance standards aligned with WMAS or district developed standards.

Wyoming

Our taskforce developed expanded standards, based on our already developed state standards in Language Arts and Math.

 

Table 7. States in Which Alternate Assessment Encompasses a Combination of General Education Standards and an Additional Set of Functional Skills

Connecticut

Students with moderate impairments participate in the academic curriculum, but at a significantly lower rate. A developmental checklist has been created for students with more significant impairments. The checklist mirrors general ed. domains.

Delaware

The Design Group developed “Standards and Key Concepts for Functional Programs.” They selected fourteen of the thirty-eight academic standards  from the areas of English/Language Arts, Mathematics, Social Studies, and Science, and added five functional domains including Communication, Personal Management, Social, Career/Vocational, and Applied Academics.

Indiana

Indiana elected to create an assessment that would include both state academic standards in language arts, mathematics, science, and social studies as well as functional proficiencies in the areas of vocational experience, social adjustment, recreation and leisure, and personal adjustment. Current assessment includes over 1000 skills in 16 sub domains.

Maryland

IMAP (Independence Mastery Assessment Program) uses a combination of standards and extended strands that have been independently developed to coordinate with general curricular areas and content subjects. For example, IMAP utilizes the regular “Skills for Success” developed by the Division of Career Technology and Adult Learning as a basis for the “skill” areas of Communication, Decision Making, and Behavior. The fourth area, Academics, is based on the extended standards in English/Language Arts, Mathematics, Science and Social Studies. There are four other subject content areas derived from the framework: Personal Management, Community, Career/Vocation and Recreation/Leisure. We are currently working on content validity through reviews of national experts, concurrent reviews of NCEO outcomes and other assessment instruments. Secondary content validity analysis consists of reviews of IEP Goal areas of the target population. Rater reliability is currently being analyzed for the student performance events, and portfolio scoring. Validity and reliability studies will continue and be an ongoing component of the alternate assessment as well as reviewing the standards for alignment and the curricular framework as a basis for locally developed curricula.

Minnesota

The Reading, Math and Writing branch of the alternate assessment is

aligned with the state’s basic standard assessments. The functional branch has an academic component but the real emphasis is on functional life skills.

Nevada

A subset of general education standards with some additions.

Oklahoma

Standards have been developed; both extended (Community Living, Personal and Home Management, Recreation and Leisure, Job/Work Opportunities) and expanded (Mathematics, Language Arts, Science, Social Studies, The Arts). In addition, standards for Social Interaction and Self Determination have been established.

 

Table 8. States in Which Alternate Assessment Encompasses Separate Standards or Functional Skill Sets Linked Back to General Education Standards

Alabama

We have developed Alternate Standards that are linked to the Alabama Courses of Study. This document will be disseminated as a resource to IEP Teams to help them understand how students with significant disabilities can be learning the general education curriculum.

Alaska

Independently developed but still based on the Alaska Content Standards.

Vermont

Subset of state standards linked to COACH curriculum areas (e.g., communication, selected academics, personal management & socialization, home/school/community, and vocational/leisure.

 

 

Table 9. States in Which Alternate Assessment Encompasses Separate Standards or Functional Skill Sets Not Linked to General Education Standards

California

While California’s rigorous academic standards are appropriate for most students with mild or moderate disabilities, they have little in common with the functional curriculum provided to the 1 – 2% of all students who will likely require an alternate assessment. Areas assessed in alternate assessments will include: communication, self-care/independent living, motor skills/mobility, functional academics, vocational skills, social/emotional, recreation/leisure.

Florida

The Performance Assessment System for Students with Disabilities addresses the needs of students at mild, moderate, and severe levels. Each level is based on a set of exit standards (or expectations) with student performance rating scales provided at benchmark levels (grades 1-3, 4-5, 6-8, and 9-12 at the mild level and ages 6-9,10-13, 14-17, and 18-21 at the moderate and severe levels.) The State Board of Education has adopted these special standards. Students may "float" between the two sets of standards (general and special) until it is determined that a standard or special diploma would be the best choice for the student. Our general education standards are purely academic in focus. The special standards address areas of curriculum/learning, social/emotional, communication, and independent functioning.

Georgia

We are looking at the type of diploma to make decisions for who needs alternate assessments, but not necessarily setting curricula guidelines or benchmarks.

Iowa

Iowa does not have state standards, but all of out districts either have standards or are in the process of developing them. We have drafted standards and benchmarks that local districts may choose to use as the basis for their alternate assessments. The standards are based on those used in many local districts, including language arts, reading, science, math, social studies, art, physical education/health, and work/lifeskills.

Michigan

Once the Alternate Assessment tryouts have been completed and there are data to support the use of the AUEN as a foundation for alternate assessment, the Office of Special Education and Early Intervention Services will ask the State Board of Education to adopt the AUEN as standards for students with moderate and severe cognitive deficits.

Nebraska

A statewide task force designed a framework of 5 domains including functional academics, personal management, vocational skills, motor skills, and independent living. Each domain includes several skill areas, with targeted behaviors under each skill area.

North Carolina

Standards, called domains and competencies, have been developed for the Alternate Assessment Portfolios. The domains are Communication, with seven competencies; Personal and Home Management, with 14 competencies; Career/Vocational, with 14 competencies; and Community, with seven competencies.

Ohio

Generally cautious about the prospect of establishing standards that can fairly be applied to the students who have the most severe disabilities. The diversity of this population, combined with serious measurement issues related to item development and reliability has prompted us to look for other means of conducting alternate assessments.

Virginia

We have a draft document that identifies relevant standards of learning for students participating in Alternate Assessment, real life activities where the standard may be demonstrated, and access skills needed to attain the standards.

 

Approach

Several approaches have been chosen by states to gather data for their alternate assessments. By early 1999, 31 states had considered assessment approaches. As Table 10 shows, very few of these approaches had actually been selected at that time. Presently, 49 states report having at least considered, and most have selected, their specific approach to alternate assessment. The numbers shown in Table 10 for the current year are high since many states have decided to collect data using multiple assessment strategies.

After a careful analysis of these approaches, we have reorganized them into four general categories: portfolio assessment or compilation of a body of evidence, checklist/rating scale of functional/essential skills, IEP analysis, and other. It was difficult to get a clear picture of the approach selected by a few states, and some still have not made a final decision about the approach they plan to use. The assessment strategies used within these categories may be very similar. For example, video taped observation might be used as a strategy to collect data for a portfolio entry, to validate a selection on a checklist, or to assess progress toward meeting an IEP goal. The approaches selected will undoubtedly become more refined as they are implemented. Tables 11 through 14 describe examples of states that have selected different general approaches.

States are in various stages of development of the approaches they have selected. Some are still in a design phase, many are pilot or field-testing their approach, and a few are involved in statewide implementation. It is interesting to note that, as states progress toward implementation of their alternate assessments, several are carefully considering the relationship between a student’s IEP goals and state standards, benchmarks, and performance indicators. States are looking at data collected as evidence of progress toward meeting IEP goals and considering the use of those data as evidence of progress toward performance indicators established within their alternate assessments. We expect this relationship to continue to evolve in the implementation phase as teachers who collect extensive data on progress toward IEP goals ask why they would be expected to repeat this data collection process for an alternate assessment, especially as IEPs become more standards-based. Tables 15 through 17 show examples of states at each stage of development.

 

Table 10. Assessment Approaches Selected by States

Assessment approach selected

# of states in 1999

# of states in 2000

Observation

7

43

Analysis of existing data

2

32

Interviews or surveys

4

27

Portfolio

3

26

Testing/adaptive behavior scale

4

23

 

Table 11. Portfolio Assessment or Compilation of a Body of Evidence

Arkansas

Arkansas’ Alternate Portfolio Assessment System.

Colorado

We collect a “Body of Evidence” that can be used as an alternate to district-level assessment.

Delaware

The Delaware Alternate Portfolio Assessment Design Group developed a portfolio assessment tool. Students, with the help of their IEP team, will develop a portfolio containing evidence of student performance within standards across five domain areas: communication, personal management, social, career/vocational, and applied academics.

Florida

Portfolio that includes teacher observation, video, audio, product completion information, skill level indication.

Georgia

We have a committee on alternate assessments and are developing a protocol for a portfolio type assessment based on the IEP.

Idaho

The Alternate Assessment Workgroup is creating a proficiency and progress rating scale to be used by a team who knows the student best at the building level. Student work samples, review of records, or tests are acceptable ways to document student proficiency and progress. A commercially available adaptive behavior scale may be used to document student proficiency if it was used for some other purpose and is already available.

Illinois

The assessment approach being piloted is a portfolio, collecting work samples to assess the Standards.

Indiana

Researchers at Purdue University are working closely with the Indiana Department of Education to develop the IASEP software that is currently being used, statewide, for all students with the most significant disabilities. We are using a computer-based rating and documentation system that integrates information from a variety of sources. Because the assessment technology is integrated into the IASEP software, it is easy for teachers to capture performance events on an ongoing basis. Any document that could be included in a paper portfolio can be included in the student's record through scanned or digital images. These documents are then easily accessed during case conferences, parent-teacher conferences, etc. An electronic IEP is included in the software along with sections for medical records, AAC/AT history and use, and student demographic/special education data. The goal is to provide each teacher with a seamless electronic assessment and instructional management system.

Iowa

Our task force is considering a portfolio system with performance rubrics. We are considering moving beyond looking at IEP goals.

Kansas

The first part of the Kansas Alternate Assessment consists of a collection of evidence of student progress on the target indicators. This collection, the Evidence File, may contain student work samples from classroom and vocational activities, anecdotal records, etc. The second part of the Proposed Kansas Alternate Assessment is the Performance Evaluation Survey, a standardized survey instrument that will be administered to at least two interviewees or respondents to obtain a broad-based assessment of student performance across domains.

Kentucky

The advisory committee recommended a portfolio assessment to evidence the 28 academic expectations within the context of "best practice" programming. While sharing content standards with all students in Kentucky, a different set of performance indicators may be evidenced in the Alternate Portfolio Assessment. The portfolio is a collection of student work that may span several years.

Louisiana

This is a performance-based student assessment. The teacher actually observes the student performing the skill.

Maryland

The Independence Mastery Assessment Program (IMAP) is parallel to the state general school accountability program in terms of assessment; frequency; reporting; approach (authentic performance events) and public reporting of results. IMAP has three major components: student portfolio, authentic events and parent survey. The program evaluation includes scoring for all three components, student performance and support.

Massachusetts

A body of evidence documenting a student's level of achievement of state learning standards will be assembled. Structured assessment tasks are provided to teachers as part of alternate assessment resource materials.
Narratives, data charts, graphs, videotape, or digital cameras may be used as part of an "electronic portfolio." Parent surveys may be conducted in person, in writing, or by telephone.

Missouri

We are using a portfolio approach.

Nevada

Electronic and manual portfolio.

New Hampshire

NH is looking at developing a portfolio approach.

North Carolina

The alternate assessment portfolio has been piloted in 14 school systems.

North Dakota

The alternate assessment will contain a body of evidence showing student progress on the state’s standards. It may include work samples, attendance reports, schedules, anecdotal records, etc.

Oklahoma

Portfolio assessment will be utilized as the alternate assessment system.

Oregon

Students taking the CLREAS are assessed on their ability to perform typical daily “routines” while incorporating essential “related skills” for living. A routine is defined as a detailed course of action with core steps important for daily living that involve a beginning, middle, and end. Concurrent with the evaluation of routines, students are assessed on specific related skills identified from the students’ IEP goals and objectives. Certificate of Initial Mastery Extended Measure (CIM-Ex). The Oregon Statewide Assessment System begins with initial benchmark standards at 3rd grade with Reading/Literature and Mathematics multiple-choice assessment. Students with beginning awareness, emerging skills, and primary academic skills are unable to obtain meaningful scores on the initial benchmark measures. For these students performing well below the first benchmark, curriculum based measurements such as the following are being developed to measure “emerging” skills leading toward the initial benchmark. The curriculum based measures include performance assessments in phonemic segmentation, oral reading fluency, prosody, reading retells, thought units, correct word sequences, total words written, thought units related to the prompt, thought nits w/correct grammar, and sentence type. An additional set of academic measures was developed to measure rudimentary academic performances for students taking the Career and Life Role extended measures. These measures include counting money, reading and writing letters and words, and time. Both the CLREAS and the CIM-Ex were field tested during the 1999-2000 school year. These results will be available in the summer of 2000.

Rhode Island

Portfolio – collection of student work across the school year and scored at the same time as the state assessment window.

South Carolina

Portfolio – collection of student work across the school year.

Tennessee

Portfolio Assessment Model with rubric scoring. Consists of five dimensions for scoring and five entries that are linked to general curriculum standards and reported with the regular TCAP Achievement assessment. Teachers have the option of taping student's performance for the assessment. In the 2000-2001 school year, an option will be offered for use of an electronic portfolio. The portfolio assessment will be conducted throughout the school year, with data collected in spring at the time of other statewide assessments.

Virginia

The Steering Committee and Internal Team recommended a "Collection of Evidence" to use as our assessment instrument. Representatives from other states presented to our steering committee. The state information was used to recommend options for collecting evidence.

Washington

We are using a portfolio with information collected over the school year.

West Virginia

Data for the alternate assessment will be collected using a skill and performance level inventory and a collection of work samples referenced to selected instructional goals and objectives for West Virginia Schools. Each student will be assessed on an individual basis through performance demonstrations in real world situations and settings.

Wyoming

Collection of a body of evidence using a variety of real world performance indicators and assessment strategies.

 

Table 12. Checklist/Rating Scale of Functional/Essential Skills

Connecticut

A locally developed checklist addresses adaptive behaviors. This assessment requires teachers to complete a basic skills checklist in the domains of Communication and Quantitative Skills. The completion of this checklist requires direct observation by teachers or others.

Michigan

Proposed Plan for the Development of Alternate Assessment for Students Receiving Special Education Services includes: (a) Purpose of the Plan (b) Background (c) Philosophy of the AUEN (d) Development of the Alternate Assessment (e) Description of the Proposed Alternate Assessment (f) Committee lists (g) sample assessment activities (h) sample report forms

Minnesota

A task force that worked on previously determined state sp ed goals developed checklists for reading, writing and math. After working with advisory committee and other stakeholders, the functional piece was developed. Professionals (teachers, psychologists, evaluation specialists, disability consultants, etc) were gathered together to develop the functional checklist.

Nebraska

We are using a checklist approach, allowing IEP teams to assess skill levels of targeted behaviors using a variety of assessment strategies, including observation, work samples, etc.

 

Table 13. IEP Analysis

Alabama

IEP goals will be evaluated to determine whether progress was made, the level of support that was needed to achieve the goal and the range of contexts across which the goal was achieved. IEP goals are categorized into domains and coded.

California

Our workgroup has identified an assessment procedure that is based on the identified common content areas. We have produced an instrument to classify IEP goals into content areas and rate progress/mastery in meeting the goal using a four-point scale.

Georgia

Observation is part of the data collection system to document IEP progress.

Ohio

Task force is developing a means of incorporating information drawn from the IEP into an aggregate form for group reporting. Analysis of data from an eligibility assessment may be used as an option for helping to determine rating of goal attainment.

Utah

We will review progress toward IEP goals through  use of a unique instrument, which will serve as the official alternate assessment.

 

Table 14. Other Assessment Approaches

Arizona

Analysis of data from eligibility assessment or other existing assessment.

Connecticut

Out-of-level testing has been available in Connecticut since 1990 and has allowed us to test children who would have otherwise been exempt from standard grade-level assessment. We have expanded this option for Alternate Assessment for students who are working in an academic curriculum but at a lower grade level.

Maine

We will not  have a separate instrument but it will be part of our Comprehensive Assessment System that will also include our state Assessment.

Oregon

We have been trying “not” to develop an alternate assessment. More precisely, we are attempting to have all students included in a single comprehensive assessment system and to avoid having an assessment that is the “alternate.” The term “alternate” assessment suggests an undesirable contrast between the “real” assessment and the “alternate” assessment. Oregon’s implementation of the alternate assessment requirement will be a lower end measure of the Career Related Learning (CRL) standards adopted by the State Board for the Certificate of Advanced Mastery (CAM). This (CRL) assessment design, intentionally, does not exclude a student from academic achievement measures. Students assessed on career related learning measures might also be working toward academic standards in reading, math, science, or social science and vice versa. The two are intended to be mutually inclusive whenever possible, not mutually exclusive.

Texas

The alternate test is designed to follow a student through the curriculum in a developmental sequence, when the student has not yet been included in grade level instruction in the subject area to the same extent as is done for students without disabilities. The alternate assessment will measure progress in the state curriculum from grade one level performance standards.

Until the state's alternate assessment instrument field test is completed in the spring of 2000, the IEP committee must select an alternate assessment that is appropriate when a student is exempted from the state accountability test.

Wisconsin

The IEP team has the responsibility to complete the alternate assessment process for an individual child. The review process and data used during this process serve as the alternate assessment if the following criteria are met: 1) the data are recent, reliable, and representative of the student’s performance, 2) the data are aligned with the WMAS or district developed standards and 3) the analysis of the data is summarized and shared with the student’s parents in a meaningful way.

 

Table 15. Initial Design

Alabama

A task force is working on the design of our instrument at this time.

Alaska

We're using a vendor and a committee of stakeholders. We have an assessment model that we will field test next school year.

California

Our workgroup has identified an assessment procedure.

Connecticut

Committee has been working on the development of the assessment instrument.

Georgia

We have a committee on alternate assessments and are developing a protocol for a portfolio type assessment based on the IEP.

Idaho

The Alternate Assessment Workgroup is creating a proficiency and progress rating scale to be used by a team who knows the student best at the building level.

Maine

It is unlikely we will have a separate instrument but it will be part of a comprehensive Local Assessment System that will also include our state Assessment (MEA).

Michigan

Proposed Plan for the Development of Alternate Assessment for Students Receiving Special Education Services includes: (a) Purpose of the Plan (b) Background (c) Philosophy of the AUEN (d) Development of the Alternate Assessment (e) Description of the Proposed Alternate Assessment (f) Committee lists (g) Sample assessment activities (h) Sample report forms.

Mississippi

In preliminary planning stage.

Missouri

A prototype for an alternate assessment is being developed.

Nebraska

We will most likely not create a single instrument but develop a process for districts to address the issues of alternative assessment.

New Jersey

A request for proposal is in development to fund a contractor to build the alternate assessment.

New York

The NYS Education Department is collaborating with a statewide Task Force along with the state's testing contractor team of Advanced Systems in Measurement and Evaluation and the Inclusive Large Scale Standards and Assessment group from the University of Kentucky to develop an alternate assessment.

North Dakota

“Key informants” were brought together to review the issues related to the creation of the alternate assessment and prepare written recommendations regarding this aspect of accountability for results of the education process.

Ohio

Task force is developing a means of incorporating information drawn from the IEP into an aggregate form for group reporting.

Oregon

The discussion is underway.

Pennsylvania

The Alternate Assessment work group will submit a recommended plan for development of an assessment instrument to the Special Education Bureau Chief.

 

Table 16. Pilot/field Test

Arizona

We have a task force that is developing the assessment. We have the assessment out for field test at this time.

Arkansas

We are currently in the midst of field-testing materials with a few school districts. All materials are works in progress and will be redrafted based on feedback from the field trial districts before they are used in the statewide pilot next school year.

New Mexico

We have created an Alternate Assessment pilot that will be used as the foundation of our Alternate Assessment.

Wyoming

We are currently field-testing our alternate assessment in about 10 districts across the state.

North Carolina

The alternate assessment portfolio has been piloted in 14 school systems.

 

Table 17. Implementation

Indiana

Researchers at Purdue University are working closely with the Indiana Department of Education to develop the IASEP software that is currently being used, statewide, for all students with the most significant disabilities.

Kentucky

The advisory committee recommended a portfolio assessment to evidence the 28 academic expectations within the context of "best practice" programming. While sharing content standards with all students in Kentucky, a different set of performance indicators may be evidenced in the Alternate Portfolio Assessment.

Maryland

The Independence Mastery Assessment Program (IMAP) is parallel to the state general school accountability program in time of assessment; frequency; reporting; approach (authentic performance events) and public reporting of results. IMAP has three major components: student portfolio, authentic events and parent survey. The program evaluation includes scoring for all three components, student performance and support.

Minnesota

Minnesota began statewide implementation during the Spring 2000 statewide testing.

 

Measures of Proficiency

At this time, 40 states have worked on establishing the levels of proficiency they plan to use in scoring their alternate assessments. Only 17 states had considered this decision early in 1999. Development of these measures has taken place with systematic and careful thought by stakeholder groups across states. Table 18 shows an example of the process used in two states.

Twelve states are using or plan to use the same measures of proficiency for their alternate assessment as are used in the general statewide assessments (Table 19), while 12 states have selected measures of proficiency that are specific to the alternate assessment (Table 20). Other states have not finalized their plans or were not clear about the measures they have selected. As these tables show, there is great variation in measures of proficiency selected across states. For example, Louisiana uses a three-point rubric: Introductory, Fundamental, and Comprehensive, while Minnesota uses a seven-point scale for its alternate assessment.

Not only is there great variation between states in the labels assigned to their measures of proficiency, but alternate assessments also measure different things in different states. For example, some measure level of skill/performance indicator or percent of competency in mastering a skill/performance indicator, while others measure the degree of progress toward a skill/performance indicator. Some also measure level of independence, while others measure degree of support. Some base their measures on a single component, while others score multiple components. Some aggregate these measures into a single score or level of proficiency, while others report each component separately. Table 21 shows examples of the components measured by four states.

Although most states have determined proficiency levels for their alternate assessments, many have not yet considered who will actually do the scoring, nor have they designed procedures for scoring. There are a variety of methods chosen by the few states that have taken this step. For example, in Minnesota and Wyoming, teachers will be expected to score their own students’ work within predetermined performance or proficiency levels. West Virginia will use people involved in a student’s education, such as teachers, parents, service providers and others to assess individual performance in real world situations and settings. To assure fairness and reliability, California plans to select a 20% random sample of alternate assessment participants to be scored a second time by another credentialed employee of the district who knows the student and has not participated in the first scoring of a student’s IEP. Other states have set up or are planning scoring sessions where educators score student work from other schools within predetermined performance or proficiency levels. Some states, Idaho for example, plan to have student performance scored independently by two or more trained evaluators at the state level. Maryland sends completed alternate assessments to the state for scoring and reporting, with teachers from across the state trained and assigned to small multi-district scoring teams.

 

Table 18. Process Used to Develop Measures of Proficiency in Two States

Colorado

Several task forces have been involved in setting the proficiency levels for the alternate assessment. These were determined by interpreting the benchmark for each standard to essential concepts and foundational skills. We also brought together several major bodies of work produced by other Colorado task forces to identify and describe the Access Skills, or underlying skills needed to meet standards and achieve successful life outcomes and community membership.

Maryland

We are currently working on content validity through reviews of national experts, concurrent reviews of NCEO outcomes and other assessment instruments. Secondary content validity analysis consists of reviews of IEP goal areas of the target population. Rater reliability is currently being analyzed for the student performance events, and portfolio scoring. Validity and reliability studies will continue and be an ongoing component of the alternate assessment as well as reviewing the standards for alignment and the curricular framework as a basis for locally developed curricula.

 

 

Table 19. Same measures of proficiency for alternate and general statewide assessments

Alaska

Alaska is using Advanced, Proficient, Below Proficient, and Not Proficient for the alternate assessment.

Kentucky

Within the performance levels of novice, apprentice, proficient, and distinguished, 5 performance standards were developed based on best practice, to specifically encourage school systems to implement well-researched strategies for effective facilitation of student learning. The standards include student performance of targeted skills, natural supports, interactions with non-disabled peers, instruction in multiple settings, inclusive contexts, and evidence of the 28 academic expectations that all students in Kentucky are working toward.

Maine

We will have global Performance standards.

Massachusetts

We have identified performance levels that are consistent with those of standard MCAS tests. Descriptive characteristics were developed for each level.

Missouri

Scoring criteria for the alternate assessment results in the same achievement level scores generated by the general assessment: progressing, nearing proficiency, proficient, and advanced.

Nebraska

Performance standards for the alternate assessment are the same as those for general education standards and are based on the percent of successful performance of targeted behaviors. They include: beginner (0-30%), progressing (30-60%), proficient (60-80%), and advanced (80%+). There is a descriptor of each performance level for each targeted skill.

Oregon

The extended measurement systems are based on the State Board adopted standards for the CIM and CAM. No additional content and performance standards have been developed for these measures. Rather, the same standards are employed and the measures are scaled so that they are appropriate to the instructional level of students performing well below the standards.

Rhode Island

Performance levels mirror the rest of the assessment program.

South Carolina

Alternate assessment performance measures mirror those used in the general assessment program.

South Dakota

We will be using proficiency levels that are already in place in Title I programs in all South Dakota school districts. Using the same levels provides education staff with a certain consistency and familiarity.

Tennessee

Performance standards established with scoring and ranking from Step One to Advanced, the same scoring range used on the Terra Nova (achievement test) in Tennessee, so the test scores can be reported and aggregated with the total school population (unless it is necessary to disaggregate for confidentiality purposes)--developed through the Task Force.

Virginia

Language was included in revised Standards for Accrediting Schools. The Steering Committee recommended using a three-point scale for the scoring rubric that mirrors the Standards of Learning Assessment scoring categories. We are using this rubric in the field test.

 

Table 20. Different Measures of Proficiency for Alternate and General Statewide Assessments

Arkansas

We have five proficiency levels for the alternate assessment: Independent, Functional Independence, Supported Independence, Emergent, and Not Evident. These are different from the general performance levels, which are based on NAEP and include Advanced, Proficient, Basic and Below Basic.

California

Each IEP goal is assigned a level of progress/mastery, including: 1. beginning (no progress); 2. transitional (partial progress – met 1-49% of the criteria); 3. intermediate (substantial progress – met 50 – 99% of the criteria); 4. competent (goal met or exceeded). A form is used to score mastery achieved on IEP goals. The information recorded on the form can be aggregated and reported at the state, district and school levels.

Colorado

We have a literacy matrix composed of performance indicators at three levels of proficiency.

Florida

The Performance Assessment System for Students with Disabilities is written in three manuals to address the needs of students at  mild, moderate, and severe levels. Each level is based on a set of exit standards (or expectations) with student performance rating scales provided at benchmark levels (grades 1-3, 4-5, 6-8, and 9-12 at the mild level and ages 6-9, 10-13, 14-17, and 18-21 at the moderate and severe levels).

Illinois

A rubric for proficiency levels has been developed.

Indiana

We have established a universal rubric by which all skills and proficiencies are rated (0-4) – independent, functionally independent, supported independent, or participant). Summary ratings are provided for each domain, sub domain, standard and proficiency. Criterion-referenced data is also available and adjusted ratings based upon psychometric characteristics of the constituent items will be available this summer.

Louisiana

There are three proficiency levels: Introductory, Fundamental, and Comprehensive.

Maryland

Student performance is ranked as: no attempt, beginning, emerging, mastery, or advanced. The total portfolio content is evaluated as none, adequate, or substantial, based on inclusion of student work in each outcome area.

Minnesota

The scaled scoring (1-7) indicators were developed by teachers and put in columns. For reading, math and writing, the left (1-2) column indicates students with little or no understanding. The middle (3-4-5) column describes a student that has understanding. The right (6-7) column describes a student with application skills. Based on field input on the functional alternate assessment, the scale (1-7) was retained but the description is different. Specific to the functional attributes, 1-2 describes a student that has little or no participation with full support, 3-5 describes moderate participation with moderate support, and 6-7 describes full participation with full support.

New Mexico

The special education unit, as a member of the Design Team, has established proficiency levels of Acquisition, Maintenance, Fluency, and Generalization. These levels translate to the levels of Beginning Step, Nearing Proficiency, Proficient, and Advanced used by the CRT.

North Carolina

Proficiency standards for the alternate assessment portfolio (rubrics for the tasks, the portfolio, and the portfolio quality) have been established.

Oklahoma

Four proficiency levels will be used in Reading, Mathematics and Writing. Two levels will be used in all other areas. Support scores will also be utilized.

West Virginia

During the spring testing window, the lead teacher will review the datafolio, in collaboration with other teachers who have collected the data, as appropriate, to rate the student’s performance on each Skill Inventory item based on the Alternate Assessment Skill Inventory rubric. The student receives one of the following ratings for each IGO: awareness, progressing, competent, or generalized. This rating is recorded on the Alternate Assessment Skills Inventory record form and is submitted to the county test coordinator for submission to the Department.

Wyoming

We have three proficiency levels: Beginner, Partially Proficient, and Proficient. These are different from the levels used in the statewide assessment.

 

Table 21. Examples of Components Measured Within Alternate Assessments

Idaho

·   Student performance

·   Opportunities to access multiple school settings

·   Opportunities for interactions with non-disabled peers

·   Generalization of skills

Kentucky

·   Student ability to plan, perform, monitor, and evaluate his or her own performance on a targeted skill related to the academic expectations and usually included in an IEP objective

·   Degree of natural supports used, such as peer buddies, tutors, or work site coworkers as opposed to instructional staff assistance, and the use of appropriate technology and adaptive assistive devices

·   Degree of peer relationships and mutual friendships with non-disabled peers

·   Application of learned skills across multiple school and community settings, particularly in integrated settings

·   Student opportunities to make choices and the use of age-appropriate materials

Maryland

·   Student performance

·   School support

·   Parent perceptions

Missouri

·   Opportunities for interactions with non-disabled peers

·   Opportunity to perform skills in multiple settings

·   Student access to adaptations, modification, natural support and/or assistive technology

·   Student self-evaluation and monitoring of progress

·   Use of age appropriate activities and materials

·   Incorporation of “Show-Me” performance/process standards

West Virginia

·   Level of accuracy and fluency

·   Number of environments in which the skill is demonstrated

·   Intensity of instructional assistance

·   Number of varied demonstrations

 

Reporting

IDEA requires states to report the number and performance of students participating in alternate assessments. States must decide whether their alternate assessment data will be reported as part of or along side data on the regular assessment, how alternate assessment results will be made available and, most importantly, how they will be used. Although 43 states reported that they have addressed the issue of reporting (compared to 21 states early in 1999), few have reached consensus on their actual reporting procedures. Most states are still in stages of designing and piloting their approach and have not considered how the information will be reported once it is collected. Comments such as “the task force will address this issue,” “we’ll get to this later this school year,” “we have a general plan that needs additional thought,” and “no work done yet” summarize the state of reporting across many states at this time. However, some states have done some careful work in this area. Examples of reporting decisions are described in Table 22.

 

Table 22. Reporting Decisions

Colorado

Alternate assessment results will be reported separately. The scores of students with IEPs who take the general CSAP are included with all other students who take that test.

Delaware

The Design Group has recommended that scores be aggregated with other special education and general education students. Because of the size of our state and the numbers of some of our low incidence populations, this discussion continues.

Florida

We have given this much thought and right now we think that it will not be statistically sound to aggregate scores of students taking alternate assessment because there is no one alternate assessment that could be given to all students exempted from regular state and district assessment. It is our feeling that exempted students vary tremendously in the setting they learn in, the level of assistance they might need to complete an assessment activity, the type of modification or accommodation needed to complete an assessment activity, etc., that an aggregation would not provide valuable nor sound information from which to make judgments about student need. We are leaning toward the use of portfolios that demonstrate student capabilities on specified standards (from the newly revised standards) that will link into our special diploma option. Student performance and progress would become a critical component and consideration in writing the quality IEP that we are striving for in Florida.

Idaho

The Alternate Assessment Workgroup is determining reporting format. This will be coordinated with the State Testing Coordinator and Public Relations personnel so Alternate Assessment results will be reported at the same time and in the same way as general statewide assessments are reported.

Kansas

For the 2000 assessment, schools received assessment scores in the following manner: 1.) the scores of all students, 2.) the scores of general education students and gifted, 3) Scores of students with disabilities and 4.) the scores for LEP students. Schools also receive a separate assessment score report for Title I.

Kentucky

Scores are aggregated with those of reg. ed. and are included in the accountability index. Based upon a rating of the 5 performance levels, a final holistic score of novice, apprentice, proficient, or distinguished is assigned to the student's portfolio, to be included in both the school- and local district-level accountability indices. Because the Alternate Portfolio has the equivalent impact in accountability index calculations as a student who participates in the general CATS assessments, a score of "proficient" from the Alternate Portfolio has the same impact as a student who scores "proficient" in reading, mathematics, science, social studies, writing, arts and humanities, and practical living/vocational living.

Maryland

The Independence Mastery Assessment Program (IMAP) is parallel to the state general school accountability program in time of assessment; frequency; reporting; approach (authentic performance events) and public reporting of results. IMAP has three major components: student portfolio, authentic events and parent survey. The program evaluation includes scoring for all three components, student performance and support. Results should assist the school in improving their program for students and allow the state to be accountable for all students. IMAP is a teacher and parent developed system and values the input from all individuals concerned with the student's progress. It includes reporting student supports provided as well as student performance. IMAP score results are reported in the same manner and frequency as MSPAP scores. Schools are reported at the satisfactory and excellent levels. IMAP score inclusion would be easily folded into the regular school report. Low numbers, which may identify particular students, are a concern.

Nebraska

This is a major issue for task force consideration in view of the fact that no statewide reporting requirements for current assessment practices are in place. There is a reporting form that will be completed and turned in for each alternate assessment participant.

New Mexico

Scores will be reported within the accountability system in addition to the achievement reports by the districts.

Oklahoma

Scores will be reported in similar manner as other special education and general education scores.

Tennessee

Scoring is designed to align with reporting standards of the Tennessee Comprehensive Assessment Program achievement assessment. Therefore, scores can be aggregated or disaggregated with the total school population.

Wisconsin

Students involved in an alternate assessment process will be reported as “prerequisite” on the reporting form. All individual alternate assessment information reviewed at the IEP team meeting is summarized and shared with parents in a meaningful way.

Wyoming

All students will be included as assessment participants. The performance of alternate assessment participants will be reported along side of performance of regular large-scale assessment.

 

High Stakes

Some states have high stakes assessment systems, that is, the assessment system has consequences for schools, students, or both. Often, high stakes systems for students affect graduation status, determining whether or what type of a diploma or other exit documentation a student will receive. Twenty-seven states use assessment results for student accountability purposes, including at least 18 states using passing results as a high school graduation requirement (Guy, Shin, Lee & Thurlow, 1999; Olson, Bond & Andrews, 1999). Assessment results are used for school accountability purposes (including sanctions and rewards for schools or districts) in 41 states (Olson, Bond, & Andrews, 1999). The introduction of alternate assessments casts new light on a system that previously excluded most of these students. As states contend with the development of their alternate assessments, they must also consider how participants in this assessment will be included in their overall accountability system. High stakes legislation is very prevalent right now, with many states having just passed or considering laws affecting high school graduation, school accreditation, and other areas of accountability. States are finding it difficult to place their alternate assessment systems within this moving target.

Alternate assessment participation decisions may be affected by a state’s graduation requirements. For example, requests for alternate assessments might increase in states where students are required to pass a test to graduate with a regular diploma while alternate assessment participants can receive a regular diploma without passing the test. It may also be the case that, in states with rewards or sanctions for schools or districts, more students might be included in the alternate assessment if their scores are removed from a school’s score.

Examples of how eight states with high stakes for students addressed alternate assessments are shown in Table 23. Examples of 11 states with high stakes for educational systems are presented in Table 24.

 

Table 23. High Stakes for Students

Alaska Mandatory graduation exam in reading, writing and math that must be passed to receive diploma beginning in 2002. Alternate assessments will be an option for severely disabled students but will lead to certificate of attendance, not a diploma.
Arizona At this point, the State Board has elected to waive “passing” the alternate assessment as a graduation requirement for those students still addressing the functional level of the academic standards by graduation time. This decision may change when we have enough information about the test and those taking it.
California California's accountability system includes school reports, incentives, and technical assistance. Student accountability includes promotion/retention standards and a high school exit examination.
Louisiana Currently, 5 components of the Louisiana Graduate Exam must be passed for a student to graduate. Students who participate in alternate assessments receive a Certificate of Achievement.
Massachusetts A passing score on the grade 10 MCAS tests in English Language Arts and Tests in 4 subject areas are given in Grades 4, 8, and 10. A passing score in grade 10 tests in English Language Arts and Mathematics are a requirement for graduation, beginning with the graduating class of 2003. Massachusetts is currently exploring the option of meeting the graduation requirement through an alternate assessment, and intends to develop guidelines on meeting the graduation requirement early in school year 2000-2001.
Minnesota Students need to pass basic standards tests in Reading, Mathematics, and Written Composition in order to graduate from high school. Alternate assessment participants are exempt from these requirements and can still receive a standard high school diploma based on mastery of their IEP goals.
New Mexico Students’ scores are not to be exempted from the high stakes testing. This is crosschecked through the listings of the students from each school who have been exempted from testing. However, students with disabilities may receive a diploma without taking the test if the IEP team so designates. Once our alternate assessment is implemented (Spring 2001), scores of students taking the alternate will be a part of the accountability system. Our accountability system is currently undergoing final revisions.
Ohio Students must pass proficiency tests for graduation. It is unlikely that that we will attach high stake consequences to alternate assessment.
Colorado The draft rules place emphasis on CSAP results as the prime determiner of accreditation status. Results that are not satisfactory may result in the district's placement in a corrective action cycle. Another accreditation indicator is the percentage of students participating in the general assessment and the draft school academic achievement report card displays the scores from the alternate assessment as well as the general assessment. New legislation was passed in April 2000 that rescinds the Accreditation Act. The governor has not yet signed nor vetoed it. Currently, discussion on the development of the rules indicates that there may be the possibility of excluding some students with IEPs from the accountability system.
 Connecticut We consider the Connecticut testing programs to have moderately high stakes in the following ways. There are financial implications in that test results are one component in the distribution of state funds to local districts. In addition, the state tests are used to determine those districts in need of improvement as required under Title 1. State Department of Education partnerships with the urban districts are also based in part on state test results. The test is also moderately high stakes for students in that test results frequently determine placement and remediation efforts. At grade 10, students are awarded a Certificate of Mastery in those subjects in which mastery is achieved.
Delaware The Design Group has recommended that the alternate assessment be included in the current accountability system being developed for all students in Delaware.
Florida Our state test (Florida Comprehensive Assessment Test-FCAT) is currently used as one indicator to calculate school achievement levels. All schools are rated on a five-point scale based on two years of data. Low rated schools who stay that way for two years suffer consequences. Students with disabilities are encouraged to take the state test but only the scores of students who are hospital/homebound and speech impaired are included in the school achievement calculation. All student attendance rates and dropout rates are included in the calculation. Alternate assessment results are not used in the state level accountability system but are used at the school levels for school improvement planning.
Idaho Idaho does not have a high stakes system at this moment. However, exiting standards have been adopted by the 2000 Legislature and will become high stakes in 2004. Our Alternate Assessment addresses domains, rather than matching all our statewide assessments (4) one-for-one. Therefore, it will be a nice match with whatever changes occur due to the adoption of standards. I don't think that scores on the Alternate Assessment will result in sanctions on districts or buildings. However, the AA results, with emphasis on progress, are included in our special education monitoring visits.
Kentucky In Kentucky's performance-based assessment and accountability system, school monetary rewards and sanctions are determined, not by baseline data, but by the amount of improvement from baseline to current-year data. The scores for students in the Alternate Portfolio Assessment are embedded in their school's accountability index using a formula that makes the score difficult to determine, thus protecting the student's right to confidentiality. Scores are tracked to the student's neighborhood school (i.e., the school they would attend if they did not have a disability) to promote ownership of that student's educational program.
Maryland IMAP will be used in the school reports.
Michigan A new accountability system was presented to the Michigan State Board of Education at the March 2000 meeting. Five Accountability factors have been identified for the proposed accreditation system. 1. Assessment of All students (participation in the assessment system) 2. High academic achievement (percentages of students scoring well on the MEAP state tests) 3. Improvement in Student Performance (Adequate Yearly Progress) 4. High Achievement for All Students (progress in minimizing achievement gaps) 5. School Improvement Results (reports on implementation of local plans). In order for the proposed accreditation system to work, all students must participate in the assessment program. Schools that do not report performance for the required percentage of students will be "Unaccredited." Schools will be required to report on a minimum of 80 percent of their students. Not included in the 80 percent will be those students for whom an Individualized Educational Program supports the use of an ALTERNATE ASSESSMENT, and those students for whom English is not the primary language and who have been in a U.S. school for less than two years. The new accreditation system will be phased in. For the year 2000-2001 the department will release accreditation status based on participation and achievement factors only. Implement Self-Assessment of School Improvement and set targets for progress and achievement gaps for the next year. The entire system will be operational during the 2001-2002 school year.
Oklahoma Currently, scores will not be used for “high stakes” (Determining low performing/high challenge sites).
Pennsylvania At this time, PA does not include a "high stakes" accountability system. It is projected that with the pending passage and implementation of 22 PA Code Chapter 4, there will be an incentive program established for districts demonstrating increased academic performance. Should this system be established, the alternate assessment work group will make recommendations regarding this incentive program.
Vermont State law requires commissioner of ed to identify technical assistance schools via "adequate yearly progress" index. Alternate assessment results will be used in process for identifying technical assistance schools when instrument is completed and validated (Projected for 2002).

 

Training

As states begin to move from pilot phases and field tests to statewide implementation of their alternate assessments, training becomes an important consideration. Some states are finding that the people involved in their pilots (generally the best educators and advocates at the local level), have been involved in the development of alternate assessments from their inception and have gained an extensive understanding of the purpose and process. Statewide implementation will require extensive training of educators who may not have been involved in standards development or implementation of any kind, and may never have heard of alternate assessments. Their initial reaction may be one of intense frustration, anger, and even revolt as they are challenged to collect information they may never have collected before for a purpose they do not understand. Statewide training efforts over the next few years must be carefully designed and implemented with ongoing support and encouragement across states.

Forty-four states reported that they are either in the process of developing training on their alternate assessments or are currently providing training. Examples of these efforts are provided in Tables 25 (developing) and 26 (underway). Training tools, including guides, presentations, etc. are listed in the next section under products. Many of these materials are available on state Web sites.

 

Table 25. Training in Developmental Stage

Alabama

Training plans are being finalized at this time. Developers of the instrument will be involved in training seminars, also stakeholders, including parents.

Arkansas

We have training as part of the scope of work with the contractor, and over time, ADE personnel will assist with training, too.

California

Training is scheduled for spring and summer 2000.

Idaho

Awareness level presentation has been made at the annual Special Ed Director's Conference. Statewide training will begin in Sept. 2000.

Illinois

Pilot teacher training was held the last week of March 2000. Feedback sessions from pilot teachers to be held in June. Necessary changes will be made at that time with initial implementation to begin in Fall, 2000. Teacher training will be held in the Fall.

Iowa

Training is being planned for the 2000-2001 school year.

Louisiana

Statewide training took place for the field test and we will provide regional training sessions this fall.

Massachusetts

Training may begin as early as winter-spring 2000.

Minnesota

State initiated training sessions on statewide assessment are continually occurring. To some degree alternate assessment is covered in each session. The focus is on accommodations v. modifications and some mention of alternate assessment. We will be preparing a better module for these sessions. We will provide numerous inservice sessions on alternate assessment that target sp ed administrators, teachers, regional testing staff, etc.

Nebraska

We are currently piloting the alternate assessment in 10 districts with 75-100 students. Feedback from this pilot will be used to determine the success of our rubrics and to complete the technical assistance guide. The guide will be disseminated statewide in July 2000, with regional training provided from August to October 2000.

New Jersey

We have been informing local districts that there will be an assessment and of our development plan. This has been within the context of general training on statewide assessment and updates on IDEA implementation.

New Mexico

State department staff will do training of trainers. Trainers will be staff of state regional cooperatives, parents, and staff of IHEs. Trainers will be funded.

North Dakota

Department staff, MPRRC staff, and members of the alternate assessment design team/workgroups will be involved in the statewide training.

Oklahoma

Field study will begin in August 2000. Training will occur in early August for these sites. Awareness training will occur statewide this fall.

South Dakota

We are still in the process of developing this component of the process. We anticipate that we will do a statewide implementation of the alternate assessment process by conducting inservice training activities in several locations throughout the state and using our distance learning system if it is operational by the time we need it.

Texas

This is at the initial stage since the field test for the new state instrument is not scheduled until spring 2000. The Steering Committee is working with the Divisions of Student Assessment, Accountability, and with the Division of Special Education in the implementation of the training design and activities. Training will be coordinated with the statewide system's Regional Education Service Centers. This has proven effective in previous statewide training/dissemination activities to assist school districts in implementation of new policy requirements.

 

Table 26. Training Initiatives Underway

Colorado

During the 1998/99 school year, we laid the groundwork in a trainer of trainers model. State department staff, district staff and parents conducted this training. State department staff will assist each district that sent a team to be trained, if they choose, to conduct trainings within their districts. Training specifically related to the alternate assessment will be conducted in Spring, 2000, when the pilot is initiated.

Connecticut

Special Education and Assessment staff have designed and offered state-wide training on the design of the Alternate Assessment, eligibility criteria, etc.

Delaware

State department staff conduct the training. Eleven teachers who participated in the pilot studies are district consultants providing technical assistance.

Florida

Training is provided on the process of performance-based assessment. Training is conducted by a team representing the state level initiative and teachers provide technical training on components such as writing, scoring criteria, data collection, rating student progress, etc.

Georgia

We conducted a train the trainer program throughout the state and trained teams from all LSSs who are now training their stakeholders.

Indiana

We have currently trained over 1000 teachers and nearly 200 technology coordinators across the state. We have developed a two-day training module that has been implemented by four "lease teachers" (former pilot teachers). These teacher trainers have traveled across the state this year providing training and support to teachers, administrators, and parents. We are also using the training module to instruct preservice teachers about performance-based assessment and the use of multi-media technology in capturing progress data.

Kansas

A large-scale training of trainers’ workshop was held in February, 2000. Over 200 individuals representing almost every school district or cooperative was represented. Each person attending received a notebook with all of the training materials that they will need to use when they return to their local district or cooperative to provide training. There will be a follow-up training in early fall to provide the information that we learned from the Spring 2000 pilot.

Kentucky

The state department of education contracts with the University of KY to run the alternate portfolio assessment project. Communication regarding training is usually sent out from UK to special education directors and district assessment coordinators through the Department of Education. Training dates and other specifics may also be located at our website under the News & Updates section.

Massachusetts

The department of education provides regional awareness and training activities to educators and parents at least twice each year. A steady flow of mailings, internet postings, and public meetings have made local educators and administrators aware of the requirement to include all students with disabilities in the state assessment system, including the use of test accommodations, and who should receive alternate assessments. A team of DOE staff, contractors, and members of the Advisory Committee typically provide training.

Michigan

Tryout and pilot sites participate in a series of workshops to provide information on the alternate assessment and how to administer it. These workshops are facilitated by the Alternate Assessment Workshop Facilitators. Currently, full-day awareness workshops are being developed on the Addressing the Unique Educational Needs of Students with Disabilities (AUEN), which is the foundation document for Michigan's alternate assessment. The plan is to offer these workshops on a regional basis across the entire state of Michigan. Once there is an understanding of the foundation document, AUEN, for the alternate assessment, then awareness workshops on the actual alternate assessment will take place regionally across the state beginning 2000/2001. Sustained learning opportunities related to both the AUEN and the alternate assessment will also be offered to the state. These sustained learning opportunities have not been developed as of April 2000.

Missouri

This year we validated our approach with about 500 students in a voluntary administration. We held 5 regional training sessions for teachers within the voluntary group.

New Hampshire

The contractor trained participants in the Pilot during April, 2000. In the fall, more training will occur for statewide implementation. Contractors have conducted general awareness trainings about the alternate provisions at various workshops.

Oregon

Assessments and professional development modules have been developed and field-tested for both the CLREAS and the CIM-Ex. Trainings have been conducted in all eight regions of Oregon and training-of-trainers workshops have been completed for the 2000 field test.

Tennessee

Statewide training for pilot in September 1999, Technical Assistance (on-site) training throughout 1999-2000 school year.  Statewide training for full implementation will begin in September 2000. Follow-up statewide training for full implementation is planned for Dec-Jan, 2000-2001. Training is provided by: State Department Personnel, ILSSA Personnel, Vanderbilt (subcontractor) Personnel.

Utah

Pilot testing of the alternate assessment instrument has been completed. Teacher training has begun and will be completed by Autumn 2000.

Vermont

We provide training and information through our website, videos, many documents, interactive television, conferences, regional workshops.

Virginia

We have conducted 1-day field test training for teachers and administrators involved with the Spring field test. We have also conducted four regional awareness presentations for directors of testing and special education. We are anticipating conducting statewide training during August through November for teachers in each of the 8 superintendent’s regions.

West Virginia

Training has occurred with the personnel of several LEAs involved in the Alternate Assessment Pilot Process. In addition, various presentations have been conducted throughout the state explaining the proposed Alternate Assessment process.

Wisconsin

State department staff have been involved in providing inservice throughout the state. This has been collaboration between the Special Education Team and the Office of Educational Accountability Team at the department of education.

 

Products

Several states have Web sites and written documents containing information about their alternate assessments (see Table 27). Products from many states are under construction or in draft form for pilot or field testing and are not listed here or included on Web sites, although they will be over the course of the next year. Contact people for each state are listed online at the top of the survey and include a variety of state department personnel, either directors of special education, special education consultants, or assessment directors or consultants.

 

Table 27. Alternate Assessment Resources from States 

Alaska

·   Performance standards based on the Alaska Content Standards and approved by the Alaska State Board of Education & Early Development

·   Participation Guidelines for Alaska Students in State Assessments

·   WEBSITE:  http://www.eed.state.ak.us/tls/assessment/AKParticipationGuide.pdf

California

·   Guideline document that contains definitions of functional domains commonly covered in the instruction of students with severe disabilities

·   WEBSITE: http://www.cde.ca.gov/spbranch/sed/altassmt.pdf

Colorado

·   1997 Access Skills Framework

·   1998 Key Components of Reading, Writing and Math Standards

·   1998 "Alternate Assessment Issues and Practices (State Practice: Colorado)" by Mid-South Regional Resource Center

·   1998 "What Gets Tested, Gets Taught; Who Gets Tested, Gets Taught: Curriculum Framework Development Process (State Practice: Colorado)" Mid-South Regional Resource Center

·   1999 Expanded Standards for Literacy

·   A School and District Assessment Coordinators' Manual is produced for each assessment. "Creating a Standards-Driven IEP" trainer’s package of materials, including a presentation guide, script, overheads and handouts. The guidelines for participation are published in each edition

·   WEBSITE: http://www.cde.state.co.us

Connecticut

·   Assessment Guidelines containing an overview of the alternate assessment process and the necessary forms for accommodations

·   WEBSITE:  http://www.state.ct.us/sde

Delaware

·   "Standards for Functional Programs in Delaware" (draft). This document contains fourteen content standards adopted from the Delaware Academic Content Standards and eighteen functional standards developed by the Alternate Assessment Design Committee

·   Eligibility Guidelines Checklist (draft)

·   CONTACT: mmieczkowski@state.de.us

Florida

·   “Performance Assessment System for Students with Disabilities” - three manuals to address the needs of students at mild, moderate, and severe levels

Georgia

·   Training manual and PowerPoint presentation. Each LSS was given a training notebook with permission to copy and use. Training manual is available in limited quantity through the Division for Exceptional Students

·   CONTACT: P. Paulette Bragg, pbragg@doe.k12.ga.us

Idaho

·   "Alternate Standards" includes the general ed standard, content knowledge & skills at the basic and functional levels, and samples of applications related to the general education standards in the areas of Language Art/Communication, Math, Social Studies, Science, and Health

·    WEBSITE: http://www.sde.state.id.us/SpecialEd/ Listed under "Publications": Idaho's Alternate Assessment Guide Manual

Indiana

·   The standards and proficiencies that are included in our assessment are accessible through our IASEP computer software. Our 100 core items that are used for leveling purposes can be accessed through several papers which are available on our website

·   Stakeholder meeting minutes and products are available through our website

·   Our Brochures, newsletters, list serves, and the website have been used in communicating our progress and soliciting feedback

·   CONTACT: Mel Davis, meldavis@purdue.edu

·   Parent liaison, Karen Dodson, has been instrumental in communicating with parents across the state and can be reached at 1-800-528-8246

Kansas

·   “Extended Curricular Standards”

·   CONTACT: Lynnett Wright, lwright@ksbe.state.ks.us

Kentucky

·   Teacher's Guide addresses standards in several places

·   WEBSITE:  www.ihdi.uky.edu/projects

Louisiana

·   “General Education Access Guide” includes forms, checklists, reports, example IEPs, etc.; participation criteria and instructions; the alternate assessment itself; District and School Test Coordinator Manuals; Teachers Guide to Statewide Alternate Assessment 

·   WEBSITE:  http://www.doe.state.la.us/DOE/asps/home.asp

·   CONTACT: cprice@mail.doe.state.la.us

Maine

·   Maine Learning Results Standards

·   WEBSITE:  http://www.state.me.us/education.htm

Massachusetts

·   Awareness materials such as newsletters, commonly-asked questions

·   MCAS Participation Requirements publication (updated annually)

·   Downloadable PowerPoint presentation

·   Training manual and curriculum guide for the state’s alternate assessment has been developed and is currently being field-tested

Michigan

·   “Addressing Unique Educational Needs of Students with Disabilities” (AUEN) may be purchased through the Center for Educational Networking (CEN) at 1-800-593-9146

·   Proposed Plan for the Development of Alternate Assessment for Students Receiving Special Education Services

·   WEBSITE: http://www.mde.state.mi.us/off/sped/admin_suite/adm_suite.html

·   CONTACT: Peggy Dutcher MDE/OSE/EIS P.O. Box 30008 Lansing, Michigan 48909

Minnesota

·   WEBSITE: www.cfl.state.mn.us/speced/speced.htm

New Hampshire

·   PowerPoint Presentation for general awareness training

·   Q & A document

New Mexico

·   FAQ document

·   Guiding Principles of the Alternate Assessment Design Team

·   Participation Criteria/Eligibility Guidelines

·   Performance Indicators for Math, Language Arts, Science and Social Studies

·   Alternate Assessment Pilot Test

·   CONTACT: Ruth LeBlanc, rleblanc@sde.state.nm.us

New York

·   "The Learning Standards and Alternate Performance Indicators for Students with Severe Disabilities"

·   A definition of who is a student with a severe disability and of the alternate performance indicators is also in Part 100 of the Commissioner's Regulations

·   WEBSITE: www.nysed.gov

North Carolina

·   The standards, called domains and competencies, are listed in the alternate assessment portfolio document

·   CONTACT: W. David Mills, dmills@dpi.state.nc.us

North Dakota

·   Introduction to Expanded Standards and Expanded Standards for Mathematics and English Language Arts (draft)

·   Guidance on Participation of Students in Statewide/Districtwide Assessments

·   WEBSITE: www.dpi.state.nd.us/dpi/speced/index.htm

Oregon

·   WEBSITE: http://www.ode.state.or.us/opte/CAM/crls.pdf

Tennessee

·   Everything we have produced will be on Tennessee's website in the fall of 2000-2001 school year

·   WEBSITE: www.state.tn.us/education/msped.htm

·   CONTACT: Ann Sanders, asanders@mail.state.tn.us

Texas

·   http://www.tea.state.tx.us/tea/curric.html

 

Vermont

·   Eligibility Documentation, Protocol, Portfolio Materials & Scoring Rubrics, Report on Field Testing, Cost Analysis, Eligibility Documentation Protocol

·   WEBSITE: http://www.state.vt.us/educ/cses/alt/memo/htm

Virginia

·   Field test manual used during field test training. This manual includes the following sections; 1) Introduction, 2) Selecting the Student, 3) Articulating the Standards (may be removed for public dissemination), 4) Components of the Collection of Evidence, 5) Methods of Assessment, 6) Developing entries, 7) Recommended Procedures, 8) Q and A, 9) Appendix

West Virginia

·   “West Virginia’s Alternate Assessment Framework: Linking Instructional Goals and Objectives with Adaptive Skills”

Wisconsin

·   Information Update Bulletin No. 98.14 explains the alternate assessment process in Wisconsin

·   PowerPoint presentations on accommodations and alternate assessment are on the website

·   WEBSITE: www.dpi.state.wi.us/dpi/dlsea/een (click on Testing and Assessment)

·   CONTACT: Sandra Berndt, Education Consultant at 608/266-1785

Wyoming

·   Policies for the Participation of ALL Students in District and Statewide Assessment and Accountability Systems

 

Following are two excellent examples of Web site information on alternate assessment:

Vermont Education Matters
120 State Street - Montpelier, VT 05620-2501 - (802) 828-3147 (fax) 828-3140

·         Memo with Important Ordering Information for Assessment Accommodations and Alternate Assessment 

·         Eligibility Documentation for Alternate Assessment

·         Alternate Assessment Options and Examples

·         What is a Lifeskills Assessment?

·         How can we include ALL students in assessments?

·         Accommodation Planning Resources

·         Assessment Participation and IEP's

 

Wisconsin Department of Public Instruction
Students with Disabilities and Statewide Assessment

Since the reauthorization of IDEA, the Special Education Team has been working in conjunction with the Office of Educational Accountability to formulate state policy regarding the inclusion of students with disabilities in statewide assessments. Listed below are links to several recent publications by the Department regarding statewide assessment and students with disabilities. The Department recognizes that educators and parents alike have many questions regarding the implementation and implications of the legislative changes for the education of all students. As such, the Department will be publishing several additional resources to assist educators and parents in facilitating the participation of students with disabilities in state assessments. These resources will be posted on this site as they are released, so we encourage you to periodically return to this page to see what’s new!

Policy Statements/Information

·   Bulletin 98.14: Guidelines for Complying with the Assessment Provisions of the Individuals with Disabilities Education Act

·   Guidelines to Facilitate the Participation of Students with Special Needs in State Assessments

·   Understanding How Scores are Reported for Students with Disabilities on the Proficiency Summary Reports for the Wisconsin Knowledge and Concepts Examinations, 5/99

·   APIs: Alternate Performance Indicators, 5/2000

Brochures

·   What Every Parent of a Student with a Disability Needs to Know About Participation in Assessment Programs and Testing Accommodations

·   Assessing All Students: What Every Teacher Needs to Know

PowerPoint Presentations for Training Purposes

·   Participation of Students with Special Needs in State Assessments (Spring 1999)

·   Educational Assessment: The Decision-Making Process
Inclusion of Students with Disabilities in State & District Assessments (January 2000)

Links to Related Sites

·   CESA #1 Assessment Project

·   Office of Educational Accountability

Test Results

·   Knowledge and Concepts Examination Test Results for 1997-98. This provides statewide and district/school statistics by gender, race/ethnicity, limited English proficiency, migrant status, disability status, and economically disadvantaged status.

·   Knowledge and Concepts Examination Test Results for 1998-99. This provides statewide and district/school statistics by gender, race/ethnicity, limited English proficiency, migrant status, disability status, and economically disadvantaged status.


Discussion

States have come a long way in the development of alternate assessments in less than three years—from the introduction of the term “alternate assessment” in IDEA 97, to the development and implementation of a whole new assessment system. As this report shows, state department personnel, local district personnel, parents, and advocates did all of this work with little federal direction, and with tremendous commitment to the students they serve. States have been guided in this effort through the establishment of guiding principles, many of which advocated for opportunities for students with significant disabilities to access and be counted in statewide systems of standards, assessment, and accountability. It is important to note that we are simply reporting what is occurring in the development of alternate assessments, not necessarily endorsing all of the approaches that have been taken. States will want to continually reassess what they are doing, as well as the intended and unintended consequences of their approaches to alternate assessments.

Nearly all states included both special education and general education assessment personnel in the development of their alternate assessment systems. In some states, the initiative to develop alternate assessments was led by state assessment personnel, in others it was lead by special education personnel, and in a few it was truly a joint effort. Whatever the case, the effort to develop a more inclusive assessment system through the use of alternate assessments may be a first and very important step toward the goal of a more inclusive educational system, with high standards for ALL students. Burgess and Kennedy (1998) found that stakeholders involved in the development of alternate assessments have discovered a “renewed sense of professional pride, an increased awareness of the vocation, and an increased appreciation for the dedication and commitment to the success of all students.”

Many states have developed written participation criteria that include guidelines for participation in alternate assessments, rather than for exemption. Previous NCEO documents listed criteria on which NOT to base participation decisions (i.e., placement, attendance, special education category, behavior—see Elliott, Thurlow, & Ysseldyke, 1996), but states are now shifting to more specific criteria for what participation decisions SHOULD be based on (i.e., participation in the general education curriculum, graduation expectations, skills a student is working on, and how much support a student needs). This information should be easily accessible and usable by local IEP teams.

Most alternate assessments are designed as alternates to general education large-scale assessments, and are based on general education standards. A few states took a more separatist approach and developed assessments for students with significant disabilities that have no connection or link to general education standards or assessments. These states will need to carefully consider what their new assessments are an alternate to.

States plan to use a variety of assessment approaches, with portfolio assessments being the most popular. No matter what the standards, connection to general education, or assessment approach selected, the actual indicators on which alternate assessment participants are assessed are very similar across states. These indicators include a variety of functional skills oriented to successful independent living outcomes.

Measures of proficiency (scoring) actually vary more across states than assessment approaches. Not only is there variation among states in the labels assigned to their measures of proficiency, but also in exactly what is measured by the alternate assessment. With this much variation among states, it will be difficult to obtain a national picture of alternate assessment performance.

Although most states have determined proficiency levels for their alternate assessments, many have not yet considered who will actually do the scoring, nor have they designed procedures for scoring. The reliability and validity of these scoring procedures may be weak at first, until states have a chance to establish exemplars and anchors upon which to base performance decisions, and scorers receive sufficient training to make valid and reliable judgments.

This report addresses the status of alternate assessment systems from participation decisions through reporting. However, it does not address the next important set of questions dealing with the actual use of the alternate assessment results. For example, Maryland reported, “Results should assist the school in improving programs for students and allow the state to be accountable for all students.” It will be important that states not decide that they have completed the alternate assessment process once the data are collected and “turned in to the state.” How the data are reported and used depends greatly on the purpose of the assessment and on the stakes set for students and states within the accountability system. As alternate assessments are implemented statewide, determining practical uses for the results, beyond just meeting the letter of the law, will be critical.

NCEO plans to “reinvent” this survey shortly after alternate assessment implementation requirements go into effect July 1, 2000. We plan to continue to keep track of where states are as they work toward statewide implementation, examine the development of state processes for evaluating their alternate assessments, and ask states about the usefulness of their alternate assessments. The technology for this type of survey is becoming more sophisticated and easier to use, to make the process of updating and viewing information even easier. Survey results will continue to be available at NCEO’s Web site, located at: http://education.umn.edu/NCEO.


References

American Federation of Teachers. (1999). Making standards matter 1999: An update on state activity. Washington DC: Author.

Burgess, P., & Kennedy, S. (1998). What gets tested gets taught, who gets tested gets taught: Curriculum framework development process. Lexington, KY: University of Kentucky, Mid-South Regional Resource Center.

Elliott, J., Thurlow, M., & Ysseldyke, J. (1996). Assessment guidelines that maximize the participation of students with disabilities in large-scale assessments: Characteristics and considerations (Synthesis Report 25). Minneapolis: University of Minnesota, National Center on Educational Outcomes.

Guy, B., Shin, H., Lee, S., & Thurlow, M. (1999). State graduation requirements for students with and without disabilities (Technical Report 24). Minneapolis: University of Minnesota, National Center on Educational Outcomes.

Olsen, K. (1998). What principles are driving development of state alternate assessments? Lexington, KY: University of Kentucky, Mid-South Regional Resource Center.

Olson, J. F., Bond, L. & Andrews, C. (1999). State student assessment programs: Data from the annual survey. Washington, DC, Council of Chief State School officers.

Thompson, S., Erickson, R., Thurlow, M., Ysseldyke, J., & Callender, S. (1999). Status of the states in the development of alternate assessments (Synthesis Report 31). Minneapolis: University of Minnesota, National Center on Educational Outcomes.

Warlick, K. & Olsen, K (1999). How to conduct alternate assessments: Practices in nine states. Lexington, KY: University of Kentucky, Mid-South Regional Resource Center.

Ysseldyke, J. E., & Olsen, K. (1997). Putting alternate assessments into practice: What to measure and possible sources of data (Synthesis Report 28). Minneapolis: University of Minnesota, National Center on Educational Outcomes.

Ysseldyke, J. E., Olsen, K., & Thurlow, M. L. (1997). Issues and considerations in alternate assessments (Synthesis Report 27). Minneapolis: University of Minnesota, National Center on Educational Outcomes.


Appendix A

Alternate Assessment in the Individuals with Disabilities Education Act

 

A.  IN GENERAL.—Children with disabilities are included in general State and district-wide assessment programs, with appropriate accommodations, where necessary. As appropriate, the State or local educational agency—

 

(i)      develops guidelines for the participation of children with disabilities in alternate assessments for those children who cannot participate in State and district-wide assessment programs; and

(ii)      develops and, beginning not later than July 1, 2000, conducts those alternate assessments.

 

B.             REPORTS.—The State educational agency makes available to the public, and reports to the public with the same frequency and in the same detail as it reports on the assessment of nondisabled children, the following:

 

(i)      The number of children with disabilities participating in regular assessments.

(ii)      The number of those children participating in alternate assessments.

(iii)      (I) The performance of those children on regular assessments (beginning not later than July 1, 1998) and on alternate assessments (not later than July 1, 2000), if doing so would be statistically sound and would not result in the disclosure of performance results identifiable to individual children.

 

(II) Data relating to the performance of children described under subclause (I) shall be disaggregated

 

(aa) for assessments conducted after July 1, 1998; and

                                    (bb) for assessments conducted before July 1, 1998, if the State is

                                    required to disaggregate such data prior to July 1, 1998. [PL 105-17,

                                    Section 612 (a)(17)]


IDEA Regulations Pertaining to Standards and Assessment

300.137 Performance goals and indicators.

            The State must have on file with the Secretary information to demonstrate that the State –

(a)  Has established goals for the performance of children with disabilities in the State that -
(1) Will promote the purposes of this part, as stated in 300.1; and

(2) Are consistent, to the maximum extent appropriate, with other goals and standards for all children established by the State;

(b)  Has established performance indicators that the State will use to assess progress toward achieving those goals that, at a minimum, address the performance of children with disabilities on assessments, drop-out rates, and graduation rates;

(c)  Every two years, will report to the Secretary and the public on the progress of the State, and of children with disabilities in the State, toward meeting the goals established under paragraph (a) of this section; and

(d)  Based on its assessment of that progress, will revise its State improvement plan under subpart 1 of Part D of the Act as may be needed to improve its performance, if the State receives assistance under that subpart.

300.138 Participation in assessments.

            The State must have on file with the Secretary information to demonstrate that –

(a) Children with disabilities are included in general State and district-wide assessment programs, with appropriate accommodations and modifications in administration, if necessary;
(b) As appropriate, the State or LEA –

      (1)       Develops guidelines for the participation of children with disabilities in alternate

            assessments for those children who cannot participate in State and district-wide

            assessment programs;

      (2)       Develops alternate assessments in accordance with paragraph (b)(1) of this section;

            and

      (3)       Beginning not later than July 1, 2000, conducts the alternate assessments

            described in paragraph (b)(2) of this section.

300.139 Reports relating to assessments.

(a)  General. In implementing the requirements of 300.138, the SEA shall make available to the public, and report to the public with the same frequency and in the same detail as it reports on the assessment of nondisabled children, that following information:
(1) The number of children with disabilities participating –

(i) In regular assessments; and

(ii) In alternate assessments.

(2) The performance results of the children described in paragraph (a)(1) of this section if
       doing so would be statistically sound and would not result in the disclosure of

      performance results identifiable to individual children –

(i) On regular assessments (beginning not later than July 1, 1998); and

(ii) On alternate assessments (not later than July 1, 2000).

(b)  Combined reports. Reports to the public under paragraph (a) of this section must include–

(1) aggregated data that include the performance of children with disabilities together
        with all other children; and

(2) disaggregated data on the performance of children with disabilities.

(c)  Timeline for disaggregation of data. Data relating to the performance of children described under paragraph (a)(2) of this section must be disaggregated –
      (1) For assessments conducted after July 1, 1998; and
      (2) For assessments conducted before July 1, 1998, if the State is required to disaggregate the data prior to July 1, 1998.

300.346. Development, review, and revision of IEP.

(a)      Development of IEP. (1) General. In developing each child’s IEP, the IEP team shall consider –
(iii) As appropriate, the results of the child’s performance on any general State or district-wide assessment programs.

300.347 Content of IEP.

(a)  General. The IEP for each child with a disability must include –

(5)         (i) A statement of any individual modifications in the administration of State or district-wide assessments of student achievement that are needed in order for the child to participate in the assessment; and

(ii) If the IEP team determines that the child will not participate in a particular State or district-wide assessment of student achievement (or part of an assessment) a statement of –

                  (A) Why that assessment is not appropriate for the child; and
                  (B) How the child will be assessed

Analysis of Comments and Changes

If IEP teams properly make individualized decisions about the participation of each child with a disability in general State or district-wide assessments, including the use of appropriate accommodations, and modifications in administration (including individual modifications, as appropriate), it should be necessary to use alternate assessments for a relatively small percentage of children with disabilities.

Alternate assessments need to be aligned with the general curriculum standards set for all students and should not be assumed appropriate only for those students with significant cognitive impairments.

In order to ensure that students with disabilities are fully included in the accountability benefits of State and district-wide assessments, it is important that the State include results for children with disabilities whenever the State reports results for other children. When a State reports data about State or district-wide assessments at the district or school level for nondisabled children, it also must do the same for children with disabilities. Section 300.139 requires that each state aggregate the results of children who participate in alternate assessments with results for children who participate in the general assessment, unless it would be inappropriate to aggregate such scores.


 Appendix B

Survey Instrument

To view the survey instrument, go to http://education.umn.edu/NCEO/survey.htm


Appendix C

Summary Data from Other Educational Units

Ten other educational units receive federal special education funding. They include American Samoa, Bureau of Indian Affairs, Micronesia, Guam, Marshall Islands, Mariana Islands, Palau, Puerto Rico, Virgin Islands, and Washington DC. Survey results have been received at least once from five of these units; two have updated their information in the past three months. As indicated in Table C1, other educational units receiving federal special education funds are still, for the most part, at the beginning stages of developing their alternate assessments.

 

Table C1. Summary of Alternate Assessment Features Addressed by Educational Units

State

Stake-holders

Partici-

pation

Guidelines

Aligned with State Standards

Approach

Proficiency Levels

Reporting

Training

High Stakes

American Samoa

 

 

Subset

X

 

X

 

 

Bureau of Indian Affairs

 

 

Combination

X

 

 

 

 

Marshall Islands

 

 

Subset

 

 

 

 

 

Virgin Islands

X

X

Additions

X

X

X

X

 

Washington DC

 

 

Subset

X

 

 

 

 

 

Standards

All of the educational units have determined the nature of the standards for their alternate assessments, and all but one have determined the approach that will be used to gather alternate assessment data.  As for the 50 states, most of the educational units are basing the alternate assessment on the state standards, either by adding to them, or by selecting a subset of them.  The educational units provide the following additional information about standards:

American Samoa:  The standards will be a subset of those applied to general education.

Bureau of Indian Affairs:   Combination based on varied states involved.

Marshall Islands:  The standards are a subset of those applied to general education.

Virgin Islands:  The standards will include those applied to general education with some additions.

Washington, DC:  The standards will be a subset of those applied to general education.

 

Approach

As for the approaches that will be used to collect data, four educational units provide additional information:

American Samoa:  Selected approaches include observation, written and face-to-face interviews, and analysis of existing data.

Bureau of Indian Affairs:  Reviewing portfolio systems proposed for general education agency-wide use.

Virgin Islands:  The assessment instrument is in the final revision stage with an expected completion date of June 29.

Stakeholders, Proficiency Levels, Training

Other information from the educational units is less consistently available.  For example, only the Virgin Islands indicated that stakeholders were involved, that proficiency levels had been set, and that training was being developed.  The responses of the Virgin Islands for these aspects of the alternate assessment were as follows:

Stakeholders:  The process began with a joint task force consisting of special and general educators and administrators, parents, and community members.  The next phase involved the assistance of the Southeastern Regional Resource Center and a pared down committee.

Proficiency Levels:  The process and reporting forms are still in draft form.

Training:  Training will take place upon the opening of school.  The SEA along with members of the Assessment Task Force, SERRC, and the PTI will conduct the sessions.

 

Participation Guidelines

Both the Marshall Islands and the Virgin Islands indicated that participation guidelines were in development.  Specific responses about these guidelines were:

Marshall Islands:  Guidelines have been researched, waiting for committee to OK the policy and forms to be printed.

Virgin Islands:  The information is still in draft form.  It is in concert with the standardized test requests.

 

Reporting

Both American Samoa and the Virgin Islands provided additional information about reporting.  Their specific comments were:

American Samoa:  Reporting procedures will be determined with much assistance from the University of Oregon.

Virgin Islands:  This phase involved the assistance of the Southeast Regional Resource Center and a committee.