Instructional and Assessment Accommodations in Maryland


NCEO Maryland/Kentucky Report 6

Published by the National Center on Educational Outcomes

Prepared by:

Jim Ysseldyke • Martha Thurlow • Allison Seyfarth  • John Bielinski   • Mark Moody  • John Haigh

December 1999


This document has been archived by NCEO because some of the information it contains is out of date.


Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Ysseldyke, J., Thurlow, M., Seyfarth, A., Bielinski, J., Moody, M., & Haigh, J. (1999). Instructional and assessment accommodations in Maryland (Maryland/Kentucky Report 6). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://education.umn.edu/NCEO/OnlinePubs/MdKy6.htm


Overview

Maryland’s School Performance Program (MSPP) is an accountability system that is held as a model nationwide (Ysseldyke et al., 1996a). It is a system based on assessments throughout a student’s schooling, including the Maryland School Performance Assessment Program (MSPAP) at grades 3, 5, and 8, as well as the Maryland Functional Testing Program (MFTP), a graduation test first given at the end of grade 6. During the late 1990s, Maryland was also piloting its Independence Mastery Assessment Program (IMAP), a performance assessment for students with significant disabilities. Additional data such as student characteristics, financial information, and kindergarten completion, are included with the assessment data to describe the progress of Maryland’s schools. Further details on the assessment system are available in Ysseldyke et al. (1996).

Students with disabilities in Maryland are required to participate in the state assessments (unless individually exempted), and with the implementation of IMAP, will be required to participate in the system. Accommodations to the assessments often are needed by students with disabilities to participate in large-scale assessments (Thurlow, Elliott, & Ysseldyke, 1998; Thurlow, Seyfarth, Scott, & Ysseldyke, 1997). An accommodation is a change in the test or testing environment intended to remove the effect of a disability from a student’s performance on an assessment. In developing its policy on accommodations, Maryland differentiated the appropriateness of the accommodation by the type of test being taken, in order to make clear whether the score would be invalidated by the accommodation, or whether the accommodation was available at all in a particular test (see Appendix A for list of accommodations).

One of the difficulties in making decisions about accommodations is the lack of appropriate research on accommodations (Thurlow et al., 1997). Many states and policymakers have begun to endorse the idea that in order for students to receive assessment accommodations, they must already be receiving the accommodations in instruction (Thurlow et al., 1997). Further, each accommodation must be justified by the specific instructional needs a student has, and the decision about accommodations should be made by someone who knows the student (Elliott, Thurlow, & Ysseldyke, 1996).

Given that Maryland has had a well-developed accountability system for years, and that it has a reputation for trying to be as inclusive of students with disabilities as possible, it is an ideal place to examine the actual implementation of accommodations policies. There are many questions yet to be answered in this area. Some of the pressing issues that can be addressed by a study in Maryland include the relationship between instructional and testing accommodations, as well as the relationship between accommodations and a number of other variables, such as school grades. The study reported here was designed to examine the relationship between instructional and assessment accommodations for the Maryland state tests. The results of this examination begin to answer whether accommodations are being provided as intended in the system—a way to “level the playing field” for students with disabilities.


Study Procedures

Participants

The Individualized Education Programs (IEPs) of 280 students were examined for the study. All students were on active IEPs at the time that their records were pulled for analysis. Additional details on the characteristics of the students are included in the Results section.

Four Local Education Agencies (LEAs) were identified to participate in the study. These LEAs allowed teams from the project to examine their IEPs. One of the LEAs had a population of more than three quarters of a million people with a growing and increasingly diverse area of urban, suburban, and rural communities. Another LEA had a population of more than one quarter million people and was located between two large urban areas. The third LEA, with a population slightly less than one quarter million was the largest of the four LEAs. The fourth participating LEA was the smallest, with a population of about 150,000. This LEA was in a mostly rural area north and west of a major metropolitan area.

Instrument Development

The Maryland State Department of Education (MSDE) used a focus group to assist in survey development. The focus group was comprised of Department of Special Education (DSE) staff and LEA teachers, administrators, and school psychologists. The draft survey created by the focus group was revised by staff at the National Center on Educational Outcomes (NCEO), with input from the Kentucky State Department of Education. A copy of the data collection instrument can be found in Appendix A.

Data Collection

The focus group reviewed several possible methods for collecting data on accommodations and performance indicators from students’ Individualized Education Programs (IEPs). The group decided that a coordinator would oversee the training of four pairs of teachers during the summer; these teachers would gather data. For ease of collection, one team member would be from the target LEA and familiar with that district’s IEP process and forms.

MSDE and the coordinator trained the teachers in the use of the instrument, intent of the project, and procedures to be used with examples from their LEA’s IEPs. Each team member was given a chance to review the instrument using his or her IEP forms, and to explain the IEP form to other members of the team. Team members were reimbursed for their participation.

Data were collected in each of the LEAs during a two-week period in the summer. Since the Maryland State Performance Assessment Program (MSPAP) covers elementary and middle school (grades 3, 5, and 8), data were collected for grades 1-8.

Part of the intent of the study was to determine whether students were receiving appropriate accommodations, and part was to examine the relationship between school achievement and MSPAP scores. Data relevant to both of these goals were collected. Accommodations used both before and after the state assessment were examined, and course grades and other indicators of achievement were reviewed. This was easier in grades 1-6 because of the location of the student and the place of assessment. For example, MSPAP students tested in the third grade at an elementary school could have their IEPs followed the year(s) prior to and following the assessment for up two years. This would be more difficult for fifth graders because they would most likely be enrolled in a different school building for middle school and would have their IEPs and other records moved; the same issue was true for eighth graders. Teams collected the data and submitted forms to the coordinator for review. When there were questions or unclear areas, student IEPs were re-reviewed.

 

Reporting

Each coordinator submitted the raw data and a summary of all data to MSDE. MSDE copied the forms, and then submitted them to NCEO for analysis.

Maryland’s accommodations for statewide assessments are divided into five categories: scheduling, setting, equipment/technology, presentation and response. These accommodations are further subdivided into specific allowable accommodations for each statewide assessment (Maryland Functional Testing Program [high stakes for student], CTBS/5 [norm-referenced assessment] and Maryland School Performance Assessment Program [high stakes for schools]). See the chart in Appendix B for Maryland’s specific accommodations policies.

Accommodations available for state assessments are identified at the IEP team meeting and are used for both instruction and assessment. It is possible for a student to receive accommodations on his or her IEP that are used in all content areas. For example a student might have a reading or extended time accommodation for any reading, and this would apply to all subject areas, such as social studies, math, science, etc.

Modifications to instruction are changes that permit the student to work toward the same standards, indicators, or extended indicators. Modifications extend beyond accommodations and generally are not allowed in assessments. See Appendix C for examples of modifications.

In some instances, multiple accommodations are used by a student. For example, the use of Braille usually requires a scheduling (extended time) and sometimes a setting (administered individually) accommodation. This leads to some primary accommodations linked to secondary and multiple accommodation sets.

Specific terms used on the data collection form, such as “not well grounded,” were defined for data collectors via a glossary. The glossary is provided in Appendix D.


Results

Subjects

The students whose records were reviewed were students who received special education services in Maryland. They were all younger than ninth grade at the time of the study, and most of the students were third grade or older. Boys were represented in the sample more often than girls (70% and 30%, respectively). The sample had more white students than other ethnic/racial groups, with 77% of the sample identified as white, 13% identified as black, 7% identified as Hispanic, and the remainder reporting missing data or other categories of ethnicity. Three percent of the sample was reported to be enrolled in classes for students who speak English as a second language. Table 1 provides additional detail on demographic information.

 

Table 1. Demographic Characteristics of Students Included in Sample                                   

 

N

Percent

Grade

 

 

1st and younger

8

2.9

2nd

19

6.9

3rd

33

12.0

4th

44

15.9

5th

35

12.7

6th

53

19.2

7th

39

14.1

8th

45

16.3

Gender

 

 

Male

195

69.6

Female

85

30.4

Race/Ethnicity

 

 

White

194

77.0

Black

32

12.7

Hispanic

17

6.7

Asian-American

5

2.0

American Indian

1

0.4

Multi-ethnic

3

1.1

ESL Status

 

 

In ESL classes

6

3.0

Not in ESL classes

197

97.0

                       

Special Education Characteristics of Sample

Included in the survey were students with a number of primary disabilities. Students identified with specific learning disabilities were most frequently represented in the sample (46%), followed by students with speech and language disabilities (25%), multiple disabilities (12%), and other health impairments (11%). When examining the prevalence of disabilities, both low prevalence (hearing impairments, deaf, visual impairments, other health impaired, multiple disabilities, traumatic brain injury, and autism) and high prevalence (speech and language, specific learning disability, and unclassified) disabilities were well represented in the sample (25% and 71%, respectively), while moderate prevalence (mental retardation, serious emotional disturbance) disabilities made up only 4% of the sample. Disability type in the sample was primarily cognitive disabilities (84%), followed by physical disabilities (13%) and emotional disabilities (3%). Students included in the sample were mainly receiving their services in the general education classroom (57%), though some students received services in a resource room (21%) or in a separate class (22%). Most students were receiving a moderate intensity of services, intensity two and three (of a six level scale) were the most often reported. Table 2 provides additional detail on the special education characteristics of the sample.

Table 2. Special Education Characteristics of Students Included in Sample

  N Percent
Primary Disability    
Specific Learning Disability 126 45.7
Speech and Language Impairments 69 25.0
Multiple Disabilities 32 11.6
Other Health Impairment 30 10.9
Severe Emotional Disturbance 8 2.9
Mental Retardation 3 1.1
Visual Impairment 2 0.7
Deaf    2 0.7
Hearing Impairment 1 0.4
Autism 1 0.4
Traumatic Brain Injury 1 0.4
Diagnostic/Not Categorized  1 0.4
Setting Receiving Services    
General Education Class 159 57.4
Resource Room 57 20.6
Separate Class 61 22.0
Intensity of Services    
Intensity I 13 4.7
Intensity II 107 38.9
Intensity   III 80 29.1
Intensity IV 42 15.3
Intensity V 33 12.0

 

The content on which the students in this sample were working was primarily reading, writing, language usage, and math. Sixty to ninety percent of the sample had IEP goals in each of those areas. About 12% of the sample had IEP goals in the areas of science or social studies. Fifty-eight percent of the sample had IEP goals in areas other than those listed above (see Table 3 for exact numbers of students with IEP goals in each area). These trends appear to hold true regardless of disability type (physical, emotional or cognitive) or disability prevalence (low, moderate, or high). One exception to this trend is that students with physical disabilities appeared to have goals in language usage less often than did students overall (55% versus 74%). Another exception to the trend is that students with high prevalence disabilities, such as specific learning disabilities and speech and language impairments, tended to have IEP goals in math less often than students overall (37% versus 63%). In general, the likelihood of a student having a goal in an area increased as the intensity of services received increased.

Grades Received

For analysis purposes, letter grades were transformed to a 13-point scale, ranging from 0 for an F to 12 for an A. In examining the grades received by students, the first set of comparisons examined whether the grade received differed as a function of whether a student had an IEP goal in that area. Overall, there were rarely grade differences between students with an IEP goal in an area and other students in the sample. Two exceptions emerged. For grades in reading in 1996/97, students without an IEP goal in reading had significantly higher grades than students with an IEP goal in reading. In math, a similar trend emerged for 1995/96 grades, with students without an IEP goal in math receiving higher grades in math than students who did have a goal in the area.

When comparing grades by the intensity of services a student received, it appeared that students at the lower intensity of services (levels one and two) generally were  more likely to earn high grades than students receiving a higher intensity of services. Additionally, the opposite was true—students receiving a higher intensity of services (levels three, four, and five) were more likely to earn low grades than students who received low intensity services. The likelihood of earning average grades was lowest for students at the level one intensity of services; those students exhibited a great deal of variability in grades, likely earning both low and high grades, but less likely to earn average grades. At other levels of intensity, the likelihood of earning average grades did not appear to differ. Grades also were analyzed as a function of accommodation. These results are presented in the next section of this report.

 

Instructional Accommodations

If a student had an IEP goal, it was very likely that the student received an accommodation for  instruction in that area. An accommodation to instruction would be a change in instruction that does not result in a change in the standards or instructional goals for that student compared to other students. In this sample, 60% of all students had an IEP goal in reading, and 45% of all students received an instructional accommodation in reading (Table 3 shows the numbers of students with instructional accommodations in each area).

Table 3. Frequency of IEP Goals, Instructional Accommodations, and Instructional Modifications

  Frequency of IEP Goals Frequency of Instructional Accommodations Frequency of Instructional Modifications
  N Percent N Percent N Percent
Content            
Reading 168 59.6 119 42.2 129 45.7
Writing 182 54.5 122 43.3 145 51.4
Language Usage 103 36.5 64 22.7 77 27.3
Math 114 40.4 82 29.1 100 35.5
Science 16 5.7 29 10.3 29 10.3
Social Studies 18 6.4 28 9.9 31 11.0
Other 162 67.4 82 29.1 115 40.8

 

The types of instructional accommodations provided to students was somewhat dependent on the content area of instruction. In reading instruction, the most commonly reported accommodations included reading an entire test to the student, and reading selected sections or vocabulary. For writing instruction, the most common accommodations include writing answers in the test booklet and dictating to an assistant who transcribed for the student. Use of a calculator was the most common instructional accommodation provided in mathematics. Breaks within a testing session or administering tests across days, as well as the repetition of directions, were other common accommodations for this sample, and these were common accommodations in the areas of speech/language, social/emotional needs, and study skills.

In examining the presence of accommodations in instruction, we found no clear differences by disability prevalence or type. However, some trends did appear in the data. It appeared that students with low prevalence disabilities were more likely to receive instructional accommodations than were students with high prevalence disabilities. Also, students with physical disabilities appeared to receive instructional accommodations more frequently than students with cognitive disabilities. These results need to be interpreted with some caution, since the numbers were too small to perform analyses for students with emotional disabilities or students with moderate prevalence disabilities.

Students who were receiving special education services at levels three and four (moderate intensity) were most likely to have accommodations in specific instructional areas, as compared to students in levels one and two (low intensity) and students in level five (very high intensity). This was true in the content areas of reading, writing, language usage, and other. In the areas of math, science, and social studies, it was more likely that students in levels four and five received instructional accommodations than did students in levels one, two, or three.

Grades earned by students did not differ overall for students who received instructional accommodations compared to students who did not receive instructional accommodations. The only course content and year where there was a significant difference (F=4.62, p=.03) was in social studies in 1995/96. Students who received instructional accommodations in social studies had a higher grade point than students who did not receive instructional accommodations. Table 4 provides details on these data.

Table 4. Tests of Mean Differences in Grade between Students Receiving Instructional Accommodations and Those Who Did Not

Year of Grade Subject F Value P Value No
(# of Ss)
Yes
(# of Ss)
1995/96 Reading 0.0312 0.8602 12 118
Writing 0.0281 0.8671 11 112
Language Usage 0.0101 0.9201 11 107
Mathematics 0.0000 1.0000 12 125
Science 0.9426 0.3334 11 123
Social Studies* 4.6196 0.0334 12 123
Other 0.4480 0.5084 2 30
1996/97 Reading 2.7690 0.0981 18 138
Writing 3.8455 0.0518 16 128
Language Usage 0.0060 0.9383 12 103
Mathematics 0.6188 0.4326 18 145
Science 0.0778 0.7807 17 139
Social Studies 1.9394 0.1567 17 141
Other 3.2659 0.0785 4 37

 * Differences between the course grades of students receiving instructional accommodations and those who did not receive accommodations was significant.

Instructional Modifications

Modifications to instruction are changes in instruction that result in a student working toward a different standard or goal than other students in the grade. In this sample, students were as likely or more likely to receive a modification to instruction compared to an accommodation to instruction. Details on the frequency of instructional modifications provided to students were included in Table 3.

It becomes clear why more modifications were reported than accommodations when they are examined closely. Frequent modifications listed for students in the area of reading included being in a small group, repeating or restating directions, and modifying curriculum materials and objectives. While modifying materials and objectives could change the standard or goal a student is working toward, small groups and repeating directions are both considered to be accommodations, rather than modifications. This trend holds throughout the “modifications” reported for students. Some, such as modifying curricular objectives and materials and adjusting the workload for a student, are common across content areas, and are true modifications. Others, such as working in small groups are more similar to accommodations.

When examining instructional modifications by disability prevalence, a trend emerged that is similar to that in the instructional accommodations. Students with low prevalence disabilities in this sample were more likely to receive instructional modifications than were students with high prevalence disabilities. However, in examining the likelihood of receiving instructional modifications by type of disability, no clear trends emerged, likely due to small numbers in the analyses. Again, these results need to be interpreted with some caution, especially due to low numbers of subjects with moderate prevalence disabilities and emotional disabilities.

Modifications to instruction are provided to students with increasing frequency as the intensity of special education services that they receive increases. Students at a level five intensity for special education services are more likely to receive a modification to instruction than are students at levels one through four. For nearly every content area, there is a linear relationship between intensity of services received, and the likelihood a student is receiving a modification to instruction.

Accountability Information for Sample

The students in this sample were participating in state tests, such as the Maryland School Performance Assessment Program (MSPAP), and the Maryland Functional Tests (MFT). Poor documentation in some of the files made it difficult to get good estimates of the total numbers of students who participated. For MSPAP, about 6% of students in the sample were exempted (did not take the test because they transferred from out-of-state during the second semester of the year, or they were first-time LEP test takers, or they were not pursuing regular learning outcomes), and less than 1% were excused (did not take the test because of emotional trauma, absence, or use of non-approved accommodations). For MFT, about 2% of students in this sample were exempted, and about 1% were excused. For the Comprehensive Test of Basic Skills (CTBS/4), 2% were exempted, and fewer than 1% were excused.

Reasons for the exemptions or excuses were fairly consistent across MSPAP and MFT. Most students were exempted from the testing due to their pursuit of a “life skills curriculum,” or because they were receiving drastically modified content. On the CTBS, most students were exempted for the same reasons, but a few students were reported to have been exempted due to lack of appropriate accommodations.

Local criterion referenced tests (CRTs) were available for about 70% of the sample. Of those who had a local CRT, 80% took the test; documentation of test participation was not found for 7% of the students in the sample.

The Independence Mastery Assessment Program (IMAP) is an alternate assessment program that is offered for students who are not able to take MSPAP meaningfully. At the time of this study, it was being piloted by the state, and was taken by less than 0.5% of the students in the sample. Because of the small numbers, it is not possible to draw any meaningful conclusions about IMAP participation for the students in this sample.

Testing Accommodations

The testing accommodations that, according to their IEPs, were provided to students were categorized into five types:

       Scheduling accommodations that change the time of day or length of testing period.

    Setting accommodations that changes where the test is offered, such as in a student’s special education classroom or a hospital setting.

       Equipment accommodations, such as providing a magnifier for a student with a visual impairment.

       Presentation accommodations, such as offering a Braille version of the test.

       Response accommodations, such as allowing typed responses rather than handwritten.

Most students in the sample (82%) used some form of testing accommodation for MSPAP, MFT, or CTBS. Scheduling accommodations were most frequently provided, followed by setting accommodations, presentation accommodations, response accommodations, and equipment/format accommodations. In this sample, 27% of students used all five types of accommodations, followed by other combinations. Details on the frequency of various testing accommodations provided to students are presented in Table 5.

Table 5. Frequency of Testing Accommodations Provided to Students

  N Percent
Categories    
Scheduling 211 74.8
Setting 199 70.6
Equipment/Format 121 42.9
Presentation 176 62.4
Response 155 55.0
Total number using accommodations 230 81.6
Combinations of Accommodation Categories    
All categories 76 27.0
All but equipment 41 14.5
Scheduling, setting, and presentation 17 6.0
All but response 16 5.7
All but presentation 13 4.6
Scheduling, setting, and response 12 4.3

 

The most common testing accommodations were similar to those most commonly used in instruction. These included the scheduling accommodations of periodic breaks within a testing session, extra response and processing time, and multiple days for testing. Within setting accommodations, special seating, and special administrators (special education teachers, aides) were the most commonly used. A calculator was the most common equipment/technology accommodation. Repeating directions and reading portions or all of the test were the most common presentation accommodations. Common response accommodations included having the student mark his or her answer in the test book, dictating to an examiner for transcription, or a combination of the two.

When examining the appropriateness of documentation of assessment accommodations, for 20% of the IEPs there was no documentation as to why accommodations were or were not made. Almost 19% of the raters indicated that the explanation for why an accommodation was provided was not well grounded. Problems identified by raters included documentation problems, a mismatch between instructional and testing accommodations, and providing either too many accommodations (e.g., a calculator for a student who has a reading disability and is reputed to be strong in math) or too few accommodations (e.g., not providing repetition of directions for a student diagnosed with an attention deficit hyperactivity disorder).

Compared to testing accommodations, instructional accommodations appeared to be both better documented and better explained. Only 13% of IEPs had missing documentation on why instructional accommodations were or were not made. Additionally, only 12% of the raters indicated that the explanation of the reason for an accommodation was not well grounded. The reasons instructional accommodations were not justified were much the same as for testing accommodations. Some of the students received more accommodations than were justified, some received fewer, and the documentation for other students was inadequate.

Accommodations Match/Status

In examining the match between instructional and testing accommodations, 88% of students had an instructional accommodation listed on their IEPs (with 31% missing data). Eighty-five percent of students had a testing accommodation listed on their IEPs (with 28% missing data). Of the students who had accommodations listed, 84% had accommodations in instruction that matched those identified for testing.

Students with speech and language disabilities and those with serious emotional disturbances were least likely to receive accommodations either to instruction or testing. Only 60-66% of students in those categories received instructional or testing accommodations, while 85% or more of students in all other federal disability categories received instructional and testing accommodations.

Students at a moderate intensity of service provision, level three, were less likely to receive any instructional accommodations than were students with either less intensity of services (level one) or more intensity of services (levels four and five). Of the students at level three, 83% received instructional accommodations; 90% of students at level one received instructional accommodations; and 97% and 95% of students at levels four and five, respectively, received  instructional accommodations. Intensity of services was more linearly related to testing accommodations. As the intensity of services received by students increased, so did the likelihood that they would receive testing accommodations.

Students with low prevalence disabilities were more likely to have their instructional accommodations matched to their testing accommodations (98%) than were students with moderate (71%) or high (84%) prevalence disabilities. Similarly, students with physical disabilities (100%) were more likely to have instructional accommodations matched to testing accommodations than were students with emotional (67%) or cognitive (87%) disabilities. The group with moderate prevalence disabilities, as well as the group with emotional disabilities, had low numbers in the IEP samples, so their results need to be interpreted with some caution.


Discussion

This study was conducted to examine the relationship between instructional and assessment accommodations for the Maryland state tests. This examination provides a first step in assessing whether accommodations are being provided as intended in the system.

The students whose IEPs were included in the sample were all students receiving special education services. The sample was fairly similar to the overall population of special education students, with most students identified as having a learning disability or a speech or language impairment, and most receiving moderate intensity services. Students with emotional disabilities were somewhat underrepresented in the sample.

When IEP goals and accommodations to instruction were examined, it appeared that most students had goals focused on the content areas of reading, writing, and mathematics. Further, the accommodations and modifications to instruction that they received were highly related to the instructional content, such as receiving reading help in reading, using a calculator in mathematics, and so on. Availability of accommodations and modifications was also related to the severity of the disability experienced by the student, in that students receiving more intensive special education services were more likely to receive accommodations and modifications than students receiving less intense interventions. There was apparent confusion over the terms “accommodation” and “modification” for many teachers, in that changes listed as modifications in the IEPs were actually considered to be accommodations by the Maryland Department of Education (MDE). Generally, student performance, as measured by school grades, was not affected by the presence of instructional accommodations.

Within this sample, most students took the state and district tests, though some students were reported to be missing documentation of their testing status. When students did not take the tests, if there was documentation available about why they were exempted or excused, it was generally due to the student’s pursuit of a different curriculum or standards from those of their peers (e.g., life skills). Not enough students in this sample took Maryland’s alternate assessment to examine that assessment.

In examining the students who participated in their state and district tests, the majority (82%) had some form of testing accommodations listed on their IEPs. Scheduling accommodations such as periodic breaks, extra response and processing time, and multiple testing days were most frequently identified; equipment/technology accommodations were least frequently identified. Instructional accommodations were judged by respondents to be better documented and better explained than were testing accommodations. The most common reasons respondents found testing or instructional accommodations to be problematic included lack of documentation, offering fewer accommodations than appeared justified, offering more accommodations than appeared justified, and a lack of relationship between instructional and testing accommodations. Of the students who had accommodations listed on their IEPs, 84% were reported to have received accommodations to instruction that matched those provided for state testing. When type of disability was examined, students with physical disabilities were more likely to have instructional accommodations matched to testing accommodations (100%) than were students with cognitive disabilities (87%) or students with emotional disabilities (67%); because this last group was small, the numbers need to be interpreted with caution.

One of the themes throughout the study’s findings was the difficulty in obtaining accurate information or judging the information available due to missing documentation. One possible reason for such a poor accounting of accommodations (20% not documented and 19% not well grounded) is that the IEPs were ones developed early in the training of school personnel on documentation of testing accommodations on student IEPs. This is in part evidenced by better documentation of instructional accommodations and modifications than testing accommodations and modifications. It is hoped that with further training, documentation errors will be eliminated.

Even with the improving documentation, some findings are still of note from this study. First, the general lack of relationship between instructional accommodations and school grades (except for social studies grades in one of the two years examined) is worth further investigation. This finding supports the generally agreed upon intent of accommodations—to remove the impact of the disability from a student’s performance without providing undue advantage. This study provides some preliminary evidence that schools may be meeting the intent of instructional accommodations. However, it is important to confirm this by looking at other performance data, to ensure that some other mediating variable is not influencing results.

A more important finding from this study, and the main focus of the research, is the match between instructional and testing accommodations. Overall, students’ instructional accommodations matched those provided for statewide testing (judged to be true for 84% of students who used accommodations). However, some of the abuses that have been of concern were also noted in this sample, such as identifying testing accommodations that were not listed for instruction, and listing testing accommodations that appeared unwarranted. Additionally, testing accommodations were reported to be not well grounded more often than were instructional accommodations. Specific examples of this were reports such as “calculator accommodation does not appear justified—educational assessment lists calculation as a strength and calculation is not listed as a need area.”

It is difficult to make solid judgments from the data gathered since they are devoid of the context in which the decisions were made, and often suffer from problems of documentation. As the problems of documentation get cleared up with further training, it may be worthwhile to again examine student files to determine whether the 16% of students with testing accommodations that do not match instructional accommodations receive the same or compatible accommodations in both situations. Further, it would be important to re-examine the files to determine whether the students for whom the accommodation decision was not well grounded were experiencing documentation problems, or whether there is a problem within the system about how such decisions are being made.

However, even with some of the difficulties identified here, this study demonstrates that for most students, appropriate accommodations to instruction and testing are being listed. Further, instructional accommodations do not appear to be providing the students an unfair advantage in school grades when compared to other students. As progress occurs within the educational system for decision making and documentation, it becomes more likely that students will receive accommodations to “level the playing field,” giving them the most appropriate opportunity to access educational opportunities.


References

Elliott, J., Thurlow, M., & Ysseldyke, J. (1996). Assessment guidelines that maximize the participation of students with disabilities in large-scale assessments: Characteristics and considerations (Synthesis Report 25). Minneapolis, MN: National Center on Educational Outcomes, University of Minnesota.

Thurlow, M., Seyfarth, A., Scott, D., & Ysseldyke, J. (1997). State assessment policies on participation and accommodations for students with disabilities: 1997 update (Synthesis Report 29). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Ysseldyke, J., Thurlow, M., Erickson, R., Gabrys, R., Haigh, J., Trimble, S., & Gong, B. (1996). A comparison of state assessment systems in Kentucky and Maryland with a focus on the participation of students with disabilities (State Assessment Series, Maryland/Kentucky Report 1). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Appendix A

Data Collection Survey

 

                           Survey of Services to Students with Disabilities

 

School Name:                                                                                                               LSS School #                    

Student Name:                                                                                                      LSS Student ID #                    

Student Date of Birth:        \       \             Grade:                                                SSIS Student ID #                    

Gender:                                                                                                                                  Male:                                                                                                                                                       r                                                                                                                                Race/Ethnicity:                    

Female: r   Disability:                                                                                                                                                            Federal Disability Code:            

                  Previous Disability (if any):                                                      Federal Disability Code:            

Date of Last IEP:        \      \                                Date of Last ARD Committee Meeting:        \      \           

Survey Prepared by:                                                                     Date Survey Prepared:        \      \           

 

1a.       In what setting is the student receiving services in accordance with the IEP and/or ARD Minutes? (Check the setting which best describes the student’s learning environment, then  consider if English as a Second Language (ESOL) is a service being provided to the student.)

o General Education Class – includes student enrolled in a comprehensive school who receives Special Education and related services OUTSIDE THE GENERAL EDUCATION CLASSROOM  for less than 21% of the school day.  For Preschool students, includes any combination of regular early childhood settings with no pull-out (e.g. Extended Elementary Education Program, Head Start, or other early childhood settings) as “inside the general education classroom.”

o Separate Class – includes student enrolled in a comprehensive school who receives Special Education and related services  (Including Preschool pull-out programs) OUTSIDE THE GENERAL CLASSROOM for more than 60% of the school day.

o Resource Room/Combined Program – includes student enrolled in a comprehensive school who receives Special Education and related services  (Including Preschool pull-out programs) OUTSIDE THE GENERAL CLASSROOM for at least 21%, but no more than 60% of the school day.

 

1b. English as a Second Language – student is also enrolled in English as a Second Language classes.                                                                                          o NDF   o Yes     o No

 

2.         What is the intensity of services stated on the IEP?

o I             o II             o III            o IV             o V             o VI             o Unknown

 

3a.     Does the student receive services which are provided in an extended school year calendar?                                                                 o NDF          o Yes     o No

 

3b.     Does the student receive services which are provided in a program which uses a twelve-month school year calendar?                                         o NDF          o Yes     o No

 


4a.          According to the student’s IEP, what related service(s) is the student receiving this school year (’1996 – ’97)?  Is the service provided direct, indirect, or both (as in an inclusion model)?  What is the schedule to proved the service?  (Related services and other supportive services are required to assist a disabled student to benefit from Special Education.  The related services include speech pathology and audiology, psychological services, physical and occupational therapy, recreation, early identification and assessment of disabilities in students, counseling service, and medical services for diagnostic or evaluation purposes.  This also includes health services, social work services in the school, and parent counseling training.)

 

 

Related Service Type

 

Direct/Indirect

 

Schedule/Time [hours per week]

 

EXAMPLE: Speech Therapy

 

Direct

 

3 times/week for 1 hour, total of 3 hours/week)

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

4b.          According to the student’s IEP, what specific academic goals require accommodation(s) and/or modification(s) in the student’s instructional setting this school year (1996 – ’97)?

 

 

IEP Academic

Goal Areas

 

Accommodation/

 Modification

Required?

 

 

Specific Accommodation

 

Reading

 

 

 

 

 

Writing

 

 

 

 

 

Language Usage

 

 

 

 

 

Mathematics

 

 

 

 

 

Science

 

 

 

 

 

Social Studies

 

 

 

 

 

 


5.       Do the IEP and/or ARD minutes document the decision as to which outcomes the student will be pursuing?

r    Maryland Learning Outcomes Only      r    Alternative Outcomes Only

r Both Maryland Learning Outcomes and r Documentation not found

Alternative Outcomes

 

6.       What year(s) and grade(s) did/will the student participate in MSPAP?

 

 

 

 

School Year

 

 

 

1994 – 95

 

1995 – 96

 

1996 – 97

 

1997 – 98

 

1998 – 99

 

Grade

 

 

3

 

 

 

 

 

 

 

 

 

 

 

5

 

 

 

 

 

 

 

 

 

 

 

8

 

 

 

 

 

 

 

 

 

 

 

7a.     What year(s) and grade(s) did/will the student participate in IMAP?

 

 

 

 

School Year

 

 

 

1994 – 95

 

1995 – 96

 

1996 – 97

 

1997 – 98

 

1998 – 99

 

Grade

 

 

3

 

 

 

 

 

 

 

 

 

 

 

5

 

 

 

 

 

 

 

 

 

 

 

8

 

 

 

 

 

 

 

 

 

 

 

7b.     In the ARD minutes or on the student’s IEP is there documentation to indicate why the student was not taking MSPAP and why IMAP was more appropriate for the student?

r No     r Yes     r Not available

Explain [indicate source of information (ARD minutes or student’s IEP)]                                                                                                                                                                       

 


8a.     What year(s) and grade(s) did/will the student participate in CTBS?

 

 

 

 

School Year

 

 

 

1994 – 95

 

1995 – 96

 

1996 – 97

 

1997 – 98

 

1998 – 99

 

Grade

 

 

3

 

 

 

 

 

 

 

 

 

 

 

5

 

 

 

 

 

 

 

 

 

 

 

8

 

 

 

 

 

 

 

 

 

 

 

8b.     What year(s) and grade(s) did/will the student participate in MFT? [MFT is not applicable for this study.

 

 

 

 

School Year

 

 

 

1994 – 95

 

1995 – 96

 

1996 – 97

 

1997 – 98

 

1998 – 99

 

Grade

 

 

3

 

 

 

 

 

 

 

 

 

 

 

5

 

 

 

 

 

 

 

 

 

 

 

8

 

 

 

 

 

 

 

 

 

 

 

9.       List the student’s end-of-year grades for the 1995 – 96 school year.  For the 1996 – 97 school year, list the most recent grades reported, and indicate if they are mid-year or first-quarter grades.

 

 

Areas

(if the areas don’t fit for a student seeking a certificate, indicated with and “X” and complete Notes section

 

Grade

for

1995-96

 

Grade for

1996-97

1st Quarter  r

Mid-year r

 

Areas

(if the areas don’t fit for a student seeking a certificate, indicated with and “X” and complete Notes section

 

Grade

for

1995-96

 

Grade for

1996-97

1st Quarter  r

Mid-year r

 

Reading

 

 

 

 

 

Mathematics

 

 

 

 

 

Writing

 

 

 

 

 

Science

 

 

 

 

 

Language Usage

 

 

 

 

 

Social Studies

 

 

 

 

 

Notes:                   

 

 


 


10.     If accommodations are made for a student, list them below, one accommodation per row.

 

 

Type

 

Description

 

Scheduling

 

 

 

Setting

 

 

 

Equipment/Format

 

 

 

Presentation

 

 

 

Response

 

 

 

[Use the “Accommodations Permitted” Document for details]

 

 

1995 – 96 School Year

 

1996 – 97 School Year

 

Instructional

Accommodations

 

Test

Accommodations

List State Test Name: MSPAP, MFT, CSTB

 

Instructional

Accommodations

 

Test

Accommodations

List State Test Name: MSPAP, MFT, CSTB

 

General Education

 

Special Education

 

General Education

 

Special Education

 

General Education

 

Special Education

 

General Education

 

Special Education

 

Examples

 

(1.B)

 

None

 

(1.B)

 

(MSPAP)I.B

 

NONE

 

NONE

 

NONE

 

NONE

 

NONE

 

NONE

 

NONE

 

(MSPAP)II.G.

 

NONE

 

NONE

 

NONE

 

NONE

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

11a.   In the ARD minutes, is there documentation to indicate why INSTRUCTIONAL ACCOMMODATIONS were or were not made in the 1995 – 96 school year?

r No     r Yes     r Not available

Explain [indicate source of information (ARD minutes or student’s IEP)]                                                                                                                                                                    In the examiner’s opinion, was the explanation well-grounded? r No     r Yes

 


11b.   In the ARD minutes, is there documentation to indicate why INSTRUCTIONAL ACCOMMODATIONS were or were not made in the 1996 – 97 school year?

r No     r Yes     r Not available

Explain [indicate source of information (ARD minutes or student’s IEP)]                                                                                                                                                                          In the examiner’s opinion, was the explanation well-grounded? r No     r Yes

 

12a.   In the ARD minutes, is there documentation to indicate why ACCOMMODATIONS for state test(s) were or were not made in the 1995 – 96 school year?

r No     r Yes     r Not available

Explain [indicate source of information (ARD minutes or student’s IEP)]                                                                                                                                                                               In the examiner’s opinion, was the explanation well-grounded? r No     r Yes

 

12b.   In the ARD minutes, is there documentation to indicate why ACCOMMODATIONS for state test(s) were or were not made in the 1995 – 96 school year?

r No     r Yes     r Not available

Explain [indicate source of information (ARD minutes or student’s IEP)]                                                                                                                                                                         In the examiner’s opinion, was the explanation well-grounded? r No     r Yes

 

13.     Was the student EXEMPTED by the ARD Committee from state test(s) listed below?

 

 

Test Name

 

Exempted?

(yes, no, dnf)

 

List the reason(s) for Exemption(s) and include the year of the exemption (e.g., ’96 – 7).

 

Examples:

 

(I)   the student transferred into the local school system with Limited English Proficiency.  (96 – 7)

(ii) the student is in need of function life skills.  (’95 – 7)

 

MSPAP

 

 

 

 

 

MFT

 

 

 

MFT is currently not applicable for this study.

 

CTBS

 

 

 

 

dnf = documentation not found

 


14.      Was the student EXCUSED by the ARD Committee from state test(s) listed below?

 

 

Test Name

 

Exempted?

(yes, no, dnf)

 

List the reason(s) for being Excused and include the year of the exemption (e.g., ’96 – 7).

 

Examples:

 

(I) the student demonstrated extreme frustration and was not able to complete the assessment.  (’96 – 7)

 

MSPAP

 

 

 

 

 

MFT

 

 

 

MFT is currently not applicable for this study.

 

CTBS

 

 

 

 

dnf = documentation not found

 

15.      Is there a local Criteria Reference Test (CRT)? r No     r Yes

 

16.      Did the student participate in the local CRT? r No     r Yes


Appendix B

Maryland's Accommodations Policies

I. Scheduling Accommodations

Is the Accommodation Permitted? (Yes, No, or NA--Not Applicable and/or Not Yet Available.)

 

MFTP

 

CTBS/5

 

MSPAP

 

 

 

 

 

Yes

 

Yes

 

Yes

 

A.

 

Periodic "breaks" needed, within a continuous test session, without exceeding total time allowance.

 

Yes

 

 

Yes

 

Yes

 

 

Yes

 

Yes

 

 

Yes

 

B.

 

 

C.

 

"Breaks" needed away from testing situation without exceeding total time allowed within same day.

 

Tests given regularly within a single day/session may be administered over multiple days without exceeding total time allowances.

 

   Yes           

 

*Yes

 

Yes

 

D.

 

Extra response and processing time may be necessary. (MSPAP time extensions must allow for participation in group activities.) (For CTBS/5 time extensions, see page 6.)

 

Yes

 

Yes

 

Yes

 

E.

 

Tests are administered at best time of day for student.

 

Yes

 

Yes

 

Yes

 

F.

 

Other, proposed by Local Accountability Coordinator and Special Education or ESL staff and approved by MSDE Assessment Office and MSDE Special Education or ESL staff.

*Invalidates comparison to national norms.

 

II. Setting Accommodations

Is the Accommodation Permitted? (Yes, No, or NA--Not Applicable and/or Not Yet Available.)

 

MFTP

 

 

CTBS/5

 

MSPAP

 

 

 

 

 

Yes

 

Yes

 

Yes

 

A.

 

General education classroom, with special seating (front of room, carrel, etc.).

 

Yes

 

NA

 

Yes

 

B.

 

General education classroom, with adjusted grouping.

 

Yes

 

Yes

 

Yes

 

C.

 

General education classroom, with additional school support person (instructional assistant, guidance, etc.)  Support person is not to help student read or respond to items.

 

Yes

 

Yes

 

Yes

 

D.

 

General education classroom, with special education staff as support. Support person is not to help student read or respond to items.

 

Yes

 

Yes

 

Yes

 

E.

 

Small group setting with school support staff (speech pathologist, pupil personnel worker, ESL teacher or specialist, etc.) as examiner.

 

Yes

 

Yes

 

Yes

 

F.

 

Small group setting with special education teacher as examiner.

 

Yes

 

Yes

 

NA

 

G.

 

Individual administration within the school building.

 

Yes

 

Yes

 

NA

 

H.

 

Individual administration outside school (home, hospital, etc.).

 

Yes

 

Yes

 

Yes

 

I.

 

Other, proposed by Local Accountability Coordinator and Special Education or ESL staff and approved by MSDE Assessment Office and MSDE Special Education or ESL staff.

 

III. Equipment/Technology Accommodations

Is the Accommodation Permitted? (Yes, No, or NA--Not Applicable and/or Not Yet Available.)

 

MFTP

 

 

CTBS/5

 

MSPAP

 

 

 

 

 

Yes

 

Yes

 

   Yes

 

A.

 

Large print test materials.

 

Yes

 

Yes

 

   Yes

 

B.

 

Braille test materials.

 

Yes

 

No

 

 *Yes

 

C.

 

Calculator for mathematics testing for special education or 504 students only.

 

 Yes

 

No

 

**Yes

 

D.

 

Use of electronic devices (mechanical speller, word processor, computer, augmented  communication device, etc.).

 

Yes

 

Yes

 

    Yes

 

E.

 

Bilingual dictionary (a synonym dictionary in the student’s native language) which

is provided in daily instruction.                        

 

Yes

 

Yes

 

   Yes

 

F.

 

Other, proposed by Local Accountability Coordinator and Special Education or ESL staff and approved by MSDE Assessment Office and MSDE Special Education or ESL staff.

 *             Entire tests are administered. Student’s mathematics score is invalidated in the scoring/data processing process. (Specified in the MSPAP Examiner’s Manual as tasks that do not list calculator as a required material for the task).

**           Entire tests are administered. Student’s language usage score is invalidated in the scoring/data processing process.

               

IV. Presentation Accommodations

Is the Accommodation Permitted? (Yes, No, or NA--Not Applicable and/or Not Yet Available.)

 

MFTP

 

 

CTBS/5

 

MSPAP

 

 

 

 

 

 Yes

 

Yes

 

  Yes

 

A.

 

Repetition of directions, as needed.

 

 Yes

 

NA

 

  Yes

 

B.

 

Written copies of orally presented materials, that are found only in examiner's manual.

 

 NA

 

NA

 

  NA

 

C.

 

Accessibility to close-caption or video materials.

 

 Yes

 

Yes

 

  Yes

 

D.

 

Sign language interpreter, amplification, or visual display required for test directions/examiner-led activities.

 

 *Yes

 

NA

 

   Yes

 

E.

 

Verbatim audiotape of directions. Scripted directions may be re-read in English  or a synonym provided in English.

 

*Yes

 

No

 

**Yes

 

F.

 

Verbatim audiotape of presentation of total test.

 

*Yes

 

No

 

**Yes

 

G.

 

Verbatim reading of selected sections of test or vocabulary by examiner or assistant.

 

*Yes

 

No

 

**Yes

 

H.

 

Verbatim reading of entire test to student.

 

 Yes

 

Yes

 

  Yes

 

I.

 

Other, proposed by Local Accountability Coordinator and Special Education or ESL staff and approved by MSDE Assessment Office and MSDE Special Education or ESL staff.

*              Not applicable to Maryland Functional Reading Test.

**           Entire tests are administered. Student’s reading score is invalidated in the scoring/data processing process.

 

V. Response Accommodations

Is the Accommodation Permitted? (Yes, No, or NA--Not Applicable and/or Not Yet Available.)

 

MFTP

 

 

CTBS/5

 

MSPAP

 

 

 

 

 

 Yes

 

Yes

 

  NA

 

A.

 

For machine-scored tests, student marks answers in test booklet. (Transfer to answer sheet completed by school personnel.)

 

 Yes

 

Yes

 

  Yes

 

B.

 

For selected response items, student indicates answers by pointing or other method.

 

 Yes

 

NA

 

**Yes

 

C.

 

For constructed response (brief or extended) items, student uses word processor.

 

 Yes

 

NA

 

**Yes

 

D.

 

For constructed response (brief or extended) items, student tapes response for later verbatim transcription by school personnel.

 

*Yes

 

Yes

 

  No

 

E.

 

Student's transferred responses (alignment and completeness of hand-filled bubbles) may be checked by school personnel.

 

 Yes

 

NA

 

**Yes

 

F.

 

For constructed responses (brief or extended) items, student dictates response to examiner for verbatim transcription by school personnel.

 

 Yes

 

NA

 

  NA

 

G.

 

For constructed response (brief or extended) items or oral presentation, student signs response to interpreter of the deaf/hearing impaired.

 

 Yes

 

Yes

 

 Yes

 

H.

 

Other, proposed by Local Accountability Coordinator and Special Education or ESL staff and approved by MSDE Assessment Office and MSDE Special Education or ESL staff.

 *             Not applicable to Maryland Writing Test

**           Entire tests are administered. Student’s language usage score is invalidated in the scoring/data processing process.


Appendix C

Examples of Modifications

 Mathematics                                                                        Reading                                                                       Written Expression

 

Instructional Modification Strategies

 

Instructional Modification Strategies

 

Instructional Modification Strategies

 

·       Reduce the number of problems

·       Eliminate the need to copy problems

·       Simplify and enlarge worksheets

·       Avoid mixing “signs” on a page

·       Minimize the number of lines on page

·       Provide more time for completion

·       Graph paper

·       Raised number lines

·       Large number lines

·       Life-sized number lines

·       Mnemonic devices

·       “Two-finger” counting aids

·       Instructional strategies

·       Multi-modal instruction

·       Computational aids

·       Color coding strategies

·       Green marker to start / Red to stop

·       Peer support

·       Cross-age tutoring

 

·       Modifications:

m    extra time for completion*

m    shorten assignments

m    simplify text

m    highlight key concepts

m    provide chapter outlines

·       Instructional strategies

m    story frame

m    before, during, after

m    echo reading

m    use positive approach

m    story mapping

m    vary approach

m    multi-modality instruction

·       Information organizer

·       Structured study guides

·       “What-you-need-to-know” chart

·       Study carrel for individual work

·       Peer support

·       Cross-age tutoring

 

·       Modifications

m    extra time for completion*

m    shorten assignments

·       Instructional strategies

m    utilize content outlines

m    “webbing” strategies

m    process writing strategies

m    writing/story starters

m    use positive approaches

·       Study carrel for individual work

·       Formulate sentences aloud

·       Use “finger-for-spacing strategy

·       Color coding strategies

·       Peer support

·       Cross-age tutoring

 

 Organization                                                              Handwriting                                                              Visual Intergration

 

Instructional Modification Strategies

 

Instructional Modification Strategies

 

Instructional Modification Strategies

 

·       Bulletin board schedule

·       Pocket schedule

·       Schedule in notebook

·       Appointment book

·       Assignment sheets

·       Reminder cards

·       Strategies to keep work space clear

·       Strategies to organize desk

·       Study carrel for individual work

·       Color coding strategies

·       Peer support

·       Cross-age support

·       Homework journal

·       Structured study guides

·       Post signs and label areas in the room

·       Tape a schedule on the desk

 

·       Peer support

·       Different kinds of paper

·       Different colors of paper

·       Different line spacing / line colors

·       Tape paper to the desk

·       Chalk board practice

·       Instructional strategies:

m    tracing exercises

m    “talk-through letter formation

m    “walk-through letter formation

m    write letters “in the air”

m    dot-to-dot

m    multi-modality instruction

m    adapt tests to “fill-in-the-blank”

m    use multiple choice / true-false

m    provide additional time*

m    shorten assignments

m    photo-copy notes, etc.

·       Try different writing implements

·       Paper position

·       Student position

·       Avoid using short pencils

·       Peer dictation

·       Cross-age tutoring

 

 

·       Enlarge worksheets

·       Enlarge reading material

·       Worksheets free of blotches/streaks

·       Change font on worksheets/tests

·       Use wide margins on worksheets

·       Use different colors of paper

·       Change lighting

·       Different line spacing/color

·       Darker lines on paper

·       Raised lines on paper

·       Limit amount of information on page

·       Use “finger-for-spacing” strategy

·       Peer support

·       Multi-modality instruction                                                           

 

 

Language Usage

 

Instructional Modification Strategies

 

·       Modifications:

m    allow more time for completion

m    reduce number of words

·       Instructional strategies:

m    Paired word associations

m    self-verbalization

m    imagery

m    mnemonic devices

m    multi-modality instruction

·       Peer support

·       Cross-age tutoring

·       “Word wall” of common words

·       Spelling word booklet

·       Problem word lists

·       Word banks

 


Appendix D

Glossary

Accommodations
Accommodations are defined as specific changes in testing conditions, procedures and or formatting which do not alter the validity or reliability of the State standard. The accommodations must not compromise the security of the test and should be consistent with the students IEP, 504 and or LEP plan. There are five accommodations available for use in each of the three statewide assessments. These include accommodations for scheduling, setting, equipment, presentation and response. The accommodations can be used in both assessment and or instruction, formally and informally. Accommodations for specific state assessments are identified in the Maryland State Department of Education document Requirements for Accommodating, Excusing, and Exempting Students in Maryland Assessment Programs.

Comprehensive School
Any public elementary, middle or secondary school.

CRT
Criterion Referenced Test

CTBS
California Test of Basic Skills - State requires sampling of 250 students 3, 5, & 8th grade.

IMAP
Independence Mastery Assessment Program - Alternate assessment for program improvement and accountability similar to MSPAP. Performance assessments conducted at ages commensurate with MSPAP.

LSS
Local School System

MFT
Maryland Functional Tests - Reading, Writing, Math, and Citizenship

MLO
Maryland Learning Outcomes - Reading, Writing, Language Usage, Mathematics, Social Studies, and Science.

Modifications
Modifications are defined as general changes in testing conditions, procedures and or formatting which may directly or indirectly compromise either the validity or reliability of the State standard. Modifications may compromise test security and therefore are not recommended for statewide assessments. Modifications are more appropriate for instruction and classroom tests and include a much wider range of fluctuation than do accommodations. Modifications can be identified on the students IEP, 504 and or LEP plan. Modifications can be effectively used in combination with accommodations in instructional and assessment situations when individualized to the students strengths and needs.

MSPAP
Maryland School Performance Assessment Program - Statewide assessment of educational reform targeted at school improvement through school accountability. Performance assessment conducted in grades 3, 5, and 8 based on the Maryland Learning Outcomes.

 NDF
No Documentation Found - There was no evidence that would assist in answering the question.

SSIS
Student Services Information System - Student soundex number usually on the students IEP. LSS’s may have a different number or student identifer as well.

Well Grounded
In reviewing a students IEP and other related data the decision arrived upon by the original team seemed to be the best or optimum professional decision based upon the available data. In your best professional judgement did the team make the correct decision unencumbered by political or parental pressure?