Characteristics of States' Alternate Assessments Based on Modified Academic Achievement Standards in 2008

Synthesis Report 72

Deb Albus, Sheryl S. Lazarus, Martha L. Thurlow, & Damien Cormier

September 2009

All rights reserved. Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Albus, D., Lazarus, S. S., Thurlow, M. L., & Cormier, D. (2009). Characteristics of states' alternate assessments based on modified academic achievement standards in 2008 (Synthesis Report 72). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Table of Contents


Executive Summary

In April 2007, Federal No Child Left Behind regulations were finalized that provided states with additional flexibility for assessing some students with disabilities. The regulations allowed states to offer another assessment option, alternate assessments based on modified academic achievement standards (AA-MAS). States are not required to have this assessment. According to the regulations, this option is for a small number of students with Individual Education Programs (IEPs) who even with appropriate grade level academic instruction are unlikely to reach grade-level proficiency within the year covered by an IEP.

The National Center on Educational Outcomes (NCEO) has been tracking and analyzing the characteristics of states’ AA-MAS since 2007. This is the second annual update. The previous NCEO report on test design for AA-MAS (Lazarus, Thurlow, Christensen & Cormier, 2007) indicated that five states offered an assessment they considered to be an AA-MAS in 2007: Kansas, Louisiana, North Carolina, North Dakota, and Oklahoma. In addition, Maryland indicated it was in process of developing an AA-MAS. In 2008, there were three more states that had an assessment they considered to be an AA-MAS: California, Connecticut, and Texas. As of March 2009, none of the states had successfully completed the U.S. Department of Education’s peer review process. As of the publication date, one state (Texas) had received approval.

States’ AA-MAS’s differed in a number of ways from their regular assessments. In 2008, the AA-MAS of all nine states used a multiple-choice format. Some states’ assessments also included constructed response or writing prompts. And in 2008, two states included performance-based tasks. Design elements differentiating the AA-MAS from a state’s regular assessment included fewer items on the test, removing a distractor, shorter passages, fewer passages, and simplified language. More than half of the states had fewer items per page on the AA-MAS than on the regular assessment. Analysis of states’ regular assessment blueprints compared to those of AA-MAS showed some differences in the patterns of emphasis across grade levels.


Overview

Federal No Child Left Behind (NCLB) regulations finalized in April 2007 provided states another assessment option to consider in meeting the goal of including all students in the federal accountability system. In addition to the previously-available assessment options for students with disabilities (e.g., taking the regular assessments with or without accommodations, or alternates based on grade level or alternate achievement standards), the regulation gave states the flexibility to offer an alternate assessment based on modified academic achievement standards (AA-MAS). States may count up to 2% of all students as proficient who met proficiency standards with an AA-MAS (U.S. Department of Education, 2007, April 9). States are not required to offer an AA-MAS.

According to the 2007 regulations, students participating in an AA-MAS must have an IEP; and even with appropriate grade level content instruction, the student must be unlikely to achieve proficiency in the year covered by an IEP. Further, the students participating in an AA-MAS may be from any disability category (U.S. Department of Education, 2007, April 9). As of the publication date, only one state (Texas) that had an assessment it considered to be an AA-MAS successfully completed the peer review process used by the U.S. Department of Education to determine whether the assessment satisfies federal requirements. The purpose of this report is to compare the characteristics of assessments states identified as AA-MAS in 2007 with those assessments identified in 2008.

In 2007, the National Center on Educational Outcomes (NCEO) tracked and analyzed states’ participation guidelines for the AA-MAS and the characteristics of states’ AA-MAS (Lazarus, Thurlow, Christensen, & Cormier, 2007). Because more information is now publicly available, NCEO is publishing two separate updates—this report on the characteristics on states’ AA-MAS (including assessment design changes) and another synthesis report on participation guidelines (Lazarus, Rogers, Cormier, & Thurlow, 2008). This report builds on the work done in the NCEO report, States’ Alternate Assessments Based on Modified Achievement Standards (AA-MAS) in 2007 (Synthesis Report 67) (Lazarus et al.). The current report covers assessment design changes, as did the previous report, but with additional analyses including a blueprint comparison between regular assessments and AA-MAS.

Questions guiding the current study were:

1. In August 2008, which states had an assessment that they considered to be AA-MAS?

2. What were the characteristics of these assessments and how had they changed since 2007?

3. What differences, if any, were there between the assessment blueprints of states’ regular and AA-MAS assessments regarding number of items and the percentage that specific components (e.g., strands) were covered in subject areas by grade?

Process Used to Find Information about States’ AA-MAS

This report summarizes publicly available information about the characteristics of the AA-MAS for states that either had an assessment they considered to be this type of alternate assessment in place in August 2008, or had information about an AA-MAS in development on the state Web site in August 2008.

Data were gathered from state department of education Web sites by locating all available information on AA-MAS and regular assessments, including general information, frameworks, test specifications, and accommodation policies. Data were gathered on assessment design changes (e.g., AA-MAS question types and characteristic changes) that had been included in the previous year’s report (Lazarus et al., 2007) to compare changes between 2007 and 2008. This report includes information on accommodations that have been incorporated into the design of states’ AA-MAS. For this report we define embedded accommodations as AA-MAS features that would be considered an accommodation on a state’s regular assessment. In other words, if a tool or procedure that is usually considered an accommodation is provided on the AA-MAS (and is available to students participating in the assessment without any IEP documentation), it is considered an embedded accommodation. We looked at accommodations that were allowed on each state’s regular assessment, as well as regular test features that are sometimes considered accommodations—and then looked to see whether any of these accommodations had been integrated into the design of the state’s AA-MAS. Examples of embedded accommodations are listed below:

  • If a state’s AA-MAS used 16-point font size and its regular assessment had 12-point font, the large print accommodation would be considered to be an embedded accommodation.
  • If the calculator was allowed on all sections of a state’s AA-MAS but allowed only on certain portions of the regular test, the calculator accommodation would be considered an embedded accommodation. 
  • If a state’s AA-MAS design included the reading of test questions and items to all participating students (and the regular assessment does not include this feature), the read aloud accommodation would be considered an embedded accommodation.

Note that this report only includes information on embedded accommodations that have been incorporated into the design of the AA-MAS. Detailed information on state’s accommodations policies for the AA-MAS will be included in a forthcoming report.

A comparison was also made between blueprints for states’ AA-MAS and the general state assessments found on state Web sites. The areas of comparison included content area changes by grade ranges, elementary to high school. For this analysis, we used samples taken for elementary (4th), middle (8th) and high school (10th) grades for all subjects reported. We limited our analysis to multiple choice items because only two states had constructed response items (other than for writing). If information for any of these grades was not available, the grade below it was used. If there were no assessments in the grade below, information was gathered for the grade above. A complete list of state documents used to compile information for this report is in Appendix A.

The AA-MAS information collected for each state was placed into a state profile in the form of summary tables. The profiles were then e-mailed to each state in September 2008. States were asked to verify the information; if the profile contained inaccurate information, states were permitted to revise their profiles, providing we could confirm their corrections with posted state information. Five states responded to the request; they either confirmed the accuracy of the information, suggested one document over another, or filled in other information. The verified information was then compiled and summarized in this report.


Results

In July 2007, there were five states that offered an assessment that the state considered to be an AA-MAS (Lazarus et al., 2007). These were Kansas, Louisiana, North Carolina, North Dakota, and Oklahoma. At that time, Maryland had publicly available information indicating that it was developing an AA-MAS so it was also included in the 2007 report (Lazarus et al., 2007). In 2008, there were three additional states either implementing or in the process of developing an AA-MAS. The states were California, Connecticut, and Texas. Table 1 lists all nine of the states that either were developing or had what they considered to be an AA-MAS in 2008, and provides brief details about each assessment (e.g., content areas and grades assessed).

Figure 1 shows the number of states employing different types of question and assessment approaches between 2007 and 2008. The total number of states for each category graphed takes into account all subject areas. For example, if a state used multiple choice and constructed response questions in one subject area, the state would be counted in both categories. But a category such as multiple choice would not be counted twice if it was used for both reading and mathematics. Four states used a combination of question types within a content area assessment.

There were also some observed differences in assessment design characteristics from the preceding year. North Dakota had a performance-based portfolio assessment in 2007 (Lazarus et al., 2007). In 2008, this assessment had evolved into a teacher mediated computer delivered performance-based assessment that used a multiple choice format. Specific information on assessment types and question characteristics for each subject area are provided in Table B1 in Appendix B.

Table 1. AA-MAS Name, Content Areas, and Grade Described by State

State Assessment Name Content Areas/Grades
California California Modified Assessment (CMA) ELA (3-8); Math (3-7); Science (5,8)
Connecticut1 CMT/CAPT Modified Achievement Standards (CAPT-MAS) ELA and Math (3-8,10-11)
Kansas Kansas Assessment of Multiple Measures (KAMM) Reading and Math (3-8; once in HS), Writing (5,8, once in HS); History/Gov (6,8, once in HS); Science (4,7, once in HS)
Louisiana LEAP Alternate Assessment, Level 2 (LAA2) English and Math (Grades 4-10); Science and Social Studies (4, 8, 11)
Maryland2 Modified Maryland School Assessment (Mod-MSA) and Modified High School Assessment (Mod-HSA) Reading/ELA and Math (3-8, HS) (Information in report and appendices is for Mod-HSA only.)
North Carolina NCEXTEND2 Reading and Math (3-8); Science (4,8,11)
North Dakota North Dakota Alternate Assessment Aligned to North Dakota Content Standards for Students with Persistent Cognitive Disabilities (NDAA2) Reading and Math (3-8); Science (4,8,11)
Oklahoma Oklahoma Modified Alternate Assessment Program (OMAAP) ELA/Reading and Math (3-8, HS); Science (5,8)
Texas Texas Assessment of Knowledge and Skills, Modified (TAKS-M) English and Math (3-11); Science (5,8,10-11); Writing (4,7,10); Social Studies (8,10-11)

1 Under development, Connecticut plans to implement in 2008-09.
2 Under development. Maryland plans to implement its AA-MAS in 2008-09 at the earliest.

Figure 1. Number of States by Assessment Type and Question Characteristics Across Study Years

Figure 1 

* North Dakota used a portfolio assessment for its AA-MAS at the time of the 2007 report. Over time it has evolved into a teacher-mediated multiple choice and performance task assessment.

Assessment Design Changes

Figure 2 shows the number of states with specified design changes across the two years. Most states noted using fewer items (n=8), followed by removing a distractor (n=6), shorter passages (n=5), and simplified language (n=5). Segmentation of passages was noted by one state in 2007 and by three states in 2008. See Tables B2 and B3 in Appendix B for more detailed information about design changes, including other changes made by only one state that are not included in the figure.

Figure 2. Selected Design Changes in States’ AA-MAS Across Study Years

Figure 2 

Embedded Accommodations

States often embedded accommodations into their AA-MAS assessments features that typically appear as accommodations in states’ policies. In Figure 3, five states were using fewer items per page and four used larger font sizes. One state embedded the calculator accommodation. See Tables B4 and B5 for additional information about embedded accommodations and for more detailed specifications.

Other accommodations found in state policies were incorporated into the AA-MAS of a single state only for this year’s study (see Table B-3 of Appendix B). Accommodations incorporated into the AA-MAS design by only one state included having a scribe for all students (North Dakota), reading aloud questions and answers for all students (Texas), and incorporating manipulatives into the assessment (North Dakota).

Figure 3. Accommodations Incorporated into AA-MAS Across Study Years1

Figure 3

1 Two of the nine states tracked in 2009 (e.g., Connecticut and Maryland) were still in the process of developing their AA-MAS. Therefore, there was minimal information available regarding embedded accommodations for those two states.

One state made an interesting change between our 2007 and 2008 analyses. The official Kansas policy regarding calculators did not change—for both years there were the same allowable accommodations for the regular test and the KAMM (the AA-MAS in Kansas)—and calculators were not allowed on the non-calculator portion of a test. However, the Kansas documents used for the analysis in 2007 went on the say:

Calculator use on non-calculator portions of the assessment is not allowed for any student. However, at this time there are no non-calculator portions on the KAMM assessment. Therefore, because of the current KAMM test design, calculators and calculation devices such as math tables are allowed on the entire KAMM.

But by 2008, Kansas was in the process of changing their KAMM, and 2008 Kansas documents indicated that:

KAMM Math Assessment will be reorganized this year based on the April 2007 release of the final NCLB 2% Regulations. The revised KAMM Math assessment will mirror the organization of the current Kansas General Math Assessment in the following way.

  • The KAMM Math Assessment will be organized into three sections.
  • The use of calculators will be allowed in two sections.
  • The third section will not allow calculator use.
  • In the non-calculator portion, there are numerous items, approximately 50% (depending on the grade level), for which a calculator is not necessary (e.g., recognizing shapes, charts and graphs, time, transformations, etc.).

Therefore, in 2007 the results of our analysis showed that Kansas incorporated the calculator accommodation into their AA-MAS test design (since it was allowed on all sections of the AA-MAS), but that in 2008 the state did not incorporate the accommodation since it was allowed on two out of three sections of both assessments.

Assessment Blueprints Comparison

Information was also gathered from state blueprints for regular assessments and AA-MAS to compare the number of test items and the percentage of coverage for components of subject area assessments. These data are presented in full—for representative elementary, middle school, and high school grades—in Appendix C. Appendix C also includes more specific assessment information such as when a state has indicated that certain items for an assessment are drawn from multiple grade levels. For example, California in its elementary science assessment for 5th grade uses a certain number of items from 4th and 5th grade content in both its regular assessment and AA-MAS.

Table 2 displays the differences in the number of total multiple choice items on states’ AA-MAS compared to the regular assessment. This table is based on detailed information in Appendix C (Tables C1-C3 provide information on elementary reading, math, and science, respectively; Tables C4-C6 are middle school reading, math, and science; Tables C7-C9 are high school reading, math, and science. Table C10 provides information on the number of items for social studies). Differences in numbers of items do not address content or difficulty of items, nor do they address the rich information found in performance level descriptors. A separate report that provides information on performance level descriptors used by states with an AA-MAS is forthcoming.

Table 2. Total Numbers of Multiple Choice Items1 on AA-MAS and Regular Assessment, and Percentage of Regular Items Represented on AA-MAS

State

Elementary

Middle School

High School

Reading/ELA2

AA-MAS

Reg

% of Reg

AA-MAS

Reg

% of Reg

AA-MAS

Reg

% of Reg

California

27

42

64%

30

42

71%

Kansas

36

743

49%

48

843

57%

48

64

75%

Louisiana

21

33

64%

21

33

64%

21

33

64%

North Carolina

40

58

69%

40

53

75%

40

56

71%

Oklahoma

404

50

80%

50

324

484

67%

Texas

32

40

80%

38

48

79%

22

28

79%

Math

California

48

65

74%

54

65

83%

Kansas

40

723

56%

40

884

45%

40

104

38%

Louisiana

42

60

70%

42

60

70%

42

60

70%

North Carolina

40

823

49%

40

803

50%

40

804

50%

Oklahoma

404

45

89%

403

45

89%

40

55

73%

Texas

34

42

81%

40

50

80%

45

56

80%

Science

California

48

60

80%

54

60

90%

Louisiana

35

40

85%

35

40

88%

North Carolina

60

803

75%

60

803

75%

40

804

50%

Oklahoma

414

45

91%

404

45

88%

464

60

76%

Texas

32

40

80%

40

50

80%

44

55

80%

Social Studies (HS)

Kansas

49

60

82%

52

60

87%

Louisiana

32

60

53%

Texas

38

48

79%

44

55

80%

Writing

California

21

33

64%

24

33

73%

Texas

24

28

86%

32

40

80%

14

20

70%

1 Multipe choice items only. Does not include constructed responses items or essays.

2 This table does not include any writing multiple choice items in Reading/ELA. See separate listing for writing.

3 Documents noted that the regular assessment included field test items. Field test items could not be disaggregated from other test items for the regular assessment.

4 Median number of questions.

Note: The matrix cells are shaded if the number of items on the AA-MAS is less than 60% of the items on the regular assessment.

Table 2 presents a detailed comparison of the number of multiple choice items on states’ AA-MAS and the regular assessment for states that have publicly available information. The greatest difference in number of items between the AA-MAS and regular assessment are shaded in Table 2 (i.e., if a state’s AA-MAS has less than 60% of the number of items on its regular assessment). Shading indicates greater difference in coverage between the two assessments. States with the largest number of multiple choice items on the regular assessment tended to have the largest difference in percentage of total items between the AA-MAS and the regular assessment. In some cases, however, we observed a comparable percentage difference even with a relatively modest number of items on the regular assessment. An example of a state that had a relatively large number of multiple choice items on its regular assessments is Kansas. The 4th grade level KAMM reading assessment had 58 items. This compared to 36 questions on its 4th grade AA-MAS for reading. As shown in Appendix Table C1, an additional 16 multiple choice questions are used for the 4th grade AA-MAS in reading, but some of these are field test items for future use. Thus, the AA-MAS had 62% as many items as the regular assessment, assuming the regular assessment items were all operational. For most content areas and at most grade levels Oklahoma and Texas had the smallest percentage difference in the total number of AA-MAS items compared to the regular assessment. See Tables C1–C10 in Appendix C for detailed information about the number of items.


Example of State with Differences Across Component Areas

Not all states with AA-MAS reported the number of items or percentage of components for various “strands” within a content area for its regular assessment and its AA-MAS. Among those that did, some differences were observed. An example of the type of differences between the regular assessment and the AA-MAS across strands in one state is provided here. It shows changes in the pattern of coverage within mathematical components across grades 4, 8, and 10.

As shown in Figure 4, at the elementary level there is a difference in the pattern of the percentage of items for strands on the AA-MAS compared to the regular assessment, with a 20% difference in number and number relations on the AA-MAS compared to the regular assessment. Geometry had the smallest percentage difference of 4%. For other strands the difference in percentage of items ranged from 2-11%. The state’s AA-MAS appears to be designed to include a range of 15-20% of the total questions for each strand assessed, but this pattern does not match the emphasis for the regular assessment, which varies between 5% and 40% of the total questions across strands.

Figure 4. State Example: Elementary Math Percentage of Total Number of Questions Devoted to Each Strand for AA-MAS and Regular Assessment

Figure 4 

Note: This figure reports percentages rather than number of items. At the elementary level, the state’s AA-MAS had 42 multiple choice items and 2 constructed response items. The state’s regular assessment had 60 multiple choice items and 3 constructed response items.

Figure 5 presents the middle school percentages for the same state. The percentage of the state’s assessment questions for each strand at the middle school level was similar to percentages for the AA-MAS at the elementary level—and also quite similar to the state’s regular assessment at this level.

Figure 5. State Example: Middle School Math Percentage of Total Number of Questions Devoted to Each Strand for AA-MAS and Regular Assessment

Figure 5 

Note: This figure reports percentages rather than number of items. At the middle school level, the state’s AA-MAS had 42 multiple choice items and 2 constructed response items. This state’s regular assessment had 60 multiple choice items and 4 constructed response items.

Figure 6 shows the comparison at the high school level. This comparison also shows little variation between the regular assessment and the AA-MAS, with a range of 1-10% of total questions across strands. The AA-MAS, again, has the same percentage for each strand as at the elementary and middle school levels. Most noticeable here is the difference between number and number relations with 10% more items in this area on the AA-MAS than on the regular assessment (i.e., 20% of the questions on the AA-MAS were “number sense” questions but only 10% of the questions on the regular assessment were devoted to this strand).

Figure 6. State Example: High School Math Percentage of Total Number of Questions Devoted to Each Strand for AA-MAS and Regular Assessment

Figure 6 

Note: This figure reports percentages rather than number of items. At the high school level, the state’s AA-MAS had 42 multiple choice items and 2 constructed response items. This state’s regular assessment had 60 multiple choice items and 4 constructed response items.

States’ blueprints for the regular assessment and the AA-MAS for all content areas that were publicly available were compared for elementary, middle, and high school assessments. Summary information and examples are provided here, with details presented in Appendix C. We examined all content areas for which blueprints were available (e.g., reading, writing, math, science).


Discussion

In 2007, six states either had or were in the process of developing an assessment they considered to be an AA-MAS. In 2008, there were nine states, and as of the publication date only one of the states has successfully completed the federal peer review process. Similar to 2007, multiple choice items were the predominant type of item on states’ AA-MAS in 2008. Only two states used constructed response items, other than in writing where some states used prompts. Two states used performance tasks, one for the entirety of the AA-MAS (North Dakota) and one only for a science portion (Kansas). The three states added in this report showed similar designs in their AA-MAS to the six states that had this assessment option in 2007—for example, fewer items, simplified language, removal of a distractor, and shorter and fewer passages. States had other unique design features in both years, but many of these were difficult to categorize because they focused on the presentation of specific item content.

Several features that were considered accommodations for the state’s regular assessment were embedded into the design of some states’ AA-MAS. In 2008, the most frequently embedded accommodations were fewer items per page and larger font size. Both of these accommodations generally are categorized as presentation accommodations for regular assessments.

In this analysis of AA-MAS, we found for the first time that a state incorporated the use of scribes into the AA-MAS design. In contrast, other states considered a scribe a separate accommodation available for students on the AA-MAS or regular assessments if they individually required one.

The different characteristics observed in these AA-MAS seem to show that assessments across states are targeting different students. This observation agrees with Filbin (2008), who noted that states either appeared to be targeting students right below the regular assessment or right above the alternate assessment based on alternate achievement standards. In our analysis, some states appeared to have fewer changes to blueprints, suggesting these states’ AA-MAS may be geared toward those students just below the regular assessment.

Comparing blueprints can yield useful information on how content coverage may differ across the assessments (Marion, 2007). The April 2007 Federal Register Rules and Regulations, in describing assessment design compared to regular content standards, said that an AA-MAS “reflects the same degree and pattern of emphasis as the content standards (balance)” (Section 200.6(a)(3)(i)), p. 2). But, in the Standards and Assessments Peer Review Guidance revised December 21, 2007, possible examples of acceptable evidence included a comparison of blueprints that “indicates that the general assessment and the assessment based on modified academic achievement standards were designed to address the same grade level content standards although the item specifications differ” (p. 26). The example showing the comparison of strands for the AA-MAS and regular assessment in one state reflects that state’s interpretation of “balance.” In that example, the state maintained the same percentage of items across components at all grades, even though this sometimes was divergent from the percentage assessed in those grades on its regular assessments. It appears that states may have very different interpretations of what is meant by “same degree and pattern of emphasis.”

In summary, it is important to continue to track the changes and decisions made by states as they develop their AA-MAS for students who qualify to participate. As states pursue the AA-MAS option, all aspects of the assessments should be analyzed and documented, toward the goal of ensuring quality grade level assessment and academic instruction for all students.


References

Filbin, J. (2008). Lessons from the initial peer review of alternate assessments based on modified achievement standards. U.S. Department of Education, Office of Elementary and Secondary Education Student Achievement and School Accountability Program.

Lazarus, S. S., Rogers, C., Cormier, D., & Thurlow, M. L. (2008). States’ participation guidelines for alternate assessments based on modified academic achievement standards (AA-MAS) in 2008 (Synthesis Report 71). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Lazarus, S. S., Thurlow, M. L., Christensen, L. L., & Cormier, D. (2007). States’ alternate assessments based on modified achievement standards (AA-MAS) in 2007 (Synthesis Report 67). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Marion, S. (2007, July 26). A technical design and documentation workbook for assessments based on modified achievement standards. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

U.S. Department of Education (2007, April 9). Final Rule 34 CFR Parts 200 and 300: Title I-Improving the Academic Achievement of the Disadvantaged; Individuals with Disabilities Education Act (IDEA). Federal Register. 72(67), Washington DC: Author. Retrieved September 12, 2007, from the World Wide Web: http://cehd.umn.edu/NCEO/2percentReg/FederalRegApril9TwoPercent.pdf

U.S. Department of Education (2007, December 21). Standards and assessments peer review guidance: information and examples for meeting requirements of the No Child Left Behind Act of 2001. Washington DC: Office of Elementary and Secondary Education (OESE). Retrieved from the World Wide Web on August 17, 2008, at http://www.ed.gov/policy/elsec/guid/saaprguidance.pdf.


Appendix A

State Documents Used in Analysis

State Documents and Presentations Used in the Analysis of States’ AA-MAS

California

California Department of Education (n.d.). cma37math. Retrieved on August 7, 2008 from http://www.cde.ca.gov/ta/tg/sr/cmablueprints.asp

California Department of Education (n.d.). cma38ela. Retrieved on August 7, 2008 from http://www.cde.ca.gov/ta/tg/sr/cmablueprints.asp

California Department of Education (n.d.). cmasciblueprints. Retrieved from on August 7, 2008 from http://www.cde.ca.gov/ta/tg/sr/cmablueprints.asp

California Department of Education (n.d.). Differences between CST and CMA, downloaded August 7, 2008 from http://www.cde.ca.gov/ta/tg/sr/cmastar.asp

California Department of Education (January, 2008). Appropriate test variations and accommodations for the 2008 administration of the California Modified Assessment (CMA) based on the study of item format and delivery mode from the CMA. Retrieved on August 7, 2008 from http://www.cde.ca.gov/ta/tg/sr/cmastar.asp

California Department of Education. (2007). 2008 standardized testing and reporting item and estimated time charts. Retrieved on September 18, 2008.from http://www.cde.ca.gov/ta/tg/sr/documents/itemtimecharts08.pdf

California Department of Education (n.d.). California STAR CST blueprints. Retrieved August 7, 2008 from http://www.cde.ca.gov/ta/tg/sr/blueprints.asp

Connecticut

Connecticut State Department of Education (n.d.). Connecticut State Department of Education: Test accommodations form special education, modified assessment system (MAS) retrieved on August 7, 2008 from http://www.csde.state.ct.us/public/cedar/assessment/mas/index.htm

Kansas

Kansas Department of Education (n.d). KAMM math studies test specifications. Retrieved on August 7, 2008 from http://www.ksde.org/Default.aspx?tabid=2371#KAMMitemandtest

Kansas Department of Education (n.d). KAMM reading test specifications. Retrieved on August 7, 2008 from http://www.ksde.org/Default.aspx?tabid=2371#KAMMitemandtest

Kansas Department of Education (n.d.). KAMM science test item Specifications. Retrieved on August 7, 2008 from http://www.ksde.org/Default.aspx?tabid=2371#KAMMitemandtest

Kansas Department of Education (n.d). KAMM social studies test specifications. Retrieved from http://www.ksde.org/Default.aspx?tabid=2371#KAMMitemandtest on August 7, 2008.

Kansas Department of Education (n.d.). KAMM modified writing assessment manual. Retrieved August 7, 2008 from http://www.kansped.org/ksde/assmts/kamm/kamm.html

Kansas Department of Education (n.d). Kansas alternate assessment & Kansas assessment of modified measures (KAMM), fact sheet 2008-2009. Retrieved August 7, 2008 from http://www.ksde.org/Default.aspx?tabid=2371#KAMMitemandtest

Kansas Department of Education (2007, November). Kansas State Department of Education accommodations manual: How to select, administer, and evaluate accommodations for instruction and assessment. Retrieved on August 7, 2008, from http://www.ksde.org/Default.aspx?tabid=2371#Accommodations

Kansas State Department of Education (2007, May). Make a musical instrument. Retrieved on August 7, 2008. from http://www.ksde.org/Default.aspx?tabid=2371 .
Kansas State Department of Education (n.d.). Sample problems illustrative of items based on modified academic achievement standards. Retrieved on August 7, 2008, from http://www.ksde.org/Default.aspx?tabid=2371

Kansas State Department of Education (2006, August 4). Act on cut scores for Kansas assessments. Press release letter from Bob Corkins to Kansas State Board of Education. Retrieved on August 7, 2008.from http://www.ksde.org/Portals/0/Special%20Education%20Services/assmts/kamm/PerfLevCutScores.pdf

Kansas State Department of Education (2007, February). Kansas assessments in reading and mathematics, 2006 technical manual for the Kansas general assessments, Kansas Assessments of multiple measures (KAMM), Kansas alternate assessment. Retrieved on August 7, 2008 from http://www.ksde.org/Default.aspx?tabid=2371

Kansas State Department of Education. (2008, July). Questions about the 2008-2009 Kansas assessment of modified measures-KAMM. Topeka: Author. Retrieved on August 7, 2008 from www.kansped.org/ksde/assmts/kamm/kammfaq.pdf

Kansas State Department of Education (n.d.). Kansas fact sheets on regular assessments. Retrieved August 7, 2008 from http://www.ksde.org/Default.aspx?tabid=420 and http://www.ksde.org/LinkClick.aspx?fileticket=P3OIVwqiJZg%3d&tabid=420&mid=5207

Kansas State Department of Education. (2007, October 24). Kansas reading education test specifications. Topeka: Author. Retrieved on August 7, 2008 from http://www.ksde.org/Default.aspx?tabid=159

Louisiana

Louisiana Department of Education (n.d.). LAA2 accommodations 2008. Retrieved on August 29, 2008 from http://www.doe.state.la.us/lde/saa/785.html

Louisiana Department of Education (n.d.). LAA2 test design. Retrieved on August 7, 2008 from http://www.doe.state.la.us/lde/saa/2221.html

Louisiana Department of Education (n.d.). Special populations and accommodations for LEAP and GEE. Retrieved on August 29, 2008 from http://www.doe.state.la.us/lde/saa/785.html

Louisiana Department of Education (n.d.). Chapter 1 LEAP English language arts, grade 4 [Assessment guides for other grades and content areas on same page]. Retrieved on September 18, 2008 from http://www.doe.state.la.us/lde/saa/1341.html

Louisiana Department of Education (2007). LAA2 2006-2007 annual report. Retrieved on August 7, 2008 from http://www.doe.state.la.us/lde/saa/2221.html

Louisiana Department of Education (2007). 2006 LAA2 technical report summary. Retrieved on August 7, 2008, from http://www.doe.state.la.us/lde/saa/2221.html

Louisiana Department of Education (February, 2008). LAA2 LEAP alternate assessment, level 2, assessment guide: English language arts and mathematics, Grades 4, 8, 11. Retrieved on August 7, 2008 from http://www.doe.state.la.us/lde/saa/2221.html

Maryland

Maryland State Department of Education (n.d.) School improvement in Maryland: HSA: high school assessment program, what is Mod-HSA? Baltimore: Author. Retrieved on October 30, 2008, from http://mdk12.org/assessments/high_school/index_d2.html

Maryland State Department of Education (n.d.). Mod-HSA example items. Retrieved on October 30, 2008, from http://mdk12.org/assessments/high_school/index_d2.html

Maryland State Department of Education. (2006, October 1). 2006-2007 Maryland accommodations manual: A guide to selecting, administering, and evaluating the use of accommodations for instruction and assessment. Retrieved on August 7, 2008 from http://www.marylandpublicschools.org/NR/rdonlyres/840EFBB6-CD7D-404E-8A77-E978F6D508AA/11347/MDAccommodationsManual.pdf

Maryland State Department of Education (2007). Technical documentation for the Maryland high school assessment program: Algebra/data analysis, biology, English, and government end-of-course assessments. Retrieved November 3, 2008 from http://www.marylandpublicschools.org/MSDE/divisions/planningresultstest/2006+HSA+Technical+Report.htm

Maryland State Department of Education (2008). 2007-2008 Maryland accommodations manual: A guide to selecting, administering, and evaluating the use of accommodations for instruction and assessment. Retrieved October 30, 2008 from: http://mdk12.org/share/pdf/AccommodationsManual.pdf

North Carolina

North Carolina Department of Public Instruction (2006, August 21). The North Carolina testing program, 2006-2007. Retrieved on September 3, 2008 from http://www.dpi.state.nc.us/docs/accountability/NORTHCgeneralpolicies.pdf

North Carolina Department of Public Instruction (2007, July). Understanding the individual student report for the NCEXTEND2 EOG grades 3-8. Retrieved on August 7, 2008, from http://www.ncpublicschools.org/accountability/policies/briefs/

North Carolina Department of Public Instruction (2008, July). North Carolina testing program assessment options, 2008-2009. Retrieved on August 7, 2008.from http://www.ncpublicschools.org/docs/accountability/policyoperations/nctpassessmentoptions.pdf

North Carolina Department of Public Instruction (2007, February). School test coordinator’s handbook. Retrieved on September 3, 2008, from http://www.ncpublicschools.org/docs/accountability/policyoperations/stcHandbook.pdf

North Carolina Department of Public Instruction (2007, October). Test information sheets. Retrieved on October 17, 2008, from http://www.ncpublicschools.org/accountability/testing/eog/reading

North Carolina Department of Public Instruction (2006, May). Test information sheets. Retrieved on October 17, 2008, from http://www.ncpublicschools.org/accountability/testing/eog/math

North Carolina Department of Public Instruction (2007, November). Test information sheets. Retrieved on October 17, 2008, from http://www.ncpublicschools.org/accountability/testing/eog/science

North Carolina Department of Public Instruction (2006, December). North Carolina end-of-course test of English I. Retrieved on October 17, 2008, from: http://www.ncpublicschools.org/docs/accountability/testing/eoc/English1/20071201englishtestinformationsheet.pdf

North Carolina Department of Public Instruction (2007, October). Test information: End-of-course (EOC) mathematics tests. Retrieved October 17, 2008, from http://www.ncpublicschools.org/docs/accountability/testing/eoc/Algebra1/MPGHW121EOCItemsbyGoal.pdf

North Carolina Department of Public Instruction (2007, October). North Carolina end-of-course test of biology. Retrieved October 17, 2008, from http://www.ncpublicschools.org/docs/accountability/testing/eoc/scienceeocbiology.pdf

North Dakota

North Dakota Department of Public Instruction (n.d.). North Dakota alternate assessments 2007-08 (PowerPoint presentation). Retrieved on August 7, 2008, from http://64.233.167.104/u/NDDPI?q=cache:JwKYLdw8SqMJ:www.dpi.state.nd.us/speced/resource/alternate/AA2ppt.pdf+NDAA&hl=en&ct=clnk&cd=5&gl=us&ie=UTF-8

North Dakota Department of Public Instruction (n.d.). Comparison of NDAAI and NDAAII. Retrieved on August 7, 2008, from http://www.dpi.state.nd.us/speced/resource/alternate/index.shtm

North Dakota Department of Public Instruction (September, 2008). Revised- students with disabilities and the North Dakota state assessments: Information for parents and educators (2008 NDAA Parent Brochure). Retrieved on August 7, 2008, from http://www.dpi.state.nd.us/speced/resource/alternate/index.shtm

Oklahoma

Garrett, S. (2007). Oklahoma school testing program (OSTP) Oklahoma modified alternate assessment program (OMAAP), test preparation manual. Retrieved on September 17, 2008, from http://www.sde.state.ok.us/AcctAssess/pdf/forms/OMAAP_TPM.pdf

Oklahoma Department of Education. (2008). Curriculum access resource guide- modified (CARG-M) modified priority academic student skills (PASS), reading, gr.3-5. Retrieved on August 7, 2008, from http://www.sde.state.ok.us/AcctAssess/OMAAP.html

Oklahoma Department of Education. (2008). Curriculum access resource guide- modified (CARG-M) modified priority academic student skills (PASS), reading and English II, gr 6-8. Retrieved on August 7, 2008 from http://www.sde.state.ok.us/AcctAssess/OMAAP.html

Oklahoma Department of Education. (2008). Curriculum access resource guide- modified (CARG-M) modified priority academic student skills (PASS), math, gr.3-5. Retrieved on August 7, 2008, from http://www.sde.state.ok.us/AcctAssess/OMAAP.html

Oklahoma Department of Education. (2008). Curriculum access resource guide- modified (CARG-M) modified priority academic student skills (PASS), math and algebra I, gr 6-8. Retrieved on August 7, 2008 from http://www.sde.state.ok.us/AcctAssess/OMAAP.html

Oklahoma Department of Education. (2008). Curriculum access resource guide- modified (CARG-M) modified priority academic student skills (PASS), science and biology I, gr.5 and 8. Retrieved on August 7, 2008, from http://www.sde.state.ok.us/AcctAssess/OMAAP.html

Oklahoma Department of Education. (2008). Oklahoma modified alternate assessment program (OMAAP) mathematics & reading grade 3, parent, student, and teacher guide.[also grades 4-8] Retrieved on August 7, 2008, from http://www.sde.state.ok.us/AcctAssess/OMAAP.html

Oklahoma Department of Education. (2008). Test administration manuals. (reading grades 3-5, reading grades 6-8, math grades 3-5, math grades 6-8, science grades 5 and 8, English II, biology I and algebra I.) Retrieved on September 17, 2008, from http://www.sde.state.ok.us/AcctAssess/testadmin.html

Oklahoma Department of Education. (2008). Test blueprints. [grades 3-8, EOI]. Retrieved on August 7, 2008, from http://www.sde.state.ok.us/AcctAssess/OMAAP.html

Oklahoma Department of Education (n.d.). Test blueprints. Retrieved on August 7, 2008, from http://sde.state.ok.us/AcctAssess/core.html

Texas

Texas Education Agency (n.d.). Blueprints math TAKS-M. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (n.d.). Blueprints reading TAKS-M. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (n.d.). Blueprints science TAKS-M. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (n.d.). Blueprints social studies TAKS-M. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (n.d.). Blueprints writing TAKS-M. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (2008, February). Modification guidelines for reading/ELA. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (2008, February). Modification guidelines for mathematics. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (2008, February). Modification guidelines for science. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (2008, February). Modification guidelines for social studies. Retrieved August 7, 2008 from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (2008, February). Modification guidelines for writing (revising and editing). Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (2008, March). Spring 2008 TAKS-M information brochure (English). Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/taksm/index.html

Texas Education Agency (2008). TAKS-M Grades 3-5 test administration manual 2008: Writing, mathematics, reading, science, social studies. Retrieved on August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/guides/test_admin/2008/TAKSM08_3to5_TAM.pdf#xml=http://www.tea.state.tx.us/cgi/texis/webinator/search/xml.txt?query=administration+manual&db=db&id=c8e062206095b2b5

Texas Education Agency (2008). TAKS-M grades 6-8 test administration manual 2008: Writing, mathematics, reading, science, social studies. Retrieved on August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/guides/test_admin/2008/TAKSM08_6to8_TAM.pdf#xml=http://www.tea.state.tx.us/cgi/texis/webinator/search/xml.txt?query=administration+manual&db=db&id=e0617822e4f5949f

Texas Education Agency (2008). TAKS-M grades 9-12 test administration manual 2008: Writing, mathematics, reading, science, social studies. Retrieved on August 7, 2008, from http://www.tea.state.tx.us/student.assessment/resources/guides/test_admin/2008/TAKSM08_9to11_TAM.pdf#xml=http://www.tea.state.tx.us/cgi/texis/webinator/search/xml.txt?query=administration+manual&db=db&id=90e938497cb95e10

Texas Education Agency (2008). TAKS blueprints. Retrieved August 7, 2008, from http://www.tea.state.tx.us/student.assessment/taks/booklets/index.html

Texas Education Agency (2008). Texas student assessment program: 2008-2009 accommodations manual: Guidelines for selecting, administering, and evaluating the use of accommodations. Retrieved on August 7, 2008 from http://ritter.tea.state.tx.us/student.assessment/admin/AccommManual_2007_08_tagged.pdf.



Appendix B

AA-MAS Characteristics by State

Table B1. Assessment Type and Question Characteristic by Content Area for States’ AA-MAS, 2008

 

State

Reading

Writing

Math

 

Science

Social

Studies

Multiple Choice

Constructed Response

Performance Task

Multiple Choice

Constructed Response

Performance Task

Writing Prompt

Multiple Choice

Constructed Response

Performance Task

Multiple Choice

Constructed Response

Performance Task

Multiple Choice

Constructed Response

Performance Task

California

X

X

X

Connecticut1

X

X

X

X

Kansas

X

X

X

X

X

Louisiana

X

X

X

X

X

X

X

X

X

X

Maryland2

X

X

X

X

X

North Carolina3

X

X

X

X

North Dakota4

X

X

X

X

Oklahoma

X

X

X

X

Texas

X

X

X

X

X

X

Shading indicates a state does not have a separate assessment for that content area.

1 Connecticut will implement in 2008-09.

2 Maryland will implement in 2008-09 at the earliest.

3 North Carolina also has occupational version that includes Occupational English I, Occupational Mathematics I, and Life Skills Science I and II.

4 The North Dakota assessment is done on computer with the student and teacher together. The test requires the teacher to enter the answer choice given by the student. Each question is presented on a single screen. Most questions are multiple choice with several teacher initiated questions (involves printing a screen shot of item, providing student with supplies to answer the item, give verbal instructions to student. The instructions are provided with the item and the teacher rates the student’s response from several options.)

Table B2. Comparison of AA-MAS and Regular Assessment: Design Changes, 2008

State

Design Change

Distractor Removed

Fewer Items

Fewer Passages

Segmenting of Passage

Shorter Passages

Simplified Language

Other

California

X

X

X

X

X*

Connecticut1

X

X

X

Kansas

X

X

X

X*

X

X*

Louisiana

X

X

X*

X

X*

Maryland2

X

X

X*

North Carolina

X

X

X

X

X*

North Dakota

X*

Oklahoma

X

X

X*

X*

Texas

X*

X

X

X*

X*

X*

X*

Total

6

8

4

3

5

5

8

* See Table B3 for specifications and for descriptions of “other” design changes.

1 Connecticut will implement in 2008-09.

2 Maryland will implement in 2008-09 at the earliest.

Table B3. Specifications and Descriptions of Assessment Design Changes and of “Other” Assessment Design Changes, 2008

State

Specification Details and Other Design Changes

California

Other Design Changes

All Content Areas: One column for most items.

Math: Graphics for most items.

Science: Graphics for most items (stems and options).

Connecticut1

Kansas

Specification Details

Shorter Passages: Reduced sentence, paragraph and passage length.

Other Design Changes

Reading/ELA: Use text with familiar/common topics to KAMM students, creating clear literal, explicit connections within text, organizing and formatting text to facilitate students’ processing of information related to overall purpose/theme (e.g., use of subheadings, bulleted lists, repetition of key words/information).

Math: Reduced complexity of items in assessment (e.g., limiting decimal places to hundredths vs. thousandths on regular) and modifying other item specifications (e.g., provide graphic when appropriate; focus on the mathematical relationships, not solving for a missing part).

Louisiana

Specification Details

Shorter Passages: Only at some grade levels (e.g., upper grades).

Other Design Changes

Reading/ELA: No poetry.

Writing: Prompt score uses two dimensions (composing and audience dimensions) of the six used in LEAP and GEE. Shorter response to writing prompt is required; For information resources section, questions are placed adjacent to the related resources.

Math: Reading difficulty level of test questions is minimized except for necessary mathematical terms.

Maryland2

Other Design Changes

All Content Areas: Administered as paper and pencil test or computerized version.

North Carolina

Other Design Changes

Writing: Grades 4 and 7 use the same prompts as regular assessment but are scored using modified achievement standards; Response booklet uses larger space between lines, with few lines overall on which to respond; Test booklets are modified with fewer printed lines (25 instead of 50), providing more white space in between lines for composing responses.

North Dakota

Other Design Changes

All Content Areas: Test is done on computer with the student and teacher together. The teacher enters the answer choice given by the student. Each question is presented on a single screen. Most questions are multiple choice with several teacher initiated questions (involves printing a screen shot of item, providing student with supplies to answer the item, give verbal instructions to student. The instructions are provided with the item and the teacher rates the student’s response from several options.)

Oklahoma

Specification Details

Segmentation of Passages: Break apart passages into smaller portions and place the specific questions that pertain to the smaller portion underneath that section.

Other Design Changes

All Content Areas: Eliminate questions that require students to select the better/best answer; Eliminate answers choices that give students the option to make "no change" to the item.

Reading: Display passages in one-column format.

Writing: Simplify the question; Simplify the writer’s checklist; Use a 3-point holistic writing rubric.

Math: Display the number on all sides for questions about perimeter; Avoid items with negative and positive answer choice of the same number (for example -4 and +4) for lower grade levels; For lower grades use grids for area questions; Be consistent with qualifiers in stem and answer choices (i.e., use mL throughout or milliliters throughout); Avoid questions with best or closest, complicated art, and items that ask for students to redefine their perception of an object (i.e., fold this item along the dotted line).

Science: Emphasize pictures over text; Simplify cells and other diagrams; Optimize readability; Highlight, if possible; Put a box around formulas to make them stand out.

Texas

Specification Details

Distractor Removed: Delete one answer choice based on content and/or statistics of item.

Reading/ELA: All other distracters must come from the associated part or a previous part; Revise answer choices as necessary to reflect modifications made to the selection.

Segmentation of Passages: Divide the selection into meaningful thought units (parts) with items associated with that unit (part) immediately following it.

Shorter Passages: Delete extraneous information that does not affect development of the selection or any context related to the tested items.

Simplified Language: Change passive voice to active voice when appropriate; Add precise language to provide additional context for clarification;

Reading: Simplify difficult to decode or conceptually difficult vocabulary, phrases, or sentences when not tested; Break compound/complex sentences into simpler sentences; Separate contractions except in cases where this makes the sentence awkward; Edit figurative language when not tested by using simpler sentences, plain language, and delete unnecessary words; Change item from an open-ended statement ending with a dash to a direct question or vice versa, as necessary for clarification.

Math: Simplify complex sentence structure and vocabulary in item and answer choices without eliminating math vocabulary.

Texas (continued)

Other Design Changes

All Content Areas: Delete items that cannot be assessed due to passage modifications; Simplify visual complexity of graphics; Revise answer choices to reflect modifications made to selection. Add precise language to provide additional context for clarification; Direct student attention to graphics; Other changes include horizontal item layout (full width), reduce the blueprint and delete all embedded field test item; Spanish-TAKS M tests are not currently available (no side by side versions with Spanish and English).

Reading: Test administrator reads the pre-reading text to the students that clarifies purpose and explains difficult concepts and vocabulary; Delete one part of a compound answer choice when possible; Paired selections in grades 3-8 are not tested as thematically linked; Delete items that cannot be modified based on guidelines; Delete crossover items, items that test author’s organization of entire selection, and open-ended responses for reading selections in grades 9-11.

Math: Reduce the number of variables and simplify digits in item when appropriate; Delete extraneous information including irrelevant material and unnecessary words in items or graphics; Change item from an open-ended statement to a direct question or vice versa, as necessary, for clarification; Use consistent language within an item in order to focus student attention on what is being asked; Revise text as necessary to maintain the authenticity and logic of the item due to modification; Provide new text and/or reorganize existing text within the question to explain or clarify the graphic; Provide additional graphics to support text, emphasize ideas, and facilitate comprehension; Reduce the number of variables and simplify digits in item when appropriate; Limit the number of steps and/or operations in multi-step problems; Provide explicit directions to explain a process such as measuring.

Science: Delete one part of compound answer choices when possible; Delete cluster items, griddable items, negative items, and items that cannot be modified based on guidelines; Delete extraneous information including irrelevant material and unnecessary words in items or graphics; Simplify complex sentence structure and vocabulary in item and answer choices without eliminating science vocabulary; Change item from an open-ended statement to a direct question or vice versa, as necessary, for clarification; Add precise language to provide additional context for clarification; Use consistent language with an item in order to focus student attention on what is being asked; Provide appropriate formula and/or conversion from science chart near the item; Provide explicit directions to explain a process such as measuring; Limit the number of steps and/or operations in multi-step problems; Provide new text and/or reorganize existing text within the question to explain or clarify the graphic; Provide additional graphics to support text, emphasize ideas, and facilitate comprehension; Reduce the number of variable and simplify digits in items when appropriate; Limit the number of steps and/or operations in multi-step problems; Provide appropriate formula and/or conversion from science chart near item; Provide explicit directions to explain a process such as measuring.

Social Studies: Provide explanatory text in brackets in historical excerpts (quotations); Simplify complex sentence structure and vocabulary in item and answer choices without eliminating social studies vocabulary; Change item from an open-ended statement to a direct question or vice versa, as necessary, for clarification; Use consistent language with an item in order to focus student attention on what is being asked; Revise text as necessary to maintain the authenticity of the item due to modifications; Provide explanatory text in brackets in historic excerpts (quotations); Provide additional graphics to support text, emphasize ideas, and facilitate comprehension; Provide new text and/or reorganize existing text with the question to explain or clarify the graphic; Delete items that cannot be modified based on guidelines.

1 Connecticut will implement in 2008-09.

2 Maryland will implement in 2008-09 at the earliest.

Table B4. AA-MAS Embedded Accommodations, Selected States, 2008

 

 

 

State

Accommodation Incorporated into AA-MAS Assessment Design

Breaks as Needed

Calculator

Fewer Items/Page

Key Text Underlined/Bolded/

Larger Font Size

Manipulatives

Read Aloud Questions and Answers

Scribe

Other

California

X

X*

Connecticut1

Kansas

X

X

X*

Louisiana

X

X

Maryland2

North Carolina

X

North Dakota

X*

X*

X*

Oklahoma

X*

X

X

Texas

X

X*

X*

X*

X*

Total

1

1

5

3

4

1

1

1

2

* See Table B5 for specifications and for descriptions of design changes.

1 Connecticut will implement in 2008-09. Unable to determine if assessment will contain embedded accommodations. Detailed accommodations and test design information for this assessment was not available.

2 Maryland will implement in 2008-09 at the earliest. Accommodations information for this assessment is not available.

Table B5. Specifications and Descriptions of Embedded Accommodations, 2008

State

Specification Details and Other Design Changes

California

Specification Details

Larger font: Helvetica sans serif.

Connecticut1

Kansas

Other Embedded Accommodations

Bulleted List

Louisiana

Maryland2

North Carolina

North Dakota

Specification Details

Calculator and Manipulatives: Supplies given to student for assessment include pencil and paper, non-permanent marker, calculator, 12" ruler, number line 0-10, concrete math manipulatives (20), non-math text; book, number line from -7 to +7 with .5 intervals (secondary only), and dictionary.

Scribe: This test will be done on the computer with the student and the teacher together. The test requires the teacher to enter the answer choice given by the student.

Oklahoma

Specification Details

Fewer items per page: Minimize questions on the page (limit to 2).

Texas

Specification Details

Key Text Underlined/Bolded: Science and Social Studies: Provide definition of non-tested vocabulary in a text box near item and bold the defined term in the item. Reading: Provide definition of literary terms in a text box near the item and bold the defined term in the item.

Larger Font Size: Larger point size, Verdana font.

Read Aloud Questions and Answers: Oral administration is not available, but reading of test questions and items are part of the design of the reading and math assessments.

Writing test: Pre-reading test only allowed; Due to the design of the revising and editing section of the writing test, orally reading the test questions and answers is not allowed. It is not possible to provide standard administration procedures that maintain the TEKS objectives for items such as misspelled words, homonym choice, irregular verb forms, or misplaced modifiers.

Other Embedded Accommodations

Bulleted List: Math, Science and Social Studies: Use bullets to clearly organize complex items into smaller, meaningful parts.

1 Connecticut will implement in 2008-09. Unable to determine if assessment will contain embedded accommodations. Detailed accommodations and test design information for this assessment not available.

2 Maryland will implement in 2008-09 at the earliest. Accommodations information for this assessment not available.


Appendix C

Percentages of Items by Elementary, Middle, and High School Representative Grade

Table C1. Elementary Grade: Reading/ELA AA-MAS Assessments (Grade 4 unless otherwise noted, multiple choice unless otherwise noted)

State

 

ELA Component

AA-MAS

Regular

Number of items

Percent

Number of items

Percent

California

Reading/ELA

Word analysis, fluency, and systematic vocabulary development

11

23 %

18

24 %

Reading comprehension (focus on informational materials)

10

21 %

15

20 %

Literary response and analysis

6

12 %

9

12 %

Total Multiple Choice

27

42

Writing

Written and oral English language conventions

11

23 %

18

24 %

Writing strategies

10

21 %

15

20 %

Total Multiple Choice

21

44 %

33

44 %

Total Reading/ELA and Writing

100%

100%

Connecticut

No information available

Kansas

Reading/ELA

Multiple choice

361

NA

Multiple measure items (field test)

161

NA

Total Multiple Choice

521

741

Louisiana2

Reading/ELA

Reading and Responding

Multiple Choice

Constructed Response

 

8

2

 

 

 

20

8

Using Information Resources

Multiple Choice

Constructed Response

 

5

1

 

5

2

Proofreading

Multiple Choice

 

8

 

8

Multiple Choice

Constructed Response

21

3

33

10

Writing

Constructed Response

 

1

1

Maryland

No information available

North Carolina

Reading/ELA1

40

100 %

58

100 %

North Dakota

Reading/ELA1

20-303

NA

Oklahoma

Reading/ELA

Vocabulary

9-114

25 %

12

24 %

Comprehension/critical literacy

17-194

45 %

23

46 %

Literature

6-84

18 %

9

18 %

Research and information

4-64

13 %

6

12 %

Total Multiple Choice

36-444

100 %

50

100 %

Texas

Reading/ELA

Basic understanding

12

38%5

15

38%5

Literary elements

6

19%5

8

20%5

Analysis using reading strategies

6

19%5

7

18%5

Analysis using critical/thinking skills

8

25%5

10

25%5

Total Multiple Choice

32

101%

40

101%

Writing2

Composition

Constructed Response

1

1

Organization (revising and editing)

3

4

Sentence structure (revising and editing)

7

8

Standard usage/word choice (revising and editing)

7

8

Punctuation, capitalization, spelling (revising and editing)

7

8

Total Multiple Choice

Total Constructed Response

24

1

28

1

1 Totals only available. Information not available by strand.

2 Percentages not calculated due to combination of multiple choice and constructed response items.

3 North Dakota included a range of items in their description of items per subject.

4 Oklahoma listed an “ideal number of items” in their test blueprint.

5 Percentage calculated based on number of items.

Note: NA = Not Available

Table C2. Elementary Grade: Mathematics AA-MAS Assessments (Grade 4 unless otherwise noted, multiple choice unless otherwise noted)

State

AA-MAS

Regular

Mathematics Component

Number of items

Percent

Number of items

Percent

Mathematics

California

Number sense

23

48 %

31

48 %

Algebra and functions

10

21 %

18

28 %

Measurement and geometry

10

21 %

12

18 %

Statistics, data analysis, and probability

5

10 %

4

6 %

Mathematical reasoning

Embedded

Embedded

Total Multiple Choice

48

100 %

65

100 %

Connecticut

No information available

Kansas

Mathematics

40

72

Mathematics

Louisiana1

 

Number and number relations

20 %

40 %

Algebra

16 %

5 %

Measurement

16 %

10 %

Geometry

16 %

20 %

Data analysis, probability, and discrete math

15 %

10 %

Patterns, relations and functions

17 %

15 %

Total Multiple Choice

Total Constructed Response

42

2

60

3

Maryland

No information available

Mathematics

North Carolina

Calculator Active

27 (all operational)

67 %

54 (includes some experimental items)

66 %

Calculator Inactive

13 (all operational)

33 %

28 (includes some experimental items)

34 %

Total Multiple Choice

40

82

North Dakota

Mathematics

20-302

Not available

Oklahoma

 

 

 

Mathematics

Algebraic reasoning

6-83

18 %

8

18 %

Number sense

8-103

22 %

10

22 %

Geometry

9-113

25 %

11

24 %

Measurement

8-103

22 %

10

22 %

Data analysis and statistics

4-63

13 %

6

13 %

Total Multiple Choice

35-453

100%

45

100%

Texas

Mathematics

Numbers, operations, and quantitative reasoning

9

26%4

11

26%4

Patterns, relationships, and algebraic reasoning

6

18%4

7

17%4

Geometry and spatial reasoning

5

15%4

6

14%4

Measurement

5

15%4

6

14%4

Probability and statistics

3

9%4

4

10%4

Mathematical processes and tools

6

18%4

8

19%4

Total Multiple Choice

34

100%

42

100%

1 Number of items not calculated for each strand due to combination of multiple choice and constructed response items included in percentages.

2 North Dakota included a range of items in their description of items per subject.

3 Oklahoma listed an “ideal number of items” in their test blueprint.

4 Percentage calculated based on number of items.

Note: NA = Not Available

Table C3. Elementary Grade: Science AA-MAS Assessments (Grade 4 unless otherwise noted)

AA-MAS

Regular

State

Science component

Number of items

Percent

Number of items

Percent

Science (Grade 5)

California

Physical sciences

141

29 %

182

30 %

Life science

143

29 %

184

30 %

Earth sciences

145

29 %

186

30 %

Investigation and experimentation

67

13 %

68

10 %

Total Multiple Choice

48

100%

60

100%

Connecticut

No information available

Kansas

Science

NA

44

100%

Science

Louisiana

Science as inquiry

7

20%9

8

20%9

Physical science

7

20%9

8

20%9

Life science

7

20%9

8

20%9

Earth and space science

7

20%9

8

20%9

Science and the environment

7

20%9

8

20%9

Total Multiple Choice

Total Constructed Response

35

2

100%

40

4

100%

Maryland

No information available

North Carolina

Science

60 (all operational)

100%

80 (includes some experimental items)

North Dakota

Science

20-3010

100%

NA

Science (Grade 5)

Oklahoma

 

 

Observe and measure

8-1011

22 %

10

22 %

Classify

8-1011

22 %

10

22 %

Experiment

9-1111

25 %

11

24 %

Interpret and communicate

12-1411

32 %

14

31 %

Total Multiple Choice

37-4511

101%

45

99%

Texas

Science (Grade 5)

Nature of science

11

34%9

13

33%9

Life science

7

22%9

9

22%9

Physical science

7

22%9

9

22%9

Earth/space science

7

22%9

9

22%9

Total Multiple Choice

32

100%9

40

101%9

1 Of these items, 8 are from grade 5 and 6 are from grade 4.

2 Of these items 11 are from grade 5 and 7 are from grade 4.

3 Of these items, 7 are from grade 5 and 7 are from grade 4.

4 Of these items, 9 are from grade 5 and 9 are from grade 4.

5 Of these items, 8 are from grade 5 and 6 are from grade 4.

6 Of these items, 11 are from grade 5 and 7 are from grade 4.

7 Of these items, 4 are from grade 5 and 2 are from grade 4.

8 Of these items, 4 are from grade 5 and 2 are from grade 4.

9 Percentage calculated based on number of items.

10 North Dakota included a range of items in their description of items per subject.

11 Oklahoma listed an “ideal number of items” in their test blueprint.

Note: NA = Not Available

Table C4. Middle School Grade: Reading/ELA AA-MAS Assessments (Grade 8 unless otherwise noted)

State

AA-MAS

Regular

ELA Component

Number of Items

Percent

Number of Items

Percent

Reading/ELA

California

Word analysis, fluency, and systematic vocabulary development

6

11 %

9

12 %

Reading comprehension (focus on informational materials)

13

24 %

18

24 %

Literary response and analysis

11

20 %

15

20 %

Total Multiple Choice

30

42

Writing

Written and oral English language conventions

11

20 %

16

21 %

Writing strategies

13

24 %

17

23%

Total Multiple Choice

24

33

Total Reading/ELA and Writing

54

99%

75

100%

Connecticut

No information available

Kansas

Reading/ELA

Multiple choice items

48

NA

Multiple measure items (field test)

16

NA

Total Multiple Choice

64

84

Louisiana1

Reading/ELA

Reading and Responding items

Constructed Response

8

1

20

8 and 1 essay

Using Information Resources

Constructed Response

5

1

5

2

Proofreading

8

8

Total Multiple Choice

21

33

Writing (Grade 7)

Constructed Response

1

1

Maryland

No information available

Reading/ELA

North Carolina

Operational

40

53

Embedded Experimental Items

0

9

Total Multiple Choice

40

62

North Dakota

Reading/ELA

20-302

NA

Oklahoma

 

 

Reading/ELA (Grade 7)

Vocabulary

13 %

10

20 %

Comprehension

43 %

20

40 %

Literature

30 %

12

24 %

Research and information

15 %

8

16 %

Total Multiple Choice

99%

50

100%

Texas3

Reading/ELA

Basic understanding

10

12

Literary elements

8

10

Analysis using reading strategies

8

10

Analysis using critical/thinking skills

12

16

Total Multiple Choice

38

48

Writing (Grade 7)

Composition Constructed Response

1

1

Organization (revising and editing)

4

6

Sentence structure (revising and editing)

8

10

Standard usage/word choice (revising and editing)

10

12

Punctuation, capitalization, spelling (revising and editing)

10

12

Total Multiple Choice

Total Constructed Response

32

1

40

1

1 Percentages not calculated due to combination of multiple choice and constructed response items.

2 North Dakota included a range of items in their description of items per subject.

3 Percentages not calculated due to mixture of Grade 8 Reading/ELA items and Grade 7 writing items in list.

Note: NA = Not Available

Table C5. Middle School Grade: Mathematics AA-MAS Assessments (Grade 8 unless otherwise noted, multiple choice unless otherwise noted)

State

Mathematics component

AA-MAS

Regular

Number of items, description

Percent

Number of items

Percent

Mathematics

California

Number sense

18

34 %

22

34 %

Algebra and functions

20

37 %

25

38 %

Measurement and geometry

11

20 %

13

20 %

Statistics, data analysis, and probability

5

9 %

5

8 %

Mathematical reasoning

Embedded

Embedded

Total Multiple Choice

54

100%

65

100%

Connecticut

No information available

Kansas

Mathematics

40

72 -1041

Mathematics

Louisiana2

 

 

Number and number relations

20 %

20 %

Algebra

16 %

15 %

Measurement

16 %

15 %

Geometry

16 %

20%

Data analysis, probability, and discrete math

15 %

20 %

Patterns, relations and functions

17 %

10 %

Total Multiple Choice

Total Constructed Response

42

2

100%

60

4

100%

Maryland

No information available

Mathematics

North Carolina

Total Multiple Choice

40 (all operational)

100%

80 (includes some experimental items)

North Dakota

Mathematics

20-303

100%

NA

Oklahoma

Mathematics

Algebraic reasoning

7-94

20 %

9

20 %

Number sense

6-84

18 %

8

18 %

Geometry

6-84

18 %

8

18 %

Measurement

10-124

27 %

12

27 %

Data analysis and statistics

6-84

18 %

8

18 %

Total Multiple Choice Items

35-454

101%

45

101%

Texas

Mathematics

Numbers, operations, and quantitative reasoning

8

20%5

10

20%

Patterns, relationships, and algebraic reasoning

8

20%5

10

20%

Geometry and spatial reasoning

6

15%5

7

14%

Measurement

4

10%5

5

10%

Probability and statistics

6

15%5

8

16%

Mathematical processes and tools

8

20%5

10

20%

Total Multiple Choice

40

100%

50

100%

1 Kansas used a range to describe number of items.

2 Number of items not calculated for each strand due to inclusion of constructed response items in information reported by state.

3 North Dakota included a range of items in their description of items per subject.

4 Oklahoma listed an “ideal number of items” in their test blueprint.

5 Percentage calculated based on number of items.

Table C6. Middle School Grade: Science AA-MAS Assessments (Grade 8 unless otherwise noted)

State


Science component

AA-MAS

Regular

Number of Items

Percent

Number of items

Percent

Science

California

 

 

 

 

Motion

7

13 %

8

13 %

Forces

7

13 %

8

13 %

Structure of matter

8

15 %

9

15 %

Earth in the solar system (earth science)

7

13 %

7

12 %

Reactions

6

11 %

7

12 %

Chemistry of living systems (life science)

3

6 %

3

5 %

Periodic table

6

11 %

7

12 %

Density and buoyancy

5

9 %

5

8 %

Investigation and experimentation

5

9 %

6

10 %

Total Multiple Choice

54

100%

60

100%

Connecticut

No information available

Kansas

Science

NA

60 (Grade 7)

100%

Louisiana

No information available

Maryland

No information available

North Carolina

Science

60 (all operational)

80 (includes some experimental items)

North Dakota

Science

20-301

Not available

Science

Oklahoma

 

 

Observe and measure

6-82

18 %

8

18 %

Classify

6-82

18 %

8

18 %

Experiment

13-152

35 %

16

36 %

Interpret and communicate

11-132

30 %

13

29 %

Total Multiple Choice

36-442

101%

45

101%

Texas

Science

Nature of science

11

28%3

14

28%3

Living systems and the environment

10

25%3

12

24%3

Structures and properties of matter

5

13%3

6

12%3

Motion, forces, and energy

5

13%3

6

12%3

Earth and space systems

9

22%3

12

24%3

Total Multiple Choice

40

101%

50

101%

1 North Dakota included a range of items in their description of items per subject.

2 Oklahoma listed an “ideal number of items” in their test blueprint.

3 Percentage calculated based on number of items.

Table C7. High School Grade: ELA AA-MAS Assessments (Grade 10 unless otherwise noted, multiple choice unless otherwise noted)

State

ELA Component

AA-MAS

Regular

Number of Items

Percent

Number of Items

Percent

California

Reading/ELA

In development

Has tests

Connecticut

No information available

Kansas

Reading/ELA

Multiple Choice

48

NA

Multiple Measures (field test)

16

NA

Total Multiple Choice

48

64+1

Reading/ELA

Louisiana2

Reading and Responding items

Constructed Response

8

1

20

10 and 1 extended essay

Using Information Resources

Constructed Response

5

1

5

2

Proofreading

8

8

Total Multiple Choice

21

33

Writing

Constructed Response

1

1

Maryland

Total Multiple Choice

Total Constructed Response

30-353

46

4

Reading

North Carolina

Operational Items

40

56

Embedded Field Test Items

0

24

Total Multiple Choice

40

80

North Dakota

Reading

20-304

Not available

Oklahoma2

 

 

 

Reading/ELA (English II EOI)

Vocabulary

4-55

4-85

Comprehension/critical literacy

9-115

16-205

Literature

12-145

17-205

Research and information

3-55

4-65

Total Multiple Choice

28-355

41-545

Writing component (English II EOI)

Grammar/usage and mechanics

7-95

12

Writing prompt

1

1

Total Multiple Choice

Total Constructed Response

7-94

1

12

1

Texas2

Reading

Basic understanding

7

8

Literary elements and techniques

Constructed Response

7

8

1

Analysis and critical evaluation (reading)

8

12

2

Total Multiple Choice

22

28

Writing

Composition Constructed Response

1

1

Revising and editing

14

20

Total Multiple Choice

Total Constructed Response

14

1

20

1

1 Kansas indicated that the assessment had “at least 64 items.”

2 Percentages not calculated due to combination of multiple choice and constructed response items.

3 Maryland used a range to describe the number of items for each session.

4 North Dakota used a range of items in their description of items per subject.

5 Oklahoma listed an “ideal number of items” in their test blueprint.

Table C8. High School Grade: Mathematics AA-MAS Assessments (Grade 10 unless otherwise noted, multiple choice unless otherwise noted)

State

Mathematics component

AA-MAS

Regular

Number of Items

Percent

Number of Items

Percent

California

Mathematics

In Development

Has Tests

Connecticut

No information available

Kansas

Mathematics

40

100%

104 (High School)

100%

Mathematics

Louisiana1

 

 

 

 

Number and number relations

20 %

10 %

Algebra

16 %

15 %

Measurement

16 %

15 %

Geometry

16 %

20 %

Data analysis, probability, and discrete math

15 %

20 %

Patterns, relations and functions

17 %

20 %

Total Multiple Choice

Total Constructed Response

42

2

100%

60

4

100%

Maryland2

Algebra

Multiple Choice Items

Constructed Response

30-353

26

12

North Carolina 2

Mathematics

40 (all operational)

80 (includes some experimental items)

North Dakota 2

Mathematics

20-304

Not available

Oklahoma

 

 

 

Algebra EOI (HS)

Number sense and algebraic operations

10-125

27 %

15

27 %

Relations and functions

21-235

55 %

31

56 %

Data analysis, probability & statistics

6-85

18 %

9

16 %

Total Multiple Choice

37-435

100%

55

99%

Texas 6

Mathematics

Functional relationships

4

9%6

5

9%6

Properties and attributes of functions

4

9%6

5

9%6

Linear functions

4

9%6

5

9%6

Linear functions and inequalities

4

9%6

5

9%6

Quadratic, other nonlinear functions

4

9%6

5

9%6

Geometric relationships and spatial reasoning

4

9%6

5

9%6

2-D and 3-D representations

4

9%6

5

9%6

Measurement

6

13%6

7

13%

Percents, proportions, probability, and statistics

4

9%6

5

9%6

Mathematical processes and tools

7

16%6

9

16%

Total Multiple Choice

45

101%

56

101%

1 Number of items not calculated for each strand due to combination of multiple choice and constructed response items included in percentages.

2 Totals only available. Information not available by strand.

3 Maryland used a range to describe the number of items for each session.

4 North Dakota used a range of items in their description of items per subject.

5 Oklahoma listed an “ideal number of items” in their test blueprint.

6 Percentage calculated based on number of items.

Table C-9 High School Grade: Science AA-MAS Assessments (Grade 10 unless otherwise noted)

State

Science component

AA-MAS

Regular

Number of Items

Percent

Number of Items

Percent

California

In development

Has tests

Connecticut

No information available

Kansas

Science

Physical Science

NA

30

50%1

Life Science

NA

30

50%1

Total Multiple Choice

NA

60

100%1

Science

Louisiana1

Science as inquiry

7

8

Physical science

Constructed Response

7

10

1

Life science

Constructed Response

7

10

1

Earth and space science

Constructed Response

7

6

1

Science and the environment

Constructed Response

7

6

1

Total Multiple Choice Items

Total Constructed Response

35

2

40

32

Maryland3

Science (Biology)

Total Multiple Choice

Total Constructed Response

30-354

48

7

North Carolina3

Science

40 (all operational)

80 (includes some experimental items)

North Dakota3

Science

20-305

NA

Science Biology EOI

Oklahoma

 

 

Observe and measure

5-76

13 %

8

13 %

Classify

5-76

13 %

8

13 %

Experiment

11-136

26 %

16

27 %

Interpret and communicate

15-176

35 %

20

34 %

Model

5-76

13 %

8

13 %

Total Multiple Choice

41-516

100%

60

100%

Texas

Science

Nature of science

14

32%7

17

31%1

Organization of living systems

9

20%7

11

20%1

Interdependence of organisms

9

20%7

11

20%1

Structures and properties of matter

6

14%7

8

15%1

Motion, forces, and energy

6

14%7

8

15%1

Total Multiple Choice

44

100%

55

101%

1 Percentages not calculated due to combination of multiple choice and constructed response items.

2 Louisiana’s Science as Inquiry dimensions I and II had three constructed response, and 1 extended constructed response in two of four strands.

3 Detailed information by strand not available on Web site.

4 Maryland used a range to describe the number of items for each session.

5 North Dakota used a range of items in their description of items per subject.

6 Oklahoma listed an “ideal number of items” in their test blueprint.

7 Percentage calculated based on number of items.

Table C10. All Grades: Social Studies AA-MAS Assessments (Grade levels noted in table, multiple choice unless otherwise noted)

State

Social Studies component

AA-MAS

Regular

Number of Items

Percent

Number of Items

Percent

California

Social Studies

In development

Has tests

Connecticut

No information available

Kansas

Social studies test (Grade 6)

421

100%

48

100%

Social studies test (Grade 8)

492

100%

60

100%

Social studies test (High School)

US Section

30

57%

30

50%

World Section

22

42%

30

50%

Total Multiple Choice

52

99%

60

100%

Louisiana3

Social Studies, Grade 4

Total Multiple Choice

Total Constructed Response

32

2

NA

Social Studies, Grade 8

No information available

Social Studies, Grade 11

Geography

8

9

Civics

8

15

Economics

8

12

History

8

24

Total Multiple Choice

Total Constructed Response

32

2

60

4

Maryland

Government

Total Multiple Choice

Total Constructed Response

30-354

50

8

North Carolina

No test

North Dakota

No test

Oklahoma

No test

Texas

Social Studies, Grade 8

History

10

26%

13

27%

Geography

5

13%

6

13%

Economics and social influences

7

18%

9

19%

Political influences

10

26%

12

25%

Social studies skills

6

16%

8

17%

Total Multiple Choice

38

99%

48

101%

Social Studies, Grade 10

History

5

13%

7

14%

Geography

10

25%

12

24%

Economics and social influences

6

15%

7

14%

Political influences

9

23%

12

24%

Social studies skills

10

25%

12

24%

Total Multiple Choice

40

101%

50

100%

Social Studies, Grade 11

History

10

23%

13

24%

Geography

7

16%

9

16%

Economics and social influences

10

23%

13

24%

Political influences

7

16%

9

16%

Social studies skills

10

23%

11

20%

Total Multiple Choice

44

101%

55

100%

1 Of these items, 18 are grade 5 items and 24 are grade 6.

2 Of these items, 23 are grade 7 items and 26 are grade 8.

3 Percentages not calculated due to combination of multiple choice and constructed response items.

4 Maryland used a range to describe the number of items for each session.