Prepared by:
Martha L. Thurlow & Hilda Ives Wiley
August 2004
Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:
Thurlow, M. L., & Wiley, H. I. (2004). Almost there in public reporting of assessment results for students with disabilities (Technical Report 39). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://education.umn.edu/NCEO/OnlinePubs/Technical39.htm
This report is the sixth analysis of state reports conducted by the National Center on Educational Outcomes (NCEO) to examine the extent to which states publicly report information about students with disabilities in statewide assessments. We present descriptions of statewide testing systems and examine whether these systems included participation and performance information for students with disabilities, as indicated by publicly available data. The majority of our information was obtained by analyzing states’ Department of Education Web sites. If disaggregated information was not posted, the states were then asked to submit public documents that included these results.
For the 2001-2002 school year, the number of states that reported both participation and performance data on students with disabilities for their general assessments was 35. This number was an increase from the 2000-2001 school year, in which only 28 states reported both participation and performance data. For the 2001-2002 year, participation data were presented in a variety of ways. The most common way was to present the number of students tested. Almost all states that reported participation data did this. Twenty-two states went beyond the numbers to report rates of participation.
General assessment performance data for students with disabilities also were reported in a variety of ways by states. Performance data reported on state’s general assessments, more often now than in previous years, compared students with disabilities to general education students or the total population of students. The results clearly illustrate the achievement gap that exists between these two groups. Still, the gap does vary dramatically across states.
Alternate assessment participation and performance reporting for 2001-2002 was available in 22 states. This was up considerably from only 13 states in 2000-2001. Other states provided only participation data (5 states) or only performance data (5 states). The nature of the data presented on alternate assessments was usually just an overall count of students participating or an overall rate of students passing. Though some states did break participation and performance information down by grade level or content matter, many states still only provide aggregated numbers.
Although not all states were yet reporting on the participation and performance of students with disabilities on either the general assessment or the alternate assessments administered in 2001-2002, the number doing so continued to increase from previous years. In addition, the nature of the reporting seemed to have improved. Nevertheless, there are many reasons to still be disappointed. Chief among these is the fact that not all states report. This was a request of IDEA 1997, the law that was enacted a full three years prior to this data reporting. Based on the states that do report, however, it is possible to generate some recommendations for where and how to move things forward. Several ideas are presented in this report.
Since 1997, the National Center on Educational Outcomes has been collecting data on the inclusion of students with disabilities in statewide assessment systems. This effort was initiated because of the recognition that excluding large numbers of students with disabilities from state assessments presents an inaccurate picture of how all students are performing (Thurlow, House, Boys, Scott, & Ysseldyke, 2000; Thurlow, Ysseldyke, Erickson, & Elliott, 1997; Zlatos, 1994). It is impossible to get a sense of the effectiveness of public education if all participants are not being measured. In addition, if the performance of students with disabilities is not assessed, teachers and administrators feel less pressure to ensure that this group of students is making visible academic progress (Elliott, Thurlow, & Ysseldyke, 1996).
Cibulka and Derlin (1995) proposed several purposes that public reporting of student assessment data serves. First, results can be used to make educated decisions about educational programs and general school effectiveness. Second, it is possible to evaluate the achievement of students. An effective accountability system is one that shares results both with educators and with the general public. When shared with others, data should be clear and accessible so that all stakeholders can use the data to reach helpful conclusions and determine whether school programs are working effectively.
Until the mid-1990s, legislation did not address the issue of students with disabilities being excluded in significant numbers from large-scale assessments. However, in 1994, the Elementary and Secondary Education Act (ESEA) amended Title 1, which required that states assess all students including students with disabilities using state tests to determine their progress toward state standards. Title 1 also required that disaggregated performance information be reported in a public report of school progress (U.S. Department of Education, 1999). Following upon ESEA, in 1997 the Individuals with Disabilities Education Act (IDEA) of 1990 was amended to require that states report state and district-wide assessment information for students with disabilities with the same frequency and in the same detail they report for students without disabilities. Both participation and performance information must be reported for both the general assessments and alternate assessments (National Research Council, 1999). States were required to develop alternate assessments to assess the performance of students who were unable to participate in the general assessment. Students participating in alternate assessments most often were students with significant cognitive disabilities.
The 2001 reauthorization of the Elementary and Secondary Education Act, the No Child Left Behind (NCLB) Act, required that by the beginning of 2002-2003 school year, assessment results would be reported at the classroom, district, and state levels (Fast, Blank, Potts, & Williams, 2002). Results also had to be publicly reported by the start of the following school year (i.e., Fall 2003). This means that most states need to speed up the process of scoring, analyzing, and disseminating results to students, schools, and districts. This has undoubtedly led to more Web-based reporting practices, which are faster and less expensive than printing lengthy reports.
Educational policies over the past decade have become more directive about participation in assessments. In the early 1990s, there was great variation in state guidelines that addressed the participation of students with disabilities in state assessments (Thurlow, Ysseldyke, & Silverstein, 1995). Participation rates of students with disabilities were known to be quite low and to vary widely from one state to another (Erickson, Thurlow, & Thor, 1995; McGrew, Thurlow, Shriner, & Spiegel, 1992; Shriner & Thurlow, 1992). In 1992, only 19 states reported assessment information, and of those 19 states, most reported participation rates of less than 10% of students with disabilities (Shriner & Thurlow, 1992). Many states reported that they did not know how many of their students with disabilities participated. If more than 10% of the population is not being assessed, it is impossible to get an accurate picture of how all students are performing (Thurlow & Thompson, 1999). In examining assessment data from the 2000-2001 school year, we found 38 states reported participation information on at least some of their state assessments (Thurlow, Wiley, & Bielinski, 2003). According to No Child Left Behind, in order to be considered to be making adequate yearly progress, schools must have 95% of their students participate in their state assessment. This requirement is broken down even further into subgroups, with the requirement that 95% of each subgroup must be tested. Recent proposed flexibility proposals have allowed that the 95% can be achieved through an averaging process across two or three years. This “flexibility” does not allow for wide variations in participation, however. To average 95%, states have to be very close to that percentage in each year that figures into the average.
Within the past several years, a growing number of states have begun publicly reporting information about the participation and performance of students with disabilities in their statewide assessment system (Bielinski, Thurlow, Callendar, & Bolt, 2001; Thurlow, Langenfeld, Nelson, Shin, & Coleman, 1998; Thurlow, et al., 2003; Ysseldyke, Thurlow, Langenfeld, Nelson, Teelucksingh, & Seyfarth, 1998). During the 2000-2001 academic year, 76% of states reported both participation and performance for students with disabilities on at least some of their state tests (Thurlow et al., 2003). When considering having either participation or performance, this percentage increases to 88%. Clearly states are now beginning to report these disaggregated data, though only 56% of states report this information for all of their state assessments (Robelen, 2004).
Because most states now report assessment results on the Web, we began our search for information by reviewing every state’s Department of Education Web site. We began collecting data in May 2003 and collected information for the 2001-2002 school year. We recorded assessments administered and documented whether participation and performance information was reported for students with disabilities. We also examined the way in which participation was reported and whether participation and performance information was given for students who took the test with accommodations. By May 2003, a large percentage of the states had already posted their 2001-2002 assessment data on-line in a way that made them easy to locate and understand.
On June 5, 2003 we mailed a letter to each state Director of Assessment (see Appendix A) outlining our findings from the state’s Web site. We asked them to review our findings, correct for any misinformation, and provide the public document or Web site at which the correct information was available. We asked that they send us these changes by June 27, 2003. Many states that had changes to make either sent us printed documents with the data or directed us to a Web page that we had not found in our search. Several states gave us dates by which time their disaggregated assessment results should be posted. Overall, we received responses from 20 Directors of Assessment.
In order to ensure that our findings were as accurate as possible, we followed up these efforts with a letter to each state’s Director of Special Education (see Appendix B). These letters were mailed on August 5, 2003. The letters asked the Directors to review our findings and make any changes by August 29, 2003. For states from which we had already received a response from the Director of Assessment, we noted that in the letter by stating that “These results were verified by your state’s Director of Assessment, but if you have anything to add, please let us know.” For states from which we did not hear from the Director of Assessment, we sent the same letter to the Director of Special Education as we had sent to the Director of Assessment. Of the 50 states to which we sent letters, 25 responded with either corrections or to verify that the information that we had was correct.
Appendix C lists all the state mandated general assessments that we identified for the 50 states. This list includes the state, the name of the test, the grades and content areas tested, and whether the state had publicly available disaggregated participation and performance data for students with disabilities for their 2001-2002 state assessments. We identified 111 separate statewide tests or testing systems. Thirty-five states had more than one general assessment.
Figure 1 breaks down the 111 testing systems by type: norm-referenced tests (NRT), criterion-referenced tests (CRT), exit tests used as a gate for graduation or earning a particular type of diploma (EXIT), and tests that combined standardized NRTs with additional state-developed test items (NRT/CRT). While we recognize that many exit exams may also be NRTs, CRTs, or both, the high stakes consequences for students on these exit exams indicated a need to create a separate category for these tests.
Criterion-referenced tests (CRTs) comprise 52% of all the assessments that states administer. In fact, only four states (Hawaii, Iowa, Montana, South Dakota) do not administer a CRT and an additional four states (Delaware, Florida, Indiana, Missouri) do not have an separate CRT, but rather have added criterion-referenced items to their NRT (CRT/NRT). The next largest number of tests administered are norm-referenced tests (NRTs), which comprise 22% of all assessments, followed closely by exit exams (21%). These numbers are similar to the 2000-2001 assessment pattern, in which 51% of tests were CRTs, 33% were NRTs, and 22% were exit exams (Thurlow, et al., 2003).
Figure 2 summarizes the different ways in which general assessment data were reported for all 50 states. Seventy percent of states reported disaggregated participation and performance information on students with disabilities for all their assessments, 8 percent reported performance for all assessments (but not participation data), 18 percent reported participation and performance information for some assessments, and 4 percent did not report any disaggregated information.
Figure 2. States that Disaggregate Assessment Results for Students with Disabilities
Figure 3 indicates which of the 50 states: (1) reported participation and performance for all of their general state assessments (70%), (2) reported performance results on all general assessments, but not participation data (8%), (3) reported participation and performance for some of their general assessments (18%), and (4) did not report either participation or performance results for any of their general assessments (4%). States that reported disaggregated data for students with disabilities at the state level generally reported results at the district and school levels, too.
Thirty-five states reported participation and performance results for students with disabilities on all of their tests. As evident in Figure 3, these states are spread across the U.S; they are states with both small and large populations. The states that reported disaggregated 2001-2002 data for their general assessments did so regardless of whether they had just one assessment or multiple assessments (20 of the 35 had more than one assessment), and regardless of whether they tested in just a few grades or in as many as 10 grades.
Figure 3. States that Report 2001-2002 Disaggregated Results for Students with Disabilities
Of the 9 states that reported participation and performance information for some of their assessments, the majority (n=7) were only missing data on one test. These states were Florida, Louisiana, South Dakota, Texas, Washington, West Virginia, and Wyoming. Virginia was missing performance data for only one assessment, though participation data were missing for two assessments. It is important to note that all states that reported disaggregated performance data did so at the state level except Wyoming, which only provided disaggregated information at the district level.
As shown in Figure 4, results from our Web searches and mailings
revealed that 22 states publicly reported both participation and performance
results for their alternate assessment. An additional 5 states reported
performance only, and 5 states reported participation only. Thus, 36 percent of
states did not report any type of information about their alternate assessment.
However, 44% of states did report both participation and performance for their
alternate assessment.
Figure 5 illustrates which states reported alternate assessment participation and performance data. There is no obvious geographic pattern to the states that did not report alternate assessment data. The states with no information are not states that did not have an alternate assessment in 2001-2002. According to Thompson and Thurlow (2001), all but 2 states had an alternate assessment approach by 2001 and all but 16 states had decided how scores from the alternate assessment would be reported. The no information states most likely are those 16 states plus some additional states.
Figure 5. States Publicly Reporting Data for the 2001-2002 Alternate Assessment
Among the states identified as providing participation data for students with disabilities, the way in which this information was reported varied (see Appendix D). Figure 6 illustrates the number of assessments with disaggregated participation data and how those participation data were reported. Information is presented in terms of the number of assessments for which participation data were available, not in terms of the number of states. For example, in Alabama there are three assessments and each is counted separately. We used this approach because not all states report participation in the same way across assessments. For example, one state might report only a count of students tested for one assessment, but for another assessment it might report a count tested, a percent tested, and a percent not tested.
Of the 86 general assessments that reported participation data, only 15 states reported both a count and a percentage of students tested; sixty-two assessments had just a count of students tested. Overall, states reported the percent of students tested for 22 assessments; 14 also provided a count as well as a percentage. For 15 assessments, the number or percentage of students who were not tested was provided, and for 8 assessments, states provided information about the number or percent of students who were exempted or excluded. For 6 assessments, states provided information about the number or percent of students who were absent.
Figure 7 illustrates the participation rates reported in those
states for which there was clear participation rate information reported. Though
the percentage of students tested was given for 22 assessments, those
assessments came from only 15 states; an additional 4 states gave the percent of
students not tested. While it may have been possible to calculate participation
rates for other states as well, using information that was reported about
student enrollment and the number of students tested, we did not take the extra
step to do the math calculations. This is because we were concerned about the
information that was readily available. However, if the state did provide only
the percentage of students not tested, we did report the percentage of students
tested in the table. It is important that states report the percentage of
students tested, in addition to just a count, because this presents a more
accurate picture of how many students are participating. These rates should
ideally be based on the school enrollment on the day of testing (Ysseldyke, et
al., 1998); however, using the December 1st Child Count data is also an
acceptable option if test day enrollment is not available.
To summarize participation rate information, we selected one grade to portray in
Figure 7. In most states, participation in the middle school/junior high school
math test was used. If the state tested in more than one grade in the middle
school level, the 8th grade test data were used. Appendix E contains information
about the tests and exact grades used for Figure 7. Not all states provided data
broken down in this way. For example, in Nevada, the grade 8 data are combined
for the reading and math test, and the South Carolina data are for all grades
combined for the math assessment. Oklahoma, Texas, and West Virginia also
provided a rate, but as it was for all subjects and grades, it was not included.
Three other states (Illinois, Kentucky, New Hampshire) provided a participation
rate, which was the number of students with disabilities who participated out of
all students rather than a percent of students with disabilities who were
tested. Though this is more helpful than not providing a rate, it was not
included in Figure 7 because it was a different type of rate. (Maine provided a
rate for the number of students with disabilities who took the test with
accommodations, but not for the number of students with disabilities who tested
without an accommodation; therefore, those results are not included.) It is
important to note that results in Figure 7 were obtained from different types of
tests that were being used in these states. Nevertheless, during this 2001-2002
academic year, participation rates ranged from 71.1% to 99.1%. Five out of the
12 states had participation rates of 95% or higher.
Figure 8 illustrates how different states reported participation
for their alternate assessment. Appendix F outlines in more detail all the ways
that information is reported. Twenty-seven states provided participation
information for their alternate assessment. Fourteen states provided only a
count and not a percentage of students tested or not tested. Three states
provided the opposite: the percentage of students tested or not tested, but not
the number of students tested or not tested. Ten states gave both a count and
the percentage of students either tested or not tested.
In our analysis of state reporting for 2001-2002, we looked at additional characteristics of states’ information. Specifically, we looked at information available on accommodations used, and if available, performance when accommodations were used. We also examined the quality of Web-based reporting.
Fourteen states provided information about students who took an assessment with an accommodation. In some cases, states reported on standard accommodations (those considered appropriate and not ones that change the constructs measured by the assessment); in other cases they reported on nonstandard accommodations (which generally were considered to change the constructs measured – and might be referred to as “non-allowed” – although IEP teams could select them), and in other cases they reported on both or did not specify which.
Table 1 describes the information the 14 states provided. Appendix G contains additional information about the information provided by the 14 states, with details about the participation and performance of students in each category that the state provides. Five states broke down student participation and performance by accommodation (e.g., directions read orally, Braille, extended time), whereas 9 states provided only overall information on students who, in general, used accommodations.
Table 1. States that Reported Information about Accommodations
State |
Standard/Non-standard Accommodation |
Participation |
Performance |
For whom |
Arkansas |
Not specified |
Yes |
Yes |
SWD |
Colorado |
Not specified |
Yes |
Yes |
ALL |
Indiana |
Not specified |
Yes |
Yes |
SWD & General Ed |
Kentucky |
Standard |
Yes |
Yes |
SWD |
Louisiana |
Not specified |
Yes |
Yes |
ALL |
Massachusetts |
Standard |
Yes |
No |
SWD |
Nebraska |
Standard |
Yes |
No |
ALL |
Nevada |
Non-standard (not allowed) |
Yes |
No |
SWD |
New Hampshire |
Non-standard (not allowed) |
Yes |
No |
ALL |
New Mexico |
Standard |
Yes |
Yes |
ALL |
North Carolina |
Standard and Non-Standard |
Yes |
Yes |
ALL |
Rhode Island |
Standard |
Yes |
Yes |
SWD |
West Virginia |
Non-standard (not allowed) |
Yes |
No |
SWD & ALL |
Note: SWD= Students with Disabilities
After examining every state’s Department of Education Web site, it became evident that some states presented data in a much more accessible format than others. Because assessment data are reported on the Web in most states, it is crucial that these data be clear and easy to access. We decided to collect data for each state that reported results for students with disabilities online and examine the quality of the reporting on the Web site. It is important to note, however, that because Web sites are frequently updated, it is possible that some of our findings no longer hold true.
To describe what we found, we identified the elements that we would like to see on a Web site. The following eight elements were evaluated (see Appendix H):
The word(s) on which one must click on the Department of Education’s homepage.
How many steps, or “clicks,” it took to get from the state’s homepage to the disaggregated results.
The proximity of special education data to general education/all students data.
The proximity of special education data to the Alternate Assessment data.
Whether the term “proficient” was defined for student performance.
Whether the data all appear on one page when printed, or if some of the data are cut-off.
Whether the date of testing appeared on the same page with the assessment results (giving just the year- e.g., 2001, did not count because that did not indicate the academic year from which the data came).
Whether there were at least two years of trend data available on the same page or a direct link given on the page (trend is 2000-2001 data and 2001-2002 data).
When finding disaggregated data, one must click on a word on each state’s Department of Education homepage to begin looking for the data. The clarity of each initial word/phrase varied widely by state. Of the 85 assessments with disaggregated Web-based results, 17 used the word “assessment.” Ten used the word “test” or “testing.” Five more used the word “accountability.” Several other states gave the name of their state test as a first step to retrieving assessment data. These are relatively clear indicators of where to find test data. In contrast, some states used words that did not indicate assessment data, such as “More,” “Statistics,” “Administrators,” and “Programs and Services.” It is important for states to make assessment information as publicly accessible as possible so that the effort of posting state assessment results reflects an intent for the public to view and use the data.
To get from a Department of Education Homepage to disaggregated participation and performance data, it is often necessary to “click” several times and follow a series of links through the Web page. The number of clicks it took to get from the homepage to the actual data ranged from 1 to 7 clicks. The most common number of clicks between the homepage and the disaggregated data was 3 clicks. Figure 9 shows the range of steps required to find assessment results on Web sites, starting from the homepage. The numbers reflect tests (total n=85), not states, for which we found disaggregated performance results. We found that the more clicks it took to reach the data, the easier it was to get lost along the way and the more difficult it was to find the results again at a later time.
Figure 9. The Number of "Clicks" from the Homepage to the Disaggregated Data
We also examined the proximity of special education data to the data of general education students and the total population of students. States that had the most comprehensive presentation of data posted disaggregated results either with, or clearly linked to, results for all students. Figure 10 illustrates the range of distances that results for students with disabilities were from results for general education students or the total population of students. This figure is based on the total number of tests (n=85), not the number of states. As evident in the figure, over half of the general assessments had results for students with disabilities on the same page with results for general education students or the total population of students.
Figure 10. Proximity of Special Education Data to Data for All Students or General Education
Of all the assessments with Web-based disaggregated data, 69% had results posted for all students on the same page as results for students with disabilities. Another 15% of assessments had their special education results one click away from their general assessment results, and 11% had these data separated by two or more clicks. Though the majority of these assessments had the special education data 3 clicks away from data for general education students, one state had 2 assessments that were 8 clicks away. For 5% of the assessments, a paper document had been scanned onto the computer and disaggregated data were provided with data for all students in the same document, though on different pages. A problem with this, however, is that many documents are long, some over 300 pages. It takes a long time to look through the document to find those data for which you are searching, and it is quite easy to miss data, too.
We also examined the proximity of the special education performance results to performance or participation information for the state’s alternate assessment. Results varied widely. For 11 assessments, these data were on the same page and for 6 assessments the data were in the same document. The alternate assessment information was 1 click away for 2 assessments, 2 clicks away for 12 assessments, and 3 clicks away for 9 assessments. For the remaining 12 assessments, alternate assessment data were at least 4 or more clicks away, and for three of those, the data were 8 clicks apart.
Another quality issue that we assessed was whether there was a definition for proficiency where performance results were presented. Many people look at assessment data to see the percentage of proficient students in a certain grade, subject, or subgroup. It is necessary that states indicate either by a written term (e.g., basic, proficient, advanced) or by defining the categories (e.g., if numbers are used, a key for what each number stands for should be provided). Of the 85 assessments in our analysis, 65 provided a definition of proficiency. Many states used the performance levels of Below Basic, Basic, Proficient, and Advanced. However, others used just the percent passing or meeting standard versus percent not passing or not meeting standard distinction.
Two other quality issues that we explored were whether the data were able to be printed on one page and whether the date of the assessment was provided with the results. In several cases we tried printing the data, but what appeared on the Web site did not all print onto a page, even when printing using a horizontal page layout. Frequently data on the right side of the page was cut off. For 19 of the 85 assessments (22%), the data that appeared on the Web site did not all print. The second additional indicator reflected a belief that it was important to have the date of testing on the same page as the test results so that people would know the year of the data they were examining. To be considered as having the date, we did not count states that only gave one year (e.g., 2002) because it was not clear whether that data came from the 2001-2002 school year or the 2002-2003 school year. States needed to give an academic year (e.g., 2001-2002) or provide the month or season (e.g., fall 2001) in which the assessment occurred. Twenty-two assessments (26%) did not provide the date with their assessment data.
A final quality issue we chose to examine was whether two years of assessment trend data were reported. Of the 85 assessments, 36 (42%) did provide at least two years of trend data (2001-2002 and 2000-2001). We only counted states that had two years of data available together on the same page or one link apart. While we recognize that more assessments might have data from the previous year available on the Web site, we believed that it was too difficult to examine trends and patterns when the data were spread out.
States such as Nevada and Washington presented particularly clear and thorough results. Disaggregated assessment results were posted only three clicks from the homepage. For Nevada, results for students with disabilities were given on the same page as results for all students, and in Washington, results were only one click apart. Though Nevada did not provide results for its alternate assessment, participation for Washington’s alternate assessment was given on the same page as results for both all students and students with disabilities; both states defined “proficiency.” The results for Nevada and Washington had the date of testing on them and were able to be printed without any part being cut off. Though Nevada did not provide trend data, Washington had data from the previous three years of testing just one click away.
We examined the performance of all students, and then the performance of students with disabilities. When examining performance across states, it is important to remember that the scores from each state are based on different tests. These tests may emphasize different standards and are likely to differ in difficulty. In addition, there is great variability across states in terms of the percentages of students with disabilities who have been included in the assessments. Thus, it is not appropriate to compare performance across states. It is possible, however, to examine the performance differences within each state between all students and students with disabilities.
Performance results are reported for both reading and math assessments because these content domains are the ones assessed by most states and are the content areas required first by NCLB to be assessed, reported, and included in accountability. For greater comparability in what we report and because states are now moving away from norm-referenced tests toward a wider use of criterion-referenced tests, we only report performance on CRTs. We also report performance on Exit exams that students are required to pass to graduate from high school with a standard diploma.
We separated grade levels into three categories: elementary (3-5), middle school (6-8), and high school (9-12). For our summary, we chose to present only one grade for each level. When available, 4th grade was used to represent the elementary level, 8th grade to represent the middle school level, and 10th grade to represent the high school level. These grades were chosen because they are the grades at which the greatest number of states test students. If data from those grades were not available, the grade below was used, followed by remaining grade if no other data were available. The number in the parenthesis next to the state’s name indicates the grade from which the data were obtained. Appendix I reports the name of the test we used and the grade.
Although most states reported the performance of all students and then the performance of subgroups, such as students with disabilities, some states did not report the performance of all students. When these data were not available, the performance of general education students was given. Because the performance of general education students as a group may be slightly higher than the performance of all students as a group, we have indicated those states with “all students” actually based only on general education students by an asterisk after the name of the state. These data are presented in Figures 11-18.
Figure 11. Elementary School Reading Performance on Criterion-Referenced Tests
Figure 12. Middle School Reading Performance on Criterion-Referenced Tests
Figure 13. High School Reading Performance on Criterion-Referenced Tests
As evident in Figures 11-13, the performance of students with disabilities in reading is generally much lower than the performance of all students. Though the gap is greater in some states than in others, students with disabilities are always performing below all students. As students move from elementary to high school, the gap grows wider. At the elementary level, the widest gap was 43.7 points in New Jersey. In middle school the greatest gap was 56 points in both New Jersey and Delaware. At the high school level, the largest gap was 67 points in Connecticut. Though these are the largest gaps, the pattern is the same for most states.
Performance of all students and students with disabilities on states’ 2001-2002
mathematics assessments is shown in Figures 14-16. The Figures cover elementary,
middle, and high school in a manner similar to the reading figures and with the
same cautions.
Figure 14. Elementary School Mathematics Performance on Criterion-Referenced Tests
Figure 15. Middle School Mathematics Performance on Criterion-Referenced Tests
Figure 16: High School Mathematics Performance on Criterion-Referenced Tests
As shown in Figures 14-16, the gap between students with disabilities and all students is quite similar to the gap found for reading assessments. The gap for math assessments exists in all states and varies considerably from state to state. The gap also increases by grade level. In elementary grades, the largest gap was 39.2 points in Delaware. In middle school, the largest gap was 51.2 points in New Jersey, and in high school it was 64.3 points in Connecticut.
Figures 17 and 18 show the results of high school reading and math exit exams. States administer exit exams in different grades. The number in the parenthesis next to the state’s name indicates the grade from which the data come. Only those states that report disaggregated results for students with disabilities are included in these figures. Also these results reflect only the first administration of the exit exam. States offer multiple retest opportunities for their exit exams and the percent passing increases with each retest. Often the gaps between general and special education students become very small on retesting.
Figure 17. Percent Passing Minimum Competency/High School Reading Exit Exam
Figure 18. Percent Passing Minimum Competency/High School Mathematics Exit Exam
The figures presented here for first-time testing show that large gaps exist for exit exams, though the percent of students passing the exit exams varies widely by state. For both reading and math, New Jersey had the largest gap (52.3 points for reading; 50.9 points for math). The gap on reading tests was small for both Louisiana (13 point difference) and Maryland (11.7 point different) though the percent passing in Louisiana was very low and the percent passing in Maryland was quite high. For math, again the gap was smallest in Louisiana (16 point difference) and Maryland (17.6 point difference) and the percent passing in Louisiana was low whereas the percent passing in Maryland was relatively high.
This sixth analysis of state education public reporting shows that states continue to make progress in the amount of information they report on the participation and performance of students with disabilities. Still, it is disappointing to see that not all states are reporting disaggregated information, and fewer than half of the states are reporting both participation and performance information for their alternate assessment.
A total of 48 states reported some information for their state assessments. Of these states, only 35 reported participation and performance for all of their assessments. An additional 9 states provided participation and performance information for some of their assessments, and 4 states reported performance data for all of their tests, though not participation. The number of states reporting both participation and performance rose from 28 states during the 2000-2001 academic year to 35 in 2001-2002. When examining participation rates for students with disabilities, the data, though variable, were much more stable than results from the 2000-2001 school year. In 2000-2001, the data ranged from 30% to 97.4% participating, whereas in 2001-2002 the data ranged from 71.1% to 99.1%.
When examining alternate assessments, only 32 states reported any information. Though this is an increase from only 25 states during the 2000-2001 year, states clearly are not reporting on their alternate assessments at the same level as they are for their general assessment. Twenty-two states provided both participation and performance data for their alternate (up from 13 states in 2000-2001), 5 states gave performance data only, and 5 states gave participation data only. The lower level of alternate assessment reporting seems to be due only in part to the fact that some states were still working on the development of their alternate assessments. Many of the non-reporting states had alternate assessments and even had reporting plans (Thompson and Thurlow, 2001), but had just not reported.
When examining ways that states report participation information for the general assessment, we found that for most general assessments (72%) only a count was reported. For 22 assessments (26%), states reported the percentage of students tested. While a rate is a more meaningful way to provide information about participation than just providing the number of students tested, only approximately ¼ of all assessment data included a rate. For 15 assessments (17%), states provided either the number or percentage of students who were not tested. For 6 assessments (7%), states gave either the number or percentage of students who were absent from testing.
When we examined the performance of students, we found that for the general assessment large gaps existed between students with disabilities and all students. Though some gaps were significantly larger than others, the gaps were noticeable for all states that provided performance data. Gaps increase as students get older.
This was the second year that we systematically examined that quality of Web-based reporting and how easily accessible the data were. Though many state Department of Education home-pages provided clear first links to their data by using words such as “Assessments,” “Student Testing,” or “Accountability,” other states still used vague terms such as “Statistics,” “Administrators,” and “Programs and Services.” The most common number of clicks it took to get to the disaggregated data from the homepage was three clicks. For 69% of the assessments, data for special education students and all students were presented on the same page. For 15% of the assessments, the data was one click apart. Quality issues were also examined such as whether all the data printed, whether the data had a date on it, and whether more than one year of data was presented so that trends could be examined. If districts, schools, teachers, parents, and others are going to use publicly reported data, it is essential that the data not only be reported but that they be easily accessible. There were no noticeable improvements in the accessibility of data from the 2000-2001 reports.
With the push from No Child Left Behind to provide assessment data to schools by the start of the school year, Web-based reporting has clearly become the primary vehicle for sharing data with the public. It is crucial, then, that the data be easy to both locate and comprehend. Based on our analyses of both Web-based and paper reports, we make the following recommendations:
Report not only the number of students with disabilities assessed, but also the percentage assessed. When states provide the number of students assessed, this information is less helpful than when a percentage is provided. By giving a percent, people are able to get a more accurate picture of how many students are participating in the state assessment system.
Ensure that Web-based assessment information is dated so the viewer knows what testing year it was from and that it can be printed. When states only indicate the testing year by giving one year (e.g., 2002), it is unclear whether the data are from the fall or spring. States should either provide the entire school year, such as 2001-2002 or provide a month or season, such as Fall 2001. Another quality issue is ensuring that the test data all prints so that people can have a hard copy of the test information. We recommend that if the data is on the Web and will not all print onto a standard piece of paper, a “print-format” option should be provided, which can provide the data in a standard format that can be printed.
Report results for the alternate assessment. Though states are finally beginning to provide participation and performance data for their general assessment, they are still slow about reporting that information for their alternate assessment. This information should be provided so that the public can see how all students are performing.
Report the number and percent of students with disabilities using accommodations. Because of various disabilities, many students are not able to take the general assessment in the standard format, and thus are provided with accommodations. Many states consider the scores of some of these accommodated assessments to either not count or to count as “not-proficient” because they are non-standard accommodations. In some states, the number of students participating using non-standard accommodations is quite high. If these numbers are not reported, then the picture painted of how all students are doing will be inaccurate.
Although there has been improvement in the reporting of assessment participation and performance data for students with disabilities, there is still much more that can be done. To some extent, the greatest improvement is in the mere reporting of the data. This should have been accomplished several years ago. What is still needed in many states – not all, of course – is a serious commitment to ensuring that the reporting for students with disabilities is on a par with that of other students. In other words, it should be just as easy to find these data, and they should be just as clear as the data for general education students. Equality in presentation – easily accessed and transparent – should be the first criterion on which the reporting of data for students with disabilities is judged. While we are almost there, we are not there yet.
Bielinski, J., Thurlow, M.L., Callender, S., & Bolt, S. (2001). On the road to accountability: Reporting outcomes for students with disabilities (Technical Report 32). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/Technical32.htm
Chibulka, J.G. & Derlin, R.L. (1995). State educational performance reporting policies in the U.S.: Accountability’s many faces. (ERIC Abstract ED 401 613).
Elliott, J.L., Thurlow, M.L., & Ysseldyke, J.E. (1996). Assessment guidelines that maximize the participation of students with disabilities in large-scale assessments: Characteristics and considerations (Synthesis Report 25). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/synthesis25.html
Erickson, R.N., Thurlow, M.L., & Thor, L. (1995). 1994 state
special education outcomes. Minneapolis, MN: University of Minnesota, National
Center on Educational Outcomes.
Fast, E.F., Blank, R.K., Potts, A., & Williams, A. (2002). A guide to effective
accountability reporting. Washington, DC: Council of Chief State School
Officers.
McGrew, K.S., Thurlow, M.L., Shriner, J.G., & Spiegel, A.N. (1992). Inclusion of students with disabilities in national and state data collection programs (Technical Report 2). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.
National Research Council (1999). Testing, teaching, and learning: A guide for states and school districts. Committee on Title 1 Testing and Assessment, R. F. Elmore and R. Rothman, (Eds.) Board on Testing and Assessment, Commission on Behavioral and Social Sciences and Education. Washington, DC: National Academy Press.
Robelen, E.W. (2004). States given more leeway on test rule. Education Week, 23(30), 1, 28-29.
Shriner, J.G., & Thurlow, M.L. (1992). State special education outcomes 1991. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.
Thompson, S. J., & Thurlow, M. L. (2001). 2001 state special education outcomes: A report on state activities at the beginning of a new decade. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/2001StateReport.html
Thurlow, M.L., House, A., Boys, C., Scott, D., & Ysseldyke, J. (2000). State participation and accommodations policies for students with disabilities: 1999 update (Synthesis Report 33). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/Synthesis33.html
Thurlow, M.L., Langenfeld, K.L., Nelson, J.R., Shin, H., & Coleman, J.E. (1998). State accountability reports: What are states saying about students with disabilities? (Technical Report 20). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/Technical20.htm
Thurlow, M.L., & Thompson, S.J. (1999). District and state standards and assessments: Building an inclusive accountability system. Journal of Special Education Leadership, (12)3-10.
Thurlow, M., Wiley, H.I., & Bielinski, J. (2003). Going public: What 2000-2001 reports tell us about the performance of students with disabilities (Technical Report 35). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/Technical35.htm
Thurlow, M.L., Ysseldyke, J.E., Erickson, R.N., & Elliott, J.L. (1997). Increasing the participation of students with disabilities in state and district assessments (Policy Directions No. 6). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/Policy6.html
Thurlow, M.L., Ysseldyke, J.E., & Silverstein, B. (1995).
Testing accommodations for students with disabilities. Journal of Remedial and
Special Education, (16)260-270.
U.S. Department of Education. (1999). Peer reviewer guidance for evaluating
evidence of final assessments under Title I of the Elementary and Secondary
Education Act. Washington, DC: Author.
Ysseldyke, J.E., Thurlow, M.L., Langenfeld, K., Nelson, J.R., Teelucksingh, E., & Seyfarth, A. (1998). Educational results for students with disabilities: What do the data tell us? (Technical Report 23). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Available at http://education.umn.edu/nceo/OnlinePubs/TechnicalReport23/technical_report_23.htm
Zlatos, B. (1994). Don’t test, don’t tell: Is “academic red-shirting” skewing the way we rank our schools? The American School Board Journal, (181)24-28.
The National Center on Educational Outcomes is examining states’ public reports on 2001-2002 school year assessment results. We have reviewed your state’s Web site for both participation and performance data on your statewide assessments. Attached tables reflect what we believe to be the tests your state administers and the results that we have found thus far on the Web (Table 1), how participation information is reported for students with disabilities (if it is available) (Table 2), and whether information is given about students who took assessments with individual accommodations (Table 3).
Please review the table and verify its accuracy. Our goal is to (a) identify all components of each state’s testing system, (b) determine whether each state reports disaggregated test results for students with disabilities, (c) describe the way participation information is presented, and (d) describe how states report results for students who took the test with accommodations or modifications.
If any data element is inaccurate, please provide us with the public document and/or website that contains the accurate information. Address your responses to Hilda Ives Wiley at the above address.
If you have any questions about our request, please call Hilda Ives Wiley at (612) 626-8913 or email: ives0016@umn.edu. If we do not hear from you by Friday, June 27, 2003, we will assume that our summaries are accurate.
Thank you for taking the time to verify our findings.
Sincerely,
Hilda Ives Wiley
Graduate Research Assistant
Martha Thurlow
Director
Table 1: Tests Administered and Results Found
State |
Test |
Grades Tested |
Subject Areas |
Disaggregated Results for students with disabilities |
|
Alabama |
|
|
|
Participation |
Performance |
Direct Assessment of Writing |
5,7 |
Writing |
Yes |
Yes |
|
High School Graduation Exam |
11,12 |
Reading, Math, Science |
Yes |
Yes |
|
SAT-9 |
3-8 |
Reading, Language, Math, Science, Social Studies |
Yes |
Yes |
|
Alternate Assessment |
3-8, 11, 12 |
Not specified |
Yes |
Yes |
Table 2: Participation Information for Students with Disabilities
State |
Test |
Count Tested |
Count Not Tested |
Count Exempt |
Count Excluded |
% of students tested |
% of students not tested |
% Exempt |
% Excluded |
Count and/or Percent Absent |
AL |
Direct Assessment of Writing |
|
|
|
|
|
|
|
|
|
High School Graduation Exam |
|
|
|
|
|
|
|
|
|
|
SAT-9 |
|
|
|
|
|
|
|
|
|
|
Alternate Assessment |
|
|
|
|
|
|
|
|
|
Blank cell = No data
Table 3: Accommodations
Test |
Standard Administration |
Nonstandard Administration |
||
|
Participation |
Performance |
Participation |
Performance |
Direct Assessment of Writing |
|
|
|
|
High School Graduation Exam |
|
|
|
|
SAT-9 |
|
|
|
|
Please place a Yes or No into each box to describe information that is publicly reported.
(Two Forms Depending on Input from Assessment Director. Example
here is if letter was verified by the Assessment Director. If no verification,
letter was the same as in Appendix A.)
The National Center on Educational Outcomes is examining states’ public reports
on 2001-2002 school year assessment results. We have reviewed your
state’s Web site for both participation and performance data on your statewide
assessments. Attached tables reflect what we believe to be the tests your state
administers and the results that we have found thus far on the Web (Table 1),
how participation information is reported for students with disabilities (if it
is available) (Table 2), and whether information is given about students who
took assessments with individual accommodations (Table 3). These results were
verified by your state’s Director of Assessment, but if you have anything to
add, please let us know.
Please review the tables and verify their accuracy. Our goal is to (a) identify all components of each state’s testing system, (b) determine whether each state reports disaggregated test results for students with disabilities, (c) describe the way participation information is presented, and (d) describe how states report results for students who took the test with accommodations or modifications.
If any data element is inaccurate, please provide us with the public document and/or Web site that contains the accurate information. Address your responses to Hilda Ives Wiley at the above address.
If you have any questions about our request, please call Hilda Ives Wiley at (612) 626-8913 or email: ives0016@umn.edu. If we do not hear from you by Friday, August 29, 2003, we will assume that our summaries are accurate.
Thank you for taking the time to verify our findings.
Sincerely,
Hilda Ives Wiley
Graduate Research Assistant
Martha Thurlow
Director
Table 1: Tests Administered and Results Found
State |
Test |
Grades Tested |
Subject Areas |
Disaggregated Results for students with disabilities |
|
Alabama |
|
|
|
Participation |
Performance |
Direct Assessment of Writing |
5,7 |
Writing |
Yes |
Yes |
|
High School Graduation Exam |
11,12 |
Reading, Math, Science |
Yes |
Yes |
|
SAT-9 |
3-8 |
Reading, Language, Math, Science, Social Studies |
Yes |
Yes |
|
Alternate Assessment |
3-8, 11, 12 |
Not specified |
Yes |
Yes |
Table 2: Participation Information for Students with Disabilities
State |
Test |
Count Tested |
Count Not Tested |
Count Exempt |
Count Excluded |
% of students tested |
% of students not tested |
% Exempt |
% Excluded |
Count and/or Percent Absent |
AL |
Direct Assessment of Writing |
|
|
|
|
|
|
|
|
|
High School Graduation Exam |
|
|
|
|
|
|
|
|
|
|
SAT-9 |
|
|
|
|
|
|
|
|
|
|
Alternate Assessment |
|
|
|
|
|
|
|
|
|
Blank cell = No data
Table 3: Accommodations
Test |
Standard Administration |
Nonstandard Administration |
||
|
Participation |
Performance |
Participation |
Performance |
Direct Assessment of Writing |
No |
No |
No |
No |
High School Graduation Exam |
No |
No |
No |
No |
SAT-9 |
No |
No |
No |
No |
Please change our No to a Yes if the information is publicly available.
State |
Assessment Component |
Grades |
Subject |
Disaggregated Special Education Data |
||
Part |
Perf |
|||||
Alabama |
Direct Assessment of Writing [CRT] |
5,7 |
Writing |
Yes |
Yes |
|
High School Graduation Exam [EXIT] |
11, 12
|
Reading, Language, Math, Science |
Yes |
Yes |
||
Stanford Achievement Test, 9th ed. (SAT-9) [NRT] |
3-8 |
Reading, Language, Math, Science, Social Studies |
Yes |
Yes |
||
Alaska |
California Achievement Test, 6th ed. (CAT-6) [NRT] |
4,5,7,9 |
Reading, Language, Math |
Yes |
Yes |
|
Benchmark Exams [CRT] |
3,6,8 |
Reading, Writing, Math |
Yes |
Yes |
||
High School Graduation Qualifying Exam [EXIT] |
10 |
Reading, Writing, Math |
Yes |
Yes |
||
Arizona |
Stanford Achievement Test, 9th ed. (SAT-9) [NRT] |
2-11 |
Reading, Language, Math |
Yes |
Yes |
|
AZ Instrument to Measure Standards (AIMS) [CRT] |
3,5,8 |
Reading, Math, Writing |
Yes |
Yes |
||
AIMS [EXIT] |
10 |
Reading, Math, Writing |
Yes |
Yes |
||
Arkansas |
Stanford Achievement Test, 9th ed. (SAT-9) [NRT] |
5,7,10 |
Complete Battery |
Yes |
Yes |
|
AR Comprehensive Testing, Assessment, and Accountability Program (ACTAAP) including End-of-Course (EOC) [CRT] |
4,6,8, 9-12 |
Literacy [Reading & Writing] (4,6,8,11), Math (4,6,8), EOC–Algebra I (9-12), EOC-Geometry (9-12) |
Yes |
Yes |
||
California |
Standardized Testing And Reporting Program (STAR) • SAT-9 [NRT]
• Spanish Assessment of Basic Education (SABE/2) [NRT]
• Content Standard [CRT] |
2-11
2-11
2-11 |
Reading, Language, Math Spelling (2-8), Science (9-11), Social Science (9-11)
Reading, Language, Math Spelling (2-8)
English/Language Arts, Math (2-7,11) [Algebra I, II; Geometry; Integrated 1,2,3 for 8-10] |
Yes
Yes
Yes |
Yes
Yes
Yes |
|
Colorado |
CO Student Assessment Program (CSAP) [CRT] |
3-10 |
Reading (3-10), Math (5-10), Writing (4,7,10), Science (8) |
Yes |
Yes |
|
Connecticut |
CT Mastery Test (CMT) [CRT] |
4,6,8 |
Reading, Math, Writing |
Yes |
Yes |
|
CT Academic Performance Test (CAPT) [CRT] |
10 |
Reading, Math, Writing, Science |
Yes |
Yes |
||
Delaware |
DE Student Testing Program (DSTP) [SAT-9 for R,M with other criterion measures; [NRT/CRT] |
3-6,8,10,11 |
Reading (3,5,8,10), Writing (3,5,8,10), Math (3,5,8,10), Science (4,6,8,11), Social Studies (4,6,8,11) |
Yes |
Yes
|
|
Florida |
FL Comprehensive Assessment Test (FCAT) includes SAT-9 [NRT/CRT] |
3-10 |
Reading (NRT 3-10/CRT 4,8,10), Math (NRT 3-10/CRT 5,8,10), Writing (CRT 4,8,10) |
Yes
|
Yes
|
|
High School Competency Test (HSCT) [EXIT] (for those not exempted by their FCAT performance in 10th grade ) |
11 |
Reading, Math |
No |
No |
|
|
Georgia
|
GA High School Graduation Test (GHSGT) [EXIT] |
11 |
English/Language Arts, Math, Science, Social Studies, Writing |
Yes |
Yes |
|
Criterion-Referenced Competency Tests (CRCT) [CRT] |
1-8 |
Reading, English/Language Arts, Math, Science (3-8), Social Studies (3-8) |
Yes |
Yes |
|
|
Middle Grades Writing Assessment [CRT] |
5,8 |
Writing |
Yes |
Yes |
|
|
Hawaii |
Stanford Achievement Test, 9th ed. (SAT-9) [NRT] |
3,5,7,9 |
Reading, Math |
No |
No
|
|
Idaho
|
ID Direct Assessments [CRT] |
4,8,11 |
Math (4,8), Writing (4,8,11) |
Yes |
Yes |
|
Iowa Tests of Basic Skills (ITBS) [NRT]
Tests of Achievement and Proficiency (TAP) [NRT] |
3-8
9-11 |
Reading, Language, Math, Science (3,5,7), Social Studies (3,5,7) Sources of Information (3,5,7)
Reading, Writing, Math, Science (9), Social Studies (9), Information Processing (9) |
Yes (only report grades 4,8, & 11)
Yes |
Yes (only report grades 4,8, & 11)
Yes |
|
|
Illinois |
IL Standards Achievement Test (ISAT) [CRT] |
3,4,5,7,8 |
Reading (3,5,8), Math (3,5,8), Writing (3,5,8), Science (4,7), Social Studies (4,7) |
Yes |
Yes |
|
Prairie State Achievement Exam [CRT] |
11 |
Reading, Math, Writing, Science, Social Studies |
Yes |
Yes |
|
|
Indiana |
IN Statewide Testing for Educational Progress (ISTEP+) [NRT/CRT] |
3,6,8 |
Language Arts, Math
|
Yes |
Yes
|
|
Graduation Qualifying Exam [EXIT] |
10 |
Language Arts, Math |
Yes |
Yes |
|
|
Iowa |
ITBS/ITED [NRT] (VOLUNTARY participation) |
3-12 (only report on grades 4,8,10) |
Reading, Math, Science (9-11), Social Studies (9-11) |
Yes
|
Yes |
|
Kansas |
KS Assessment System [CRT]
|
4-8,10,11 |
Reading (5,8,11), Math (4,7,10), Science (4,7,10), Social Studies (6,8,11) |
Yes |
Yes
|
|
Kentucky |
Comprehensive Test of Basic Skills, 5th ed. (CTBS/5) [NRT] |
3,6,9 |
Reading, Language, Math |
Yes |
Yes |
|
KY Core Content Test [CRT] |
4,5,7,8, 10-12 |
Reading (4,7,10), Math (5,8,11), Writing (4,7,12), Science (4,7,11), Social Studies (5,8,11), Arts & Humanities (5,8,11), Practical Living & Vocational Studies (5,8,10) |
Yes |
Yes |
|
|
Louisiana |
Developmental Reading Assessment (DRA) [CRT] |
2,3 |
Reading |
No |
No |
|
LA Educational Assessment Program (LEAP 21) [CRT] |
4,8 |
English/Language Arts, Math, Science, Social Studies |
Yes |
Yes |
||
Graduation Exit Exam- 21 [EXIT] |
10, 11 |
Language Arts (10), Math (10), Science (11), Social Studies (11) |
Yes |
Yes |
||
Iowa Tests of Basic Skills/Iowa Tests of Educational Development [NRT] |
3,5-7,9 |
Complete Battery |
Yes |
Yes |
||
Maine |
Maine Educational Assessment (MEA) [CRT] |
4,8,11 |
Reading, Writing, Health, Science, Math, Social Studies, Visual & Performing Arts |
No |
No |
|
Maryland |
MD School Performance Assessment Program (MSPAP) [CRT] |
3,5,8 |
Reading, Writing, Language Usage, Math, Science, Social Studies |
Yes |
Yes |
|
MD Functional Tests [EXIT] |
9,11 |
Reading, Math, Writing, |
Yes |
Yes |
||
Comprehensive Tests of Basic Skills, 5th ed. (CTBS/5) [NRT] |
2,4,6 |
Reading, Language, Math |
Yes |
Yes |
||
High School Assessment [CRT] |
9-12 |
English, Biology, Geometry, Government, Algebra |
No |
Yes |
||
Massachusetts |
MA Comprehensive Assessment System (MCAS) [CRT] |
3,4-8,10 |
Reading (3), English Language Arts (4,7,10), Math (4,6,8,10), Science/Technology (5,8), History/Social Science (5,8) |
Yes |
Yes |
|
Michigan |
MI Educational Assessment Program (MEAP) [CRT] |
4,5,7,8,11
|
Reading (4,7,11), Math (4,8,11), Writing (5,7,11), Social Studies (5,8,11), Science (5,8,11) |
Yes
|
Yes
|
|
Minnesota |
MN Comprehensive Assessment (MCA) [CRT] |
3,5 |
Reading, Math, Writing (5) |
Yes |
Yes |
|
Basic Standards Exam [EXIT] |
8,10 |
Reading (8), Math (8), Writing (10) |
Yes |
Yes |
||
Mississippi |
Grade Level Testing Program · Comprehensive Tests of Basic Skills, 5th ed. (CTBS/5) [NRT]
· MS Curriculum Test (MCT) [CRT]
· Writing Assessment [CRT] |
5,8
2-8
4,7 |
Reading, Language, Math
Reading, Language, Math
Writing |
Yes
Yes
Yes |
Yes
Yes
Yes |
|
Functional Literacy Exam (FLE) [EXIT] |
11 |
Reading, Math, Writing |
Yes |
Yes |
||
Missouri |
MO Assessment Program (MAP) (Terra Nova survey) [NRT/CRT] |
3-5,7-11 |
Communication Arts (3,7,11), Math (4,8,10), Science (3,7,10), Social Studies (4,8,11), Heath & Physical Education (5,9) |
Yes |
Yes |
|
Montana |
Iowa Tests of Basic Skills/ Iowa Tests of Educational Development (ITBS/ITED) [NRT] |
4,8,11 |
Reading, Math, Science, Social Studies, Language Arts |
Yes |
Yes |
|
Nebraska |
Nebraska Statewide Writing Assessment [CRT] |
4 |
Writing |
Yes |
Yes |
|
Assessment of State Mathematics Standards [CRT] |
4,8,11 |
Math |
Yes |
Yes |
||
Nevada |
Terra Nova Comprehensive Tests of Basic Skills, 5th ed. (CTBS/5) [NRT] |
4,8,10 |
Reading, Language, Math, Science |
Yes |
Yes |
|
Nevada Criterion Referenced Test [CRT] |
3,5 |
Reading, Math |
Yes |
Yes |
||
NV High School Proficiency Exam [EXIT] |
10-12 |
Reading, Math, Writing |
Yes |
Yes |
||
New Hampshire |
NH Educational Improvement and Assessment Program (NHEIAP) [CRT] |
3,6,10 |
English Language Arts, Math, Science (6,10), Social Studies (6,10) |
Yes |
Yes |
|
New Jersey |
Elementary School Proficiency Assessment (ESPA) [CRT]
Grade Eight Proficiency Assessment (GEPA) [CRT] |
4
8 |
Language Arts/Literacy, Math
Language Arts/Literacy, Math, Science |
Yes
Yes
|
Yes
Yes
|
|
High School Proficiency Assessment (HSPA) [EXIT] |
11 |
Language Arts Literacy, Math |
Yes
|
Yes |
||
New Mexico |
NM Achievement Assessment Program (NMAAP) (CTBS/5 & other criterion measures) [NRT/CRT] |
3-9 |
Reading, Language, Math, Science, Social Studies |
Yes |
Yes |
|
NM High School Competency Exam [EXIT] |
10 |
Reading, Language Arts, Math, Science, Social Studies |
Yes |
Yes |
||
NM Writing Assessment Program [CRT] |
4,6 (8 optional) |
Writing |
Yes |
Yes |
||
New York |
Occupational Education Proficiency Exams [EXIT] |
9-12 |
Occupational Education |
No |
No
|
|
Regents Comprehensive Exams [EXIT] |
9-12 |
English, Foreign Languages, Math, History/Social Studies, Science |
No |
No
|
||
Regents Competency Test [EXIT] |
9-12 |
Reading, Math, Science, Writing, Global Studies, US Hist & Gov’t |
Yes |
Yes
|
||
NY State Assessment Program [CRT] |
4,8 |
English/Language Arts, Math, Science |
No |
Yes |
||
North Carolina |
Grade 3 Pre-test [CRT] |
3 |
Reading, Math |
Yes |
Yes |
|
End of Grade [CRT] |
3-8 |
Reading, Math |
Yes |
Yes |
||
Writing test [CRT] |
4,7 |
Writing |
Yes |
Yes |
||
Computer Skills [CRT] |
8 |
Computer |
Yes |
Yes |
||
Competency Test [EXIT] |
9 |
Reading, Math |
Yes |
Yes |
||
End of Course [CRT] |
9-12 |
Biology, Chemistry, Economics, English I, Physical Science, Physics, U.S. History, Algebra I, Algebra II, & Geometry |
Yes |
Yes |
||
North Dakota
|
ND State Assessment [CRT] |
4,8,12 |
Reading/Language Arts, Math |
No |
Yes |
|
Ohio |
OH Proficiency Tests [CRT] |
4,6,10 |
Reading, Writing, Math, Science, Citizenship |
Yes (district level only) |
Yes |
|
OH Proficiency Test [EXIT] |
9 |
Reading, Writing, Math, Science, Citizenship |
Yes (district level only) |
Yes |
||
Oklahoma |
Core Curriculum Tests [CRT] |
5,8,11 |
Reading, Math, Writing, Science, History/Constitution/ Government, Geography, OK History, Art |
Yes |
Yes
|
|
Oregon |
OR State Assessment [CRT]
Certificate of Mastery for 10th [EXIT] |
3,5,8,10 |
Reading/Literature, Math, Math Problem Solving (5,8,10), Writing, Science (8,10) |
No
No |
No
No |
|
Pennsylvania |
PA System of School Assessment (PSSA) [CRT] |
3,5,6,8,9,11 |
Reading (3,5,8,11), Math (3,5,8,11), Writing (6,9,11) |
Yes |
Yes |
|
Rhode Island
|
New Standards Reference Examinations [CRT]
RI State Writing Assessment [CRT]
RI Health Education Assess [CRT] |
4,8,10
3,7,11
5,9 |
Reading, Math, Writing
Writing
Health |
Yes
Yes
Yes |
Yes
Yes
Yes |
|
South Carolina |
Palmetto Achievement Challenge Tests (PACT) [CRT] |
3-8 |
English/Language Arts, Math |
Yes (but not broken down by grade level) |
Yes (but not broken down by grade level) |
|
High School Exit Exam [EXIT] |
10 |
Reading, Math, Writing |
No |
Yes |
||
South Dakota
|
Stanford Achievement Test, 9th ed. (SAT-9) [NRT]
Stanford Writing Assessment [NRT] |
2,4,8,11
5,9 |
Reading, Language Arts, Math, Environment (2), Science (4,8,11), Social Studies (4,8,11)
Writing |
Yes
No |
Yes
No |
|
Tennessee |
TN Comprehensive Assessment (TCAP) (Terra Nova CTBS/5) [NRT] |
3-8, 11 |
Reading, Language, Math, Science, Social Studies (3-8), Writing (4,7,11) |
Yes |
Yes |
|
TN Competency Test [EXIT] |
9-12 |
Math, Language Arts |
No |
Yes |
||
Gateway Testing Initiative [CRT] |
9-12 |
Math (End-of-Course in Algebra I, II, Geometry, Tech. I) |
Yes |
Yes |
||
Texas |
TX Assessment of Academic Skills (TAAS) [CRT] |
3-8
|
Reading, Math, Writing Science, Social Studies; Spanish version for 3-6 |
Yes |
Yes |
|
Exit Level TAAS [EXIT] |
10-12 |
|
Yes |
Yes |
||
Statewide End-of-Course Tests [CRT] |
9-12 |
Algebra I, English II, US History, Biology |
No |
No |
||
Reading Proficiency Tests in English [CRT] |
3-12 |
English Reading Proficiency |
Yes |
Yes |
||
Utah |
Stanford Achievement Test, 9th ed. (SAT-9) [NRT] |
3,5,8,11 |
Reading, Language, Math, Science, Social Science, Thinking Skills (5,8,11) |
Yes |
Yes |
|
Core Criterion-Referenced Tests [CRT] |
1-12 |
Language Arts, Math, Science (4-12) |
Yes |
Yes |
||
Vermont |
VT Comprehensive Assessment System [CRT] |
2,4,5,8,10, 11 |
Reading (2), English/ Language Arts (4,8,10), Math (4,8,10), Science (5,11) |
Yes |
Yes |
|
Virginia |
Standards of Learning (SOL) [CRT] |
j3,5,8 |
English (3), English: Reading/Literature and Research (5,8), English: Writing (5,8), Math, History, Science, Computer Technology (5, 8) |
No |
Yes |
|
Standards of Learning [EXIT] |
9-12 (may be taken at an earlier grade if course-work completed) |
English (9-11), Math (Algebra I, II, & Geometry), History/Social Science (World History I & II, Geography, US History), Science (Earth, Biology, Chemistry) |
No |
Yes |
||
VA State Assessment Program (VSAP) (SAT-9, Form TA) [NRT] |
4,6,9 |
Reading, Language, Math [Science, Social Studies are optional] |
Yes |
Yes |
||
Washington |
WA Assessment of Student Learning (WASL) [CRT] |
4,7,10 |
Reading, Writing, Listening, Math |
Yes |
Yes |
|
Iowa Tests of Basic Skills/Iowa Tests of Educational Development (ITBS/ITED) [NRT] |
3,6,9 |
Reading, Language (6), Expression (9), Math (3,6), Quantitative Thinking (9) |
No |
No |
||
West Virginia |
Stanford Achievement Test, 9th ed. (SAT-9) [NRT] |
3-11 |
Basic Skills (Reading, Math, Language) |
Yes |
Yes |
|
WV Writing Assessment [CRT] |
4,7,10 |
Writing |
No |
No |
||
Wisconsin |
WI Knowledge and Concepts Exam (WKCE) [CRT] |
4,8,10 |
Reading, Language Arts, Math, Science, Social Studies |
Yes |
Yes |
|
WI Reading Comprehension Test (WRCT) [CRT] |
3 |
Reading |
Yes |
Yes |
||
Wyoming |
WY Comprehensive Assessment System (WyCAS) [CRT] |
4,8,11 |
Reading, Writing, Math |
Yes (district level only) |
Yes (district level only) |
|
Terra Nova Comprehensive Tests of Basic Skills, 5th ed. (CTBS/5) [NRT] |
4,8,11 |
Reading, Language, Math |
No |
No |
State |
Test |
Count |
Count Not Tested |
Count Exempt |
Count Excluded |
Percent of students tested |
Percent of students not tested |
Percent Exempt |
Percent Excluded |
Count and/or Percent Absent |
AL |
HS Graduation Exam |
|
|
|
|
|
|
|
|
|
SAT-9 |
|
|
|
|
|
|
|
|
|
|
Alabama Direct Assessment of Writing |
|
|
|
|
|
|
|
|
|
|
AK |
CAT-6 |
|
|
|
|
|
|
|
|
|
Benchmark Exams |
|
|
|
|
|
|
|
|
|
|
HSGQE |
|
|
|
|
|
|
|
|
|
|
AZ |
SAT-9 |
|
|
|
|
|
|
|
|
|
AIMS |
|
|
|
|
|
|
|
|
|
|
AIMS-EXIT |
|
|
|
|
|
|
|
|
|
|
AR |
SAT-9 |
|
|
|
|
|
|
|
|
|
ACTAAP |
|
|
|
|
|
|
|
|
|
|
CA |
STAR: • SAT-9 • SABE/2 • Content Standard |
|
|
|
|
|
|
|
|
|
CO |
CSAP |
|
|
|
|
|
|
|
|
|
CT |
CMT |
|
|
|
|
|
|
|
|
|
CAPT |
|
|
|
|
|
|
|
|
|
|
DE |
DSTP (SAT-9) |
|
|
|
|
|
|
|
|
|
FL |
FCAT (includes SAT-9) |
|
|
|
|
|
|
|
|
|
GA |
GHSGT |
|
|
|
|
|
|
|
|
|
CRCT |
|
|
|
|
|
|
|
|
|
|
Writing Assessment |
|
|
|
|
|
|
|
|
|
|
ID |
IDA |
|
|
|
|
|
|
|
|
|
ITBS TAP |
|
|
|
|
|
|
|
|
|
|
IL |
ISAT |
|
|
|
|
|
|
|
|
|
PSAE |
|
|
|
|
|
|
|
|
|
|
IN |
ISTEP+ |
|
|
|
|
|
|
|
|
|
GQE |
|
|
|
|
|
|
|
|
|
|
IA |
ITBS/ITED |
|
|
|
|
|
|
|
|
|
KS |
KAS |
|
|
|
|
|
|
|
|
|
KY |
KCCT |
|
|
|
|
|
|
|
|
|
CTBS/5 |
|
|
|
|
|
|
|
|
|
|
LA |
ITBS/ITED |
|
|
|
|
|
|
|
|
|
LEAP-21 |
|
|
|
|
|
|
|
|
|
|
GEE-21 |
|
|
|
|
|
|
|
|
|
|
ME |
MEA |
|
|
|
|
|
|
|
|
|
MD |
MSPAP |
|
|
|
|
|
|
|
|
|
MFT |
|
|
|
|
|
|
|
|
|
|
CTBS/5 |
|
|
|
|
|
|
|
|
|
|
MA |
MCAS |
|
|
|
|
|
|
|
|
|
MI |
MEAP |
|
|
|
|
|
|
|
|
|
MN |
MCA |
|
|
|
|
|
|
|
|
|
BSE |
|
|
|
|
|
|
|
|
|
|
MS |
Grade Level Testing Program · CTBS/5 · MCT · Writing Assessment |
|
|
|
|
|
|
|
|
|
FLE |
|
|
|
|
|
|
|
|
|
|
MO |
MAP (Terra Nova survey) |
|
|
|
|
|
|
|
|
|
MT |
ITBS/ITED |
|
|
|
|
|
|
|
|
|
NE |
NE Writing Assessment |
|
|
|
|
|
|
|
|
|
Assessment of State Math Standards |
|
|
|
|
|
|
|
|
|
|
NV |
Terra Nova CTBS/5 |
|
|
|
|
|
|
|
|
|
NH |
NHEIAP |
|
|
|
|
|
|
|
|
|
NJ |
ESPA/GEPA/ HSPT |
|
|
|
|
|
|
|
|
|
NM |
NMAAP |
|
|
|
|
|
|
|
|
|
NM High School Competency Exam |
|
|
|
|
|
|
|
|
|
|
NM Writing Assessment Program |
|
|
|
|
|
|
|
|
|
|
NY |
Regents Competency Test |
|
|
|
|
|
|
|
|
|
NC |
End of Grade |
|
|
|
|
|
|
|
|
|
End of Course |
|
|
|
|
|
|
|
|
|
|
Grade 3 Pretest |
|
|
|
|
|
|
|
|
|
|
Computer Skills |
|
|
|
|
|
|
|
|
|
|
Writing Test |
|
|
|
|
|
|
|
|
|
|
Competency Test |
|
|
|
|
|
|
|
|
|
|
OH |
OH Proficiency Tests |
|
|
|
|
|
|
|
|
|
OK |
CCT |
|
|
|
|
|
|
|
|
|
PA |
PSSA |
|
|
|
|
|
|
|
|
|
RI |
New Standards Reference Examinations
RI State Writing Assessment
Health Assessment |
|
|
|
|
|
|
|
|
|
SC |
PACT |
|
|
|
|
|
|
|
|
|
SD |
SAT-9 |
|
|
|
|
|
|
|
|
|
TN |
TCAP |
|
|
|
|
|
|
|
|
|
Gateway Testing Initiative |
|
|
|
|
|
|
|
|
|
|
TX |
TAAS |
|
|
|
|
|
|
|
|
|
TAAS-EXIT |
|
|
|
|
|
|
|
|
|
|
RPTE |
|
|
|
|
|
|
|
|
|
|
UT |
SAT-9 |
|
|
|
|
|
|
|
|
|
CCRT |
|
|
|
|
|
|
|
|
|
|
VT |
VCAS |
|
|
|
|
|
|
|
|
|
VA |
VSAP |
|
|
|
|
|
|
|
|
|
WA |
WASL |
|
|
|
|
|
|
|
|
|
WV |
SAT-9 |
|
|
|
|
|
|
|
|
|
WI |
WKCE |
|
|
|
|
|
|
|
|
|
WRCT |
|
|
|
|
|
|
|
|
|
State |
Grade |
Subject |
Test Name |
CO |
8 |
Math |
CSAP |
CT |
8 |
Math |
CMT |
FL |
8 |
Math |
FCAT |
KS |
7 |
Math |
KSAP |
MD |
8 |
Math |
MSPAP |
MA |
8 |
Math |
MCAS |
MO |
8 |
Math |
MAP |
NE |
8 |
Math |
Assessment of State Mathematics Standards |
NV |
8 |
Entire TerraNova |
TerraNova |
SC |
Aggregate of 3-8 |
Math |
PACT |
WA |
7 |
Math |
WASL |
WI |
8 |
Math |
WKCE |
State |
Test |
Count |
Count Not Tested |
Count Exempt |
Count Excluded |
Percent of students tested |
Percent of students not tested |
Percent Exempt |
Percent Excluded |
Count and/or Percent Absent |
AL |
Alternate |
· |
|
|
|
|
|
|
|
|
AK |
Alternate |
· |
|
|
|
|
|
|
|
|
AR |
Alternate Portfolio |
· |
|
|
|
|
|
|
|
|
CO |
Alternate |
· |
· |
|
|
|
· |
|
|
|
CT |
Alternate |
|
|
|
|
· |
|
|
|
|
FL |
Alternate |
· |
|
|
|
|
|
|
|
|
GA |
Alternative |
· |
|
|
|
|
|
|
|
|
KY |
Alternate |
· |
|
|
|
· |
|
|
|
|
LA |
Alternate |
· |
|
|
|
|
|
|
|
|
MA |
MCAS Alternate |
· |
|
|
|
· |
|
|
|
|
MI |
Alternate |
· |
|
|
|
· |
|
|
|
|
MO |
Alternate |
· |
|
|
|
|
|
|
|
|
MT |
Alternate |
· |
|
|
|
|
|
|
|
|
NE |
Alternate Assessment for the Assess. of State Math Standards |
|
|
|
|
· |
|
|
|
|
NV |
Alternate |
· |
|
|
|
|
|
|
|
|
NH |
Alternate |
· |
|
|
|
· |
|
|
|
|
NJ |
Alternate |
· |
|
|
|
|
|
|
|
|
NM |
Alternate |
· |
|
|
|
|
|
|
|
|
NC |
Alternate |
· |
|
|
|
· |
|
|
|
|
PA |
Alternate |
· |
|
|
|
|
|
|
|
|
SC |
Alternate |
· |
|
|
|
|
· |
|
|
|
UT |
Alternate |
· |
|
|
|
|
|
|
|
|
WA |
Alternate |
|
|
|
|
· |
|
|
|
|
WV |
Alternate |
· |
|
|
|
|
|
|
|
|
WI |
Alternate |
· |
|
|
|
· |
|
|
|
|
WY |
Alternate |
|
· |
|
|
|
· |
|
|
|
Grade |
Subject |
Accommodation |
Participation |
Performance |
Arkansas: SAT 9 “IEP Students” |
||||
5 |
Reading |
Signing Directions |
18 |
PR=15 |
Preferential seating/ small group testing |
1164 |
PR=6 |
||
Individual testing |
204 |
PR=6 |
||
Student marks booklet and teacher transfers answers to answer sheet/ student responds verbally and teacher records responses |
25 |
PR=16 |
||
Magnifying glass |
X |
PR=57 |
||
Noise buffers |
X |
PR=75 |
||
Individualized scheduling |
446 |
PR=7 |
||
Braille |
X |
PR=8 |
||
Large print |
13 |
PR=47 |
||
Multiple accommodations |
498 |
PR=5 |
||
5 |
Math |
Signing Directions |
18 |
PR=15 |
Preferential seating/ small group testing |
1250 |
PR=8 |
||
Individual testing |
221 |
PR=9 |
||
Student marks booklet and teacher transfers answers to answer sheet/ student responds verbally and teacher records responses |
26 |
PR=9 |
||
Magnifying glass |
X |
PR=45 |
||
Noise buffers |
X |
PR=33 |
||
Individualized scheduling |
472 |
PR=8 |
||
Braille |
X |
PR=5 |
||
Large print |
13 |
PR=36 |
||
Multiple accommodations |
516 |
PR=6 |
||
7 |
Reading |
Signing Directions |
20 |
PR=16 |
Preferential seating/ small group testing |
938 |
PR=6 |
||
Magnifying glass |
X |
PR=76 |
||
Individual testing |
197 |
PR=6 |
||
Student marks booklet and teacher transfers answers to answer sheet/ student responds verbally and teacher records responses |
25 |
PR=8 |
||
Individualized scheduling |
363 |
PR=6 |
||
Braille |
X |
PR=10 |
||
Large print |
X |
PR=40 |
||
Multiple accommodations |
433 |
PR=8 |
||
7 |
Math |
Signing Directions |
20 |
PR=21 |
Preferential seating/ small group testing |
1002 |
PR=7 |
||
Individual testing |
202 |
PR=7 |
||
Student marks booklet and teacher transfers answers to answer sheet/ student responds verbally and teacher records responses |
26 |
PR=7 |
||
Magnifying glass |
X |
PR=76 |
||
Individualized scheduling |
362 |
PR=7 |
||
Braille |
X |
PR=3 |
||
Large print |
X |
PR=34 |
||
Multiple accommodations |
441 |
PR=9 |
||
10 |
Reading |
Signing Directions |
21 |
PR=10 |
Preferential seating/ small group testing |
686 |
PR=6 |
||
Individual testing |
131 |
PR=4 |
||
Student marks booklet and teacher transfers answers to answer sheet/ student responds verbally and teacher records responses |
25 |
PR=3 |
||
Magnifying glass |
X |
PR=78 |
||
Noise buffers |
X |
PR=11 |
||
Individualized scheduling |
177 |
PR=6 |
||
Braille |
X |
PR=8 |
||
Large print |
11 |
PR=19 |
||
Multiple accommodations |
127 |
PR=6 |
||
10 |
Math |
Signing Directions |
22 |
PR=25 |
Preferential seating/ small group testing |
698 |
PR=21 |
||
Individual testing |
140 |
PR=18 |
||
Student marks booklet and teacher transfers answers to answer sheet/ student responds verbally and teacher records responses |
25 |
PR=22 |
||
Magnifying glass |
X |
PR=76 |
||
Noise buffers |
X |
PR=21 |
||
Individualized scheduling |
177 |
PR=19 |
||
Braille |
X |
PR=27 |
||
Large print |
11 |
PR=37 |
||
Multiple accommodations |
131 |
PR=20 |
||
Colorado: CSAP “All Students” |
||||
4 |
Reading |
Braille version |
2 |
X |
Large-print version |
18 |
33 |
||
Teacher-read directions only |
1507 |
11 |
||
Use of number line |
X |
X |
||
Scribe |
518 |
33 |
||
Signing |
28 |
11 |
||
Assistive communication device |
14 |
X |
||
Extended/modified timing |
4292 |
37 |
||
Oral presentation of entire test |
X |
X |
||
8 |
Reading |
Braille version |
2 |
X |
Large-print version |
13 |
X |
||
Teacher-read directions only |
1185 |
6 |
||
Use of number line |
X |
X |
||
Scribe |
173 |
34 |
||
Signing |
18 |
17 |
||
Assistive communication device |
5 |
X |
||
Extended/modified timing |
1219 |
23 |
||
Oral presentation of entire test |
X |
X |
||
10 |
Reading |
Braille version |
3 |
X |
Large-print version |
7 |
X |
||
Teacher-read directions only |
415 |
7 |
||
Use of number line |
X |
X |
||
Scribe |
45 |
40 |
||
Signing |
5 |
X |
||
Assistive communication device |
35 |
3 |
||
Extended/modified timing |
876 |
24 |
||
Oral presentation of entire test |
X |
X |
||
5 |
Math |
Braille version |
6 |
X |
Large-print version |
11 |
X |
||
Teacher-read directions only |
1147 |
14 |
||
Use of number line |
9 |
X |
||
Scribe |
306 |
30 |
||
Signing |
17 |
6 |
||
Assistive communication device |
11 |
X |
||
Extended/modified timing |
2539 |
30 |
||
Oral presentation of entire test |
1681 |
14 |
||
8 |
Math |
Braille version |
2 |
X |
Large-print version |
11 |
X |
||
Teacher-read directions only |
630 |
3 |
||
Use of number line |
6 |
X |
||
Scribe |
109 |
19 |
||
Signing |
18 |
6 |
||
Assistive communication device |
3 |
X |
||
Extended/modified timing |
1754 |
31 |
||
Oral presentation of entire test |
971 |
1 |
||
10 |
Math |
Braille version |
6 |
X |
Large-print version |
7 |
X |
||
Teacher-read directions only |
348 |
1 |
||
Use of number line |
2 |
X |
||
Scribe |
32 |
9 |
||
Signing |
8 |
X |
||
Assistive communication device |
37 |
3 |
||
Extended/modified timing |
1087 |
22 |
||
Oral presentation of entire test |
211 |
0 |
||
Indiana- ISTEP+ (Grades 3,6,8) and GQE (Grade 10) “Special Ed” |
||||
3 |
English/ L. Arts |
Accommodations |
5206 |
15% |
Math |
Accommodations |
5036 |
25% |
|
6 |
English/ L. Arts |
Accommodations |
7646 |
6% |
Math |
Accommodations |
7485 |
18% |
|
8 |
English/ L. Arts |
Accommodations |
7414 |
14% |
Math |
Accommodations |
7371 |
18% |
|
10 |
English/ L. Arts |
Accommodations |
6284 |
14% |
Math |
Accommodations |
5726 |
16% |
|
Indiana- ISTEP+ (Grades 3,6,8) and GQE (Grade 10) “General Ed” |
||||
3 |
English/ L. Arts |
Accommodations |
486 |
47% |
Math |
Accommodations |
443 |
51% |
|
6 |
English/ L. Arts |
Accommodations |
309 |
17% |
Math |
Accommodations |
295 |
30% |
|
8 |
English/ L. Arts |
Accommodations |
244 |
26% |
Math |
Accommodations |
236 |
24% |
|
10 |
English/ L. Arts |
Accommodations |
520 |
12% |
Math |
Accommodations |
404 |
22% |
|
Kentucky- KY Core Content Test “Students with Disabilities” |
||||
4 |
Reading |
Accommodations |
4758 (80% of SWDs) |
37 |
7 |
Reading |
Accommodations |
4117 (71% of SWDs) |
14 |
10 |
Reading |
Accommodations |
2479 (61% of SWDs) |
1 |
5 |
Math |
Accommodations |
5006 (81% of SWDs) |
14 |
8 |
Math |
Accommodations |
3701 (68% of SWDs) |
3 |
11 |
Math |
Accommodations |
1866 (62% of SWDs) |
1 |
Kentucky- CTBS/5 “Students with Disabilities” |
||||
3 |
Reading |
Accommodations |
3821 (71% of SWDs) |
NP=33 |
6 |
Reading |
Accommodations |
4192 (74% of SWDs) |
NP=25 |
9 |
Reading |
Accommodations |
3490 (64% of SWDs) |
NP=18 |
3 |
Math |
Accommodations |
3821 (71% of SWDs) |
NP=27 |
6 |
Math |
Accommodations |
4192 (74% of SWDs) |
NP=16 |
9 |
Math |
Accommodations |
3490 (64% of SWDs) |
NP=11 |
Louisiana- ITBS “All Students” |
||||
3 |
Reading |
Calculator Used |
15109 (36%) |
PR=43 |
7 |
Reading |
Calculator Used |
30414 (55%) |
PR=44 |
9 |
Reading |
Calculator Used |
31474 (63%) |
PR=46 |
3 |
Math |
Calculator Used |
15109 (36%) |
PR=51 |
7 |
Math |
Calculator Used |
30414 (55%) |
PR=49 |
9 |
Math |
Calculator Used |
31474 (63%) |
PR=50 |
Maine – MEA “Students who took all or part of the assessment with accommodations: Identified Disability” |
||||
4 |
Reading |
Accommodations |
1747 (73% of accommodated students) |
- |
8 |
Reading |
Accommodations |
1588 (84% of accommodated students) |
- |
11 |
Reading |
Accommodations |
963 (87% of accommodated students) |
- |
8 |
Math |
Accommodations |
1606 (83% of accommodated students) |
- |
11 |
Math |
Accommodations |
927 (86%of accommodated students) |
- |
Massachusetts- MCAS “Students with Disabilities” |
||||
4 |
Reading |
Accommodations |
81% of SWDs |
- |
7 |
Reading |
Accommodations |
80% of SWDs |
- |
10 |
Reading |
Accommodations |
75% of SWDs |
- |
4 |
Math |
Accommodations |
77% of SWDs |
- |
8 |
Math |
Accommodations |
75% of SWDs |
- |
10 |
Math |
Accommodations |
74% of SWDs |
- |
Nebraska- Statewide Writing Assessment “Students Receiving Accommodations” |
||||
4 |
Writing |
Accommodations |
1,097 (5.25%) |
- |
Nevada- TerraNova “Students with Disabilities” “Special Conditions”= Accommodations that are not allowed |
||||
4 |
Reading |
Special Conditions (accommodations that are not allowed) |
794 |
- |
8 |
Reading |
Special Conditions |
84 |
- |
10 |
Reading |
Special Conditions |
328 |
- |
4 |
Math |
Special Conditions |
794 |
- |
8 |
Math |
Special Conditions |
840 |
- |
10 |
Math |
Special Conditions |
328 |
- |
New Hampshire- NHEIAP “All Students” Use of non-standard accommodations (not allowed) |
||||
3 |
Reading |
Nonstandard Accommodations |
50 |
- |
6 |
Reading |
Nonstandard Accommodations |
13 |
- |
10 |
Reading |
Nonstandard Accommodations |
4 |
- |
3 |
Math |
Nonstandard Accommodations |
8 |
- |
6 |
Math |
Nonstandard Accommodations |
2 |
- |
10 |
Math |
Nonstandard Accommodations |
1 |
- |
New Mexico- NMAAP “All Students” |
||||
4 |
Reading |
Presentation |
142 |
NP=17.5 |
Response |
5 |
NP=X |
||
Timing |
301 |
NP=26 |
||
Presentation/Response |
11 |
NP=6.5 |
||
Presentation/Timing |
1648 |
NP=19 |
||
Response/Timing |
24 |
NP=20 |
||
Presentation/Response/Timing |
544 |
NP=18 |
||
8 |
Reading |
Presentation |
166 |
NP=18 |
Response |
10 |
NP=X |
||
Timing |
830 |
NP=18 |
||
Presentation/Response |
20 |
NP=10.5 |
||
Presentation/Timing |
1073 |
NP=13 |
||
Response/Timing |
51 |
NP=18 |
||
Presentation/Response/Timing |
556 |
NP=14 |
||
9 |
Reading |
Presentation |
162 |
NP=17 |
Response |
9 |
NP=X |
||
Timing |
685 |
NP=20 |
||
Presentation/Response |
32 |
NP=16 |
||
Presentation/Timing |
619 |
NP=16 |
||
Response/Timing |
34 |
NP=18 |
||
Presentation/Response/Timing |
437 |
NP=15 |
||
4 |
Math |
Presentation |
142 |
NP=16 |
Response |
5 |
NP=X |
||
Timing |
301 |
NP=17.5 |
||
Presentation/Response |
11 |
NP=9 |
||
Presentation/Timing |
1648 |
NP=16 |
||
Response/Timing |
24 |
NP=18 |
||
Presentation/Response/Timing |
544 |
NP=14 |
||
8 |
Math |
Presentation |
166 |
NP=10 |
Response |
10 |
NP=X |
||
Timing |
830 |
NP=14 |
||
Presentation/Response |
20 |
NP=8.5 |
||
Presentation/Timing |
1073 |
NP=10 |
||
Response/Timing |
51 |
NP=10 |
||
Presentation/Response/Timing |
556 |
NP=10 |
||
9 |
Math |
Presentation |
162 |
NP=10.5 |
Response |
9 |
NP=X |
||
Timing |
685 |
NP=12 |
||
Presentation/Response |
32 |
NP=10.5 |
||
Presentation/Timing |
619 |
NP=9 |
||
Response/Timing |
34 |
NP=12 |
||
Presentation/Response/Timing |
437 |
NP=9 |
||
New Mexico- NM High School Competency Exam “All Students” |
||||
10 |
Reading |
Presentation |
339 |
48.3% |
Response |
61 |
77.6% |
||
Timing |
507 |
57% |
||
Presentation/Timing |
409 |
40.6% |
||
Response/Timing |
18 |
83.3% |
||
Presentation/Response |
70 |
54.3% |
||
Presentation/Response/Timing |
127 |
29.2% |
||
10 |
Math |
Presentation |
339 |
38.8% |
Response |
61 |
58.6% |
||
Timing |
507 |
45.9% |
||
Presentation/Timing |
409 |
37.3% |
||
Response/Timing |
18 |
83.3% |
||
Presentation/Response |
70 |
56.5% |
||
Presentation/Response/Timing |
127 |
34.4% |
||
North Carolina- Grade 3 Pretest “All Students” |
||||
3 |
Reading |
Braille Edition |
3 (0%) |
- |
Large Print Edition |
48 (0%) |
45.8% |
||
Assistive Technology/Devices |
34 (0%) |
44.1% |
||
Braille Writer |
2 (0%) |
- |
||
Cranmer Abacus |
10 (0%) |
- |
||
Dictation to Scribe |
169 (.2%) |
30.2% |
||
Interpreter/Translator Signs/Cues Test (not allowed) |
24 (0%) |
- |
||
Magnification Devices |
8 (0%) |
- |
||
Student Marks Answers in Test Book |
5,104 (5%) |
38.0% |
||
Test Administrator Reads Test Aloud (in English) (not allowed) |
6,490 (6.4%) |
32.8% |
||
Typewriter/Word Processor |
6 (0%) |
- |
||
Hospital/Home Testing |
8 (0%) |
- |
||
Multiple Testing Sessions |
2,379 (2.3%) |
38.4% |
||
Scheduled Extended Time |
7,590 (7.5%) |
38.5% |
||
Testing in a Separate Room |
7,392 (7.3%) |
37.6% |
||
English/Native Language Dictionary/Electronic Translator |
127 (.1%) |
45.7% |
||
One Test Item Per Page |
11 (0%) |
- |
||
Unpublished Accommodation |
16 (0%) |
- |
||
Math |
Braille Edition |
3 (0%) |
- |
|
Large Print Edition |
48 (0%) |
69.6% |
||
Assistive Technology/Devices |
34 (0%) |
61.8% |
||
Braille Writer |
2 (0%) |
- |
||
Cranmer Abacus |
10 (0%) |
- |
||
Dictation to Scribe |
169 (.2%) |
50.3% |
||
Interpreter/Translator Signs/Cues Test (not allowed) |
24 (0%) |
- |
||
Magnification Devices |
8 (0%) |
- |
||
Student Marks Answers in Test Book |
5,104 (5%) |
65.9% |
||
Test Administrator Reads Test Aloud (in English) (not allowed) |
6,490 (6.4%) |
65.7% |
||
Typewriter/Word Processor |
6 (0%) |
- |
||
Hospital/Home Testing |
8 (0%) |
- |
||
Multiple Testing Sessions |
2,379 (2.3%) |
63.9% |
||
Scheduled Extended Time |
7,590 (7.5%) |
67.1% |
||
Testing in a Separate Room |
7,392 (7.3%) |
66.0% |
||
English/Native Language Dictionary/Electronic Translator |
127 (.1%) |
74.8% |
||
One Test Item Per Page |
11 (0%) |
- |
||
Unpublished Accommodation |
16 (0%) |
- |
||
Rhode Island- New Standards Reference Examinations “Students with Disabilities” |
||||
4
|
Reading |
IEP with Accommodations |
65 |
0% |
Math |
IEP with Accommodations |
65 |
20% |
|
8
|
Reading |
IEP with Accommodations |
57 |
0% |
Math |
IEP with Accommodations |
57 |
4% |
|
10
|
Reading |
IEP with Accommodations |
70 |
0% |
Math |
IEP with Accommodations |
70 |
2% |
|
West Virginia- SAT-9 “General ed. and Special Ed.” |
||||
3-11 |
SAT-9 Overall |
Overall tested with nonstandard accommodations |
25,360 |
- |
General ed. tested with nonstandard accommodations |
5,751 (3.1%) |
- |
||
Special ed. tested with nonstandard accommodations |
19,609 (10.5%) |
- |
Web-Based Reporting
State |
Test |
Word (a) |
Click (b) |
Prox-all (c) |
Prox-alt. (d) |
Prof.-def. (e) |
Print (f) |
Date (g) |
Trend (h) |
AL |
DAW |
Reports |
4 |
Same page |
2 clicks |
Yes (% meeting standards) |
Yes |
Yes |
No |
HSGE
|
Reports |
4 |
Same page |
2 clicks |
Yes (% passing) |
Yes |
Yes |
No |
|
SAT-9 |
Reports |
4 |
Same page |
2 clicks |
No (PR) |
Yes |
Yes |
No |
|
AK |
CAT-6 |
Assessments |
3 |
2 clicks |
2 clicks |
No |
Yes |
Yes |
No |
Benchmark Exams |
Assessments |
3 |
Same page |
2 clicks |
Yes (Adv./ Prof.) |
Yes |
Yes |
Yes |
|
HSGQE |
Assessments |
3 |
Same page |
2 clicks |
Yes (% Proficient) |
Yes |
Yes |
No |
|
AZ
|
SAT-9 |
Special Education |
4 |
7 clicks |
N/A |
No (PR) |
Yes |
Yes |
No |
AIMS |
Accountability and Standards |
5 |
3 clicks |
N/A |
Yes (Meets the stand. & exceeds the stand.) |
Yes |
Yes |
No |
|
AIMS-Exit |
Accountability and Standards |
5 |
3 clicks |
N/A |
Yes (Meets the stand. & exceeds the stand.) |
Yes |
Yes |
No |
|
CA |
Content Standard |
STAR (Stanford 9) |
4 |
1 clicks |
N/A |
Yes- % Prof. & % Adv. |
Yes |
No |
No |
SAT-9 |
STAR (Stanford 9) |
4 |
1 clicks |
N/A |
No (PR) |
Yes |
No |
No |
|
SABE/2 |
STAR (Stanford 9) |
4 |
2 clicks |
N/A |
No (PR) |
Yes |
Yes |
Yes |
|
CO |
CSAP |
Assessment |
3 |
Same page |
6 clicks |
Yes (% Prof. & % Adv.) |
Yes |
No |
No |
CT |
CMT |
State CMT Results, 2002 |
1 |
Same doc. |
Same doc. |
Yes (% scoring within the goal range) |
Yes |
Yes |
Yes |
CAPT |
State CAPT Results, 2002 |
1 |
Same doc. |
Same doc. |
Yes (% at or above goal) |
Yes |
Yes |
Yes |
|
DE |
DSTP |
DSTP- Delaware Students Testing Program |
5 |
1 click |
N/A |
Yes (% Meets or exceeds standard) |
Yes |
Yes |
Yes |
FL |
FCAT |
Special Education (pull-down bar) |
3 |
6 clicks |
N/A |
No (no def. of levels) |
Yes |
Yes |
Yes |
GA |
CRCT |
More |
7 |
Same page |
8 clicks |
Yes (Meets & Exceeds) |
No- cuts off right |
Yes |
Yes |
GHSGT |
More |
7 |
Same page |
8 clicks |
Yes (Pass and Pass Plus) |
No-cuts off right |
Yes |
Yes |
|
Writing Assessment |
More |
7 |
Same page |
8 clicks |
Yes (On target & Exceeds target) |
No-cuts off right |
Yes |
Yes |
|
ID |
IDA |
Statistics |
2 |
1 click |
N/A |
No (no def.) |
Yes |
Yes |
Yes |
ITBS |
Statistics |
2 |
1 click |
N/A |
No (%ile of avg. standard score) |
Yes |
Yes |
Yes |
|
TAP |
Statistics |
2 |
1 click |
N/A |
No (%ile of avg. standard score) |
Yes |
Yes |
Yes |
|
IRI |
Statistics |
2 |
1 click |
N/A |
No (no def.) |
Yes |
Yes |
Yes |
|
IL |
ISAT |
Administrators |
4 |
Same page |
Same doc. |
Yes (meets & exceeds standards) |
No- cuts off right |
No |
No |
PSAE |
Administrators |
4 |
Same page |
Same doc. |
Yes (meets & exceeds standards) |
No- cuts off right |
No |
No |
|
IN |
ISTEP |
ISTEP and Info Center |
3 |
Same page |
N/A |
Yes (# and % pass) |
No-Cuts off right |
Yes |
No |
|
GQE |
ISTEP and Info Center |
3 |
Same page |
N/A |
Yes (# and % above) |
No-Cuts off right |
Yes |
No |
IA |
ITBS/ITED |
Reports, Data, & Statistics |
3 |
Same page |
N/A |
Yes (% at or above prof. level) |
Yes |
Yes |
Yes |
KS |
KAS |
Building, district, & state report cards |
2 |
1 click |
6 clicks |
Yes (% prof., adv. & exemplary) |
Yes |
No |
No |
KY |
CTBS |
Testing and Reporting |
5 |
Same page |
Same page |
No (PR) |
Yes |
Yes |
No |
KCCT |
Testing and Reporting |
6 |
Same page |
Same page |
Yes (Prof. & distinguished) |
Yes |
Yes |
No |
|
MD |
MSPAP |
Testing |
4 |
Same page |
2 clicks |
Yes (% satisfactory) |
Yes |
No |
Yes |
MFT |
Testing |
4 |
Same page |
2 clicks |
Yes (% passing) |
Yes |
No |
Yes |
|
HAS |
Testing |
4 |
Same page |
2 clicks |
No (Median NPR) |
Yes |
No |
No |
|
CTBS/5 |
Testing |
4 |
Same page |
2 clicks |
No (Median NPR) |
Yes |
No |
Yes |
|
MA |
MCAS |
Assessment/ Accountability |
4 |
Same doc. |
3 clicks |
Yes (% prof. & advanced) |
Yes |
Yes |
Yes |
MN |
MCA |
MN Connecting Learning, Accountability, Students, and Schools |
5 |
Same page |
N/A |
Yes (definition given) |
Yes |
Yes |
Yes |
BST |
MN Connecting Learning, Accountability, Students, and Schools |
5 |
Same page |
N/A |
Yes (% passing) |
Yes |
Yes |
Yes |
|
MS |
MCT |
Assessment, Students (pull-down bar) |
5 |
1 click |
Same page |
Yes (% prof. & adv.) |
Yes |
Yes |
No |
FLE |
Assessment, Students (pull-down bar) |
5 |
1 click |
Same page |
Yes (% pass) |
Yes |
Yes |
No |
|
Writing Assessment |
Assessment, Students (pull-down bar) |
5 |
1 click |
Same page |
No |
Yes |
Yes |
No |
|
CTBS/5 |
Assessment, Students (pull-down bar) |
5 |
1 click |
Same page |
No (PR) |
Yes |
Yes |
No |
|
MO |
MAP |
Student Assessment/ MAP |
2 |
Same page |
Same page |
Yes (% prof. & adv.) |
No- cuts off right |
Yes |
Yes |
MT |
ITBS/ITED |
Assessment (MontCAS) (pull-down bar) |
5 |
Same page |
Same page |
Yes (% proficient & advanced) |
No- cuts off right |
No |
Yes |
NE |
Nebraska Statewide Writing Assessment |
Click to view the 2001-2002 state of the schools report |
3 |
Same page |
1 click |
Yes (% meeting or exceeding standards) |
Yes |
Yes |
No |
Assessment of State Mathematics Standards |
Click to view the 2001-2002 state of the schools report |
3 |
Same page |
1 click |
Yes (% meeting or exceeding standards) |
Yes |
Yes |
No |
|
NV |
CRT |
NV Statewide Education Database |
3 |
Same page |
N/A |
Yes (% meets and exceeds standards) |
Yes |
Yes |
No |
HSPA |
NV Statewide Education Database |
3 |
Same page |
N/A |
Yes (% pass) |
Yes |
Yes |
No |
|
NH |
NHEIAP |
Reports and Statistics |
4 |
Same doc. |
Same doc. |
Yes (% prof. or above) |
Yes |
Yes |
No |
NJ |
ESPA |
Assessment (pull-down bar) |
5 |
Same page |
6 clicks |
Yes (% prof. & adv.) |
Yes |
Yes |
No |
GEPA |
Assessment (pull-down bar |
5 |
Same page |
6 clicks |
Yes (% prof. & adv.) |
Yes |
Yes |
No |
|
HSPA |
Assessment (pull-down bar |
4 |
Same page |
5 clicks |
Yes (% prof. & adv.) |
Yes |
Yes |
No |
|
NM |
NMAAP |
Executive summary for the NM Articulated Assessment Program (Spring 2002) |
1 |
Same page |
3 clicks |
No (Median PR) |
Yes |
Yes |
No |
NM Writing Assessment Program |
Executive summary for the NM Articulated Assessment Program (Spring 2002) |
1 |
Same page |
3 clicks |
No (no definition) |
Yes |
No |
Yes |
|
NM High School Competency Exam |
Executive summary for the NM Articulated Assessment Program (Spring 2002) |
1 |
Same page |
3 clicks |
Yes (% passing) |
Yes |
Yes |
No |
|
NC |
End of Grade |
Reports & Statistics |
8 |
Same page |
2 clicks |
Yes (% at or above level III) |
Yes |
Yes |
Yes |
End of Course |
Reports & Statistics |
8 |
Same page |
2 clicks |
Yes (% at or above level III) |
Yes |
Yes |
Yes |
|
Writing Assessment |
Reports & Statistics |
3 |
Same page |
Same doc. |
Yes (% at or above 2.5) |
Yes |
Yes |
Yes |
|
Competency Test |
Reports & Statistics |
5 |
Same page |
N/A |
Yes (% Proficient) |
Yes |
Yes |
Yes |
|
Computer Skills |
Reports & Statistics |
5 |
Same page |
N/A |
Yes (% Proficient) |
Yes |
Yes |
Yes |
|
Grade 3 Pretest |
Reports & Statistics |
4 |
Same page |
N/A |
Yes (Percent at or above Level III) |
Yes |
Yes |
Yes |
|
ND |
ND State Assessment |
Programs and Services |
3 |
Same page |
N/A |
Yes (% prof. & adv.) |
Yes |
Yes |
No |
OH |
OPT |
Data and select local report cards |
3 |
Same page |
N/A |
Yes (% at or above prof.) |
Yes |
Yes |
No |
PA |
PSSA |
K-12 schools |
6 |
Same page |
7 clicks |
Yes (% prof. & adv.) |
Yes |
Yes |
No |
RI |
NSRE |
Infoworks! |
3 |
3 clicks |
3 clicks |
Yes (% prof.) |
No- cuts off right |
No |
No |
RI State Writing Assessment |
Infoworks! |
3 |
3 clicks |
3 clicks |
Yes (% prof.) |
No-cuts off right |
No |
No |
|
RI Health Education Assessment |
Infoworks! |
3 |
3 clicks |
3 clicks |
Yes (% prof.) |
No- cuts off right |
No |
No |
|
SC |
PACT |
Test Scores |
3 |
Same page |
6 clicks |
Yes (% prof. & adv.) |
No- cuts off right |
No |
No |
HSEE |
Test Scores |
3 |
Same page |
6 clicks |
Yes (% meet. stand.) |
Yes |
No |
Yes |
|
SD |
SAT-9
|
2001-2002 Education in SD: District and Statewide Profiles |
5 |
Same page |
N/A |
Yes (% prof. & adv.) |
No- cuts off right |
Yes |
No |
TN |
TCAP Ach. |
Tests |
2 |
Same page |
N/A |
No (Median NP) |
Yes |
No |
No |
TCAP Comp. |
Tests |
2 |
Same page |
N/A |
Yes (% passing) |
No- cuts off right |
Yes |
Yes
|
|
Gateway Testing Initiative |
Tests |
4 |
Same page |
N/A |
Yes (% proficient) |
Yes |
Yes |
No |
|
TX |
TAAS |
AEIS Reports |
3 |
Same page |
N/A |
Yes (% passing) |
No (cuts off on right) |
Yes |
Yes |
TAAS-EXIT |
AEIS Reports |
3 |
Same page |
N/A |
Yes (% passing) |
No (cuts off right) |
Yes |
Yes |
|
RPTE |
Assessment/ Testing |
4 |
Same page |
N/A |
Yes (advanced) |
Yes |
Yes |
No |
|
UT |
CCRT |
Evaluation and Assessment |
2 |
Same page |
3 clicks |
No (Med. %ile Scores) |
No- cuts off right |
No |
Yes |
SAT-9 |
Evaluation and Assessment |
2 |
Same page |
3 clicks |
Yes (% Mastery) |
Yes |
No |
No |
|
VT |
VCAS |
School Data and Reports |
3 |
Same page |
N/A |
Yes (% Achiev & Honors) |
Yes |
No |
No |
VA |
SOL |
Reports |
3 |
Same page |
3 clicks |
Yes (Passing rate) |
Yes |
Yes |
Yes |
WA |
WASL |
Assessment and Research |
3 |
1 click |
Same page |
Yes (% who met standard) |
Yes |
Yes |
Yes |
WV |
SAT-9 |
Special Education |
3 |
Same page |
Same page (particip.); 2 (perf.) |
No (no description) |
No- cuts off right |
Yes |
No |
WI |
WKCE |
Statistics and Reports |
4 |
Same page |
Same page |
Yes (prof. & adv.) |
Yes |
Yes |
No |
WRCT |
Statistics and Reports |
4 |
Same page |
5 clicks |
Yes (Prof & Advanced) |
Yes |
No |
No |
a. Word on main Web-page that indicates results (e.g. “Assessment Data”)
b. Number of clicks from homepage to disaggregated results (e.g. 4 clicks)
c. Proximity of special education data to “all students” or “regular education” (e.g. same page)
d. Proximity of Alternate Assessment to disaggregated data (e.g. 3 clicks)
e. Is the term “proficient” defined? (e.g. yes or no)
f. Does the data all appear on one page when printed or does some get cut-off or print in white so it is not visible? (e.g. yes or no- specify problem)
g. Is the date of testing on the same page as the results? (e.g. yes or no) (just the year doesn’t count- needs to have spring/fall or the month)
h. Is there at least two years of trend data available on the same page or a direct link is given on the page with the 2001-2002 data? (e.g. yes or no)
State |
Subject |
Grade |
Type of Test |
Test Name |
Alabama |
Reading and Math |
11 |
EXIT |
High School Graduation Exam |
Alaska |
Reading and Math |
3,8 |
CRT |
Benchmark Exams |
Reading and Math |
10 |
EXIT |
HSGQE |
|
Arizona |
Reading and Math |
3,8 |
CRT |
AIMS |
Reading and Math |
10 |
EXIT |
AIMS Exit |
|
Arkansas |
Reading and Math |
4,8 |
CRT |
ACTAAP |
California |
Reading and Math |
4,7 |
CRT |
Content Standard |
Colorado |
Reading |
4,8,10 |
CRT |
CSAP |
Math |
5,8,10 |
CRT |
CSAP |
|
Connecticut |
Reading and Math |
4,8,10 |
CRT |
CMT |
Delaware |
Reading and Math |
3,8,10 |
NRT/CRT |
DSTP |
Georgia |
Reading and Math |
4,8 |
CRT |
CRCT |
Reading and Math |
11 |
EXIT |
GHSGT |
|
Illinois |
Reading and Math |
3,8,11 |
CRT |
ISAT |
Reading and Math |
11 |
EXIT |
PSAE |
|
Kansas |
Reading |
5,8,11 |
CRT |
KAS |
Math |
4,7,10 |
CRT |
KAS |
|
Kentucky |
Reading |
4,7 |
CRT |
KCCT |
Math |
5,8 |
CRT |
KCCT |
|
Louisiana |
Reading and Math |
4,8 |
CRT |
LEAP 21 |
Reading and Math |
10 |
EXIT |
GEE 21 |
|
Maryland |
Reading and Math |
3,8 |
CRT |
MSPAP |
Reading and Math |
9 |
EXIT |
MFT |
|
Massachusetts |
Reading |
4,7,10 |
CRT |
MCAS |
Math |
4,8,10 |
CRT |
MCAS |
|
Michigan |
Reading |
4,7 |
CRT |
MEAP |
Math |
4,8 |
CRT |
MEAP |
|
Minnesota |
Reading and Math |
3 |
CRT |
MCA |
Reading and Math |
8 |
EXIT |
BST |
|
Mississippi |
Reading and Math |
4,8 |
CRT |
MS Curriculum Test |
Missouri |
Reading |
3,7,11 |
CRT |
MAP |
Math |
4,8,10 |
CRT |
MAP |
|
Nebraska |
Math |
4,8,11 |
CRT |
Assessment of State Mathematics Standards |
Nevada |
Reading and Math |
3 |
CRT |
NV Criterion-Referenced Test |
Reading |
11 |
EXIT |
Graduation Exam |
|
Math |
10 |
EXIT |
Graduation Exam |
|
New Hampshire |
Reading and Math |
3,6,10 |
CRT |
NHEIAP |
New Jersey |
Reading and Math |
4,8 |
CRT |
ESPA; GEPA |
Reading and Math |
11 |
EXIT |
HSPA |
|
New Mexico |
Reading and Math |
10 |
EXIT |
NM High School Competency Exam |
New York |
Reading and Math |
4,8 |
CRT |
NY State Assessment Program |
North Carolina |
Reading and Math |
4,8 |
CRT |
End of Grade |
North Dakota |
Reading and Math |
4,8,12 |
CRT |
ND State Assessment |
Ohio |
Reading and Math |
4,6,10 |
CRT |
OH Proficiency Test |
Reading and Math |
9 |
EXIT |
OH Proficiency Test |
|
Pennsylvania |
Reading and Math |
5,8,11 |
CRT |
PSSA |
South Carolina |
Reading and Math |
10 |
EXIT |
High School Exit Exam |
Texas |
Reading and Math |
4, 8 |
CRT |
TAAS |
Utah |
Reading |
4,8,10 |
CRT |
Core Criterion-Referenced Tests |
Math |
4,7 |
CRT |
Core Criterion-Referenced Tests |
|
Virginia |
Reading and Math |
3,8 |
CRT |
Standards of Learning |
Washington |
Reading and Math |
4,7,10 |
CRT |
WASL |
Wisconsin |
Reading and Math |
4,8,10 |
CRT |
WKCE |