District Court, exhibits four through nine from memorandum brief.

Skip viewer

<?xml version="1.0" encoding="utf-8"?>
<items type="array"> <item>

<dcterms_description type="array">

<dcterms_description>This transcript was create using Optical Character Recognition (OCR) and may contain some errors. I Year 2 Evaluation: The Effectiveness of the PreK-2 Literacy Program in the Little Rock School District 1999-2000 and 2000-2001 Presented to the Board of Education Little Rock School District October 2001 - Prepared by Dr. Bonnie A. Lcslc~ Dr. Ed Williams Patricia Price Pat Busbea Ann Freeman Ken Savage "" EXHIBIT Anita Gilliam i Sharon Kiilsgaard l ~ l 1-28-020111 I I I I I I I I I I I I I I I I I I I Table of Contents Section I: Introduction Introduction Research Questions Methodology Outline of Program Evaluation Sections Outline of Appendices Section II: Background on Program Design Background on Program Requirements: Design of the PreK-3 Literacy Program Background on Program Requirements: LRSD Strategic Plan Background on Program Requirements: Revised Desegregation and Education Plan Section III: The Assessments The Assessments: Observation Survey The Assessments: Developmental Reading Assessment Definition of "Readiness" vs. "Proficiency" Reliability and Validity: National Study Reliability and Validity: LRSD Study Developmental Appropriateness of Testing Instruments The Assessments: Achievement Level Tests in Reading and Language Usage 1-6 1-2 3 3-5 5-6 6 7-13 7-8 8 8-13 14-25 14-15 15-21 16-19 19-20 20-21 21 -23 23-25 Section IV: Alignment with National Research on Early Literacy 26-29 Section V: Description of Tables 30-42 Table 1: Kindergarten, 1999-2000, Fall to Spring Black and Non-Black Performance 31 Table 2: Kindergarten, 2000-01, Fall to Spring Black and Non-Black Performance 31 Table 3: Gradel, 1999-2000, Fall to Spring Black and Non-Black Performance 32 Table 4: Grade 1, 2000-01, Fall to Spring Black and Non-Black Performance 33 Table 5: Grade 2, 1999-2000, Fall to Spring Black and Non-Black Performance 33 1-28-020112 Table 6: Grade 2, 2000-0 I, Fall to Spring Black and Non-Black Performance 34 Table 7: Cohort 1, Kindergarten Fall 1999 and Grade 1 Spring 2001 35 Table 8: Cohort 2, Grade 1 Fall 1999 and Grade 2 Spring 2001 35 Table 9: Grades K-2, 1999-2000, Fall to Spring Performance, All Students 36 Table 10: Grades K-2, 2000-01, Fall to Spring Performance, All Students 3 7 Table 11: Percent of Maximum Scores, Kindergarten Black Students 37 Table 12: Percent of Maximum Scores, Kindergarten Non-Black Students 38 Table 13: Percent of Maximum Scores, Grade 1 All Students 38 Table 14: Percent of Maximum Scores, Grade 1 Black Students 38 Table 15: Percent of Maximum Scores, Grade 1 Non-Black Students 38 Table 16: Percent of Maximum Scores, Grade 1 All Students 39 Table 17: Percent of Maximum Scores, Grade 2 Black Students 39 Table 18: Percent of Maximum Scores, Grade 2 Non-Black Students 39 Table 19: Percent of Maximum Scores, Grade 2 All Students 39 Table 20: Cohort 1-All Students, Kindergarten Fall 1999 and Grade 1 Spring 2001 40 Table 21 : Cohort 2-All Students, Grade 1 Fall 1999 and Grade 2 Spring 2001 40 Table 22: Percent Readiness, DRA, Black and Non-Black Students 41 Table 23: Percent Readiness, DRA, All Students 41 Table 24: Grade 2 Reading, ALT, Black and Non-Black Comparisons 41 Table 25: Grade 2 Reading, ALT, All Students 42 Table 26: Grade 2 Language Usage, ALT, Black and Non-Black Comparisons 42 Table 27: Grade 2 Language Usage, ALT, All Students 42 I I I I I I I I I I I I I I I I I o" "'?&gt; I ~'o .. oi '\- I I I I I I I I I I I I I I I I I I I Section VI: Analysis of Results, 1999-2000 and 2000-2001 Letter Identification Word Test Concepts about Print Writing Vocabulary Hearing and Recording Sounds Developmental Reading Assessment Section VII: Additional Data Achievement Gap Among Schools Impact of Professional Development Section VIII: Program Evaluation Findings and Recommendations for Improvement Research Question I-Program Effectiveness Research Question 2-Achievement Disparities Research Question 3-Professional Development Research Question 4-Four Literacy Models Research Question 5-Program Strengths and Weaknesses Research Question 6-Cost Effectiveness Recommendations for Improvement Instruction Parent Involvement Interventions Professional Development Schools Identified for Improvement Year 3 Program Evaluation 43-67 43-46 47-51 52-54 55-59 60-63 64-67 68-71 68-70 70-71 72-113 7~-80 81-96 96-100 100-103 103-105 105-106 106- 107-109 109-110 110-112 112 112 112-113 Section IX: Bibliography - 114-116 Section X: School-Level Data Letter Identification, Kindergarten Word Test, Kindergarten Concepts about Print, Kindergarten Writing Vocabulary, Kindergarten Hearing and Recording Sounds, Kindergarten Developmental Reading Assessment, Kindergarten Letter Identification, Grade 1 Word Test, Grade 1 Concepts about Print, Grade 1 Writing Vocabulary, Grade 1 Hearing and Recording Sounds, Grade 1 117-205 119-122 123-126 127-130 131-134 135-138 139-142 143-146 147-150 151-154 155-158 159-162 1-28-020114 Developmental Reading Assessment, Grade 1 Word Test, Grade 2 Writing Vocabulary, Grade 2 Hearing and Recording Sounds, Grade 2 Developmental Reading Assessment, Grade 2 Cohort I-Letter Identification, Black and Non-Black Cohort I-Word Test, Black and Non-Black Cohort I-Concepts about Print, Black and Non-Black Cohort I-Writing Vocabulary, Black and Non-Black Cohort I-Hearing and Recording Sounds, Black and Non-Black Cohort I-Developmental Reading Assessment, Black and Non-Black Cohort 2-Word Test, Black and Non-Black Cohort 2-Writing Vocabulary, Black and Non-Black Cohort 2-Hearing and Recording Sounds, Black and Non-Black Cohort 2-Developmental Reading Assessment, Black and Non-Black Percent Readiness, Developmental Reading Assessment, K-2 Percent Readiness, DRA, Rank Order, K-2 Percent Readiness, DRA, Black and Non-Black 163-166 167-170 171-174 175-178 179-182 183 184 185 186 187 188 189 190 191 192 193-195 196-198 199-201 Grade 2 ALT, Reading-All Students 202 Grade 2 ALT, Reading, Black and Non-Black 203 Grade 2, ALT, Language Usage, All Students 204 Grade 2, ALT, Language Usage, Black and Non-Black 205 Appendices A. PreK-3 Literacy Program Plan B. Section 5.2.1 of the Revised Desegregation and Education Plan's March 2000 Interim Compliance Report C. Section 5.2.1 of the Revised Desegregation and Education Plan's March 200 I Compliance Report D. Presentation to the Board of Education, January 2000 (update on program implementation and early results) ".2s-02011 s I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I E. Update on the Implementation of the PreK-3 Literacy Program Plan, Highlights of Grades K-2 Results: Developmental Reading Assessment, 1999-2000 and 2000-2001, and a copy of the slides for the June 2001 presentation to the Board of Education 1-28-020116 Year 2 Evaluation: The Effectiveness of the PreK-2 Literacy Program in the Little Rock School District 1999-2000 and 2000-2001 Section I: Introduction Introduction During March 2000 the Little Rock School District provided to the Board of Education, the federal court, the Office of Desegregation Monitoring, and administrators an Interim Compliance Report, which included a status report on the implementation of the PreK-3 Literacy Program (pp. 93-105) relating to the Revised Desegregation and Education Plan (RDEP). In August 2000 the Planning, Research, and Evaluation (PRE) office provided to the Board and staff a draft copy of a program evaluation for the first year of implementation of the K-2 Literacy Program. At least two subsequent drafts were developed as more data became available, but these were not presented to the Board of Education- just discussed among staff members. An implementation update was provided to the Board in January 2001 by the curriculum staff, on the status of program implementation and including an analysis of available data, along with an outline of next steps. Then in March 2001 the staff provided a summary evaluation in the Compliance Report (pp. 72-93) relating to the Revised Desegregation and Education Plan that was filed with the federal court and provided to members of the Board of Education. The Board of Education approved on second reading in March 2001 a new policy on program evaluation. Policy IL: Evaluation oflnstructional Programs requires that the staff evaluate the instructional programs designated by the Board of Education in their annual approval of the program evaluation agenda. Each evaluation is to "provide valuable insights into how programs are operating, the extent to which they are serving the intended purpose of increasing student achievement, the strengths and weaknesses, the cost-effectiveness, and directions for the future." In August 1999, 2000, and 2001, the Board of Education included the PreK-2 literacy program on its approved research agenda for the following year. An interim program evaluation was provided to the Board of Education in June 2001, the first analysis of the scores on the Developmental Reading Assessment in grades K-2 for 1999-2000 and 2000-2001 . At that time the scores were reported as the percent of students at each grade level, by race, who met the standard for "readiness," the level that would predict success at the next grade level (level 2 at kindergarten; level 16 at grade 1; and level 24 at grade 2). Copies of that report, plus the summary and the slides were immediately sent via e-mail to principals to use in their own analysis and to provide to I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I teachers and parents. (See Appendix E.) Elementary principals used these materials in their August 2001 preschool inservice sessions. This "Year 2 Evaluation of the Effectiveness of the PreK-2 Literacy Program in the Little Rock School District" builds on the information provided in all earlier reports. It is intended to meet the requirements specified in Policy IL for the 2000-01 school year, as well as to fulfill the requirements in Section 2. 7 .1 of the Revised Desegregation and Education Plan for the PreK-3 Literacy Program Plan. The grade levels evaluated include only grades kindergarten through grade 2. Another report will include grades 3 through 5. The curriculum staff received from PRE on July 19, 2001 , the report on the mean scores for K-2 students on both the Observation Survey and the Developmental Reading Assessment for 2000-01 . Achievement Level Test data were available earlier, but they had not yet been disaggregated by race. This program evaluation, therefore, differs from , but builds upon, the evaluation report that was presented to the Board of Education in June. It includes a much more detailed analysis of data; it includes the results of the five sub-tests of the Observation Survey; and it includes the average performance scores for each school on each sub-test-not just the percent of students meeting the standard. It also includes the results of the grade 2 Achievement Level Tests in reading and language usage. The new data permit the staff to calculate and analyze the scores in a different way (mean performance vs. percent readiness), and they permit the calculation of a black to nonblack student ratio so that the degree to which the achievement gap in narrowed can be measured, as well as how the gap has changed over the two years of program implementation. One caution in comparing the 1999-2000 and 2000-01 pre-test scores on the Observation . Survey and the Developmental Reading Assessment is that some schools did not complete their fall testing by the deadline in 1999 and so their pre-test scores were higher than they would have been had the testing been done in a timely manner. There were instances when there were several weeks' difference in the test date, so this variance would affect the pre-test scores. The kindergarten pre-test scores in fall 2000, for instance, were generally lower than those for fall 1999, for both black and non-black students. These differences do not necessarily indicate that this past year's kindergarten class was that much weaker than the one the year before--especially when this past year's end-of-year scores were higher than the previous class's end-of-year scores. The third and fourth tests administered are the Achievement Level Tests in reading and language usage that are given in spring of grade 2. Those scores, combined with the results of the Observation Survey and the Developmental Reading Assessment, enable the District to assess the effectiveness of the early literacy program in LRSD, including its impact on "the improvement of the academic achievement of African American children." 1-28-020118 2 Research Questions Using the obligations set forth in the Revised Desegregation and Education Plan (RDEP), the Board's Strategic Plan, and the Board's Policy IL, the following research questions were established to guide this study: 1. Are the new curriculum standards/benchmarks, instructional strategies, and materials effective in teaching primary grade students how to read independently and understand words on a page? (See Section 5 .2.1 a of RDEP and Strategy 2 of the Strategic Plan.) 2. Is the new program effective in improving and remediating the academic achievement of African American students? (See Section 2.7 of RDEP.) 3. Is there a relationship between teacher participation in professional development and student achievement? (See Policy IL expectation to examine cost effectiveness and Strategy 7 of the Strategic Plan.) 4. Is there evidence of success in each of the four literacy models in use-Early Literacy Learning in Arkansas (ELLA) only; ELLA and Reading Recovery; Success for All; and Direct Instruction? (See Section 2.7 of RDEP.) 5. What are the program's strengths and weaknesses? (See Policy IL.) 6. Is the program cost effective? (See Policy IL and Strategy 3 of the Strategic Plan.) Methodology An interdisciplinary team was assembled to prepare the program evaluation for the PreK- 2 literacy program for Year 2. Several staff members provided assistance and support in the construction of 27 separate tables of district-level data to display not only the mean scores for each sub-test, by race and for all students, on the Observation Survey and the Developmental Reading Assessment, but also to display the percent who scored at or above the "readiness" level on the Developmental Reading Assessment and the median RIT score on the sub-tests of the Achievement Level Tests. Calculations were verified three times by separate staff members to ensure the highest possible degree of accuracy. Among the calculations that were made to assist in the analysis of data were numbers of points of growth from fall to spring for each of the two years, spring to spring, and fall of one grade to spring of the following grade (for a two-year growth). Black to non-black ratios were calculated to determine the degree to which black students were attaining essential knowledge and skill at the same level as non-black students. Growth ratios were also determined-the degree to which growth in a given year by black students was at the same level or higher than that of non-black students. The percent of growth for one year of instruction and then two years of instruction in the program was calculated for each level and each sub-test, although these calculations were not used in the section on "findings" or in the recommendations made for improvement. And, finally, the mean 1-28-020119 3 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I score on the Observation Survey and the Developmental Reading Assessment was divided by the maximum possible score to determine the average percent for each score. An additional table was constructed to display the achievement gap between/among schools for each sub-test at each grade level. The District's statistician conducted three statistical studies that informed the study: one of the average number of days of teacher participation in professional development on the implementation of ELLA, by program model, and another of descriptive statistics between teacher participation in professional development on ELLA implementation and student achievement. A third study was conducted to determine the validity of the Observation Survey and the Developmental Reading Assessment in relationship to the Achievement Level Tests. Finally, 87 tables of school-level data were constructed to add to the study and to provide the critical information for school-level staff members to conduct their own analyses at the school level. Throughout the writing of this report individual staff members, both program staff and assessment specialists, were interviewed and queried in order to clarify issues of program implementation, testing administration, instructional procedures, and data interpretation. Their assistance was invaluable. The research studies which guided the initial design of the PreK-3 Literacy Program Plan were again reviewed-especially the research on the identification of prerequisite knowledge and skills that children must acquire on their pathway to learning to read. These findings were once again mapped with the implementation plan for LRSD, as well as the assessment instruments to ensure ongoing alignment. Serendipitously, the National Center for Education Statistics published a report in July 2001 entitled Educational Achievement and Black-White Ineguality, which proved to be very helpful in interpreting Little Rock results in a national context, and which is cited in this program evaluation, along with other external studies. Multiple strategies to analyze the data were employed so as to establish as thoroughly and comprehensively as possible a basis for determining the program's quality. The detailed analysis is found in Section VI. No attempt was made in this study to analyze the results for limited-English proficient children since that program is evaluated separately. It is important to note, however, that the scores of limited-English proficient students are included in each school's results. The District requires them to take the tests so their progress in learning English, as well as in learning to read, may be monitored. And, finally, credible research studies were consulted, as were informed staff, in the determination of recommendations for improvement or determining next steps in becoming even more effective. 1-28-020120 4 Before the program evaluation was published, it was reviewed by many individuals, including Dr. Steve Ross of the University of Memphis, and groups, including the Early Literacy program staff, PRE staff representatives, and School Services staff. The District is grateful to all who offered feedback and suggestions for the improvement of this report. To the best of the writer's ability, the suggestions for improvement were incorporated into the draft. Others were added to recommendations for the Year 3 study. Outline of Program Evaluation Sections This report is organized into ten sections: 1. Section I includes the Introduction, as well as a delineation of the Research Questions for the study and a description of the methodologies employed. 2. Section II provides background information on the program design and its relationship to the Strategic Plan and the Revised Desegregation Plan. 3. Section III describes the selection of appropriate assessments for grades K-2 and the processes by which "readiness" standards were established for each grade level for the Developmental Reading Assessment. It also includes information on national and local validation studies of the Observation Survey and the Developmental Reading Assessment, as compared to the Achievement Level Test. 4. The literacy plan's design in relationship to the findings in national research studies on early literacy is described in Section IV. This section also includes an alignment of the research with the assessments selected by the District. 5. Three major sections on data analysis follow. Section Vis a description of each of the tables that was constructed from the data reports to assist the writers of this report and its readers in analyzing the results on the eight measurements: the five sub-tests on the Observation Survey (OS); the Developmental Reading Assessment (DRA); and the reading and language usage sub-tests of the Achievement Level Tests (ALTs). 6. Section VI is a detailed analysis of the data in each table and a comparison of 1999-2000 and 2000-01 data, by race. 7. Additional data are provided in Section VII on the achievement gap among schools and on some statistical studies that were conducted relating to program effectiveness and the relationship between teacher participation in professional development and the achievement of their students. 8. Following the data analysis is Section VIII that summarizes the program strengths and weaknesses and specifies the implications for instruction, with specific recommendations for improvements in 2001-2002. 5 I I I I I I I I I I I I I I I I I I I 9. Section IX is the Bibliography for the study. 10. Section X includes 87 tables of school-level data. Those interested in individual school performance or comparisons are encouraged to use the model in this report for data analysis at the District level to conduct similar analyses at the school level. Behind Section X are appendices A-E for more background and further reference: A. "PreK-3 Literacy Program Plan" B. Section 5.2.1 of the Revised Desegregation and Education Plan's March 2000 Interim Compliance Report C. Section 5.2.l of the Revised Desegregation and Education Plan's March 2001 Compliance Report D. Presentation to the Board of Education, January 2000 (update on program implementation and early results) E. "Update on the Implementation of the PreK-3 Literacy Program Plan," "Highlights of Grades K-2 Results: Developmental Reading Assessment, 1999-2000 and 2000-01," and a copy of the slides for June 2001 presentation to the Board of Education I I I I I I I I I I I I I I I I I I I 1-28-020122 6 '---- --- - - - - II. Background on Program Design Background on Program Requirements: Design of the PreK-3 Literacy Program During early fall 1998 a committee was formed in the Division of Instruction of the Little Rock School District to design a new elementary literacy program, with an emphasis on the primary grades of PreK-3. The processes and ultimate design of that plan are described in the PreK-3 Literacy Program Plan in Appendix A. All elementary schools in the Little Rock School District are expected to teach the same curriculum standards and grade-level benchmarks, regardless of the instructional strategies and/or materials that are selected according to the various implementation models. Twenty-seven of the District's 35 schools are implementing the Early Literacy Learning in Arkansas (ELLA) instructional strategies that are the content of the professional development program for PreK-2 teachers. This model was developed through a collaborative effort that included the Reading Recovery Training Center at the University of Arkansas at Little Rock, the Arkansas Reading Recovery teacher leaders, and the Arkansas Department of Education. Nine schools are implementing the Reading Recovery program, a first-grade intervention, developed by Marie Clay. Seven schools are implementing the Success for All model that was developed at Johns Hopkins University. Little Rock schools receive their training for this program from the University of Memphis. Both ELLA and Success for All training are designed from the same research base on early literacy; they differ in implementation strategies and materials. One school is implementing Direct Instruction through an approved waiver from the District program. Both the Success for All schools and the Direct Instruction school are supplementing their programs, in some cases, with ELLA strategies for greater effectiveness. According to Busbea (2000), In ELLA the importance of helping students feel like readers and writers on the first day of school is stressed. In order to achieve such a goal, teachers must provide students with the needed materials and opportunities for literacy activities. A balanced literacy approach is used to give students these opportunities. The children are engaged in whole text, but they are given formal instruction based on their strengths and needs (30-31 ). The literacy components taught in the ELLA professional development program, again according to Busbea, are as follows: Read aloud. Shared reading . Guided reading . Familiar reading . Modeled writing or shared writing . Interactive writing . 7 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Writing aloud . Revising and editing . Independent writing and conferencing . Phonetic skills . Classroom management. Each school is required to dedicate a two and one-half hour block of uninterrupted time daily for literacy instruction. Background on Program Reguirements: LRSD Strategic Plan The District adopted its Strategic Plan in 1996, and it was updated in fall 1998. Three of the eleven strategies were important in the development of the PreK-2 Literacy Program Plan: Strategy 2: In partnership with our community, we will establish standards in the core curriculum (reading/language arts, mathematics, science, and social studies) at each appropriate level, as well as develop the means of assessing whether students have met these standards. Strategy 3: We will develop and implement a broad range of alternatives and interventions for students scoring below the 5dh percentile on standardized tests or who are at serious risk of not achieving District standards in the core curriculum. Strategy 7: We will design a comprehensive staff development system to best achieve the mission and objectives in the Strategic Plan. Background on Program Reguirements: Revised Desegregation and Education Plan The charge to the design committee of the PreK-3 Literacy Plan included three major sections of the Revised Desegregation and Education Plan that was approved by the federal court in February 1998: Section 2.7, Section 2.7.1, and Se_ction 5.2.1. The first of these sections (2. 7) establishes the obligation to improve the achievement of students, especially those who are African American. Section 2. 7: LRSD shall implement programs, policies, and/or procedures designed to improve and remediate the academic achievement of AfricanAmerican students, including but not limited to Section 5 of this Revised Plan. On January 21, 1998, Mr. John Walker, on behalfofthe Joshua Intervenors, signed an agreement with the Little Rock School District that was filed with the federal court, which included the following statement: With regard to the achievement disparity, the January 16 Revised Plan recognizes that the only legitimate means to eliminate the racial disparity in achievement is by improving African-American achievement (2). 1-28-020124 8 - ---- - - - To that end and to address the obligation in Section 2.7, the staff made a conscious decision to emphasize "designed to improve ... the academic achievement of AfricanAmerican students," rather than to "remediate" that achievement, given the failure of most remediation efforts not only in Little Rock, but across the country. This is not to say that the District abandoned its remediation efforts. It did not. Re-teaching, tutoring, Title I programs, computer-assisted instruction, inter-sessions in the Extended Year schools, after-school programs, summer school, and Reading Recovery (first-grade intervention in some schools) continued as much as ever, but as supplemental to the efforts going on in every classroom to prevent as much failure as possible, rather than try to correct failure after it had occurred. These remediation efforts are documented in the schools' School Improvement Plans and their Title I Plans. And, of course, the Success for All program implemented in seven LRSD elementary schools and Direct Instruction at Washington Magnet can be described as both preventative and remedial in nature. This decision to emphasize prevention of failure vs. remediation is supported in the published work of the National Research Council, Preventing Reading Difficulties Among Young Children (1998); the research in scores of studies sponsored by the International Reading Association; and from Marie Clay, who developed the Reading Recovery program. The National Research Council concluded in their massive study the following: The majority of reading problems faced by today's adolescents and adults are the results of problems that might have been avoided or resolved in their early childhood years. It is imperative that steps be taken to ensure that children overcome these obstacles during the primary grades (5). Marie Clay writes the following: Teachers and parents of 11- to 16-year olds often believe that schools have done nothing for the reading difficulties of the young people they are concerned about. Yet the older child has probably been the focus of a whole sequence of wellintentioned efforts to help, each of which has done little for the child. This does not mean that children do not sometimes succeed with a brilliant teacher, a fantastic teacher-child relationship, a hard-working parent-child team. What it does mean is that the efforts often fail (15). Dorothy Strickland makes a similar finding: Historically, educators focused their attention on remediation, allowing children to fail before help was given. The importance of intervening early and effectively is well established among educators and social service providers (325). She explains that" the cycle of failure often starts early in a child's school career" arid that "there is a near 90% chance that a child who is a poor reader at the end of grade 1 will remain a poor reader at the end of grade 4." Therefore, as the child continues to experience "failure and defeat," he/she becomes likely to drop out of school (326). Also, 1-28-020125 9 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I she states that "supplementary remedial programs such as Title I and replacement programs that substitute for regular, in-class instruction have had mixed results over the years" (326). She concludes: Those who have turned their attention to early intervention state that it is ultimately less costly than years of remediation, less costly than retention, and less costly to students' self-esteem. This final point may be the most compelling of all because the savings in human suffering and humiliation is incalculable. Teachers in remedial programs often observe that students who feel they are failures frequently give up and stop trying to learn despite adequate instructional opportunities (326). Linda Dom (1998), Reading Recovery Teacher Leader Trainer and developer of the Arkansas Early Literacy and Literacy Coach model, and her colleagues French and Jones explain this shift in understanding about teaching as follows: Recently, Linda asked a group of teachers in a college course how they taught reading to their lowest achieving children. From their responses, it was clear that their theory was a deficit one guided by their concern about how much the children did not know. Traditionally, we have tested children to identify their weak areas and then designed instruction based on what they do not know. This theory of learning is in direct opposition to what research tells us about how the brain acquires information and then organizes related information into larger networks. .. . instruction that is based on inadequate background is grounded in a deficit model, which may force young learners to rely on low-level processes (24-25). In their summary of Chapter 1, they wrote: Prevention ofreading problems must begin in the early grades. If children are not reading on grade level by the end of third grade, their chance of success in later years is minimal. One significant characteristic of problem readers is their lack of literacy experiences during their preschool years. Schools must compensate by providing the children with rich literacy classroom programs and supplemental literacy services that focus on early intervention (15). In other words, those who persist in insisting on remediation of learning as the primary emphasis for the lowest-achieving children doom those children to lessons that never get beyond the rote memorization of basic information, and those children will never have an opportunity to understand anything well, much less apply higher-order thinking skills. Dorn, ~ ill ( 1998) urge teachers, therefore, to "identify the strengths of young children and use this infonnation as the basis for designing rich learning experiences that emphasize problem-solving (p. 25)." In these ways, schools can prevent failure. 1-28-020126 It should be noted that the District sees its HIPPY and expansive pre-kindergarten program as a part of its overall prevention-of-failure efforts. (See Compliance Report of March 2001, pp. 72-73, for a break-down of the 1312 youngsters involved in early childhood education during 2000-01 .) The second section from the Revised Plan (2. 7. l) requires the District to conduct annual assessments of English language arts and mathematics in order to determine their effectiveness in improving the achievement of African American students-and then to take appropriate action if the program is not effective by either modifying the program's implementation or replacing it. Section 2. 7.1: LRSD shall assess the academic programs implemented pursuant to Section 2. 7 after each year in order to determine the effectiveness of the academic programs in improving African-American achievement. If this assessment reveals that a program has not and likely will not improve AfricanAmerican achievement, LRSD shall take appropriate action in the form of either modifying how the program is implemented or replacing the program. Prior to fall 1999 there was not in place a reading assessment (except the eight-week assessments in the Success for All schools) that measured student progress in their acquisition of learning-to-read skills in the early grades. For a time the SAT9 was administered in grades 2-3, but it was not used to drive instructional practice as much as it was used to identify students for the gifted/talented program. The Literacy Benchmark examination required by the State of Arkansas in grade 4 was the first formal assessment of whether students could read independently. The design committee believed strongly that to comply with the Revised Plan and also, importantly, to be able to diagnose potential reading difficulties, as well as to identify progress and grov.1h of individual students, classrooms, schools, and the District, an annual assessment would be required. The District could not afford to wait until grade 4 to find out whether every student had learned to read independently, a goal established in the Revised Plan. After a review of the available literacy assessments for young children and after consulting with the experts involved in the Early Literacy Learning in Arkansas (ELLA) professional development program and with specialists at the Arkansas Department of Education, District staff decided to adopt two sets of measurements--the Observation Survey of Early Literacy Achievement developed by Marie M. Clay and the Developmental Reading Assessment developed by Joetta Beaver. Subsequently, because of a need to have a measurement for the identification of students for the grade 3 gifted/talented program, the Achievement Level Test developed by the Northwest Evaluation Association in collaboration with LRSD teacher teams was added to the assessment plan for grade 2. The results of these data would be the primary basis for evaluating program effectiveness. 1-28-020127 11 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I The third section (5.2 .1) of the Revised Desegregation and Education Plan establishes several curriculum, instruction, professional development, assessment, and parental involvement obligations: Reading/Language Arts Section 5.2. l: Primary Grades. LRSD shall implement at least the following strategies to improve the academic achievement of students in kindergarten through third grade: a. Establish as a goal that by the completion of the third grade all students will be reading independently and show understanding of words on a page; b. Focus teaching efforts on reading/language arts instruction by teaching science and social studies through reading/language arts and mathematics experiences; c. Promote thematic instruction; d. Identify clear objectives for student mastery of all three reading cueing systems (phonics, semantics, and syntax) and of knowing-how-to-learn skills; e. Monitor the appropriateness of teaching/learning materials to achieving curricular objectives and the availability of such materials in all classrooms; f Establish uninterrupted blocks of time for reading/language arts and mathematics instruction; g. Monitor student performance using appropriate assessment devices; h. Provide parents/guardians with better information about their child's academic achievement in order to help facilitate the academic development of the students; i. Provide pre-kindergarten, kindergarten, and first grade learning readiness experiences for students who come to school without such experiences; j. Train teachers to manage successful learning for all students in diverse, mainstreamed classrooms; k. Use the third and/or fourth grade as a transition year from focused reading/language arts and mathematics instruction to a more traditional school day; and I. Provide opportunities for students to perform and display their academic training in a public setting. Rather than repeat in this program evaluation the information provided in a number of earlier reports, the relevant pages from those earlier reports are included in the appendices. The document in Appendix E entitled "Update on the Implementation of the PreK-3 Literacy Program Plan" includes the following list of initiatives that have been implemented from the PreK-3 plan and which require emphasis (pp. 2-3): 1-28-020128 12 --- ---- Title I programming was restructured and aligned with the District's program. 'A moratorium was placed on adding any new supplemental reading/ language arts programs. Some programs in previous use were abandoned . A waiver was granted to Washington Magnet to keep its Direct Instruction program. Cuniculum standards, instructional strategies, instructional materials, assessments, and professional development were tightly aligned. Each school established a sacred, uninterrupted, two and one-half hour daily block for the teaching of reading/language arts. A new English-as-a-Second Language program was implemented that is also tightly aligned with the District's general education program. New assessments that are developmentally appropriate and aligned with the curriculum and instructional program were implemented. Animated Literacy, a phonemic awareness program, was implemented in kindergarten. Early Literacy Learning in Arkansas (ELLA) was implemented in grades K-2, with Pre-ELLA added in fall 2000 for prekindergarten students. More than $350,000 was expended in the purchase of reading and other cuniculum support materials during the past two years. A committee has almost completed work on a new elementary report card. Most primary teachers experienced a minimum of one week of ELLA training, with follow-ups as necessary and appropriate (See Compliance Report in Appendix C for lists of professional development sessions.) The Parent-School Compact was revised, and the Student Academic Improvement Plan (SAIP) was developed and implemented. The Parent Program was restructured in May 2000 . An ESL Parent Coordinator was employed in spring 2001 . 1-28-020129 13 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I III. The Assessments Marie Clay makes the point repeatedly in her book, An Observation Survey of Early Literacy Achievement (1993), that no one observation task is satisfactory on its own when one needs to make important instructional decisions for children (p. 20). She would find strong support from Grant Wiggins, who is a national expert in assessment. In his book, Assessing Student Performance: Exploring the Purpose and Limits of Testing (1993), Wiggins wrote: One test signifies nothing, let us emphatically repeat, but five or six tests signify something. And that is so true that one might almost say, 'It matters very little what the tests are so long as they are numerous"' (13). In the Little Rock School District, the tests are numerous. The Assessments: Observation Survey Below is summary information about what the five sub-tests in the Observation Survey measure. Letter Identification This sub-test answers the following questions: What letters does the child know? Which letters can he/she identify? All letters, lower and upper case, are tested. The observation includes an analysis of the child's preferred mode of identifying letters; the letters a child confuses; and the unknown letters. (Clay, p. 43) The maximum score is 54. This test is administered in grades K-1. Word Test The student is tested over the most frequently occurring words in whatever basic reading texts are being used. Scores on this measure are useful in determining a child's "readiness to read." (Clay, p. 53) The maximum score is 20. This test is administered in grades K-2. Concepts about Print This sub-test (5-10 minutes) includes testing whether the student knows the front of the book, that the print (not the picture) tells the story, that there are letters, that are clusters of letters called words, that there are first letters and last letters in words, that you can choose upper or lower case letters, that spaces are there for a reason, and that different punctuation marks have meanings. Scores on this measure have proven to be a sensitive indicator of behaviors that support reading acquisition. (Clay, p. 47) The maximum score is 24. This test is administered in grades K-1. Writing Vocabulary The student is asked to write down in ten minutes all the words he/she knows how to write, starting with his/her own name and making a personal list of words 1-28-020130 14 he/she has managed to learn. There is no maximum score. This test is administered in grades K-2. Hearing and Recording Sounds in Words The teacher asks the child to record a dictated sentence. The child's performance is scored by counting the child's representation of the sounds (phonemes) by letters (graphemes). The maximum score is 37 at grades K-1 and is 64 at grade 2. This test is administered in grades K-2. The Assessments: Developmental Reading Assessment The Developmental Reading Assessment is a one-on-one assessment of reading skillsprimarily accuracy of oral reading and comprehension through reading and re-telling of narrative stories. The assessment consists of stories that increase in difficulty. Factors which contribute to the gradient of difficulty of the stories include the number of words on a page, complexity of vocabulary, length of the stories, degree of support from the pictures, as well as complexity of sentence and story structure. The assessment formats are as follows: Levels A-2 (Kindergarten Grade Level), 7-8 minutes 1. Teacher selects book 2. Teacher introduces text 3. Teacher reads one or two pages 4. Child points and reads rest of story; teacher takes running record 5. Teacher asks print questions 6. Teacher asks preference questions Levels 3-16 (First Grade Level), 10-15 minutes 1. Teacher selects book 2. Teacher introduces text 3. Child looks at pictures; tells what is happening 4. Child reads story aloud; teacher takes running record 5. Child retells story 6. Teacher asks response questions 7. Teacher asks preference questions Levels 18-44 (Second Grade Level), 15-20 minutes 1. Teacher selects range of three texts 2. Child previews and chooses one 3. Teacher introduces text 4. Child reads first 2-4 paragraphs aloud 5. Child predicts what will happen in story 6. Child reads complete story silently in another location 7. Child retells story 8. Teacher asks response questions 9. Child reads selected portion of text; teacher takes running record 1-28-020131 15 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I 10. Teacher asks preference questions 11. Teachers asks one or two inference questions (L~vels 28-44). "Readiness" levels for the Little Rock School District have been established as follows: Kindergarten- Level 2 Grade I-Level 16; and Grade 2-Level 24. The explanation below (developed in summer 2000) on "Definition of 'Readiness' vs. 'Proficiency"' is a delineation of the District's efforts to define appropriate cut scores for each grade level so that a determination could be made of the percent of students who are achieving a standard of "readiness" for success at the next grade level. Definition of "Readiness" vs. "Proficiency" The Arkansas Department of Education has defined performance at four levels: Below Basic, Basic, Proficient, and Advanced for the Benchmark examinations that are administered at grades 4, 6, and 8 and the end-of-level examinations for designated high school courses. "Proficient" is the performance standard that all students should achieve. The ADE definition follows: Proficient students demonstrate solid academic performance for the grade tested and are well-prepared for the next level of schooling. They can use Arkansas' established reading, writing, and mathematics skills and knowledge to solve problems and complete tasks on their own. Students can tie ideas together and explain the ways their ideas are connected. The Developmental Reading Assessment allows teachers to assess reading "levels" of students through a one-on-one test reading conference between teacher and student. Teachers observe student performance during the test, make notes on reading behaviors, and score the performance as they go along. The desire was to establish appropriate cut points that would define "proficient" performance. To gauge which "level" is equivalent to how Arkansas defines "proficiency," the staff used national reading standards for each grade level as defined in Reading and Writing Grade by Grade: Primary Literacy Standards from Kindergarten through Third Grade (New Standards Primary Literacy Committee, National Center on Education and the Economy and the University of Pittsburgh, 1999). The staff then identified the DRA level that corresponds to that specific performance. Standards and DRA equivalents by grade level follow: 1-28-020132 16 Grade Level Readinl! Standards DRA Level Kindergarten Children at the end of kindergarten should Assessment texts A through 2 consist of a repeated word or Gradel Grade 2 understand that every word in a text says sentence pattern with natural language structures. The simple something specific. They can demonstrate this illustrations include animals and objects familiar to primary competence by reading Level B books that they children and highly support the text. One or two lines of text have not seen before, but that have been appear on the left page and are large and well spaced so that previewed for them, attending to each word in children can point as they read. The number of words in the seouence and l!Cttinl! most of them correct. texts ranees from ten to thirtv-six . By the end of the year, we expect first-grade Assessment texts 16 through 28 arc stories with beginnings, students to be able to: middles, and ends, throughout which problems are presented read Level 16 books that they have not seen and resolved. The characters are either imaginary (giants and before, but that have been previewed for them, elves) or animals with human characteristics. The content with 90 percent or better accuracy of word begins to move beyond children 's personal experiences and recognition (self-correction allowed). builds a basis with which to compare and contrast other When they read aloud, we expect first graders stories. Literacy language structures are integrated with to sound like they know what they are reading. natural language. Some description of characters and setting Fluent readers may pause occasionally to work is included. Illustrations provide moderate to minimum out difficult passages. By the end of the year, support. The text may be three to twelve lines above or we expect first-grade students to be able to beneath the illustrations, or a full page. The number of words independently read aloud from Level I books in these texts starts at 266 and increases with each level of that have been previewed for them, using difficulty. intonation, pauses and emphasis that signal the structure of the sentence and the meaning of the text. By the end of the year, we expect second-grade Assessment texts 16 through 28 arc stories with beginnings, students to be able to independently read aloud middles, and ends, throughout which problems are presented unfamiliar Level 24 books with 90 percent or and resolved. The characters are either imaginary (giants and better accuracy of word recognition (self- elves) or animals with human characteristics. The content correction allowed). begins to move beyond children's personal experiences and builds a basis with which to compare and contrast other stories. Literacy language structures are integrated with natural language. Some description of characters and sett ing is included. Illustrations provide moderate to minimum support. The text may be three to twelve lines above or beneath the illustrations, or a full page. The number of words in these texts starts at 266 and increases with each level of difficultv. The staff also considered the work of others who use the DRA in their determination of appropriate cut points to define proficiency at each grade level. Several states and many school districts have adopted the DRA for early literacy assessment. One example is the chart establishing "proficiency levels" developed by the East Baton Rouge Parish School System in Louisiana. They have determined that "On Grade Level" is defined by a kindergarten student's performance at levels 1, 2 on the DRA; grade 1 is levels 16, 18; and grade 2 is levels 24, 28. "Above Grade Level" is defined as levels 3-14 at kindergarten; levels 20-28 at grade 1; and levels 30-38 at grade 2. In Lindsay, California, the "Approaching Proficiency" levels are defined similarly: level 2 at kindergarten; levels 10-12 at grade 1; and level 24 at grade 2. 1-28-020133 17 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I A program evaluation conducted by the Austin, Texas, Independent School District indicates that the "Grade Level" performance on the DRA was defined as level 2 at kindergarten; level 16 at grade 1; and levels 24-28 at grade 2. The State of Ohio defined "Success Indicators" for reading for each grade level. These can be compared to the national standards developed by the National Center for Education and the Economy: At the end of kindergarten, children should be able to write in a left to right/top to bottom manner, have a firm grasp of letters and their sounds, and recognize a few simple words. By the end of first grade, students should be using and integrating phonics and reading strategies as they read, writing simple stories, reading independently, and demonstrating comprehension of stories through drawing, writing, discussion, and dramatization. By the end of second grade, students should be reading silently for extended periods and reading orally with appropriate use of punctuation. They should demonstrate that they can gather information by reading, predict how stories will end, compare and contrast story elements, sequence evens from a story, retell a story, and relate what they read to their lives. The State of Connecticut uses the Developmental Reading Assessment as a part of their state accountability system in grades 1-3. Grade 1 students who perform at or below level 10 and grade 2 students who perform at or below level 16 at the end of the year are identified as "substantially deficient." Such students then receive a personal or individual reading plan that outlines additional instructional support and monitors student progress-similar to the District's Student Academic Improvement Plan (SAIP). Although Connecticut does not identify grade-level proficiency levels, they have established the literacy standard for LEP students to exist the bilingual program: at kindergarten the student must perform at level 2; at grade 1 level 16; and at grade 2 level 28. Ve-rmont, likewise, uses the DRA in their state assessment program and has established similar levels of proficiency. Joetta Beaver, the developer of the Developmental Reading Assessment (published by Celebration Press in 1997), suggests that districts should define proficiency levels so that students performing below those levels receive necessary interventions and remediation. Her recommended proficiency levels are levels 1-2 for kindergarten; levels 16-18 at grade 1; and levels 24-28 at grade 2. All these efforts to define proficiency are either exactly aligned with the decisions made by LRSD staff or are very close. 1-28-020134 18 Given, however, the difficulty of establishing with confidence an equivalent definition of "proficiency" that would predict achievement on the grade 4 Benchmark examination, District staff members have made the decision to use what in their best judgment are the appropriate cut scores (based on all the research cited), but to use the term "Readiness" to define the desired performance. When the District has multiple years of data and when the 1999-2000 kindergarten students take the Grade 4 Benchmark examination in spring 2004, then the staff can do some statistical calculations that will enable the District to set cut scores that reliably predict "Proficient" performance on the grade 4 Benchmark. Reliability and Validity: National Study The development of the Developmental Reading Assessment began in 1988 by a team of teacher-researchers. According to the national validation study, "the purpose of the assessment was to guide teachers' ongoing observations of student progress over time within a literature-based reading program" (p. 2). Over the next six years there were numerous revisions in response to teacher feedback. In spring 1996 the first formal validation study was conducted. Seventy-eight teachers from various parts of the United States and Canada participated. (p. 3) The results of the study were very positive, and where the correlations were not as strong as they possibly could be, revisions to the instrument were made to strengthen validity. In summary, the DRA was found to be a valid assessment. Teachers found it very helpful in determining individual students' instructional text reading level; describing his/her performance as a reader; selecting appropriate interventions and/or focus for instruction; and identifying students who may be reading below proficiency (11). A reliability study of the Developmental Reading Assessment was conducted in spring 1999 by Dr. E. Jane Williams. In this study eighty-seven teachers from ten states participated. All had prior experience in administering the DRA .. The findings were that both the inter-rater reliability and the internal consistency of the test were strong to very strong (6). The construct validity of the DRA was also established through an additional study. Construct validity ensures that the test measures what was intended that it measure. The statistics for this study were done using DRA individual student scores compared to individual scores on the Iowa Test of Basic Skills. They correlated positively, and for the ITBS Total Reading subscale, very positively. The conclusion, then, was that "the DRA validly measures a child's ability to decode and understand/comprehend what he/she has read" (6). Of importance to the LRSD was another conclusion to this study: It should be noted that a major purpose of the DRA is to help guide instruction. Ninety-eight percent of the teachers and raters agreed or strongly agreed to the 19 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I statement that the information gained about the reader during the DRA conference helped them better identify things that the child needed to do or learn next ( 9). It was the intent of the design committee and is the intent of the curriculum staff that the multiple assessments selected for grades K-2 be used to drive instruction-for the data gathered from those assessments to be used to assist teachers in deciding what to do next for each individual child. LRSD embraces the joint position statement of the International Reading Association and the National Association for the Education of Young Children that was adopted in 1998: Throughout these critical years accurate assessment of children's knowledge, skills, and dispositions in reading and writing will help teachers better match instruction with how and what children are learning. However, early reading and writing cannot be measured as a set of narrowly defined skills on standardized tests. These measures often are not reliable or valid indicators of what children can do in typical practice, nor are they sensitive to language variation, culture, or the experience of young children. Rather, a sound assessment should be anchored in real-life writing and reading tasks ... and should support individualized diagnosis needed to help young children continue to progress in reading and writing" (20). Reliability and Validity: LRSD Study The following correlational matrix constructed by the District's statistician in spring 2001 displays the relationships between the scores on the Achievement Level Tests (AL Ts) and the Observation Survey and Developmental Reading Assessment scores. Correlational Matrix, Spring 2001 ALT Reading RIT, ALT Reading Goal RITs, Observation Survey, and DRA Scores Goal I : Word Goal 2: Goal 3: Goal 4: Observation Observation Observation Meaning Literal Interpretive Evaluative Survey: Survey: Survey: Compreben Compreben Comprehen Word Test Writing Dictation sion sion 1ion . Vocabulary Reading RIT Score 0.937 0.940 0.922 0.917 0.280 0.467 0.638 Goal I : Word Meaning 0.839 0.805 0.815 0.255 0.438 0.602 Goal 2: Literal :: 1 .. 0.823 0.822 0.223 0.418 0.577 Comnrehension ; l ' Goal 3: Interpretive Comprehension .. 0.795 0.1 99 0.410 0.535 Goal 4: Evaluative Comprehension 0.207 0.413 0.574 Observation Survey: Word Test ' " 0.276 0.351 Observation Survey: Writiug Vocabulary .. 0.442 Observation Suney: Dictation . . ' All correlations arc significantat the .05 level DRA 0.788 0.733 0.724 0.696 0.719 0.360 0.478 0.683 N's range from 1577 to 1684 1-28-020136 While all the relationships are significant at the .05 level, some relationships are stronger than others. All of the ALT scores relate strongly to the DRA, with values of .696 to 20 .788. Only Hearing and Recording Sounds (Dictation) on the Observation Survey has a value above .50--.683. Also, within the Observation Survey correlational values are lower. The staff anticipated this result since the Observation Survey measures learninghow- to-read skills, and the Developmental Reading Assessment measures more difficult comprehension skills. The large sample size gives power to this matrix and contributes to significance at apparent low correlational values. The statistician subsequently ran a statistical test called Cronbach's Alpha, which is a reliability test for internal consistency of an assessment. Reliability is a measure of a test's stability; that is, if one gives the same test more than once, a reliable test would produce a similar or same result. A test with an acceptable Alpha indicates that the variability in scores is a result of the test taker, while a low Alpha indicates that the variability in scores is a result of a poorly designed or inconsistent test. A test with an Alpha of .60 and greater is usually considered to be internally consistent. The Alpha coefficients for the Observation Survey and the Developmental Reading Assessment for both fall and spring administrations are as follows: Fall K Grade 1 Grade 2 .63 .66 .74 Spring .85 .62 .65 Therefore, both the Observation Survey and the Developmental Reading Assessment appear to have stability and are internally consistent. The Alpha for the spring grade 2 Achievement Level Test is .97. What these data are indicating is that the Developmental Reading Assessment is a valid and reliable test. The lower correlation values of the Observation Survey are more likely a product of these tests measuring pre-reading knowledge and skills, as opposed to the reading comprehension skills measured on the grade 2 Achievement Level Test. Developmental Appropriateness of Testing Instruments Both the sub-tests on the Observation Survey and the Developmental Reading Assessment are administered one-on-one by the classroom teacher to the student. The teacher scores the student's performance, based upon rubrics and scoring instructions provided to the teacher in a mandated training session and in writing. The teacher then bubbles in on each child's answer sheet his/her level of performance and sends those answer sheets to the Director of Early Literacy for processing and the compilation of scoring reports. One caution, therefore, in interpreting the data is that the teacher has scored his/her own students' performance, and bias may be possible. The District has conducted a procedure to verify the accuracy of the spring scores- those most likely to be influenced by bias. Students' spring scores are matched with their fall scores the following year, and then ifthere is a wide discrepancy, that score can be flagged. When there is a pattern of significantly higher spring scores from one teacher than the next year's fall scores, then an investigation must be conducted. One school with suspiciously high spring scores was flagged for review in fall 2000. However, when 1-28-020137 21 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I the match of scores was run, the staff found absolutely no evidence of cheating. The fall 2000 scores were closely in line with those of the previous spring, even though the children in the fall were in several different schools, and there were more than seven teachers administering the fall tests. The staff also has collected some anecdotal evidence that a few teachers may, in fact, be under-reporting student achievement rather than overreporting, due to their own low expectations. To avoid even the appearance of bias, some would recommend that the District use a standardized examination with individual students writing their own answers and then the answer sheets scored by machine. The problem with this approach is that the results would likely be even more questionable than the ones produced through one-on-one testing. Experts in early literacy and in early education have developed strongly stated positions against the use of standardized tests for young children, ages 3 through 8. For example, a position statement, Leaming to Read and Write: Developmentally Appropriate Practices for Young Children, was issued in 1998 by the International Reading Association (IRA) and the National Association for the Education of Young Children (NAEYC). The section on assessment follows: Group-administered, multiple-choice standardized achievement tests in reading and writing skills should not be used before third grade or preferably even before fourth grade. The younger the child, the more difficult it is to obtain valid and reliable indices of his or her development and learning using one-time test administrations. Standardized testing has a legitimate function, but on its own it tends to lead to standardized teaching-one approach fits all-the opposite of the kind of individualized diagnosis and teaching that is needed to help young children continue to progress in reading and writing (11). A 1987 position paper by NAEYC, Standardized Testing of Young Children 3 Through 8 Years of Age, is even more explicit: Young children are not good test takers. The younger the child, the more inappropriate paper-and-pencil, large group test administrations become. Standards for administration of tests require that reasonable comfort be provided to the test taker (AERA, AP A, &amp; NCME, 1985). Such a standard must be broadly interpreted when applied to young children. Too often, standardized tests are administered to children in large groups, in unfamiliar environments, by strange people, perhaps during the first few days of school or under other stressful conditions. During such test administrations, children are asked to perform unfamiliar tasks, for no reason that they can understand. For test results to be valid, tests are best administered to children individually in familiar, comfortable circumstances by adults whom the child has come to know and trust and who are also qualified to administer the tests (5). In conclusion, therefore, the staff made the determination that the Observation Survey and the Developmental Reading Assessment met all the criteria for selecting good assessment instruments for the children in K-2 classrooms. They were closely aligned 1-28-020138 22 - - -------- with the curriculum and teaching strategies that were to be used by teachers; they measured the learning-to-read skills that were essential for children becoming independent readers; they provided teachers with necessary diagnostic and summative data; they were developmentally appropriate; their administration procedures met test administration standards for young children; and their results were much likely to be valid and reliable than if a standardized test was used. The Assessments: Achievement Level Tests in Reading and Language Usage The Achievement Level Test (ALT) at grade 2 in reading and language usage was first administered in spring 2000. The AL Ts are a series of tests that are aligned with the Little Rock School District curriculum and the Arkansas state standards. Because the scores are along one continuum over the grade levels, they allow staff and others who are interested to calculate the amount of growth for individual students, classrooms, schools, and the District as a whole from year to year. With the AL Ts, students take tests at a level that matches their current achievement level. The test should be challenging, but neither too difficult nor too easy. Because the tests match the achievement level of the student, teachers receive accurate information that helps them to monitor each student's academic growth. ALTs are not timed, and they take about one hour per subject for most students. The District scores the AL Ts, and the results are returned to the schools as quickly as possible, sometimes within 48 hours. Any retesting that is necessary is completed, so school reports cannot be printed until all testing is finished, and district reports cannot be completed until all schools finish their testing. Reports are also produced for parents, teachers, and administrators. Once a student has been through two administrations of the AL Ts, a trend report is produced for parents that allows them to monitor the growth of their child compared to the growth of the District and the growth of the national group that takes the test. Student progress is reported in a scale score called the Rasch Unit (RIT). It is an equal interval measure. It can be compared to measuring a child's physical growth in inches and then comparing it to an expected growth chart. The test measures achievement growth with a RIT scale and compares the growth to an expected national growth chart. By monitoring the growth of students, staff can pinpoint areas where individual students might need extra help or attention. District staff and Campus Leadership Teams use the information to make data-driven decisions about school improvement plans, curriculum and instructional changes, and professional development needs. The scores are also used in program evaluations. There are four goals/standards that are measured on the reading sub-test: 1. Word Meaning A. Phonetic skills B. Context clues C. Synonyms, antonyms, homonyms D. Component structure (prefix, suffix, origin, roots) E. Multiple meanings 1-28-020139 23 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I 2. Literal Comprehension A. Recall/identify significant details B. Identify main idea C. Locate information D. Follow directions E. Sequence details 3. Interpretive Comprehension A. Inference B. Identify cause and effect C. Author's purpose D. Prediction E. Summarize F. Identify literacy elements (character, plot, setting, theme, etc.) 4. Evaluative Comprehension A. Evaluate conclusions, validity (supporting context) B. Identify fact and opinion C. Identify literary techniques (figurative language, mood, tone, etc.) D. Distinguish text forms E. Identify bias, stereotypes. Three goals/standards are tested on the Language Usage sub-test: l . Writing Process A. Prewriting skills B. Drafting and revising C. Editing/proofreading D. Choosing appropriate format E. Sentence choice appropriate to purpose F. Paragraph skills (topic and concluding sentences, indenting, etc.) 2. Grammar and Usage A. Sentence patterns B. Phrases and clauses C. Noun forms D. Verb usage: tenses, irregular verbs, subject-verb agreement E. Adjective forms F. Adverb forms G. Pronoun forms H. Pronoun-antecedent agreement I. Negative forms 3. Mechanics A. End punctuation B. Commas C. Apostrophes D. Enclosing punctuation E. Titles F. Beginning capitalization G. Proper nouns and adjectives 1-28-020140 24 H. Capital I The staff made a deliberate decision to delay the use of this formal, group-administered test until the end of second grade. Even then, many teachers, principals, central office staff, and parents question its usefulness in measuring learning-to-read skills and knowledge. The data are included in this program evaluation because they exist and because they provide another measurement of student achievement that may be used to inform decision-making about the program. 1-28-020141 25 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I IV. Alignment with National Research on Early Literacy Background on the Context: National Research on Early Literacy A publication of the National Research Council, (1998), Preventing Reading Difficulties in Young Children, is nationally recognized, and it was used to a high degree in the design of the LRSD PreK-2 literacy program. Below is a short summary of the report's recommendations for early learners: Prekindergarten: Preschool programs .. . should be designed to provide optimal support for cognitive, language, and social development, within this broad focus. However, ample attention should be paid to skills that are knovm to predict future reading achievement, especially those for which a causal role has been demonstrated. Kindergarten: Kindergarten instruction should be designed to stimulate verbal interaction; to enrich children's vocabularies; to encourage talk about books; to provide practice with the sound structure of words; to develop knowledge about print, including the production and recognition of letters; and to generate familiarity with the basic purposes and mechanisms of reading. Beginning readers need explicit instruction and practice that lead to an appreciation that spoken words are made up of smaller units of sounds, familiarity with spelling-sound correspondences and common spelling conventions and their use in identifying printed words, "sight" recognition of frequent words, and independent reading, including reading aloud. Fluency should be promoted through practice with a wide variety of wellwritten and engaging tests at the child's own comfortable reading level. Children who have started to read independently, D'J)ically second graders and above, should be encouraged to sound out and confirm the identities of visually unfamiliar words they encounter in the course of reading meaningful texts, recognizing words primarily through attention to their letter-sound relationships. Although context and pictures can be used as a tool to monitor word recognition, children should not be taught to use them to substitute for information provided by the letters in the word. Because the ability to obtain meaning from print depends so strongly on the development of word recognition accuracy and reading fluency, both of the latter should be regularly assessed in the classroom, permitting timely and effective instructional response when difficulty or delay is apparent. Beginning in the earliest grades, instruction should promote comprehension by actively building linguistic and conceptual knowledge 1-28-020142 26 in a rich variety of domains, as well as through direct instruction about comprehension strategies such as summarizing the main idea, predicting events and outcomes of upcoming texts, drawing inferences, and monitoring for coherence and misunderstandings. This instruction can take place while adults read to students or when students read themselves. Once children learn some letters, they should be encouraged to write them, to use them to begin writing words or parts of words, and to use words to begin writing sentences. Instruction should be designed with the understanding that the use of invented spelling is not in conflict with teaching correct spelling. Beginning writing with invented spelling can be helpful for developing understanding of the identity and segmentation of speech sounds and sound-spelling relationships. Conventionally, correct spelling should be developed through focused instruction and practice. Primary-grade children should be expected to spell previously studied words and spelling patterns correctly in their final written products. Writing should take place regularly and frequently to encourage children to become more comfortable and familiar with it. Throughout the early grades, time, materials, and resources should be provided with two goals: (a) to support daily independent reading of texts selected to be of particular interest for the individual student, and beneath the individual student's capacity for independent reading and (b) to support daily assisted or supported reading and rereading of texts that are slightly more difficult in wording or in linguistic, rhetorical, or conceptual structure in order to promote advances in the student's capabilities. Throughout the early grades, schools should promote independent reading outside school by such means as daily at-home reading assignments and expectations, summer reading lists, encouraging parent involvement, and by working with community groups, including pu~lic librarians, who share this goal (7-9). Similar research is quoted, and similar recommendations are found in an earlier study from the Center for the Study of Reading at the University of Illinois at UrbanaChampaign (1990), Beginning to Read: Thinking and Learning about Print by Marilyn Jager Adams. Then in April 2000 with the publication of the findings of the National Reading Panel in their report, Teaching Children to Read: An Evidence-Based Assessment of the Scientific Research Literature on Reading and Its Implications for Reading Instruction, one finds similar findings and recommendations. The research-based practices for kindergarten and primary grades advocated by the International Reading Association (IRA) and the National Association for the Education of Young Children (NAEYC) in their 1998 position paper, Leaming to Read and Write: Developmentally Appropriate Practices for Young Children, follow: 1-28-020143 27 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I daily experiences of being read to and independently reading meaningful and engaging stories and informational texts; a balanced instructional program that includes systematic code instruction along with meaningful reading and writing activities; daily opportunities and teacher support to write many kinds of texts for different purposes, including stories, lists, messages to others, poems, reports, and responses to literature; writing experiences that allow the flexibility to use nonconventional forms of writing at first (invented or phonic spelling) and over time move to conventional forms; opportunities to work in small groups for focused instruction and collaboration with other children; an intellectually engaging and challenging curriculum that expands knowledge of the world and vocabulary; and adaptation of instructional strategies or more individualized instruction if the child fails to make expected progress in reading or when literacy skills are advanced (10). This research base under-girds the work of Linda Dorn of the University of Arkansas at Little Rock, developer of the Arkansas Early Literacy and Literacy Coach model that is recommended by the Arkansas Department of Education and was adopted by the Little Rock School District. The alignment between the research on what works in early litera~y and the assessments selected by the District to measure children's progress in these pre-reading and early reading skills should be evident when comparing the list of recommended practices cited above and the descriptions of what is tested in the assessments described in the following section. For example, "knowledge about print" is assessed in the sub-test on the Observation Survey called "Concepts about Print." The "production and recognition of letters" is assessed in "Letter Identification." "Recognition of frequent words" is assessed in the sub-test, "Word Test." "Word recognition and reading fluency" are tested in the "Word Test" and on the Developmental Reading Assessment. 1-28-020144 28 "Writing words and parts of words" is tested in "Writing Vocabulary." "Linguistic and conceptual knowledge" is tested on the Developmental Reading Assessment, in "Writing Vocabulary," and in "Hearing/Recording Sounds." 1-28-020145 29 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I V. Description of Tables Numerous tables displaying test data for each of the three assessments used in K-2 literacy are included in this section. The District-level results only are reported. The tables displaying data for each school on the Observation Survey, the Developmental Reading Assessment, and the Achievement Level Test are in Section IX. Tables lA, 2A, etc. include a calculation of the "Percent Improvement." Some statisticians do not see value in this calculation since it sometimes may mislead a reader. For instance, it is possible to show a greater percent of improvement for a low-performing group than for a higher performing group, even when the lower group gained fewer total points than the higher group. On the other hand, many readers are familiar with the calculation since it is commonly used in the news media to report changes in stock prices, changes in the crime rate, and other reports on issues of interest to the general public. The evaluators made a decision to leave the calculation in the tables in Section V and in the analysis of those tables in Section VI so that the reader may draw his or her own conclusions about their use. They are not used, however, in any way in arriving at the "findings" or recommendations for improvement in Section VIII. Table 1 Description Table l displays the mean performance levels of kindergarten black and non-black students in 1999-2000-both the fall pre-test and the spring post-test scores. The third set of data in this table, "B/NB Ratio," is a calculation of the black student scores divided by the white student scores as a method of determining the achievement gap at each stage of testing. For instance, black students entered kindergarten in fall 1999 scoring 0.95, as compared to non-black students who scored 2.72. If one divides 0.95 by 2.72, he/she finds that entering black kindergarten students' scores were 35 percent of non-black kindergarten students' scores. The "Growth" column in the first two sets of data is simply a subtraction of the fall scores from the spring scores to determine the year's growth. One can compare/contrast the "Growth" columns for black and non-black students to determine whether black students were growing at the same pace as non-black students in terms of total points. The "Growth" column in the third set of columns, "B/NB Ratio," is a calculation of the number of points gained by black kindergarten students divided by the number of points gained by non-black kindergarten students. This ratio then defines the degree to which black student growth approximates non-black student growth over the year. Where this ratio is equal to or more than l 00 percent, black student growth for the year equaled or exceeded non-black student growth. 1-28-020146 30 Table I: Kindergarten, 1999-2000 Fall to Spring Black and !'ion-Black Performance lllad, Students l'ion-lllark Studnr, II/II.II Rotio Sub-Test Fall Spring Growth Fall Spring Growth Fall Spring Growth 1999 2000 1999 2000 1999 2000 Letter Identification 27.59 48.48 20.89 34.08 50.30 16.22 81% 96% Word Test 1.75 11.33 9.58 3.05 14.91 11.86 57% 76% Conceots about Print 6.54 14.30 7.76 9.50 17.56 8.06 69% 81% Writin2 Vocabularv 2.93 14.50 I 1.57 4.70 22.13 17.43 62% 66% Hearin1&gt;/R~cordin2 3.58 17.02 13 .44 6.66 24.37 17.71 54% 70% ORA 0.95 3.09 2.14 2.72 7.12 4.40 35% 43% Table 1 A Description Table IA includes the same data for 1999-2000 as Table 1, except for two columns. Rather than compute simply the number of points of "Growth," as Table 1 displays, Table IA includes in that column for both black and non-black students a column called "Percent Improvement." This column indicates the rate of growth. That is, the number of growth points in Table 1 for a given sub-test was divided by the fall score to calculate the growth rate for that year. By comparing the two columns, one can determine whether black students grew at or less/more than the rate of non-black growth on each sub-test. Table 1A: Kindergarten, 1999-2000 Fall to Spring Black and Non-Black Performance, with Percent Improvement lll~ck Studrnls l\nn-lllack S1udr111s Sub-Test Fall Spring Growth Percent Fall Spring Growth Percent 1999 2000 lmnrv. 1999 2000 lmnrv. Letter Identification 27.59 48.48 20.89 76% 34.08 50.30 16.22 48% Word Test 1.75 11.33 9.58 547% 3.05 14.91 11.86 389% Concepts about Print 6.54 14.30 7.76 119"/o 9.50 17.56 8.06 85% Writinl! Vocabularv 2.93 14.50 11.57 395% 4.70 22.13 17.43 371% Hearin1?1Recordin2 3.58 17.02 13.44 375% 6.66 24.37 17.71 266% ORA 0.95 3.09 2.14 225% 2.72 7.12 4.40 162% Table 2 Description Table 2 includes the same data as Table 1 for kindergarten students, except for school year 2000-0 l. Table 2: Kindergarten, 2000-01 Fall to Spring Black and Non-Black Performance Hlack s111drnt, !\on-lllack Students U/MI Ratio Sub-Test Fall Spring Growth Fall Spring Growth Fall Spring Growth 2000 2001 2000 2001 2000 2001 Ratio Letter Identification 27.43 49.38 21.95 33.02 51.06 18.04 83% 97% 122% Word Test 1.38 13.41 12.03 2.59 16.32 13.73 53% 82% 88% Concepts about Print 5.95 16.02 10.07 8.30 18.41 10.11 72% 87% 100% Writing Vocabularv 1.96 18.82 16.86 3.36 26.42 23.06 58% 71% 73% Hearim!/Recordin2 2.16 19.59 17.43 4.66 25.69 21.03 46% 76% 83% ORA 0.35 3.56 3.21 0.85 7.47 6.62 41% 48% 48% Table 2A Description Ratio 129% 81% 96% 66% 76% 49% Table 2A includes the same data as Table IA for kindergarten students, except for school year 2000-01. 1-28-020147 31 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Table 2A: Kindergarten, 2000--01 Fall to Spring Black and Non-Black Performance, With Percent Improvement lllackStull,nh 1\011-lllack Stullcnls Sub-Test Fall Spring Growth , Percent Fall Spring Growth Percent 2000 2001 lmorv. 2000 2001 lmorv. Lener Identification 27.43 49.38 21.95 80% 33.02 51.06 18.04 55% Word Test 1.38 13.41 12.03 872% 2.59 16.32 13.73 530% Concepts about Print 5.95 16.02 10.o7 169% 8.30 18.41 10.11 122% Writinl! Vocabulary 1.96 18.82 16.86 860% 3.36 26.42 23.06 686% Hearim?1Recordin2 2.16 19.59 17.43 807% 4.66 25.69 21.03 451% DRA 0.35 3.56 3.21 917% 0.85 7.47 6.62 779% Table 3 Description Table 3 includes the same data for 1999-2000 as Table 1, except Table 3 displays grade 1 data. Table 3: Grade 1, 1999-2000 Fall to Spring Black and Non-Black Performance lllack Students l\nn-lllack St111knts 11/1\ll Ratio Sub-Test Fall Spring Growth Fall Spring Growth Fall Spring Growth 1999 2000 1999 2000 1999 2000 Ratio Letter Identification 47.44 52.80 5.36 49.54 52.96 3.42 96% 100% 157% Word Test 5.75 16.87 11.12 7.89 18.34 10.45 73% 92% 106% Conceots about Print 13.81 19.46 5.65 15.70 20.91 5.21 88% 93% 108% Writin2 Vocabularv 13.54 37.11 23.57 15.65 44.04 28.39 87% 84% 83% Hearim!/Recordin2 17.25 30.87 13.62 21.98 34.l l 12.13 78% 91% 112% DRA 4.29 16.67 12.38 6.68 24.37 17.69 64% 68% 70% Table 3A Description Table 3A includes the same data for 1999-2000 as Table IA. except Table 3A displays grade I data. Table 3A: Grade I, 1999-2000 Fall to SprinG Black and Non-Black Performance. With Percent Improvement lllack Studrnh l\nn-lllack Students Sub-Test Fall Spring Growth Percent Fall Spring Gro,.1b Percent 1999 2000 lmorv. 1999 2000 - Imorv. Lener Identification 47.44 52.80 5.36 11% 49.54 52.96 3.42 7% Word Test 5.75 16.87 11.12 193% 7.89 18.34 10.45 132% Conccots about Print 13.81 19.46 5.65 41% 15.70 20.91 5.21 33% Writing Vocabularv 13.54 37.11 23.57 174% 15.65 44.04 28.39 181% Hearin g/Recording 17.25 30.87 13.62 79% 21.98 34.11 12.13 55% DRA 4.29 16.67 12.38 289"/o 6.68 24.37 17.69 265% Table 4 Description Table 4 displays the same data for 2000-2001 as Table 2, except Table 4 displays grade 2 data. 1-28-020148 32 Table 4: Grade I, 2000--01 Fall to Spring Black and Non-Black Performance lllack Sludcnh l\on-lllack Student. 11/:\11 Ratio Suh-Test Fall Spring Growth Fall Spring Growth Fall Spring Growth 2000 2001 2000 2001 2000 2001 Ratio Letter ldenti Ii cation 48.95 53.01 4.06 49.66 53.08 3.42 99% 100% 119% Word Test 5.81 17.33 11.52 8.49 18.53 10.04 68% 94% 115% Conccots about Print 13.51 19.76 6.25 16.11 21.22 5.11 84% 93% 122% Writin2 Vocabulary 12.94 40.16 27.22 16.15 45.44 29.29 80/c, 88% 93% HeaJinc,IRecordin2 17.49 31.70 14.21 23.55 34.40 10.85 74% 92% 131% ORA 3.72 17.94 14.22 7.95 25.41 17.46 47% 71% 81% Table 4A Description Table 4A displays the same data for 2000-2001 as Table 2A, except Table 4A displays grade 2 data. Table 4A: Grade I, 2000--01 Fall to Spring Black and Non-Black Performance, With Percent Improvement lllack Studcnl, 1\1111-lllack Student\ Sub-Test Fall Spring Growth Percent Fall Spring Growth Percent 2000 2001 lmprv. 2000 2001 lmprv. Lener Identification 48.95 53.01 4.06 8% 49.66 53.08 3.42 7% Word Test 5.81 17.33 11.52 198% 8.49 18.53 10.04 118% Conceots about Print 13.51 19.76 6.25 46% 16.11 21.22 5.11 32% Writin2 Vocabulary 12.94 40.16 27.22 210% 16.15 45.44 29.29 181% Hearin2/Recordin2 17.49 31.70 14.21 81% 23.55 34.40 10.85 46% ORA 3.72 17.94 14.22 382% 7.95 25.41 17.46 220% Table 5 Description Table 5 includes the same data for 1999-2000 as Table 1, except Table 5 displays grade 2 data. Letter Identification and Concepts about Print are not administered after grade 1. Table 5: Grade 2, 1999-2000 Fall to Spring Black aod Non-Black Performance lllack Sludcnh l\nn-lllack Srud,nts 11/:\U Ratio Sul&gt;-Ttst Fall Spring Growth Fall Sprini: Growth Fall Spring Growth 1999 2000 1999 2000 1999 2000 Ratio Word Test 16.11 18.93 2.82 18.07 19.80 1.73 89% 96% 163% Writing Vocabulary 35.09 50.27 15.18 36.91 60.99 24.08 95% 82% 63% Hcaring/Recordin2 42.16 50.34 8.18 48.96 57.17 8.21 86% 88% 100% ORA 17.81 27.92 JO.II 24.21 36.00 11.79 74% 78% 86% Table 5A Description Table 5A displays the same data for 1999-2000 as Table IA except Table 5A displays grade 2 data. 1-28-020149 33 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Table SA: Grade 2, 1999-2000 Fall to Spring Black and Non-Black Performance, With Percent Improvement lllatk Studl'nls l\cm-lllack Stud,nh Sub-Test Fall Spring Growth Percent Fall Spring Grol'1h Percent 1999 2000 lmnrv. 1999 2000 lmnrv. Word Test 16.1 I 18.93 2.82 18% 18.07 19.80 1.73 10%, Writinl( Vocabularv 35.09 50.27 15.18 43% 36.91 60.99 24.08 65% Hearinu/Recordin2 42.16 50.34 8.18 19% 48.96 57.17 8.21 17% DRA 17.81 27.92 JO. I I 57% 24.21 36.00 11.79 49"/o Table 6 Description Table 6 displays the same data for 2000-2001 as Table 2, except Table 6 displays grade 2 data. Table 6: Grade 2, 2000-01 Fall to Spring Black and Non-Black Performance lllack Students l\nn-lllack Studl'nls 11/llill Ratio Sub-Telil Fall Spring Growth Fall Spring Growth Fall Spring Growth 2000 2001 2000 2001 2000 2001 Ratio Word Test 16.00 18.06 2.06 17.60 18.91 1.31 91% 96% 157% Writing Vocabularv 29.80 55.76 25.96 35.43 63 .97 28.54 84% 87% 91% Hearim?!Recordin2 45.50 51.60 6.10 52.44 56.78 4.34 87% 91% 141% DRA 18.20 28.75 10.55 26.01 35.88 9.87 70% 80% 107% Table 6A Description Table 6A displays the same data for 2000-2001 as Table 2A, except Table 6A displays grade 2 data. Table 6A: Grade 2, 2000-01 Fall to Spring Black and Non-Black Performance, With Percent Improvement lllack Stud,nts l\nn-lllack Students Sub-Test Fall Spring Growth Percent Fall Spring Growth Percent 2000 2001 lmnrv. 2000 2001 lmnrv. Word Test 16.00 18.06 2.06 13% 17.60 18.91 1.31 7% Writin2 Vocabularv 29.80 55.76 25.96 87% 35.43 63 .97 28.54 81% Heari n11./Recordin2 45.50 51.60 6.10 13% 52.44 56.78 4.34 8% DRA 18.20 2875 10.55 58% 26.01 35.88 9.87 38% Table 7 Description Table 7 displays black and non-black students' performance for a cohort of students as they moved from kindergarten in fall 1999-2000 to the end of grade 1 in spring 2000-01 . In other words, the table displays the evidence of two years of growth. Although the data include all students enrolled for the full year each of the two years, not just those who were in LRSD for both years, they provide a good picture of the growth of a cohort of students over a two-year period, while Tables 1-6 compared different groups of students at a given grade level. The black/non-black ratios that are displayed in the third set of columns were calculated by dividing the black student scores by the non-black scores. Where the growth is at or above 100 percent in the last column indicates that black growth over the two-year period was more than that of non-black growth, based on total points of growth. 1-28-020150 34 Table 7: Cohort I Kindergarten, Fall 1999-2000 and Grade I, Spring 2000-01 lllack Stud~nh Non-lllack Student, 11/lliB Ralio Sub-Tut Fall Spring Growth Fall Spring Gro.,tb Fall Spring Gro,.th 1999 2001 1999 2001 1999 2001 Ralio Letter Identification 27.59 53.01 25.42 34.08 53.08 19.00 81% 100% 134% Word Test 1.75 17.33 15.58 3.05 18.53 15.48 57% 94% 101% Concepts about Print 6.54 19.76 13.22 9.50 21.22 11.72 69% 93% 113% Writing Vocabulary 2.93 40.16 37.23 4.70 45.44 40.74 62% 88% 91% Hearing/Recording 3.58 31.70 28.12 6.66 34.40 27.74 54% 92% 101,~ DRA 0.95 17.94 16.99 2.72 25.41 22.69 35% 71% 75% Table 7 A Description Table 7 A displays the same data as Table 7 except that instead of the number of "Growth" points being displayed in the third column for black and non-black students, the table includes "Percent Improvement." This calculation was the result of dividing the number of growth points in Table 7 by the fall 1999 score for black and then non-black students to determine the growth rate. A comparison of these two columns will reveal the degree to which the program is especially effective for African-American students, as compared to non-black students. Sub-Test Lener ldenrificarion Word Test Concepts about Print Writing Vocabulary Heari nj!{Recordinl! DRA Table 8 Description Table 7A: Cohort I-Percent Improvement Kindergarten, Fall 1999-2000 and Grade I, Spring 2000--01 lllack S1udents Non-Black Studenh Fall Spring Growth Percenl Fall Spring Growth 1999 2001 Jmprv. 1999 2001 27.59 53.01 25.42 92% 34.08 53.08 19.00 1.75 17.33 15.58 890% 3.05 18.53 15.48 6.54 19.76 13.22 202% 9.50 21.22 11.72 2.93 40.16 37.23 1271% 4.70 45.44 40.74 3.58 31.70 28.12 785% 6.66 34.40 27.74 0.95 17.94 16.99 1788% 2.72 25.41 22.69 Percenl lmprv. 56% 508% 123% 867% 417% 834% Table 8 is similar to Table 7, except that the cohort data displayed if for fall 1999-2000 grade 1 black and non-black students and spring 2000-2001 grade 2 black and non-black students. Letter Identification and Concepts about Print were not administered after grade 1. Table 8: Cohort 2 Grade I, Fall 1999-2000 and Grade 2, Spring 2000--01 Ulock Sludents 1'011-Black Studlnls 81!\8 Ratio Sub-Test Fall Spring Growth Fall Spring Growth Fall Spring Growth 1999 2001 1999 2001 1999 2001 Ratio Lener Identification 47.44 NIA 49.54 NIA 96% NIA Word Test 5.75 18.06 12.31 7.89 18.91 11.02 73% 96% 112% 13.81 NIA 15.70 NIA 88% NIA Writin 13.54 55.76 42.22 15.65 63.97 48.32 87% 87% 87% Hearin 17.25 51.60 34.35 21.98 56.78 34.80 78% 91% 99% DRA 4.29 28.75 24.46 6.68 35.88 29.20 64% 80% 84% '\-28-020'\ 5'\ 35 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Table 8A Description See description of Table 7A and 8 above. Table 8A: Cohort 2,-Percent Improvement Grade 1, Fall 1999-2000 and Grade 2, Spring 2000-01 lllark Studrnls l\on-lllark Studtnh Sub-Test Fall Spring Growth Percent Fall Spring Growth 1999 2001 lmnrv. 1999 2001 Lener Identification 47.44 NIA 49.54 NIA Word Test 5.75 18.06 12.31 214% 7.89 18.91 11.02 Concepts about Print 13.81 NIA 15.70 NIA Writing Vocabulary 13.54 55.76 42.22 312% 15.65 63.97 48.32 Heannj!/Recordin2 17.25 51.60 34.35 199% 21.98 56.78 34.80 DRA 4.29 28.75 24.46 570% 6.68 35.88 29.20 Table 9 Description Percent lmprv. 140"/o 309% 158% 437% Table 9 displays the kindergarten, grade 1, and grade 2 performance of all students in 1999-2000, including the amount of fall to spring growth on each sub-test. This table includes only those students who were present for both fall and spring testing, not all those enrolled. Table 9: Grades K-2, 1999-2000 Fall to Spring Performance, All Students h:ind1rj!artrn c;rade I Gradr2 Sub-Test Fall Spring Growth Fall Spring Growth Fall Spring Growth 1999 2000 1999 2000 1999 2000 Lener Identification 29.72 49.05 19.33 48.11 52.86 4.75 Word Test 2.18 12.48 10.30 6.43 17.34 10.91 16.76 19.23 2.47 Concepts about Print 7.52 15.37 7.85 14.41 19.91 5.50 Writin11: Vocabulary 3.51 16.99 13.48 14.20 39.30 25.10 35.71 53.80 18.09 HearinJ!/Recordin11: 4.59 19.41 14.82 18.75 31.89 13.14 44.34 52.51 8.17 DRA 1.52 4.40 2.88 5.05 19.11 14.06 19.85 30.50 10.65 Table 9A Description Table 9A calculates the growth rate for all students from fall to spring in 1999-2000. Table 9A: Grades K-2, 1999-2000 Fall to Spring Performance, All Students, with Percent of Improvement h:inderj!nrtrn (iradr I Grade 2 Sub-Test Fall Spring Percent Fall Spring Percent Fall Spring Percent 1999 2000 lmnrv. 1999 2000 lmnrv. 1999 2000 lmnrv. Lener Identification 29.72 49.05 65% 48.11 52.86 10% NIA Word Test 2.18 12.48 472% 6.43 17.34 170% 16.76 19.23 15% Conceots about Print 7.52 15.37 104% 14.41 19.91 38% NIA Writin2 Vocabulary 3.51 16.99 384% 14.20 39.30 177% 35.71 53 .80 51% Heatin~IRecordin2 4.59 19.41 323% 18.75 31.89 70% 44.34 52.51 18% DRA 1.52 4.40 189% 5.05 19.11 278% 19.85 30.50 54% 1-28-020152 36 --- ---- ----- - ----- Table 10 Description Table 10 is similar to Table 9 except that it includes 2000-2001 data for all students. Table 10: Grades K-2, 2000-01 Fall to Spring Performance, All Students h:inderi:artcn &lt;.rade I Gradc2 Sub-Test Fall Spring Growth Fall Spring Growth Fall Spring Growth 2000 2001 2000 2001 2000 2001 Letter Identification 29.05 49.79 20.74 49.07 53.02 3.95 Word Test 1.81 14.29 12.48 6.68 17.67 10.99 16.48 18.33 1.85 Concepts about Print 6.67 16.75 10.08 14.29 20.21 5.92 Writing Vocabulary 2.41 21.07 18.66 14.02 41.72 27.70 31.59 58.35 26.76 Hearinl!/Recording 3.00 21.42 18.42 19.46 32.48 13.02 47.53 53.07 5.54 DRA 0.52 4.80 4.28 5.10 20.24 15.14 20.56 30.93 10.37 Table 1 OA Description See description of Table 9A and 10 above. Table JOA: Grades K-2, 2000--01 Fall to Spring Performance, All Students, With Percent or Improvement Kindtrcarltn (;rndc 1 (;rade 2 Sub-Test Fall Spring Percent Fall Spring Percent Fall Spring Percent 2000 2001 lmorv. 2000 2001 lmorv. 2000 2001 lmnrv. Lener Identification 29.05 49.79 71% 49.07 53.02 8% NIA Word Test 1.81 14.29 690"/o 6.68 17.67 165% 16.48 18.33 11% Concepts about Print 6.67 16.75 151% 14.29 20.21 41% NIA Writing Vocabulary 2.41 21.07 774% 14.02 41.72 198% 31.59 58.35 85% HearinivRccordinR 3.00 21.42 614% 19.46 32.48 67% 47.53 53.07 12% DRA 0.52 4.80 823% 5.10 20.24 297% 20.56 30.93 50% Table 11 Description Table 11 displays for each sub-test the percent of the maximum score that black kindergarten students on average attained for school years 1999-2000 and 2000-01 . Each test score is divided by the maximum score to calculate the percent score. Sub-Test l\ln. Letter Identification 54 Word Test 20 Concepts about Print 24 Writing Vocabulary None Heannj!/Rccording 37 DRA 44 Table 12 Description Table 11: Percent or Maximum Scores-Kindergarten Black Students Fall Pcncnt Spring Percent an 1999 2000 2000 27.59 51% 48.48 90"/o 27.43 1.75 9% 11.33 57% 1.38 6.54 27% 14.30 60% 5.95 2.93 NIA 14.50 NIA 1.96 3.58 10% 17.02 46% 2.16 0.95 2% 3.09 7% 0.35 l'crcent Spring 1'11rcent 2001 51% 49.38 91% 7% 13.41 67% 25% 16.02 67% NIA 18.82 NIA 6% 19.59 53% 1% 3.56 8% See Table 11. Table 12 is the same, except that the data are for non-black students. 1-28-020153 37 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Sub-lc,t Max. Letter Identification 54 Word Test 20 Concepts about Print 24 Writin11 Vocabulary None HearinltfRecordin11 37 DRA 44 Table 13 Description Table 12: Percent of l\luimum Scores-Kindergarten Non-Black Students au P,rccnt Sprini: l'crc,nt Fall 1999 2000 2000 34.08 63% 50.30 93% 33.02 3.05 15% 14.91 75% 2.59 9.50 40% 17.56 73% 8.30 4.70 NIA 22.13 NIA 3.36 6.66 18% 24.37 66% 4.66 2.72 6% 7.12 16% 0.85 Pcrc.,nl Spri111: 2001 61% 51.06 13% 16.32 35% 18.41 NIA 26.42 13% 25.69 2% 7.47 See Table 11. Table 12 is the same, except that the data are for all students. Sub-Test !\lax. Letter Identification 54 Word Test 20 Conceots about Print 24 Writing Vocabulary None Hearing/RecordinJZ 37 DRA 44 Table 14 Description Table 13: Percent of Maximum Scores-Kindergarten All Students Fall Perca,nt Spring l'crccnt fall 1999 2000 2000 29.72 55% 49.05 91% 29.05 2.18 11% 12.48 62% 1.81 7.52 31% 15.37 64% 6.67 3.51 NIA 16.99 NIA '.!.42 4.59 12% 19.41 52% 3.00 1.52 3% 4.40 10% 0.52 Percent Spring 2001 54% 49.79 9% 14.29 28% 16.75 NIA 21.07 8% 21.42 1% 4.80 Percmt 95% 82% 77% NIA 69% 17% Percent 92% 71% 70% NIA 58% 11% See Table 11 description. This table is the same, except that the data are for grade 1 students. Sub-Test l\lax. Letter Identification 54 Word Test 20 Concepts about Print 24 Writing Vocabulary None Hearinu!Recordin2 37 DRA 44 Table 15 Description Table 14: Percent of Mui mum Scores-Grade 1 Black Studenu 'Fall l'crccnt Sprini: l'crccnt Fall 1999 2000 '2000 47 .44 88% 52.80 98% 48.95 5.15 29%, 16.87 S4% 5.81 13.81 58% 19.46 81% 13.51 13.54 NIA 37.11 NIA 12.94 17.25 47% 30.87 83% 17.49 4.29 10% 16.67 38% 3.72 Percent Spring Percent 2001 91% 53.01 98% 29% 17.33 87% 56% 19.76 82% NIA 40.16 NIA 47% 31.70 86% 8% 17.94 41% See Table 11 description. This table is the same except that the data are for grade 1 non- black students. Sub-Test Mu. Letter Identification 54 Word Test 20 Concepts about Print 24 WritinJZ Vocabulary None Hearing/Recording 37 DRA 44 Table 15: Percent of Maximum Scores-Grade 1 Non-Black Students Fall Percent Sprlni: Percent Fall 1999 2000 2000 49.54 92% 52.96 98% 49.66 7.89 39% 18.34 92% 8.49 15.70 65% 20.91 87% 16.11 15.65 NIA 44.04 NIA 16.15 21.98 59% 34.11 92% 23.55 6.68 15% 24.37 55% 7.95 l'crcent Spring l'ercent 2001 92% 53.08 98% 42% 18.53 93% 67% 21.22 88% NIA 45.44 NIA 64% 34.40 93% 18% 25 .41 58% 1-28-020154 38 Table 16 Description See Table 11 description. This table is the same except that the data are for grade I-all students. Sub-Test l\ln. Letter Identification 54 Word Test 20 Concepts about Print 24 Writing Vocabulary None Hearing/Recording 37 ORA 44 Table 17 Description Table 16: Percent of Maximum Scores-Grade 1 All Students l'all l'ercent Spring Percent Fall 1999 2000 2000 48.11 89% 52.86 98% 49.07 6.43 32% 17.34 87% 6.68 14.41 60% 19.91 83% 14.29 14.20 NIA 39.30 NIA 14.02 18.75 51% 31.89 86% 19.46 5.05 11% 19.11 43% 5.10 Percent Spring l'erccnt 2001 91% 53.02 98% 33% 17.67 88% 60% 20.21 84% NIA 41.72 NIA 53% 32.48 88% 12% 20.24 46% See Table 11 description. This table is the same except that the data are for grade 2 black students. Sub-Tt,st !\lax. Word Test 20 Writing Vocabulary None Hearinl!iRecordinl! 64 ORA 44 Table 18 Description Table 17: Percent of Mui mum Scores-Grade 2 Black Students f"all Pl'rcent Spring l'ncent Fall 1999 2000 2000 16.11 81% 18.93 95% 16.00 35.09 NIA 51.27 NIA 29.80 42.16 66% 50.34 79% 45.50 17.81 40% 27.92 63% 18.20 l'ercent Spring Percent 2001 80% 18.06 90% NIA 55.76 NIA 71% 51.60 81% 41% 28.75 65% See Table 11 description. This table is the same except that the data are for grade 2 nonblack students. Sub-Test !\lax. Word Test 20 Writing Vocabularv None Hearinl!/Recordinl! 64 ORA 44 Table 19 Description Table 18: Percent of Maximum Scores-Grade 2 Noa-Black Students Fall Percent Sprini: l'crcent Fall 1999 2000 2000 18.07 90% 19.80 99"/o 17.60 36.91 NIA 60.99 NIA 35.43 48.96 77% 57.17 89% 52.44 24.21 55% 36.00 82% 26.01 Percent Spring Percent 2001 88% 18.91 95% NIA 63.97 NIA 82% 56.78 89% 59% 35.88 82% See Table 11 description. This table is the same except that the data are for grade 2-all students. Sub-Trst !\lax. Word Test 20 Writine: Vocabularv None Hearinl!IRccordine: 64 ORA 44 Table 19: Percent of Maximum Scorn-Grade 2 All Students Fall l'ernnt Sprin~ Percent Fall 1999 2000 2000 16.76 84% 19.23 96% 16.48 35.71 NIA 53.80 NIA 31.59 44.34 69% 52.51 82% 47.53 19.85 45% 30.50 69% 20.56 Percent Spring l'crccnt 2001 82% 18.33 92% NIA 58.35 NIA 74% 53 .07 83% 47% 30.93 70% 1-28-020155 39 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Table 20 Description Table 20 displays the all-student cohort data for fall 1999-2000 kindergarten students and end-of-year 2000-01 grade 1 students. Table 20: Cohort I-All Students Kindergarten, fall 1999-2000 and Grade I, Spring 200~1 Suh."l&lt;'&lt;I Fall 1999 Sprin:? 2001 (;rm,th Letter Identification 29.72 53.02 23.30 Word Tesl 2.18 17.67 15.49 Concepts aboul Print 7.52 20.21 12.69 Writinl! Yocabularv 3.51 41.72 38.21 Hearine/Recordin2 4.59 32.48 27.89 ORA 1.52 20.24 18.72 Table 20A Description Table 20A calculates the growth rate for the fall 1999-2000 kindergarten and 2000-01 grade! cohort. Table 20A: Cohort I-All Students, Percent Improvement IGndergarten, Fall 1999-2000 and Grade I, Spring 2~1 Suh-1 c,1 F:ill 1999 Sprinl! :?001 (;rm,th 0/.,lmpn. Letter Identification 29.72 53.02 23.30 78% Word Test 2.18 17.67 15.49 711% Conceots abou1 Print 7.52 20.21 12.69 169% Writine Yocabularv 3.51 41.72 38.21 1089"/4 Hcaring/RecordinR 4.59 32.48 27.89 608% ORA 1.52 20.24 18.72 1232% Table 21 Description Table 21 is similar to Table 20 except that it includes the fall 1999-2000 grade I and endof- year 2000-2001 grade 2 cohort data. Table 21: Cohort 2-AII Students Grade 1, Fall 1999-2000 and Grade 2. Spring 2001 Suh-1 l'St Fall 1999 Spri111? 2001 Grnnlh Letter Identification 48.11 NIA Word Test 6.43 18.33 11.90 Conccnts about Print 14.41 NIA Writine Vocabularv 14.20 58.35 44.15 Hearine/Recordine 18.75 53.07 34.32 ORA 5.05 30.93 25.88 Table 21A Description See description of Table 20A and 21 above. 1-28-020156 40 Table 21A: Cohort 2--AII Students, Percent Improvement Grade I, Fall 1999-2000 and Grade 2, Spring 2001 Suh-1.,,t Fall 1999 Spring 2001 &lt;iro\\th %lmpn. Letter ldcntifica1ion 48.11 NIA NIA Word Test 6.43 18.33 11.90 185% Conceots about Print 14.41 NIA NIA Writin~ Vocabulary 14.20 58.35 44.15 311% Hearinl!IRccordinl! 18.75 53.07 34.32 183% DRA 5.05 30.93 25.88 512% Table 22 Description Table 22 includes for the Developmental Reading Assessment at all three grades tested the percent of black and non-black students who scored at or above the "readiness" level. Also shown is the perfonnance disparity (gap) between blacks and non-blacks for each of the two years of the testing data and, in the last column, the difference between those gaps. Table 22 Percent Readiness, Developmental Reading Assessment Black and Non-Black Students Gradt: Black 1'011-II Gap lllacl, :\011-B Gap +I-Spr. 2000 Spr. 2000 Spr. 2001 Spr. 2001 K10der 69.3 84.7 15.4 77.0 88.8 I 1.8 3.60 Grade I 48.3 71.2 22.9 57.4 77.3 19.9 -3 .00 Grade 2 63.8 81.6 17.8 69.8 86.8 17.0 -0.80 Table 23 Description Table 23 provides District-level data on the percent of students at each grade level who scored at or above the "readiness" level for each of the two years of the testing. Kinder,:art Spr. 2000 72.2 I Table 24 Description Table 23 Percent Readiness, Developmental Reading Assessment All Students Grade I pr. 2000 Spr. 200 63.8 Table 24 includes District-level ALT data on the perfonnance of black, non-black, and all students on the spring 2000 and spring 2001 administrations of the Reading and Language Usage sub-tests of the Achievement Level Tests for grade 2. Table 24 Grade 2 Reading, Achievement Level Test Median RIT Scores, Black and Non-Black Comparisons 1-28-C,20157 41 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Table 25 Description Table 25 includes the median RIT score on the grade 2 Achievement Level Test- Reading for spring 2000 and spring 2001 . Table 2S Grade 2 Reading, Achievement Level Test Median RJT Scores, All Students Spr. 2001 ffllrk,iit&amp; 183 21 Table 26 Description Table 26 is the same as Table 24, except that the data display the results of the Language Usage sub-test. Table26 Grade 2 Langua~ Usage, Achievement Level Test Median RJT Scores, Black and Non-Black Comparisons r:-u7m# : .. ,..196a .~-14 ,"118:4 ilil19:6 11-1 21 1 Table 27 Description Table 27 is the same as Table 25, except the data display the results of the Language Usage sub-test. Table 27 Grade 2 Language Usage, Achit\'ement Level Test Median RJT Scores, All Students rffi"m Spr. 2001 188 1-28-020158 42 - -- - ------ ------- VI. Analysis of Results, 1999-2000 and 2000-01 For each sub-test of the Observation Survey and for the Developmental Reading Assessment at grades K-2 and for each sub-test of the Achievement Level Tests at grade 2, results are analyzed below in several ways, particularly in what they reveal about the achievement of African American children. Letter Identification Letter identification is tested at the kindergarten and grade I level. Out of a maximum of 54 points, students performed on this measure as follows in 1999-2000 and 2000-01: Fall and Spring Performance Comparisons Black kindergarten students scored 27.59 on the fall test in 1999-2000 and 27.43 in fall 2000-01--a difference of .16. Although the 2000-01 group performed at a slightly lower level on the fall test than those in 1999-2000, they ended the year a little stronger--from 48.48 in spring 1999-2000 to 49.38 in spring 2000-01--a difference of .90. (See Tables 1 and 2.) Non-black kindergarten students scored 34.08 on the fall 1999-2000 test and 33.02 in fall 2000-01--a difference of 1.06. Non-black kindergarten students,just as blacks, started lower in fall 2000-01 than in fall 1999- 2000, yet they too ended the year a little stronger than the previous year's group--from 50.30 in 1999-2000 to 51.06 in 2000-01--a difference of.76. (See Tables l and 2.) Black grade 1 students scored 4 7.44 on Letter Identification in fall 1999 and 48.95 in fall 2000--an improvement of 1.51 points, perhaps indicating the strength of the 1999-2000 kindergarten instructional program for African-American students, even in its first year of implementation. As in kindergarten, the grade 1 black students in 2000-0J ended the year stronger than the grade 1 black students in 1999-2000--from 52.80 in spring 2000 to 53.01 in spring 2001--a difference of .21. (See Tables 3 and 4.) Non-black grade 1 students scored 49.54 in fall 1999 and 49.66 in fall 2000--a difference of .12. As in kindergarten and as for black students, the grade 1 non-black students ended spring 2001 at a higher level than they were in spring 2000--from 52.96 in 2000 to 53.08 in 2001--a difference of .12--the same amount of difference, then, as the fall to fall scores. (See Tables 3 and 4.) 1-28-020159 43 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I One-Year Growth Black kindergarten students grew 20.89 points on Letter Identification in 1999-2000 and 21.95 points in 2000-01, again indicating more growth in the second year of the program implementation than in year one for African-American students. (See Tables 1 and 2.) Non-black kindergarten students also grew more in the second year of the program--from 16.22 points in 1999-2000 to 18.04 in 2000-01. (See Tables 1 and 2.) Black grade 1 students grew 5.36 points in Letter Identification in 1999- 2000 and 4.06 in 2000-01 . Perhaps the reason for the declining amount of growth was that the 2000-01 grade 1 students were closer to the maximum score in 2000-01 than they were in 1999-2000. (See Tables 3 and 4.) Non-black grade 1 students grew 3.42 points in both 1999-2000 and 2000- 01. (See Tables 3 and 4.) Growth Rate {Percent Improvement) Black kindergarten students' percent improvement (rate of growth) in 1999-2000 on Letter Identification was 76 percent, as compared to 80 percent in 2000-01. (See Tables lA and 2A.) Non-black students' percent improvement in 1999-2000 was 48 percent as compared to 55 percent in 2000-01. (See Tables IA and 2A.) Black grade l students' percent improvement in 1999-2000 was 11 percent, as compared to 8 percent in 2000-01. (See Tables 3A and 4A.) Non-black grade 1 students' percent improvement. was 7 percent in both 1999-2000 and 2000-01. (See Tables 3A and 4A.) Kindergarten Spring and Grade 1 Fall Comparison Black kindergarten students ended the 1999-2000 school year with a score of 48.48, and they entered grade 1 with a score of 48.95--a slight improvement of .4 7, indicating no regression over the summer. (See Tables 1 and 4.) Non-black kindergarten students ended the 1999-2000 year with a score of 50.30, and they entered grade 1 in 2000-01 with a score of 49.66- a slight regression over the summer of .64. (See Tables 1 and 4.) 1-28-020160 44 Black to Non-Black Ratios In fall 1999 the black kindergarten students ' scores on Letter Identification were 81 percent those of non-black students. By the end of that year their scores were 96 percent of those of non-black students--indicating an improvement of 15 percentage points. The achievement gap was virtually closed, therefore, on this measure by the end of the kindergarten year. (See Table I.) In fall 2000 the black kindergarten students' scores started the year at 83 percent of those of non-black students--two points higher than they were at the beginning of the previous kindergarten class. By the end of the year their scores were 97 percent of those of non-black students--one point closer to closing the achievement gap on this measure than at the end of the previous kindergarten year. (See Table 2.) In fall 1999 the black grade 1 students' scores were 96 percent of those of non-black students. By the end of the year their scores were almost exactly the same as non-black students--I 00 percent. The achievement gap was closed on this measure. (See Table 3.) Again in fall 2000 the black grade I students' scores were 99 percent of those of non-black students, and by the end of the year the achievement gap closed when black scores were 100 percent of non-black scores on this measure. (See Table 4.) Black to Non-Black Growth Ratios In 1999-2000 black kindergarten student growth was 129 percent ofnonblack student growth. In 2000-01 black kindergarten growth continued to exceed non-black growth--this time at 122 percent. (See Tables 1 and 2.) In 1999-2000 black grade 1 student growth was 157 percent of non-black student growth. That pattern continued in 2000-01 when black grade 1 growth was 119 percent of non-black growth. (See Tables 3 and 4.) Kindergarten--Grade 1 Cohort {Fall 1999 to Spring 2001) Black kindergarten students grew from 27.59 in fall 1999 to 53.01 in spring 2000-01 when they were in grade 1--a total of25.42 points. (See Table 7.) Non-black kindergarten students grew from 34.08 in fall 1999 to 53.08 in spring 2001 when they were in grade 1--a total of 19.00 points. (See Table 7.) 1-28-020161 45 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I K.indergarten--Grade 1 Cohort (Fall 1999 to Spring 2001) Growth Rate Black kindergarten students' percent improvement from fall 1999-2000 to spring 2000-01 in grade 1 was 92 percent. Given that black students began kindergarten knowing a little more than half of their letters, they almost doubled their knowledge in this area over the two-year period. (See Table 7A.) Non-black kindergarten students' percent improvement from fall 1999 to spring 2001 in grade l was 56 percent. Even though black students grew at a rate considerably higher than non-black students over the two years, non-black students continued also to improve. (See Table 7 A.) Black to Non-Black Ratios for Kindergarten to Grade 1 Cohort In fall 1999 the black kindergarten students' scores were 81 percent of those of non-black students. By the end of grade 1, the achievement gap was closed with black scores at 100 percent of non-black scores. (See Table 7.) During the two-year period the black growth was 134 percent of non-black growth on this measure. (See Table 7.) 1-28-020162 46 Word Test The Word Test, with a maximum score of 20, is administered at all three grades levels, K-2. Observations about student performance in 1999-2000 and 2000-01 follow: Fall and Spring Performance Comparisons Black kindergarten students scored 1. 75 on the fall test in 1999-2000 and 1.38 in fall 2000-01--a difference of .37, repeating the pattern seen on the Letter Identification test--slightly lower perfonnance in fall 2000 than in fall 1999. Again, however, just as in Letter Identification, the spring 2001 scores were higher than they were in spring 2000. The spring 2000 score was 11.33, and the spring 2001 score was 13.41--a difference of 2.08 points--a good increase in decoding skill. (See Tables l and 2.) Non-black kindergarten students scored 3.05 in fall 1999 and 2.59 in fall 2000--a slightly lower score, .46 lower. Again the pattern holds, however. The spring 2001 scores were higher than the spring 2000 scores--from 14.91 in spring 2000 to 16.32 in spring 2001--a difference of 1.41 points. (See Tables 1 and 2.) Black grade I students scored 5.75 on the Word Test in fall 1999 and 5.81 in fall 2000--continuing the pattern of higher scores at the beginning of the year for students who had been in the program two years. Also, black students in spring 2001 had higher scores than those in spring 2000. Spring 2000 scores for grade I black students were 16.87, and they were 17.33 in spring 2001--a difference of .46 points. (See Tables 3 and 4.) Non-black grade I students scored 7.89 in fall 1999 and 8.49 in fall 2000-an increase of .60. In spring 2000 the scores were 18.34, and in spring 2001 they were 18.53--again higher by .19. (See Tables 3 and 4.) Black grade 2 students scored 16.11 in fall 1999 an the Word Test and 16.00 in fall 2000--down .11. The spring performance for 2000 was 18.93, and the spring performance for 2001 was 18.06--down .87. (See Tables 5 and 6.) Non-black grade 2 students scored 18.07 in fall 1999 and 17.60 in fall 2000--down .47 from the previous year. The spring performance for 2000 was 19.80 and for spring 2001 18.91--down .90. (See Tables 5 and 6.) One-Year Growth Black kindergarten students grew 9.58 points in 1999-2000 and 12.03 points in 2000-01, again indicating more growth the second year of the program implementation than in year one. (See Tables 1 and 2) Non-black kindergarten students grew 11.86 points in 1999-2000 and 13.73 points in 2000-01. (See Tables I and 2.) 1-28-020163 47 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I Black grade 1 students grew 11 .12 points in 1999-2000 and 11.52 points in 2000-01 . (See Tables 3 and 4.) Non-black grade 1 students grew 10.45 in 1999-2000 and 10.04 in 2000- 01 . One possible reason for this reduced growth is that the non-black students in grade 1 were already getting close to the maximum score of 20 on this measure. (See Tables 3 and 4.) Black grade 2 students grew 2.82 points in 1999-2000 and 2.06 points in 2000-01. (See Tables 5 and 6.) Non-black grade 2 students grew 1.73 in 1999-2000 and 1.31 in 2000-01. (See Tables 5 and 6.) Growth Rate (Percent Improvement) Black kindergarten students' percent improvement (rate of growth) in 1999-2000 was 54 7 percent, as compared to 872 percent in 2000-0 l. Although black kindergarten students did not grow as many points as nonblack students in either 1999-2000 or 2000-01, their percent of improvement or growth rate far exceeded that of non-black students. (See Tables IA and 2A.) Non-black kindergarten students' percent improvement in 1999-2000 was 389 percent, and in 2000-01, it was 530 percent. (See Tables IA and 2A.) Black grade 1 students' growth rate in 1999-2000 was 193 percent, as compared to 198 percent in 2000-01. At grade 1 in 2000-01 not only did black students have a higher growth rate than in 1999-2000, they also grew more in terms of points. (See Tables 3A and 4A.) Non-black grade 1 students' growth rate in both 1999-2000 was 132 percent, and in 2000-01 it was 118 percent. Again, black students' higher growth rate indicates a closing of the achievement gap on this measure. (See Tables 3A and 4A.) Black grade 2 students' growth rate in 1999-2000 was 18 percent-considerably lower than in kindergarten and grade l, but due to the approximation of the maximum score of 20. In 2000-01 the growth rate was 13 percent. (See Tables SA and 6A.) Non-black grade 2 students' growth rate in 1999-200 was 10 percent and in 2000-01, 7 percent. (See Tables SA and 6A.) 1-28-020164 48 Kindergarten Spring and Grade 1 Fall Comparison: Grade 1 Spring and Grade 2 Fall Comparison Black kindergarten students ended the 1999-2000 school year with a score of 11.33 on the Word Test, and they entered grade 1 in fall 2000 with a score of 5.81--indicating, most likely, little reinforcement of school vocabulary during the summer months. (See Tables 1 and 4.) Non-black kindergarten students ended the 1999-2000 school year with a score of 14.91, and they entered grade 1 in fall 2000 with a score of 8.49-again indicating little reinforcement of school vocabulary during the summer months. (See Tables 1 and 4.) Black grade 1 students ended the 1999-2000 school year with a score of 16.87, and they entered grade 2 in fall 2000 with a score of 16.00--a slight regression of .87. Summer regression may decline when students begin to read independently. It is interesting that the regression between grade 1 and 2 is much lower than between kindergarten and grade 1. (See Tables 3 and 6.) Non-black grade 1 students ended the 1999-2000 school year with a score of 18.34 and began grade 2 in fall 2000 with a score of 17 .60--a regression of .74. (See Tables 3 and 6.) Black to Non-Black Ratios In fall 1999 the black kindergarten students' scores were 57 percent of those of non-black students. By the end of the year, they were 76 percent of those of non-black students--indicating an improvement of 19 percentage points in the first year of instruction. (See Table 1.) In fall 2000 the black kindergarten students' score:. were only 53 percent of those of non-black students. By the end of the year, however, they were 82 percent of those of non-black students--an improvement of29 percentage points. (See Table 2.) In fall 1999 the black grade 1 students' scores were 73 percent of those of non-black students. By the end of the year, however, they were 92 percent of those of non-black students--an improvement of 19 percentage points. (See Table 3.) In fall 2000 the black grade 1 students' scores were 68 percent of those of non-black students. By the end of the year, however, they were 94 percent of those of non-black students--an improvement of 26 percentage points in one year and an indication that the achievement gap is ahnost closed. (See Table 4.) 1-28-020165 49 I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I I In fall 1999 the black grade 2 students' scores were 89 percent of those of non-black students. By the end of the year, however, they were 96 percent of those of non-black students--an improvement of7 percentage points in one year. (See Table 5.) In fall 2000 the black grade 2 students' scores were 91 percent of those of non-black students. By the end of the year, they were 96 percent, and the achievement gap on this measure was virtually closed after three years of instruction. (See Table 6.) Black to Non-Black Growth Ratios In 1999-2000 black kindergarten student growth was 81 percent of nonblack student growth. In 2000-01 black kindergarten growth was 88 percent of non-black student growth. (See Tables I and 2.) In 1999-2000 black grade 1 student growth exceeded that ofnon-blacks-- 106 percent. The growth ratio in 2000-01 for grade I students was 115 percent, so, again, black growth exceeded non-black growth in grade l . (See Tables 3 and 4.) In 1999-2000 black grade 2 student growth greatly exceeded the growth of non-black students--163 percent. The growth ratio in 2000-01 continued at a high rate--157 percent. Black students made their greatest gains in closing the achievement gap on this measure in grade 2. (See Tables 5 and 6.) Kindergarten--Grade 1 Cohort {Fall I 999 and Spring 2001) Black kindergarten students grew from 1.75 in fall 1999 to 17.33 in spring 2000 when they were in grade 1--a total of 15.58 points. (See Table 7.) Non-black students in this cohort grew from 3.05 in fall 1999 to 18.53 in spring 2001--a total of 15.48 points. (See Table 7.) Grade 1--Grade 2 Cohort (Fall 1999 and Spring 2001) Black grade l students grew from 5. 75 to 18.06 in spring of grade 2--a total of 12.31 points. (See Table 8.) Non-black grade 1 students grew from 7 .89 to 18. 91 in spring of grade 2-a total of 11.02 points. (See Table 8.) Kindergarten--Grade 1 Cohort Growth Rate Black kindergarten students' percent improvement from fall 1999 to spring 2001 in grade 1 was 890 percent. (See Table 7A.) 1-28-020166 50 Non-black kindergarten students' percent improvement from fall 1999 to spring 2001 in grade 1 was 508 percent. (See Table 7 A.) Black grade l students' percent improvement from fall 1999 to spring 2001 in grade 2 was 214 percent. (See Table SA.) Non-black grade l students' percent improvement from fall 1999 to spring 2001 in grade 2 was 140 percent. (See Table SA.) Black to Non-Black Ratios for Kindergarten to Grade 1 Cohort In fall 1999 the black kindergarten scores were 57 percent of those ofnonblack students. By spring 2001 at the end of grade 1, the black scores were 94 percent of t This project was supported in part by a Digitizing Hidden Special Collections and Archives project grant from The Andrew W. Mellon Foundation and Council on Library and Information Resources.</dcterms_description>

</dcterms_description>

</item>
</items>