Court filings regarding Court order filed May 12, 2004 requiring glossary of acronyms and educational terms, plaintiff's response to the order, and Arkansas Department of Education (ADE) project management tool.

Skip viewer

<?xml version="1.0" encoding="utf-8"?>
<items type="array"> <item>

<dcterms_description type="array">

<dcterms_description>Court filings: District Court, order; District Court, plaintiff's notice of filing documents in response to the Court order filed May 12, 2004; District Court, plaintiff's notice of filing glossary of acronyms and educational terms in response to the Court's order filed May 12, 2004; District Court, notice of filing, Arkansas Department of Education (ADE) project management tool This transcript was create using Optical Character Recognition (OCR) and may contain some errors. A072A (Rev.8182) ECEIVED MAY 1 -" 2004 OFFICE OF ESEGREGATION MONITORING IN THE UNITED STATES DISTRICT,COURT EASTERN DISTRICT OF ARKANSAS MAY 1 2 - LITTLE ROCK DIVISION JAMES W. McCORMACK, CLERK .By: ______ --=-=-=-=::-:-:::-:= DEP CLERK LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. No. 4:82CV00866 WRW/JTR PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, ET AL. RECEIVED DEFENDANTS INTERVENORS INTERVENORS MRS. LORENE JOSHUA, ET AL. KATHERINE KNIGHT, ET AL. MAY 1 '. 2004 OFFICE OF DESEGREGATION MONITORING ORDER In preparing for the June 14 and 15 evidentiary hearing on LRSD 's Compliance Report, it is apparent that a number of matters need to be brought to the attention of counsel for LRSD and Joshua: (1) The LRSD Board, in approving the October 10, 2002 "Compliance Plan," also adopted "IL-Rl ," which sets forth ''the written procedures for evaluating the 2. 7 programs." While the October 10, 2002 Compliance Plan is attached as Exhibit A to LRSD's March 14, 2003 ''Notice Of Filing Program Evaluations Required By Paragraph C Of The Court's Compliance Remedy'' (docket entry #3745), "IL-R 1" is not attached to that document or otherwise included in the record. Counsel for LRSD must immediately provide me with a copy of "IL-RI." (2) Exhibit A to LRSD's "Compliance Report" is an October 25, 2002 letter from Mr. John Fendley, one of LRSD's attorneys, to all parties, responding to certain written "concerns" raised by Joshua's counsel, Mr. John Walker, regarding 864 A072A (Rev.8/82) LRSD's proposed "Compliance Plan." In oroer for the Court to place Mr. Fendley's October 25, 2002 letter in context, I need the following additional documents: (a) Mr. Walker's October 10 and 24, 2002 letters to Mr. Fendley raising his "concerns" about the "Compliance Plan"; and (b) a copy of the document that Mr. Fendley repeatedly quotes Mr. Walker referring to in his October 10 and October 24, 2002 letters as ''your document."1 Counsel for LRSD must immediately provide me with copies of the foregoing documents. (3) In my September 13, 2002 Memorandum Opinion, I thought I made it clear that I am a big fan of plain English and have no desire to learn the acronym-filled lexicon of the professional educator. Therefore, I am now directing counsel to comply with the following rules in all oral and written communications with the Court in this case: (a) Do not use any educational acronyms unless they are first defined. The pleadings that I have reviewed to date in preparing for the June 14 and 15 hearing are littered with references to "SAIPs," "DRAs," "DIBELs," "ELLA," "CRT," "SMART," "THRIVE," "ACTAAP," "SREB," "CREP," and "SFA." Counsel for LRSD must immediately prepare a glossary which defines all acronyms used in all exhibits attached to LRSD's Compliance Report. A copy of this glossary is to be provided forthwith. 11 speculate that "your document" is probably LRSD's "Compliance Plan," which I already have. If my speculation is correct, LRSD's counsel should so advise me and need not provide the Court with a copy of that document. -2- A072A (Rev.8/82) (b) During the hearing on June 14 and I 5, please instruct your witnesses to testify usingplain English- not professional educatorese. Based upon the parties' previous written submissions and testimony taken in earlier hearings, I fear this may pose a significant challenge for some of the witnesses (and me). If so, I encourage these witnesses to begin now to practice speaking in plain English, so that they will be ready to testify by the June 14 and 15 hearing. ( 4) On or before June 7, 2004, counsel for Joshua and LRSD must submit proposed Findings of Fact and Conclusions of Law on the issue of whether LRSD has substantially complied with its obligations under Section VII of the Court's September 13, 2002 Memorandum Opinion and 2.7.1 of the Revised Plan. (5) On April 22, 2004, we had a telephone conference during which LRSD's Compliance Hearing was rescheduled from April 26 and 27, 2004, to June 14 and 15, 2004. During that telephone conference, I stated that I would make every effort to render my decision on LRSD's Compliance Report by June 30, 2004. Based upon my current work load, I now believe the earliest I will be able to enter my decision is thirty to sixty days after the conclusion of the evidentiary hearing in this matter. IT IS SO ORDERED. DATED this / ~ay of May, 2004. -3- TO: DATE: FAX COVER SHEET UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS Chris Heller Sam Jones Steve Jones John Walker Robert Pressman Timothy Gauger Mark Hagemeier Ann Marshall Mark Burnette Clay Fendley &gt; .-'/Z-v'( Telephone: 501-604-5140 Fax Number: 501-604 5149 376-2147 376-9442 375-1027 374-4187 781-862-1955 682-2591 682-2591 371-0100 375-1940 907-9798 - TI1ere are j_ pages, including this Cover Sheet, being s~nt by this facsimile transmission. MESSAGE SENT BY: Office of Judge Wm. U.S. District Court 600 West Capitol, Room 423 Little Rock, Arkansas 72201 Matt Morgan, LRSD Law Clerk 501-604-5141 IN THE UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION RECEIVED MAY 1 4 2004 OFACEOF DESEGREGATION MONITORING LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. NO.4:82CV00866 WRW PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, ET AL MRS. LORENE JOSHUA, ET AL KATHERINE KNIGHT, ET AL PLAINTIFF'S NOTICE OF FILING DOCUMENTS IN RESPONSE TO THE COURT'S ORDER FILED MAY 12, 2004 DEFENDANTS INTERVENORS INTERVENORS Plaintiff Little Rock School District ("LRSD") for its Notice of Filing states: 1. Attached are the following documents requested by the Court in its Order filed May 12, 2004: A. Little Rock School District Proposed Compliance Plan Revised Plan 2. 7 .1 (Appendix 1 of which is "IL-RI"); B. Letter from John W. Walker to Chris Heller dated October 10, 2002; and, C. Letter from John W. Walker to Chris Heller dated October 23, 2002 (received by fax on October 24, 2002). 2. As to Mr. Walker's references to "your document," the Court is correct that Mr. Walker is referring to the Proposed Compliance Plan attached hereto as Exhibit A. Page 1 of 3 3. As to the educational acronyms, Counsel has requested that the authors of the comprehensive evaluations immediately prepare a glossary of acronyms used in their respective evaluations. These will be consolidated into a single glossary for all exhibits and provided to the Court as soon as possible. Respectfully Submitted, LITTLE ROCK SCHOOL DISTRICT FRIDAY, ELDREDGE &amp; CLARK Christopher Heller (#81083) 2000 Regions Center 400 West Capitol Little Rock, AR 72201-3493 (501) 376-2011 BY: { P,,, ~ \-..U ._ Christopher Heller Page 2 of 3 CERTIFICATE OF SERVICE I certify that a copy of the foregoing has been served on the following people by depositing a copy of same in the United States mail on May 13, 2004: Mr. John W. Walker JOHN W. WALKER, P.A. 1723 Broadway Little Rock, AR 72201 Mr. Sam Jones Wright, Lindsey &amp; Jennings 2200 Nations Bank Bldg. 200 West Capitol Little Rock, AR 72201 Mr. Steve Jones JACK, LYON &amp; JONES, P.A. 425 W. Capitol, Suite 3400 Little Rock, AR 72201-3472 Judge J. Thomas Ray U. S. District Courthouse 600 West Capitol Avenue, Suite 149 Little Rock, AR 72201 Ms. Ann Marshall Desegregation Monitor 1 Union National Plaza 124 W. Capitol, Suite 1895 Little Rock, AR 72201 Mr. Tim Gauger Mr. Mark A. Hagemeier Office of the Attorney General 323 Center Street 200 Tower Building Little Rock, AR 72201 Mr. Clayton Blackstock Mr. Mark Burnett 1010 W. Third Street Little Rock, AR 72201 ~\UP-..~ Christopher Heller Page 3 of 3 Little Rock School District Compliance Committee Proposed Compliance Plan Revised Plan 2.7.1 ... EXHIBIT I ft The District Court's Compliance Remedy On September 13, 2002, the District Court issued its Memorandum Opinion (hereinafter "Opinion") finding that the Little Rock School District ("LRSD") had substantially complied with all areas of the Revised Desegregation and Education Plan ("Revised Plan"), with the exception Revised Plan 2. 7 .1. Section 2. 7 .1 provided: LRSD shall assess the academic programs implemented pursuant to Section 2.71 after each year in order to determine the effectiveness of the academic programs in improving African-American achievement. If this assessment reveals that a program has not and likely will not improve African-American achievement, LRSD shall take appropriate action in the form of either modifying how the program is implemented or replacing the program. The District Court's Opinion set forth a detailed "Compliance Remedy" to be implemented by the LRSD. The Opinion first stated: Because LRSD failed to substantially comply with the crucially important obligations contained in 2. 7 .1, it must remain under court supervision with regard to that section of the Revised Plan until it: (a) demonstrates that a program assessment procedure is in place that can accurately measure the effectiveness of each program implemented under 2. 7 in improving the academic achievement of African-American students; and (b) prepares the program evaluations identified on page 148 of the Final Compliance Report and uses those evaluations as part of the program assessment procedure contemplated by 2. 7 .1 of the Revised Plan. The Opinion then outlined the "details" of the Compliance Remedy as follows: A. For the entire 2002-03 school year and the first semester of the 2003-04 school year, through December 31, 2003, LRSD must continue to assess each of the programs implemented under 2.7 to improve the academic achievement of African-American students. LRSD now has over three years of testing data and other information available to use in gauging the effectiveness of those programs. I expect LRSD to use all of that available data and information in assessing the effectiveness of those programs and in deciding whether any of those programs should be modified or eliminated. 1Revised Plan 2. 7 provided, "LRSD shall implement programs, policies and/or procedures designed to improve and remediate the academic achievement of African-American students, including but not limited to Section 5 of this Revised Plan." 1 B. LRSD must maintain written records regarding its assessment of each of those programs. These written records must reflect the following information: (a) the written criteria used to assess each program during the 2002-03 school year and the first semester of the 2003-04 school year; (b) the results of the annual assessments of each program, including whether the assessments resulted in program modifications or the elimination of any programs; and ( c) the names of the administrators who were involved with the assessment of each program, as well as at least a grade level description of any teachers who were involved in the assessment process (e.g. , all fourth grade math teachers; all eighth grade English teachers, etc.). C. LRSD must use Dr. Nunnerly2 or another expert from outside LRSD with equivalent qualifications and expertise to prepare program evaluations on each of the programs identified on page 148 of the Final Compliance Report. I will accept all program evaluations that have already been completed by Dr. Nunnerly or someone with similar qualifications and approved by the Board. All program evaluations that have not yet been completed on the remaining programs identified on page 148 of the Final Compliance Report must be prepared and approved by the Board as soon as practicable, but, in no event, later than March 15, 2003. In addition, as these program evaluations are prepared, LRSD shall use them, as part of the program assessment process, to determine the effectiveness of those programs in improving African-American achievement and whether, based on the evaluations, any changes or modifications should be made in those programs. In addition, LRSD must use those program evaluations, to the extent t!1ey may be relevant, in assessing the effectiveness of other related programs. *** F. On or before March 15, 2004, LRSD must file a Compliance Report which documents its compliance with its obligations under 2.7.1. Any party, including Joshua, who wishes to challenge LRSD's substantial compliance with 2.7.1, as specified above, may file objections with the court on or before April 15, 2004. Thereafter, I will decide whether the LRSD has substantially complied with 2.7.1, as specified in the Compliance Remedy, and should be released from all further supervision and monitoring. 2The Court is clearly referring to Dr. John Nunnery. 2 Proposed Compliance Plan As the Compliance Committee understands the District Court's Opinion, the Compliance Remedy requires the LRSD to: 1. Continue to administer student assessments through the first semester of 2003-04; 2. Develop written procedures for evaluating the programs implemented pursuant to Revised Plan 2.7 to determine their effectiveness in improving the academic achievement of African-American students; 3. Maintain written records of ( a) the criteria used to evaluate each program; (b) the results of the annual student assessments, including whether an informal program evaluation resulted in program modifications or the elimination of any programs; and (c) the names of the administrators who were involved with the evaluation of each program, as well as at least a grade level description of any teachers who were involved in the evaluation process; 4. Prepare a comprehensive program evaluation of each academic program implemented pursuant to Revised Plan 2. 7 to determine its effectiveness in improving the academic achievement of African-American students and to decide whether to modify or replace the program; and 5. Submit for Board approval the program evaluations identified on page 148 of the LRSD's Final Compliance Report that have been completed, and complete, with the assistance of an outside expert, the remaining evaluations identified on page 148 of the LRSD's Final Compliance Report. What follows is an explanation of how the Compliance Committee derived these five requirements from the District Court's Opinion, and what the Compliance Committee proposes to do to comply with each requirement. Assessment and Evaluation When first read, the District Court's Compliance Remedy seemed simple and straightforward, but as the Compliance Committee attempted to develop this Proposed Compliance Plan; numerous questions arose. The most fundamental question related to the District Court's use of the term "assessment" in Paragraphs A and B of the Compliance Remedy. The ambiguity of this term was the subject of testimony at the hearing. The District Court included in its Opinion Dr. Lesley's testimony on the difference between "assessment" and "evaluation," see Opinion, p. 152, but it is unclear whether the Court accepted this testimony. 3 It is clear that the District Court understood the distinction between "testing data," which . are derived from student assessments, and "program evaluations," which are used to determine the effectiveness of programs. See Opinion, p. 152 (''LRSD acknowledged in the Interim Compliance Report that it was required: (a) to use both the testing data and the "program evaluations" to determine the effectiveness of the key academic programs implemented pursuant to 2.7 ... " (emphasis in original)). Even so, the District Court appears to have used the term "assessment" in some instances to refer to only student assessments and in other instances to refer to both student assessments and evaluations. This required the Compliance Committee to determine the District Court's intended meaning. In making this determination, the Compliance Committee considered the context in which the term was used, the District Court's findings of fact as set forth in the Opinion, what would be in the best interest of African-American students, and hopefully, common sense. An explanation of each requirement of the Compliance Remedy is provided below. To avoid any ambiguity, Compliance Committee hereinafter uses the term "assessment" to refer to student assessments and the term "evaluation" to refer to the program evaluations, whether formal or informal. 1. Continue to administer student assessments through the first semester of 2003-04. This requirement derives from Paragraph A of the Compliance Remedy. Given Paragraph A's reference to ''testing data," it seems clear that Paragraph A concerns, in part, student assessments. The Compliance Committee proposes to comply with this part of Paragraph A by implementing the 2002-03 Board-approved assessment plan. The 2002-03 Board-approved assessment plan incorporates four changes that have been made since the LRSD's Final Compliance Report. First, the Board eliminated the fall administrations of the Achievement Level Tests (ALTs) in 2001-02. The administration recommended this for three reasons: (1) the loss of instructional time resulting from testing and test preparation; (2) fall results did not provide significantly different information from the previous spring' s results; and (3) the cost of administering and scoring the tests. Second, the fall administration of the Observation Surveys and Developmental Reading Assessment will only be used by the teacher for diagnostic purposes. The scores will not be reported to or maintained by the LRSD. This change saves considerable time in test administration and allows more time for instruction. It was approved by the Board on September 26, 2002. Third, the LRSD will no longer administer the AL Ts. The administration recommended the complete elimination of the AL Ts for the following reasons: (1) the lack of alignment with the content and format of the State Benchmarks; (2) the loss of instructional time resulting from 4 - testing and test administration; (3) the new federal accountability requirements in the No Child Left Behind Act require annual testing by the State in grades 3-8, making the LRSD's administration of the ALTs redundant; and (4) the costs of administering and scoring the tests. The Board approved this change on September 26, 2002. Finally, the Arkansas Department of Education ("ADE") has moved the administration of the SAT9 from the fall to the spring, effective 2002-03. The 2002-03 Board-approved assessment plan calls for the administration of the following student assessments in English language arts and mathematics: Kindergarten Grade 1 Grade2 Grade4 Grade 5 Grade 6 Grade 7 Grade 8 Grades 7-10 Grades 9-11 Grade 10 Grade 11 Observation Surveys (5) Developmental Reading Assessment Observation Surveys (5) Development Reading Assessment Observation Surveys (3) Development Reading Assessment Norm-referenced test to be identified for gifted/talented screening Benchmark Literacy examination Benchmark Mathematics examination SAT9 Total Battery Benchmark Literacy examination Benchmark Mathematics examination SAT9 Total Battery Benchmark Literacy examination Benchmark Mathematics examination End-of Course Algebra I examination End-of Course Geometry examination SAT9 Total Battery End-of-Level Literacy examination All of these assessments are administered in the spring. Consequently, the final student assessment before March 15, 2004, will be administered in the spring of 2003. 2. Develop written procedures for evaluating the programs implemented pursuant to 2. 7 to determine their effectiveness in improving the academic achievement of African-American students. This requirement derives from the opening paragraph of the Compliance Remedy. To comply with this requirement, two proposed regulations have been drafted, IL-Rl for formal evaluations and IL-R2 for informal evaluations, attached as Appendixes 1 and 2, respectively. 5 Proposed regulation IL-Rl combines generally accepted principles of program evaluation with practices that have been in place in the LRSD for the past two years. See, ~ Robby Champion, "Map Out Evaluation Goals," Journal for Staff Development, Fall 2002, attached as Appendix 3. This regulation will be submitted to the Board, Office of Desegregation Monitoring ("ODM") and the Joshua Intervenors ("Joshua") for review and comment before being :finalized. Proposed regulation IL-R2 specifically addresses the next requirement and is discussed therewith. 3. Maintain written records of (a) the criteria used to evaluate each program; (b) the results of the annual student assessments, including whether an informal program evaluation resulted in program modifications or the elimination of any programs; and (c) the names of the administrators who were involved with the evaluation of each program, as well as at least a grade level description of any teachers who were involved in the evaluation process. This requirement derives from Paragraph B of the Compliance Remedy. Paragraph B apparently came about as a result of the District Court's concern about the LRSD making program modifications based on informal evaluations of student assessment data. See Opinion, p. 155 ("I have grave reservations about anyone this side of Solomon being wise enough to use two or three semesters' worth of erratic composite test scores to make reliable decisions about which remediation programs for LRSD's African-American students were actually working."). Proposed regulations IL-R2 was drafted to specifically address this requirement. It prohibits substantial program modifications from being made without a written record as required by Paragraph B. This regulation will also be submitted to ODM and Joshua for review and comment before being finalized. Proposed regulation IL-Rl also complies with this requirement. It mandates that the criteria used to formally evaluate a program be identified as the research questions to be answered, the first of which will be, "Has this curriculum/instruction program been effective in improving and remediating the academic achievement of African-American students?". See Appendix 1, IL-Rl, p. 5. Recommended program modifications and the members of the evaluation team are routinely included in formal evaluations. As to the results of annual student assessments, the LRSD will continue to maintain a computer database with the results of annual students assessments administered pursuant to the Board-approved assessment plan. 6 4. Prepare a comprehensive program evaluation of each academic program implemented pursuant to 2.7 to determine its effectiveness in improving the academic achievement of African-American students and to decide whether to modify or replace the program. This requirement derives from Paragraph A of the Compliance Remedy. To comply with this requirement, the Compliance Committee proposes to prepare the following new, comprehensive evaluations: ( a) Primary Reading/Language Arts, (b) Middle and High School Literacy and (c) K-12 Mathematics and Science. Each evaluation will be prepared in accordance with proposed Regulation IL-Rl and will incorporate all available student assessment data relevant to the program being evaluated. Based on Paragraph F of the Compliance Remedy, the LRSD understands these evaluations must be submitted to the Court on or before March 15, 2004. Some may argue that Paragraph A and Paragraph C together require the LRSD to prepare new, comprehensive evaluations of all the programs identified on page 148 of the LRSD's Final Compliance Report. The Compliance Committee considered and rejected this argument for three reasons. First, Paragraph A's description of the programs to be evaluated differs from that of Paragraph C. Paragraph A states that the LRSD "must continue to assess each of the programs implemented under 2.7 . . . " The Compliance Committee understands this to mean that the LRSD should continue to prepare evaluations of "some of the key programs," as identified in the Interim Compliance Report. See Opinion, p. 151 ("In addition to the "Assessment Plan," 2. 7 .1 of the Interim Compliance Report noted that the LRSD was preparing "evaluations" of some of the key programs designed to improve African-American achievement in order to provide a more in-depth look at the effectiveness of those programs." (emphasis in original)). In contrast to Paragraph A, Paragraph C requires the LRSD to prepare evaluations "of each of the programs identified on page 148 of the Final Compliance Report." The Compliance Committee understands this to mean that the LRSD should complete all of the evaluations identified on page 148 of the Final Compliance Report and submit those to the Court. See Opinion, p. 156 ("[A]s of March 15, 2001 , the date the Final Compliance Report was filed with the Court: (1) PRE had prepared only draft evaluations of some of the programs in question; (2) none of those evaluations had been approved by the Board . .. . " (emphasis in original)). The District Court's statement in Paragraph C that it will accept evaluations already completed and approved by the Board further indicates that Paragraph C does not require new, comprehensive evaluations. Second, recognizing this distinction between Paragraph A and Paragraph C resolves a potential conflict between Paragraph C and Paragraph F. Paragraph C provides, "All program evaluations that have not yet been completed on the remaining programs identified on page 148 7 of the Final Compliance Report must be prepared and approved by the Board as soon as practicable, but, in no event, later than March 15, 2003." However, Paragraph F does not require the LRSD to file a compliance report on its compliance with Revised Plan 2. 7 .1 until March 15, 2004. The Compliance Committee concludes that March 15, 2004, is the deadline for submitting the new, comprehensive evaluations of"the programs implemented pursuant to 2. 7." See Paragraph A of Compliance Remedy. This is consistent with Paragraph A's requirement that the LRSD include assessment data through December 31, 2003. Obviously, such data could not be included in an evaluation filed on or before March 15, 2003. Finally, it makes the most sense for the LRSD to expend the greatest time and resources preparing evaluations of the programs designed to improve African-American achievement. While the requirement for new, comprehensive evaluations derives from Paragraph A, some may argue that Paragraph C's requirement that the LRSD use an outside expert "to prepare evaluations of each of the programs identified on page 148 of the Final Compliance Report" applies to the new, comprehensive evaluations. The Compliance Committee hopes the District Court and the parties agree that the team approach to program evaluation set forth in proposed regulation IL-Rl renders this argument moot. Proposed Regulation IL-Rl states that the program evaluation team must include "[a]n external consultant with expertise in program evaluation, the program area being evaluated, statistical analysis, and/or technical writing .... " Appendix 1, p. 4. The exact role of the external consultant "may vary, depending upon the expertise required for the production of the program evaluation." Id. The Compliance Committee believes that the LRSD's practice over the last two years of using the team approach to program evaluation has produced credible evaluations. Moreover, participation of the LRSD staff on the evaluation team provides them an excellent learning experience that they do not typically receive when an evaluation is prepared entirely by an outside expert. The evaluations prepared over the last two years using the team approach are as follows: 1. 2. Dr. Steve Ross was the external consultant in the production of the Early Literacy program evaluation for 1999-2000 and 2000-01 . He was asked to Fead a nearfinal draft and to provide feedback, which he did. His suggestions were then incorporated into the final report before it was published and disseminated. Other team members included Bonnie Lesley (associate superintendent), Patricia Price (program director), Pat Busbea (program specialist), Ed Williams (statistician), and Ken Savage (computer programmer). Dr. Julio Lopez-Ferraro is the National Science Foundation (''NSF") program officer who over-sees the LRSD's implementation of the grant-funded 8 3. 4. Comprehensive Partnership for Mathematics and Science Achievement ("CPMSA"). NSF trained a team ofLRSD staff to produce the mandated annual program evaluations for this initiative and then assembled an external team of practitioners and researchers who came to the LRSD each year to validate our :findings and provide written feedback. The LRSD team members who participated in writing of the annual progress reports included Vanessa Cleaver (project director), Dennis Glasgow (director of mathematics and science), Bonnie Lesley (associate superintendent and co-project investigator), Virginia Johnson (CPMSA program evaluator), Ed Williams (statistician), and Ken Savage ( computer programmer). Mr. Mark Vasquez, an attorney and former employee of the Office for Civil Rights in Dallas, has been retained by the LRSD for the past three years to provide guidance in the design and production of the English as a Second Language ("ESL") program evaluation. Other team members have been Bonnie Lesley (associate superintendent), Karen Broadnax (program supervisor), Ed Williams (statistician), Ken Savage (computer programmer), and Eddie McCoy (program evaluator). Dr. Larry McNeal, a professor at the University of Arkansas at Little Rock in education administration and a private consultant in program evaluation, was retained by the LRSD to lead the team that produced the program evaluation for the Charter School. Other members of that team included Linda Watson ( assistant superintendent), Krista Young (program director), and Ed Williams (statistician). Dr. McNeal wrote this report. The team approach, supported by an external expert, ensures that all areas of expertise (program, implementation, technical and evaluative) are included. No one person would have all the knowledge and skills that a team would have. As these examples show, the external expert does not always perform the same role in every project. Rather, the role changes, depending on the expertise that is required for a credible report. 5. Submit for Board approval the program evaluations identified on page 148 of the LRSD's Final Compliance Report that have been completed, and complete, with the assistance of an outside expert, the remaining program evaluations identified on page 148 of the LRSD's Final Compliance Report. The following program evaluations identified on page 148 of the Final Compliance Report have been completed: 1. Early Literacy. A comprehensive report for 1999-2000 and 2000-01 was prepared, completed, and presented to the Board in fall 2001. An update to this report for 2001-02 was presented to the Board in June 2002, with an emphasis on 9 2. 3. 4. 5 6. 7. 8. 9. 10. the improved achievement of African-American students and closing the achievement gap. Mathematics and Science. Three years (1998-99, 1999-2000, and 2000-01) of program evaluations as required by the NSF were prepared, presented to the Board, and submitted to NSF, and NSF has responded to each evaluation. Extended Year Schools. The LRSD staff prepared, completed, and presented to the Board in the spring of 2002 an evaluation of the Extended Year Schools. Elementary Summer School. The LRSD staff prepared, completed, and provided to the School Services Division an evaluation of elementary summer school programs for 2000-01. HIPPY. The HIPPY program was evaluated by the LRSD staff in July 1999. The report was prepared, completed, and submitted to the program director and the Cabinet. Charter School. This program evaluation was prepared, completed, and presented to the Board in June 2001. ESL. The Office for Civil Rights has required the LRSD to prepare a program evaluation in this area for each of the past three years: 1999-2000, 2000-01 , and 2001-02. The first two of these reports have been prepared, completed, submitted to the Board, and submitted to OCR. (A third program evaluation will be completed in October when state scores arrive and will be ready by the March 15, 2003 deadline). Lyceum Scholars Program. Two separate evaluations of this alternative education school program were prepared by the LRSD staff. Southwest Middle School's SEDL Program. Southwest Middle School was the recipient of a two-year technical assistance grant from the Southwest Educational Development Lab ("SEDL'') to build professional community. SEDL prepared a comprehensive program evaluation that included Southwest among other grant recipients outside the LRSD. The LRSD staff provided SEDL data for this evaluation. Onward to Excellence (Watson Elementary). A grant from ADE funded a partnership between Watson Elementary and the Northwest Educational Development Lab to implement a school improvement initiative. The LRSD staff provided data to Watson's principal for preparation of program evaluations. The principal submitted two annual program evaluations to ADE. 11. Collaborative Action Team ("CAT"). This one-year partnership with SEDL provided in 2000-01 for establishing and training a Collaborative Action Team of parent and community volunteers supported by LRSD staff to improve parent involvement. SEDL wrote a 249-page evaluation of their three-year grant-funded program, of which LRSD was included only the last year. The LRSD staff provided SEDL data for this evaluation. 12. Vital Link. The LRSD staff prepared a program evaluation, and it was provided to the project director. A question arises as to which of these evaluations are acceptable to the Court without additional work. The first sentence of Paragraph C of the Compliance Remedy provides, "LRSD must use Dr. Nunnerly (sic) or another expert from outside LRSD with equivalent qualifications and expertise to prepare program evaluations of each of the programs identified on page 148 of the Final Compliance Report." The second sentence of Paragraph C states that the District Court ''will accept all program evaluations that have already been completed by Dr. Nunnerly (sic) or someone with similar qualifications." It is unclear whether an "expert from outside the LRSD" must have prepared the completed evaluations for them to be accepted by the District Court, or whether it is sufficient that they were prepared by someone within LRSD with "similar qualifications." The District Court's findings of fact suggest that the District Court will accept only program evaluations already completed by an outside expert. The District Court noted that Dr. Lesley testified ''that, by the end of November 2000, it was her opinion that no one in PRE had the expertise to prepare program evaluations." Opinion, p. 153. Thus, the District Court likely concluded that the only acceptable program evaluations would be those prepared by persons outside the LRSD. Applying this standard, the Compliance Committee believes that the following evaluations are acceptable to the Court, following Board approval, without additional work: Early Literacy, Mathematics and Science, Charter School, ESL, Southwest Middle School's SEDL Program and CAT. The remaining program evaluations identified on the bottom of page 148 of the Final Compliance Report must be "completed" by an outside expert. They are: Extended Year Schools, Middle School Implementation, Elementary Summer School, HIPPY, Campus Leadership Teams ("CLTs"), Lyceum Scholars Program, Onward to Excellence and Vital Link. The Compliance Committee's proposal for completing each of these evaluations will be discussed below. In deciding how to go about completing these evaluations, the Compliance Committee focused on what makes sense to do at this time considering the goal of improving African-American achievement and the limitations inherent in asking an expert to "complete" an evaluation. 11 Extended Year Schools. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. Middle School Implementation. A draft of this evaluation was presented to the Board in July and August 2000, but it was never completed. The Compliance Committee proposes retaining an outside expert to rewrite the report and, if possible, prepare an evaluation based on the existing data. Elementary Summer School. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. HIPPY. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. CLTs. The LRSD staff conducted a survey of CL Ts during 2000-01 . A summary of the survey findings was presented during a CL T training session, but no formal report was ever prepared. The Compliance Committee proposes retaining an outside expert to review the survey data and, if possible, prepare an evaluation based on the existing survey data. Lyceum Scholars Program. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. Onward to Excellence. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. Vital Link. This evaluation was completed by LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. 12 - Action Plan Timeline The Compliance Committee proposes implementation of this Compliance Plan in accordance with the following timeline. 1. Provide copies of this Week of September 30, Clay Fendley proposed Compliance Plan 2002 Ken James to ODM and Joshua for their reactions. 2. Incorporate, as possible, Week of October 7, 2002 Attorneys suggested revisions from Ken James ODM and Joshua. Compliance Team 3. Place Compliance Plan October 10, 2002 Ken James on the agenda for Board Attorneys review and approval. 4. Place 2002-03 Program October 24, 2002 Ken James Evaluation Agenda on the Bonnie Lesley Board's agenda for review and approval. 5. Place on Board agenda October 24, 2002 Bonnie Lesley for approval two previously Linda Watson presented program evaluations ( early literacy, and charter school). 6. Place on Board agenda November 2002 Bonnie Lesley for approval the evaluations of Southwest Middle School's SEDL program and the Collaborative Action Team (also conducted by SEDL). 7. Place on Board agenda November 2002 Bonnie Lesley for approval the previously Karen Broadnax presented ESL program evaluations for 1999-2000 and 2000-01, plus the new evaluation for 2001-02. 13 - 8. Place on Board agenda December 2002 Bonnie Lesley for approval the three Vanessa Cleaver previously presented Dennis Glasgow program evaluations for the NSF-funded CPMSA program, plus the new Year 4 report for 2001-2002. 9. Issue Request for Mid-October 2002 Bonnie Lesley Proposals (RFPs) from Darral Paradis available external experts to review and complete the eight remaining program evaluations listed on page 148. 10. Form a screening team Late October 2002 Ken James to determine Compliance Team recommendations to the Superintendent for designating external experts to review and complete the eight remaining program evaluations listed on page 148. 11. Select and negotiate Mid-November 2002 Bonnie Lesley consulting contracts with designated external experts. 12. Assign appropriate staff Mid-November 2002 Ken James to each external expert to Bonnie Lesley provide needed information, data, access to program staff, etc. 13. Monitor the work to Mid-November Bonnie Lesley ensure timely completion. 2002-February 2003 14. As each J)aper is December 2002-February Bonnie Lesley completed and ready for 2003 circulation, send copies to ODM and Joshua for their review and comments. 14 - 15. As each paper is December 2002-F ebruary Ken James completed, place on the 2003 Bonnie Lesley Board's agenda the item to be reviewed and approved. 16. Write Interim March 15, 2003 Attorneys Compliance Report relating Compliance Committee to programs on page 148 to be completed. 17. Establish staff teams for March 1, 2003 Bonnie Lesley each of the three programs on the Board's Program Evaluation Agenda to be completed for 2002-2003 (Elementary Literacy, Secondary Literacy, and K- 12 Mathematics/ Science). 18. Publish RFPs to March 1, 2003 Bonnie Lesley identify external experts to Darral Paradis serve on each of the two staff teams for the Board's - Program Evaluation Agenda (K-12 mathematics/science external experts are provided by NSF). 19. Establish consulting Late March 2003 Bonnie Lesley contracts with the two external experts required for the Elementary Literacy and Secondary Literacy program evaluations. 20. Train each program May 2003 Bonnie Lesley evaluation team, including the external expert, on the requirements of the approved Compliance Plan and IL-R. 15 - 21. Monitor the completion May-October 2003 Bonnie Lesley of the work on all three program evaluations required in the Board's Program Evaluation Agenda. 22. Send copies of the With October 2003 Board Ken James completed Elementary agenda packet Bonnie Lesley Literacy program evaluation to ODM and Joshua for information. 23. Complete the October board meeting, Bonnie Lesley evaluation of the 2003 Pat Price Elementary Literacy program and place on the Board's agenda for approval. 24. Send copies of the With November 2003 Board Ken James Secondary Literacy program agenda packets Bonnie Lesley evaluation to ODM and - Joshua for information. 25. Complete the November board meeting, Bonnie Lesley evaluation of the Secondary 2003 Pat Price Literacy program and place on the Board's agenda for approval. 26. Send copies of the With December 2003 Board Ken James completed CPMSA program agenda packet Bonnie Lesley evaluation to ODM and Joshua for information. 27. Complete the five-year December board meeting, Bonnie Lesley evaluation of the CPMSA 2003 Vanessa Cleaver project (science and Dennis Glasgow mathematics) and place on the Board's agenda for approval. 28. Write Section 2.7.1 March 15, 2004 Ken James Final Compliance Report Attorneys for federal court and file Compliance Team with Court. - 16 Appendix 1 Proposed IL-Rl LITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R1 PROGRAM EVALUATION AGENDA Purpose The purpose of these regulations is to provide guidance to the staff involved in the evaluation of programs required in the Board's Program Evaluation Agenda. They do not necessarily apply to grant-funded programs if the funding source requires other procedures and provides funding for a required evaluation. Criteria for Program Evaluations Policy IL specifies that the evaluations of programs approved in its Boardapproved Program Evaluation Agenda shall be conducted according to the standards developed by the Joint Committee on Standards for Educational Evaluation. (See Joint Committee on Standards for Educational Evaluation, James R. Sanders, Chair (1994 ). The Program Evaluation Standards, 2nd Edition: How to Assess Evaluations of Educational Programs. Thousand Oaks, CA: Sage Publications.) They are as follows: Utility Standards The utility standards are intended to ensure that an evaluation will serve the information needs of intended users. These standards are as follows: Stakeholder identification. People involved in or affected by the evaluation should be identified so that their needs can be addressed. Evaluator credibility. The people conducting the evaluation should be both trustworthy and competent to perform the evaluation so that the evaluation findings achieve maximum credibility and acceptance. Information scope and sequence. Information collected should be broadly selected to address pertinent questions about the program and should be responsive to the needs and interests of cljents and other specified stakeholders. . Values identification. The perspectives, procedures, and rationale used to interpret the findings should be described carefully so that the bases for value judgements are clear. Report clarity. Evaluation reports should describe clearly the program being evaluated, including its context and the purposes, procedures, and findings of the evaluation, so that essential information is provided and understood easily. 1 Report timeliness and dissemination. Significant interim findings and evaluation reports should be disseminated to intended users so that they can be used in a timely fashion. Evaluation impact. Evaluations should be planned, conducted, and reported in ways that encourage follow-through by stakeholders, so that the likelihood that the evaluation will be used is increased. Feasibility Standards Feasibility standards are intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal. Practical procedures. Evaluation procedures should be practical so that the disruption is kept to a minimum while needed information is obtained. Political viability. The evaluation should be planned and conducted with anticipation of the different positions of various interest groups so that their cooperation may be obtained, and so that possible attempts by any of these groups to curtail evaluation operations or to vias or misapply the results can be averted or counteracted. Cost-effectiveness. The evaluation should be efficient and produce information of sufficient value so that the resources expended can be justified. Propriety Standards The propriety standards are intended to ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results. Service orientation. Evaluations should be designed to assist organizations to address and effectively serve the needs of the full range of targeted participants. Formal agreements. Obligations of the formal parties to an evaluation (what is to be done, how, by whom, and when) should be agreed to in writing so that these parties are obligated to adhere to all conditions of the agreement or to formally renegotiate it. Rights of human subjects. Evaluations should respect human dignity and worth in their interactions with other people associated with an evaluation so that participants are not threatened or harmed. Complete and fair assessments. The evaluation should be complete and fair in its examination and recording of strengths and weaknesses of the program being evaluated so that strengths can be built upon and problem areas addressed. Disclosure of findings. The formal parties to an evaluation should ensure that the full set of evaluation findings, along with pertinent limitations, are made accessible to the people affected by the 2 evaluation, as well as any others with expressed legal rights to receive the results. Conflict of interest. Conflict of interest should be dealt with openly and honestly so that it does not compromise the evaluation processes and results. Fiscal responsibility. The evaluator's allocation and expenditure of resources should reflect sound accountability procedures and be prudent and ethically responsible so that expenditures are accounted for and appropriate. Accuracy Standards Accuracy standards are intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine the worth of merit of the program being evaluated. Program documentation. The program being evaluated should be described and documented clearly and accurately so that it programs is identified clearly. Context analysis. The context in which the program exists should be examined in enough detail so that its likely influences on the program can be identified. Described purposes and procedures. The purposes and procedure of the evaluation should be monitored and described in enough detail so that they can be identified and assessed. Defensible information sources. The sources of information used in a program evaluation should be described in enough detail so that the adequacy of the information can be assessed. Valid information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the interpretation arrived at is valid for the intended use. Reliable information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the information obtained is sufficiently reliable for the intended use. Systematic information. The information collected, processed, and reported in an evaluation should be review systematically so that the evaluation questions are answered effectively. Analysis of quantitative information. Quantitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Analysis of qualitative information. Qualitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Justified conclusions. The conclusions reached in an evaluation should be justified explicitly so that stakeholders can assess them. 3 Impartial reporting. Reporting procedures should guard against distortion caused by personal feelings and biases of any party so the evaluation reports reflect the evaluation findings fairly. Metaevaluation. The evaluation itself should be evaluated formatively and summartively against these and other pertinent standards so that its conduct is appropriately guided, and on completion, stakeholders can closely examine its strengths and weaknesses. Program Evaluation Procedures The following procedures are established for the evaluation of programs approved by the Board of Education in its annual Program Evaluation Agenda: 1. The Division of Instruction shall recommend to the Superintendent annually, before the budget for the coming year is proposed, the curriculum/instruction programs for comprehensive program evaluation. The recommendation shall include a proposed budget, a description of other required resources, and an action plan for the completion of the reports. Criteria for the proposed agenda are as follows: A. Can the results of the evaluation influence decisions about the program? B. Can the evaluation be done in time to be useful? C. Is the program significant enough to merit evaluation? (See Joseph S. Wholey, Harry P. Hatry, and Kathryn Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: JosseyBass Publishers. 5-7.) 2. The Superintendent shall recommend to the Board of Education for approval the proposed Program Evaluation Agenda-with anticipated costs and an action plan for completion. 3. For each curriculum/instruction program to be evaluated as per the Program Evaluation Agena, the Associate Superintendent for Instruction shall establish a staff team with a designated leader to assume responsibility for th~ production of the report according to the timelines established in the action plan approved by the Board of Education. 4. Each team shall include, at a minimum, one or more specialists .in the curriculum/instruction program to be evaluated, a statistician, a programmer to assist in data retrieval and disaggregation, and a technical writer. If additional expertise is required, then other staff may be added as necessary. 5. An external consultant with expertise in program evaluation, the program area being evaluated, statistical analysis, and/or technical writing shall be retained 4 as a member of the team. The role of the external consultant may vary, depending upon the expertise required for the production of the program evaluation. 6. The team leader shall establish a calendar of regularly scheduled meetings for the production of the program evaluation. The first meetings will be devoted to the following tasks: A. Provide any necessary training on program evaluation that may be required for novice members of the team, including a review of the Board's policy IL and all of the required criteria and procedures in these regulations, IL-R. B. Assess the expertise of each team member and make recommendations to the Associate Superintendent for Instruction related to any additional assistance that may be required. C. Write a clear description of the curriculum/instruction program that is to be evaluated, with information about the schedule of its implementation. D. Agree on any necessary research questions that need to be established in addition to the question, "Has this curriculum/instruction program been effective in improving and remediating the academic achievement of African-American students? (See Policy IL, 2. 7 .1 of the Revised Desegregation and Education Plan, and Judge Wilson's Compliance Remedy.) E. Generate a list of the data required to answer each research question, and assign responsibility for its collection and production. All available and relevant student performance data must be included. (See Judge Wilson's Compliance Remedy.) F. Decide who will be the chief writer of the program evaluation. G. Plan ways to provide regular progress reports (e.g., dissemination of meeting minutes, written progress reports, oral reports 'to the Superintendent's Cabinet and/or Compliance Team) to stakeholders, including the Associate Superintendent for Instruction, the Superintendent of Schools, the Office of Desegregation Monitoring (until Unitary Status is achieved), and the Joshua lntervenors (until Unitary Status is achieved). (See Joellen Killion (2002). Assessing Impact: Evaluating Staff Development. Oxford, OH. National Staff Development Council (NSDC); Robby Champion (Fall 2002). "Map Out Evaluation Goals." Journal of Staff Development. 78-79; 5 Thomas R. Guskey (2000). Evaluating Professional Development. Thousand Oaks, CA: Corwin Press; Blaine R. Worthen, James R. Sanders, and Jody L. Fitzpatrick (1997). "Participant-Oriented Evaluated Approaches." Program Evaluation: Alternative Approaches and Practical Guidelines; 153-169; Beverly A. Parsons (2002). Evaluative Inquiry: Using Evaluation to Promote Student Success. Thousand Oaks, CA: Corwin Press; and Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: Jossey-Bass Publishers.) 7. Subsequent meetings of the program evaluation team are required for the following tasks: to monitor the completion of assignments; to collaborate in the interpretation and analysis of data; to pose any necessary new questions to be answered; to review drafts and provide feedback to the writer; to formulate recommendations, as required, for program improvement, especially to decide if a recommendation is required to modify or abandon the program if the findings reveal that the program is not being successful for the improvement of AfricanAmerican achievement; to assist in final proofreading; and to write a brief executive summary, highlighting the program evaluation findings and recommendations. 8. A near-final copy of the program evaluation must be submitted to the Associate Superintendent for Instruction at least one month before the deadline for placing the report on the Board's agenda for review and approval. This time is required for final approval by staff, for final editing to ensure accuracy, and for submission to the Superintendent. 9. When the program evaluation is approved for submission to the Board of Education for review and approval, copies of the Executive Summary and complete report must be made for them, for members of the Cabinet, for ODM (until Unitary Status is achieved), and for the Joshua lntervenors (until Unitary Status is achieved). 10. The program evaluation team shall plan its presentation to the Board of Education on the findings and recommendations. 6 11. The Associate Superintendent for Instruction shall prepare the cover memorandum to the Board of Education, including all the required background information (see Judge Wilson's "Compliance Remedy"): A. If program modifications are suggested, the steps that the staff members have taken or will take to implement those modifications. If abandonment of the program is recommended, the steps that will be taken to replace the program with another with more potential for the improvement and remediation of African-American students. (See Section 2. 7 .1 of the Revised Desegregation and Education Plan and Judge Wilson's Compliance Remedy.) B. Names of the administrators who were involved in the program evaluation. C. Name and qualifications of the external expert who served on the evaluation team. D. Grade-level descriptions of the teachers who were involved in the assessment process (e.g., all fourth-grade math teachers, all eighth grade English teachers, etc.). 10. When the program evaluation is approved by the Board of Education, the team must arrange to have the Executive Summary and the full report copied and design a plan for communicating the program evaluation findings and recommendations to other stakeholders. This plan must then be submitted to the Associate Superintendent for approval. 11. Each program evaluation team shall meet with the Associate Superintendent for Instruction after the completion of its work to evaluate the processes and product and to make recommendations for future program evaluations. (See "Joellen Killion (2002). " Evaluate the Evaluation." Assessing Impact: Evaluating Staff Development. Oxford, OH: National Staff Development Council. 46, 123-124.) 7 Appendix 2 Proposed IL-R2 LITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R2 INFORMAL PROGRAM EVALUATION Introduction The purpose of this regulation is to ensure that a written record exists explaining a decision to significantly modify an academic program. It is not the intent of this regulation to require a formal program evaluation before every significant program modification. Definitions "Academic Program" means one of the core curriculum programs of English/Language Arts, Mathematics, Science or Social Studies. "Significantly modify" means a material change in the content or delivery of an academic program implemented throughout the entire District. Written Record A written record must be prepared and maintained explaining a decision to significantly modify an academic program. The written record required by this regulation must include the following information: (a) the written criteria used to evaluate the program; (b) a summary of the student assessment data or other data on which the decision was based; and (c) the names of the administrators who were involved with the evaluation of each program, as well as at least a grade level description of any teachers who were involved in the evaluation process (e.g., all fourth grade math teachers; all eighth grade English teachers, etc.). 1 Appendix 3 Robby Champion, "Map Out Evaluation Goals," Journal for Staff Development, Fall 2002 78 .CHAMPION ,;Map .out evaluat:i:o.n goals A master plan can guide yo_u down the rocky path of ~valuatio~ W . hen you launch a inajor professional dev~lop,nent evaluation, regardless of the . project's scope, you may quickly find yourself on a slippery, often rocky road, . . . . with twists and unexpected turns. . . Before venturing too far and becoming disillusioned about program evaluation, create a master plan. While itTequires an upfront investment of time and may delay starting, it quickly becomes an invaluable road map that helps you avciid delays and detours along the way. Developing an evaluation master plan is most useful when you are ~aunching a major, summative program evaluation. A "summative" evaluation is done at major junctures in a program's life. cy~le and emphasizes . docu~ menting impact Information from summative evaluations is used to make important decisions about the initiative, such as whether to continue, alter, expan.d, downsize, or eliminate it A "formative" evaluation, on the other hand, means monitoring and collecting.data, often informally and spontaneously, throughout program implementation. Fonnative evaluation helps show implementers where to make adjustments so a program can eventually achieve significant results. A thoughtfully prepared master plan for a major evaluation effort would: Focus the evaluation effort and help implementers avoid being sidetracked by leadership changes and new opinions; Create a realistic tirneline and work plan that Robby Champion is president of Champion Training &amp; Consulting. You can contact her at Champion Ranch at Trumbell'Canyon, Mora, NM 87732, (505) 387-2016, fax (505) 387-5581, e-.mail: Robbychampion@aol.com. provides .needed momentum for the work; . : Be a key informational document to provide an overview and answer specific questions . throughout the process; Help recruit people to assist with the project ' on.the myriad evaluation.tasks; Give the message that the evaluation will be open and not secretive. Whetper your evaluation must be completed within a few months or will extend . for several years, think through four phases of work before starting. PHASE I: ORGANIZE THE PROCESS L Form a steering committee, including any needed outside expertise. . 2. Learn moreabout program evaluation together. 3: Write a clear description of each program to be.evaluated: 4. Agree on the primary purpose of the ,evaluation. . .5. Plan how you will keep. everyone informed along the way. Steering committees, charged specifically . with program evaluati_on, are important to focus attention and maintain the energy and momentum needed for the evaluation. They also help build a spirit of collaboration and open inquiry. And they keep the evaluation on track when other priorities might push the effort aside. . Provide steering committee members with the tools to ~ucceed. Mem~ers need not be evalu- National Staff Development Council JSD Fall 2002. : .on:experis, but they do need informa- . . . on, support, and guidance to make infoirued decisions .. They need background material to learn about program evaluation and examples of good evaluation studies. Finally, they need access to experts on prof~ssional development, measurement, arid the content areas of the training programs. Before launching any evaluation effort, have a written description of each program to be evaluated. You would be amazed at the number of people who do not have a clear idea of what you mean by the "New Teacher..Induction Program" or the "Early Literacy Initiative" since s6 many different initiatives are being under- ta.ken simultaneously arciund the school or district. PHASE II: DESIGN THE EVALUATION 1. Generate questions to guide the evaluation.' 2. Generate potential data sources/ . instruments to address the questions .. a 3. Using a matrix to provide a j. . Wrd's-eye .view, agree on the most important questions .and the best data sources. 4. Decide if collecting data from a sample group is warranted to m~e the evaluation manageable. 5. Determine the evaluation approach that makes sense: quantitative vs. qualitative/naturalistic. 6. Gather or create the instruments for data. collection. .7. Detennine a realistic schedule for . collecting data. . . 8. Create a system for coJiecting, analyzing, and interpreting data. Decisions made in Phase II are critical. They detennine the technical quality of your evaluation. In the questions you select, you determine what to examine and what to ignore. When you finish with ~ the design phase, your program evaluation Will be shaped to .use a quantitative or a , qualitative model - or a mixture of the two. In the design phase, you make other - jor decisions, such as whether to use a sample group. You also decide whether to do ~ in-depth case study, whether to t a It j .n g m e :a s :u r e ON THE WEB. See. an example of a matnx.'to help guide evaluations at: www.nsdc.org/library/jsd/ champion234.html. survey the whole population, whether to use examples of student work instead of official do_cuments such as .student grades or standardized test scores, or whether to . judge adult learners' understanding of the training .content with performance tasks during training or by exit tests, classroom observations, or student feedback. If the programs to be evaluated already have stated indicators of longterm impact, generating appropriate , evaluation questions is much simpler than when programs have only vague, lofty goals. The steering cornmittee may drift into the realm of program planning as you encounter hurdles like fuzzy program outcomes. To avoid making misinformed evaluation design decisions, involve prograrnJeaders in your discussions . Developing or gathering insn:uments and then collecting the data:are the most expensive steps in any evaluation. Think strategically about which data to collect, from whom to collect it. or where to find it, and the best time to collect it Your organization may already be collecting data for another purpose that now can be used for program evaluation. Some public records, such as student attendance, may be valuable if, for example, "20% increase in student attendance at all grade levels" is one of your program's indica- tors of impact PHASE Ill: PREPARE TO REPORT 1 .. Determine which audiences will want to .lrnow the results. 2. Consider several forums and formats to disseminate the results. 3. Plan reports, presentations, photo displays, graphs, charts, etc. Remember that your job is to make the evaluation results usefuJ. to your organization, so consider a range of ways to provide information to various groups. Consider briefs in the school or district newsletter, a handout updating staff about . the schedule for data collection, five- . JSD Fall 2002 National Staff Development Council minute progress updates in faculty meet- . ings, bulleted statements on your.web site, a digital picture. album of the program's results in classrooms with . photos of students, and hallway displays of student work. If your final report is a formal document complete with exampl~s of your data collection instruments, consider writing an executive summary of . five pages or less to belp;readers get the essential information, PHASE IV: CREATE THE WORK PLAN 1. List all tasks to be completed for the whole eve.luatiori .. 2. Create a realistic timeline. . 3. Assign work . . .4. Distribute the master plan. You will have to be creative to accomplish all the evaluation tasks. In education, we rarely have the luxury of contracting outsiders for the entire project. Enlist .stee~g committee members, partners, graduate students from the local university, and other talented .critical friends to get' the work done. One caution: .For formal or sumrnative evaluations to be credible, avoid using insiders such as the. program designers or implementers (coaches, mentors, trainers, or facilitators) to pe:rf orm critical evaluation tasks that call for objectivity and distance. And be sure to get ongoing, high-quality technical . expertise for the ritical technical analysis. A CATALYST FOR REFLECTION Completing a major program evaluation. usually serves as the catalyst for serious reflection on the current designs, policies, and. practices of your prof es-' . s1onal development programs - their goals, content, processes, and. contexts. In fact,revelations are often so powerful that they bring about the realization that major changes are needed if. significant results are really expected from professional development. People frequently conclude that designing the evaluation should be the first step in the program planning process, rather than an afterthought during implementation. II 79 JOHNW. WALKER SHAWN CHILDS Mr. Chris Heller Friday, Eldredge &amp; Clark . 2000 Regions Center 400 West Capitol Little Rock, AR 72201 JOHN W. WALKER, P.A. A'ITORNEY AT LA w 1723 BROADWAY LI'ITLE ROCK, ARKANSAS 72206 TELEPHONE (501) 374-3758 FAX (501) 374-4187 Via Facsimile - 376-2147 October 10, 2002 Re: Little Rock School District v. PCSSD, et al. Case No. 4:82CV00866 Dear Chris: OF COUNSEL ROBERT McHENRY, PA. DONNA J. McHENRY 8210 HENDERSON ROAD LITI'LE ROCK, ARKANSAS 72210 PHONE: (501) 372-3425 FAX (501) 372-3428 EMAIL: mchenryd@swbell.net - This refers to your letter of October 4, 2002, providing LRSD's proposed Compliance Plan. The court's remedy and the general subject matter are too complex for us to provide all comments and objections we may ultimately have before today's Board meeting. We do note the following: 1. More consideration is needed of the programs to be identified as "implementation pursuant to Section 2.7 . .. ", which are to be subjected to a "comprehensive program evaluation . . . " Your document at page 7 identifies three areas. We note the absence of specific reference and detail regarding interventions / "scaffolding" -- areas of vital importance given the achievement patterns of African American students. We note also that the LRSD compliance report cited many more programs as designed to fulfill Section 2.7. 2. In a discussion prior to his testimony in the hearing Judge Wtlson, we understood Dr. Ross to indicate that the existing evaluation of the Pre-K - 2 literary program was not adequate. The notation at page 4 of your document of the changed use of the Observation Survey and the DRA relates to part of the concerns he expressed. This undermines the LRSD argument (page 11) that the existing evaluation, upon Board approval, will satisfy a part of the court's remedy. - 3. The LRSD discussion about satisfying the court's order regarding the evaluat~io111ns------.. "" EXHIBIT t5 i mentioned at page 148 of the compliance report does not seem to take account of the material provided, which describes an adequate evaluation. 4. We question the period for implementation of a remedy which the court has identified and, therefore, the LRSD schedule. Once again, these comments should not be taken to be the full range of concerns, which Joshua ma'y ultimately have about the court's remedy and the Compliance Plan. Nor do we intend to waive our concerns about the court setting forth a remedy, without first hearing from the parties and the ODM with regard to the court's views on an appropriate remedy. JWW:js cc: Ms. Ann Marshall All Counsel of Record ----------- - ------------ OCT.24.2002 9:06AM JOHN W WALKER PA JOHNW. WA.LR:ER SID.WN CRILDS Mr. Christopher Heller JOHN W. WALKER, P.A. ATI'ORNEY AT LAW 1723 BROADWAY L!'M'LE RoCK, ARKANSAS 72206 TELEPHONE (501) 374-3758 FAX (501) 374-4187 October 23, 2002 FRIDAY, ELDREDGE &amp; CLARK 400 W. Capitol, Suite 2200 Little Rock. Arkansas 72201 Re: LRSD v. PCSSD Dear Chris: N0 . 963 P.2 Oli'COUNBEL ROBERT McBENR.Y, P.A. DONNAJ. McHENXY 821D Hzm&gt;ERSON ROAD Lmu Rocx, ARKANshS 72210 P'l!ONE: (5-01) 372-3425 PAX (501) 372-8428 :E;MAn.: mchemyd@awb&lt;ill.net _ 1bis letter sets forth additional comments of the Joshua Intervenors concerning the LRSD Compliance Plan. We are offering these comments, although we are unable to discern that the comments we offered earlier were given consideration. l. In using historical student assignment results, attention should be given to the quality of the data In the past, LRSD bas used results on the RA and the Observation Survey in ways not consistent with the purposes of those instruments. In addition, because teachers provided scores for their own students, the past use made of the data was in conflict with the district's recognition in the newly enacted Regulation IL-RI that "Conflict of Interest" must be avoided. 2. We are concerned about the manner in which the regulation describes the ''team" process for preparin,g evaluations, again in the context of "conflict of interest." In order to insure that "conflict of interest" is avoided, the "external consultant" needs to write the report and control the context of the analysis. Paragraphs 3, 5 and 6 of the "Program Evaluation Procedures" do not guarantee that the external expert will have these roles. Of course, if reports were prepared in the manner which we describe, there would be no bar to LR.SD staff preparing comments to the Board with a differing interpretation of the evaluation results. 3. We continue to be concerned about the global, general manner in which the ~ntent of planned evaluations is described (page 7 of the document, first paragraph). For example, the Board has adopted a policy and two regulations dealing with remediation for students whose performance is below par. Studying the actual implementation of these standards (in all or a representative sample of schools) is of vital importance to the Intervenor class because class members are so much more likely than other students to exhibit unsatisfactory p~rf ormance on the Benchmark and Stanford Achievement Tests. A satisfactory description by the School Board oftbe evaluations which it .;., - . j EXHIBIT I C, 10/ 24 / 2002 THU 09: 03 [TX/ RX NO 8580 l ~ 002 -------- --- ------------- ,,. o. r,-,,T.. . 24. 2002 8: 07AM JOHN W WALKER PA N0.963 P.3 Page Two October 23, 2002 requires the staff to undertake should make clear th.at the actual implementation of remediation activities in district schools is to receive careful consideration. This is surely an important contextual factor (see "Accuracy Standards," para. 2). 4. We understand from the Plan that the LRSD plans evaluations of programs deemed to be particularly directed to achievement of A:fricnn American students for the indefinite future, not simply for the period necessary to satisfy the court. We would like to receive the Board's assurance that this is the case. We would appreciate your providing this letter to the Superintendent and the members of the school board. JWW:lp cc: All Counsel Ms. Ann Marshall Judge Thomas Ray 10/24/ 2002 THU 09 : 03 [TX/RX NO 8580] la)003 IN THE UNITED STATES DISTRJCT COURT EASTERN DISTRJCT OF ARKANSAS WESTERN DMSION LITTLE ROCK SCHOOL DISTRJCT V. LR-C-82-866 RECEIVED PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, ET AL MAY 2 n 2004 OFFICE OF PLAINTIFF DEFENDANTS MRS. LORENE JOSHUA, ET AL KATHERINE KNIGHT, ET AL DESEGREGATION MONITORING INTERVENORS PLAINTIFF'S NOTICE OF FILING DOCUMENTS IN RESPONSE TO THE COURT'S ORDER FILED MAY 12, 2004 INTERVENORS Plaintiff Little Rock School District ("LRSD") for its Notice of Filing states: 1. fu response to the Court's Order filed May 12, 2004, attached is a Glossary of - Acronyms and Educational Terms. Respectfully Submitted, LITTLE ROCK SCHOOL DISTRJCT FRIDAY, ELDREDGE &amp; CLARK Christopher Heller (#81083) 2000 Regions Center 400 West Capitol g~ B~ Page 1 of 2 CERTIFICATE OF SERVICE - I certify that a copy of the foregoing has been served on the following people by depositing a copy of same in the United States mail on May 24, 2004: Mr. John W. Walker JOHNW. WALKER, P.A. 1723 Broadway Little Rock, AR 72201 Mr. Robert Pressman 22 Locust A venue Lexington, MA 02173 Mr. Sam Jones Wright, Lindsey &amp; Jennings 2200 Nations Bank Bldg. 200 West Capitol Little Rock, AR 72201 Mr. Steve Jones JACK, LYON &amp; JONES, P.A. 425 W. Capitol, Suite 3400 Little Rock, AR 72201-3472 Mr. Mark T. Burnette Attorney at Law 1010 w. 3rd Little Rock, AR 72201 Ms. Ann Marshall Desegregation Monitor 1 Union National Plaza 124 W. Capitol, Suite 1895 Little Rock, AR 72201 Judge J. Thomas Ray U.S. District Courthouse 600 West Capitol Little Rock, AR 72201 Mr. Mark A. Hagemeier Office of the Attorney General 323 Center Street 200 Tower Building Little Rock, AR 72201 Page 2 of 2 GLOSSARY OF ACRONYMS AND EDUCATIONAL TERMS - Below are identifications and/or definitions of acronyms and other educational terms that appear in exhibits. While most of the acronyms and terms are generically defined and equally applicable to most school districts in Arkansas, many are defined specifically in relation to the Little Rock School District. ACSIP (Arkansas Comprehensive School Reform Improvement Plan) - Plan required by State which specifically sets steps for school improvement AFRAMER (African-American) ALP (Alternative Language Program) - Another name for ESL ALT (Achievement Level Tests) - Tests the LRSD developed, with the assistance of a commercial testing firm, for the purpose of measuring student achievement growth within a school year. The test items were selected from a menu in the test firm's item bank, so all the questions had been used numerous times in schools across the country. Students in grades 3-11 took these tests in the fall and spring of each year. The LRSD discontinued the ALT's in September 2002. ANCOV A (Analysis of Covariance) ANO VA (Analysis of variance) - Statistical test with one outcome AP (Advanced Placement) - High-level courses with curriculum developed by College Board which allows students to test for earned college-level credit while in high school. AR (Accelerated Reader) - A program based on the premise that students become more motivated to read if they are tested on the content of the books they have read and are rewarded for correct answers. Students read books at predetermined levels of difficulty, individually take a test on a computer, and receive some form of reward when they score well. AYP (Adequate Yearly Progress) -Amount of improvement in proficiency required each year to reach total proficiency under NCLB (2013). Benchmark Examination - One of the criterion-referenced examinations implemented by the Arkansas Department of Education (ADE) for all Arkansas public schools in the 4th, 6th, 8th, and 11th grades and in selected high school courses. The tests are based on the state's curriculum as outlined in the curriculum frameworks. Test results are categorized as Below Basic, Basic, Proficient, and Advanced. BL (Balanced Literacy) - An approach to literacy instruction that focuses on providing instruction that addresses student's individual strengths and needs through whole group and flexible grouping to enhance student development in all of the language arts areas-reading, writing, spelling, listening, and speaking. CAP (Concepts about Print) - One of the assessments included in the Observation Survey Assessment which assesses children's knowledge of book concepts. CAT (Collaborative Action Team)-A process designed to increase stakeholders' involvement in schools. CBL (Calculator-based Laboratories) - Probes used to collect data for classrooms. CLT (Campus Leadership Teams) - A term used to refer to school-based leadership committees CMP (Connected Mathematics Project) - Mathematics curriculum resource used in Grades 6- 8 in Little Rock School District CREP (Center for Research in Educational Policy) - This is an organization based at the University of Memphis that conducts program evaluations for educational organizations. Dr. Steve Ross and Dr. John Nunnery are two researchers for CREP. CRT (Criterion Referenced Tests) - Tests that LRSD curriculum specialists, teachers, and other staff developed using the state's curriculum frameworks and the district's curriculum to guide item development. CSR (Comprehensive School Reform) - A whole school reform model DI (Direct Instruction) - A reading program that uses very explicit instructional language and follows a highly prescriptive program of instruction that is implemented according to a predetermined scope and sequence of skills DIBELS (Dynamic Indicators of Basic Early Literacy Skills) - This is a system utilizing a variety of assessments to monitor a child's progress in developing specific literacy skills which have predictive value for future reading achievement. The assessments include, but are not limited to, letter identification, phoneme segmentation, and oral r~ading fluency. DRA (Developmental Reading Assessment) - The second of two assessments given to LRSD students in grades K-2. This assessment consists of stories that increase in difficulty as the child's reading ability increases. Students are evaluated on a variety of reading skills, including comprehension. DSA (Developmental Spelling Assessment) - An assessment to monitor student progress along a spelling developmental continuum ELLA (Early Literacy Learning in Arkansas) - A statewide three-year staff development process designed to assist teachers in grades K-2 in implementing instructional techniques that support emergent learners. ELLA helps enhance teachers' understanding of how students learn to read and encourages them to use a balanced literacy approach in the classroom. - EOC (End-of-course exam) - State-developed criterion-referenced tests implemented in Arkansas schools as part of the Arkansas Comprehensive Testing, Assessment, and Accountability Program (AT AAP). Currently, end-of-course exams are administered only in Algebra I and geometry. EXPLORE - An American College Testing (ACT) program designed to help 8th and 9th graders examine a broad range of options for their future. EXPLORE helps prepare students for their high school course work as well as their post-high school choices. ESL (English as a Second Language) - Refers to students for whom English is not their native language EYE (Extended Year Education) - Applies to schools with atypical school calendars without a long summer break. FEPE (Fluent English Proficient Exited) - students who are released from ESL program due to proficiency in English GT (Gifted and Talented) HBE (Home-based Educators) - employees of the Home Instruction for Parents of Preschool Youngsters (HIPPY) Program ffiPPY (Home Instruction for Parents of Preschool Youngsters) - A parent-involvement readiness program for young children The program, which has been operating in the United States since 1984, offers home-based early childhood education for three-year-old children, working with their parent(s) as their first teacher. The HIPPY program provides parents with carefully developed materials, curriculum, and books designed to strengthen their children's early literacy skills and their social, emotional, and physical development. HLM (Hierarchical Linear Model) HSCP (Home, School, and Community Partnership) - A precursor to the Collaborative Action Team (CAT) HSTW (High Schools That Work) - A school-wide reform model for high schools that is based on the key practices of successful high schools IRC (Instructional Resource Center) - Offices of curriculum staff for LRSD. ITBS (Iowa Test of Basic Skills) - Norm-referenced assessment currently used by LRSD replacing Stanford Achievement Test JR TEAMS (Joint Recruiting and Teaching for Effecting Aspiring Minorities in Science) - A two week multidisciplinary pre-college science and engineering program offered through a partnership with the University of Arkansas at Little Rock aimed at increasing the number of minority students pursuing degrees in science and engineering. LEP (Limited English Proficient) - Identifies students not proficient in English LPAC (Language Proficiency Assessment Committee) LPTQ- Literacy Program Teacher Questionnaire MANOVA (Multiple Analysis of Variance) - Statistical tests with multiple outcomes MSS - (Middle School Survey) - A survey completed by teachers and students on the implementation of the middle school model. NALMS (Not Assessed Language Minority Students) NCE (Normal Curve Equivalent) - A type of standard score, NCE scores are normalized standard scores on an equal interval scale from 1 to 99, with a mean of 50. The NCE was developed by RMC Research Corporation in 1976 to measure the effectiveness of the Title I Program across the United States. An NCE gain of 0 means that the Title I Program produced only an average gain or the expected gain if there was no Title I Program. (Students must answer more items correctly on the posttest than on the pretest in order to maintain the same NCE.) All NCE gains greater than 0 are considered positive. NCLB (No Child Left Behind) - Federal legislature requiring vast assessment and increased standards for American public schools NCTM (National Council of Teachers of Mathematics) - An organization of math teachers and specialists that has provided the standards for K-12 mathematics NPR (National Percentile Rank) - National percentile ranks indicate the relative standing of a student in comparison with other students in the same grade in the norm (reference) groups (in this case, the nation) who took the test at a comparable time. Percentile ranks range from a low of 1 to a high of 99, with 50 denoting average performance for the grade. The percentile rank corresponding to a given score indicates the percentage of students in the same grade in the norm group obtaining scores equal to or less than that score. For example, a student earning a percentile rank of 62 achieved a score that was equal to or better than the scores earned by 62% of the students in the national sample. NSES (National Science Education Standards) - The standards established for K-12 science education NSF (National Science Foundation) - A government entity created in 1950 to promote excellence in science and to fund research. The LRSD received funds from NSF through a multiyear grant to improve mathematics and science instruction and achievement, naming the program Comprehensive Partnerships for Mathematics and Science Achievement (CPMSA). Grant funding ended August 31, 2003.' NWEA (Northwest Evaluation Association) - A company that developed the Achievement Level Tests OTE (Onward to Excellence)-A whole school restructuring model PD (Professional Development) - Term used to describe the training provided to teachers to enhance their instructional or classroom management skills. PHLOTE (Primary Home Language other than English) PLAN - An American College Testing (ACT) guidance resource for 10th graders. PLAN helps students measure their current academic development, explore career or training options, and make plans for the remaining years of high school and post-graduation years. As a pre-ACT test, PLAN is a good predictor of success on the ACT. Typically, PLAN is administered in the fall of the sophomore year. PRE (Planning, Research, and Evaluation) -A department of the Little Rock School District Pre-AP (Pre-Advanced Placement) - Courses designed for middle school and high school to prepare students for success in Advanced Placement level courses. Pre-K-3 (Pre-kindergarten through 3rd Grade) RIT (Rausch Unit) - a type of scaled score. RR (Reading Recovery) - An intensive early-intervention literacy program developed in New Zealand and used in this country for many years. The program is based on helping children with poor reading readiness skills develop the skills common to proficient readers. SAIP (Student Academic Improvement Plan) - A personalized plan required by State for lower-achieving students on ACT AAP Benchmark tests Includes both areas of deficiencies and plans for remediation. SAT 9 (Stanford Achievement Test, 9th Edition) - A general education test used widely across the United States. It compares a student's performance on the test to a representative national norm group of students. For many years, the publisher of SAT-9 has had a contract with the ADE to provide tests to all students in the state's public schools in grades five, seven, and ten. The results are widely reported for every school district in the state, and each district receives data in varying - formats to allow analysis of student performance by school, class, gender, race, or wealth. (Beginning in the 2003-04 school year, the state will require a similar nationally-normed test, the Iowa Tests, rather than the SAT.) SEDL (Southwest Educational Development Laboratory) - A private, not-for-profit education research and development corporation based in Austin, Texas. SEDL works with educators, parents, community members, and policymakers in the southwestern states to develop and implement effective strategies to address pressing educational problems. SEM (Science, Engineering, and Mathematics) SFA (Success for All) - A school-based achievement-oriented program for disadvantaged students in pre-K through grade five. The program is designed to prevent or intervene in the development of learning problems in the early years by effectively organizing instructional and family support resources within the regular classroom. Specifically, the goal of Success for All is to ensure that virtually every student in a high-poverty school will finish the 3rd grade with grade-level reading skills. SLET (Secondary Literacy Evaluation Team) SMART (Summer Mathematics Advanced Readiness Training) - This is a two-week halfday summer program for rising 8th and 9th grade students who will be enrolled in Algebra I during the upcoming school year. SMART provides opportunity for students to gain the knowledge, skills, and confidence needed to succeed in Algebra I. SpEd - Special Education SREB (Southern Regional Educational Board) - A private, not-for-profit education research and development corporation based in Atlanta, GA SREB works with schools, educators and policymakers in the southern states to develop and implement effective strategies to address . pressing educational problems. One school-wide reform model, developed and sustained by SREB, is High School That Work (HSTW). SS (Scaled Score) - A type of standard score. Scaled score is calculated based on the difficulty of the questions and the number of correct responses. Scaled scores are useful for comparing student performance over time and across grades. All norm referenced scores are derived from the Scaled Score. Standard Score :- Standard scores are a universally understood score system. Standard scores are used to place raw scores in context. For example, a raw score on a test doesn't mean much because it isn't compared to anyone or not compared to any scale. Standard scores offer two advantages to the student over conventional "raw scores." standard scores take into account the relative difficulties of various exams and assignments standard scores make it possible to measure improvement TAP (Teacher Advancement Program) - A strategy to attract, retain, motivate, and develop talented people to the teaching profession by rewarding good teachers with higher salaries. THRIVE - (Project THRIVE, a follow-up component to SMART) - This is a Saturday academy for students who are enrolled in Algebra I. Students participate in ten (10) Saturday sessions during the school year. Two primary goals of Project THRIVE are 1) to strengthen mathematical skills required to be successful in Algebra I, and 2) to prepare students for the State End-of-Course examination in Algebra I. URM (Underrepresented Minority Populations) - Includes American Indian/Alaskan Native, Black or African-American, and Hispanic or Latino. VOC - (Writing vocabulary) - One of the assessments included in the Observation Survey Assessment which WRAT (Wide Range Achievement Test) Z-scores - A test score that is converted to a common scale wherein scores from sets of data with different units can be compared. Arkansas RECEIVED J~l~ 1 - 2004 DEPARTMENT OF EDUC4ETE8fiuromNG 4 STATE CAPITOL MAil. LfITLE ROCK, ARKANSAS 72201-1071 (501) 682-4475 http:/ / arkedu.kl2.ar.us Dr. Kenneth James, Director May 28, 2004 Mr. M. Samuel Jones, III Wright, Lindsey &amp; Jennings 200 West Capitol, Suite 2000 Little Rock, AR 72201 Mr. John W. Walker John Walker, P.A. 1 723 Broadway Little Rock, AR 72201 Mr. Mark Burnette Mitchell, Blackstock, Barnes, Wagoner, Ivers &amp; Sneddon P. 0. Box 1510 Little Rock, AR 72203-1510 Mr. Christopher Heller Friday, Eldredge &amp; Clark 400 West Capitol, Suite 2000 Little Rock, AR 72201-3493 Mr. Stephen W. Jones Jack, Lyon &amp; Jones 425 West Capitol, Suite 3400 Little Rock, AR 72201 Ms. Ann Marshall One Union National Plaza 124 West Capitol, Suite 1895 Little Rock, AR 72201 BNIHDllNDW NDllVB3HB3S30 :l033l:l:l0 +aoz - r nnr 03/\l303H RE: Little Rock School District v. Pulaski County Special School District, et al. U.S. District Court No. 4:82-CV-866 Dear Gentlemen and Ms. Marshall: Per an agreement with the Attorney General's Office, I am filing the Arkansas Department of Education's Project Management Tool for the month of May 2004 in the above-referenced case. If you have any questions, please feel free to contact me at your convenience. General Counsel Arkansas Department of Education SS:law cc: Mark Hagemeier --------------- STATE BOARD OF EDUCATION: Chair - JoNell Caldwell, Little Rock Vice Chair - Shelby Hillman, Carlisle Members: Sherry Burrow, Jonesboro Luke Gordy, Van Buren Calvin King, Marianna Randy Lawson, Bentonville MaryJane Rebick, Little Rock Diane Tatum, Pine Bluff Jeanna Westmoreland, Arkadelphia An Equal Opportunity Employer UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DNISION RECEIVED JUH 1 - 2004 OFFICE OF DESEGREGATION faONITORING LITTLE ROCK SCHOOL DISTRICT .PLAINTIFF V. No. LR-C-82-866 PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, et al DEFENDANTS NOTICE OF FILING In accordance with the Court's Order of December 10, 1993, the Arkansas Department of Education hereby gives notice of the filing of the ADE's Project Management Tool for May 2004. Respectfully Submitted, ,g1t ii.db Scott Smith, #92251 Attorney, Arkansas Department of Education #4 Capitol Mall, Room 404-A Little Rock, AR 72201 501-682-4227 CERTIFICATE OF SERVICE I, Scott Smith; certify that on May 28, 2004, I caused the foregoing document to be served by depositing a copy in the United States mail, postage prepaid, addressed to each of the following: Mr. M. Samuel Jones, III Wright, Lindsey &amp; Jennings 200 West Capitol, Suite 2000 Little Rock, AR 72201 Mr. John W. Walker John Walker, P.A. 1 723 Broadway Little Rock, AR 72201 Mr. Mark Burnette Mitchell, Blackstock, Barnes Wagoner, Ivers &amp; Sneddon P. 0. Box 1510 Little Rock, AR 72203-1510 Mr. Christopher Heller Friday, Eldredge &amp; Clark 400 West Capitol, Suite 2000 Little Rock, AR 72201-3493 Mr. Stephen W. Jones Jack, Lyon &amp; Jones 425 West Capitol, Suite 3400 Little Rock, AR 72201 Ms. Ann Marshall One Union National Plaza 124 West Capitol, Suite 1895 Little Rock, AR 72201 IN THE UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION RECEIVED JUN 1 - 2004 OFFICE OF LITTLE ROCK SCHOOL DISTRICT, ET AL PLAINTIFFS DESEGREGATION ;;i ONITORING V. NO. LR-C-82-866 PULASKI COUNTY SPECIAL SCHOOL DISTRICT, ET AL DEFENDANTS MRS. LORENE JOSHUA, ET AL INTERVENORS KATHERINE W. KNIGHT, ET AL INTERVENOR$ ADE'S PROJECT MANAGEMENT TOOL In compliance with the Court's Order of December 10, 1993, the Arkansas Department of Education (ADE) submits the following Project Management Tool to the parties and the Court. This document describes the progress the ADE has made since March 15, 1994, in complying with provisions of the Implementation Plan and itemizes the ADE's progress against timelines presented in the Plan. IMPLEMENTATION PHASE ACTIVITY I. FINANCIAL OBLIGATIONS A. Use the previous year's three quarter average daily membership to calculate MFPA (State Equalization) for the current school year. 1. Projected Ending Date Last day of each month, August - June. 2. Actual as of May 31, 2004 Based on the information available at April 30, 2004, the ADE calculated the Equalization Funding for FY 03/04, subject to periodic adjustments. B. Include all Magnet students in the resident District's average daily membership for calculation. 1. Projected Endin9 Date Last day of each month, August - June. This project was supported in part by a Digitizing Hidden Special Collections and Archives project grant from The Andrew W. Mellon Foundation and Council on Library and Information Resources.</dcterms_description>

</dcterms_description>

</item>
</items>