Compliance court filings

FILED U.S. DISTRICT COURT EASTERN DISTRICT ARKANSAS IN THE UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS LITTLE ROCK DIVISION ' APR 19 2m JAM CC' lACK, CLERK DEP CLERK LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. No. 4:82CV00866 WRW/JTR PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, ET AL. RECEiVED DEFENDANTS MRS. LORENE JOSHUA, ET AL. 1 200^ INTERVENORS KATHERINE KNIGHT, ET AL. INTERVENORS ORDER Please file a list of your expected witnesses and exhibits by noon, day after tomorrow, April 21, 2004. For each witness you expect to call, please set forth the amount of time you expect to spend on direct examination. If you want a conference call regarding the presentation of evidence at the hearing next week please call Ms. Mary Johnson at 501-604-5144 forthwith. IT IS SO ORDERED this / day of April, 2004. CT Wm. R. Wilson, Jr. JUDGE 8 5 8 RECEIVED UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION APR 2 1 2004 OFFICE OF DESEGREGATION MONITORING LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. No. LR-C-82-866 PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. I, et al. DEFENDANTS SUPPLEMENT TO RESPONSE TO COURT ORDER BY SEPARATE DEFENDANT ARKANSAS DEPARTMENT OF EDUCATION Comes now Separate Defendant Arkansas Department of Education (ADE), by and through its attorneys, Attorney General Mike Beebe and Assistant Attorney Mark A. Hagemeier, and for its Response to the courts Order dated April 19, 2004, states: ADE does not plan to call any witnesses or offer any exhibits at the hearings currently scheduled before the court on April 27-28, 2004. Respectfully Submitted, MIKE BEEBE Attorney General By
MARK A. HAGEMEIHR, #94127 Assistant Attorney Genewl 323 Center Street, Suite 200 Little Rock, AR 72201-2610 (501) 682-3643 CERTIFICATE OF SERVICE I, Mark A. Hagemeier, Assistant Attorney General, do hereby certify that I have served the foregoing by depositing a copy in the United States Mail, postage prepaid, this day of April 2004, addressed to:Stephen W. Jones Jack, Lyon & Jones 3400 TCBY Tower 425 W. Capitol Little Rock, AR 72201 Christopher Heller Friday, Eldredge & Clark 2000 Regions Center 400 W. Capitol Little Rock, AR 72201-3493 M. Samuel Jones, III Wright, Lindsey & Jennings LLP 200 W. Capitol, Suite 2300 Little Rock, AR 72201-3699 John W. Walker John Walker, P.A. 1723 Broadway Little Rock, AR 72201 Arm Brown Marshall ODM One Union National Plaza 124 West Capitol, Suite 1895 Little Rock, AR 72201 Mr. Mark Burnette Attorney at Law 1010 W. 3* Little Rock, AR 72201 2u vni 1 WMLKLPl NO.004 P.2 I I I 2 IN THE UNITED STATES DISTRICT eastern DISTRICT OF ARKANSAS----- WESTERN DIVISION LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. CASE NO. 4:S2CV866WRW PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. l.ETAL. DEFENDANT MRS. LORENE JOSHUA ETaL. INTERVENORS KATHERINE KNIGHT, ET AL. INTERVENORS THE JOSHUA INTERVENORS WITNESS LTST The-Joshua Intervenors may call the following persons as witnesses during the hearin! 6 scheduled for April 26 and 27, 2004: 1. Gene Jones, Office of Desegregation Monitoring -1 hour 2. Walt Haney, Ed. D,, Expert -1 1/4 hours 3. Richard Hunter, Ed. D., Expert - 45 minutes 4. Margie Powell, Office of Desegregation Monitoring -1 hour 5. Dennis Glasgow, Little Rock School District - 20 minutes 6. Ann Marshall, Office of Desegregation Monitoring - 20 minutes 7. Willie Morris, Arkansas Department of Education - 20 minutes 8. Morris Holmes, Interim Superintendent, Little Rock School District - 1/4 hour 9. J unions Babbs, Associate Superintendent, Little Rock School District -15 minutes 10. Ethel Dunbar, Principal al Franklin Elementary School, LRSD -10 minutesII 1 pinu.rst.r-. NO.U04 P.3 11. David Smith, Principal at Southwest Middle School. LRSD -10 minutes 12. Cassandra Norman, Principal at McClellan High School, LRSD 13. Karl Brown, Assistant Superintendent PCSSD - 5 minutes -10 minutes 14. Bobby Acklin, Assistant Superintendent, NLRSD - 5 minutes Joshua reserves the right to call witnesses listed by the Little Rock School District. Respectfully submitted, 22 Locust Avenue Lexington, MA 02421 781-862-1955 Mass Bar 405900 W.*' Walker ^'Rickey Hicks John W. Walker, P.A. 1723 Broadway Little Rock, AR 72206 501-374-3758 Ark. 64046 Elaine R, Jones .... President & Director-Counsel Norman Chachkin Theodore Shaw NAACP Legal Defense and Educational Fund, Inc. 99 Hudson Street New York, NY 212-965-2200 10013-2897 CERTIFICATE OF SERVICE I do hereby state that a copy of the foregoing has been served on all counsel of record on this 21^ by placing a copy of s prepaid. day of April, 2004 apie in the United .States mail postage I / /' nited .States IV J tJU-4 >/C7-COO/^ I I iv. IN THE UNITED STATES DISTRICT^ Eastern district of Arkansas WESTERN division little rock school district PLAibmp? V, Case no. 4
82CV866WRW PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1. ETAL. defendant MRS. LORENE JOSHUA DT AL. Katherine knight, et al. INTERVENORS INTERVENORS THE JOSHUA INTERVENORS EXHTRTT LTST The Joshua Intervenors may use the following exhibits during the hearing scheduled for April 26 and 27, 2004
1) LRSD Policy IL (Evaluation of Instructional Programs), CX 575 2) LRSD Regulation IL-Rl (^Program Evaluation Agenda) 3) Text of Plan, Sections 2.7 and 2.7.1 4) Review of Year Two Evaluations, Steven M. Ross, Ph.D. CProvided to Iniervenors by Counsel for e LRSD, October 25,2002) Superintendent James to LRSD Board of Education fPrcnaredbv Assoicate Superintendent for Instruction Bonnie A. Lesley)
Vri - 2) Approval of the Charter School Program Evaluation, October 24, 2002 , b) Approval of rhe SEDLs Program Evaluation for the CoEaborative Project, November 21, 2002 Action Team Vo " of Program Evaluation for Southwest Middle Schools Partnership with Souwest Education Development Lab (SEDL), November 21, 2002 'I-iU. H.5 VtH d) Campus Leadership Team Program Evaluation, February 13,2003 e) HIPPY Program Evaluation, February 13, 2002 ' I Vol. C- f) Onward to ExceUence Program Evaluation, February 13, 2003 g) Campus Leadership Teem Progrem Eveleetidn, Febmeiy 13, 2003 i "d" ) Vijt A'!: h) Vital Link program Evaluation, February 13, 2003 v/ i) Middle School Transition Program Evaluation, February: j) Lyceum Scholars Program Evaluation, February 27, 2003 n, 2003 k) Extended Year Education CEYE) Program Evaluation, February 27, 2003 V
)l. 1) Elementary Summer School Program Evaluatioi ii, February 27, 2003 6) (^delines for Completing Eights Program Evaluations Ph.D. (Filed by LRSD March 14, 2003) in LRSD, Steven M. Ross, S T) from Chris Heller to Ann Marshall and John W. Walker, October 27, 2003 8) Letter from Chris Heller to John W. Walker, January 12, 2004 9) LRSD Literacy Program Evaluation I District from 1998 to 2003 c <1 * ^Srams in the We Rct School 11) The LRSDs Implementation of the Courts Compliance Remedy, March 30, 2004 / > 1'9' 1 i2) Resume, Walter M. Haney, Ed.D, (Professor, Lynch School of Education. Research Associate. Center for the Study of Testing, Evaluation - tion. Senior College) and Educational Policy, Boston 2 13) Grade to Grade Progression Data for LRSD and Arkansas, By Race nOh of Educational .Administration and Head of the Educational Organization and Leadership Organization ini^uanon and Head and Leadership Department. Joshua reserves rhe right to utilize the exhibits as listed by the defendants. Respectfully submitted.JOHNW. WALKER, P.A. Attorney at Lav/ 1723 Broadway Little Rock, Arkansas 72206 Telephone (501) 374-375S Fax (501) 574-4187 ------------transmission cover .SRPKT Date: To: Fax: Re: Sender: rcZ7SHOULD RLCEIVE [ COVER SHEET. IF YOU DO (including cover sheet)] PAGE(S). INCLUDING THIS "<(1(11) 374.S75S>" only for the use of the individual Or entity named above. If die reader of this message is not the intended .. , -------------------------- ut uiib inesss recipient, or the employee or agent responsible to deliver it to the intended recipient v that any dissemination, distribution received this communication in or :, you are hereby notified copying of this communication is strictly prohibited. If you have IO error, please immediate notify us by telephone, and return the original message TT C Dz\z-ta1 _T'l____1___ o US at the above address via the U.S. Postal Service, Thank you.ONiUOilNOW N0liV33HD3S3a 30331330 ^ooz c 2 ydv aaAiHoau tv, IN THE UNITED STATES DISTRICTCl EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION 6ep LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. CASE NO. 4:82CV866WRW PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1. ETAL. DEFENDANT MRS. LORENE lOSHUA, ET AL. INTERVENORS KATHERINE KNIGHT, ET AL. INTERVENORS THE JOSHUA INTERVENORS EXHIBIT LIST The Joshua Intervenors may use the following exhibits during the hearing scheduled for April 26 and 27, 2004: 1) LRSD Policy IL (Evaluation of Instructional Programs), CX 575 2) LRSD Regulation IL-Rl (Program Evaluation Agenda) 3) Text of Plan, Sections 2.7 and 2.7.1 4) Review of Year Two Evaluations, Steven M. Ross, Ph.D. (Provided to Intervenors by Counsel for the LRSD, October 25, 2002) 5) Memoranda from Superintendent James to LRSD Board of Education (Prepared by Assoicate Superintendent for Instruction Bonnie A. Lesley): a) Approval of the Charter School Program Evaluation, October 24, 2002 b) Approval of the SEDLs Program Evaluation for the Collaborative Action Team Project, November 21, 2002 c) Approval of Program Evaluation for Southwest Middle Schools Partnership with Southwest Education Development Lab (SEDL), November 21, 2002d) Campus Leadership Team Program Evaluation, February 13, 2003 e) HIPPY Program Evaluation, February 13, 2002 f) Onward to Excellence Program Evaluation, February 13, 2003 g) Campus Leadership Team Program Evaluation, February 13, 2003 h) Vital Link program Evaluation, February 13, 2003 i) Middle School Transition Program Evaluation, February 27,2003 j) Lyceum Scholars Program Evaluation, February 27, 2003 k) Extended Year Education (EYE) Program Evaluation, February 27, 2003 1) Elementary Summer School Program Evaluation, February 27, 2003 6) Guidelines for Completing Eights Program Evaluations in LRSD, Steven M. Ross, PhD, (Filed by LRSD March 14, 2003) 7) Letter from Clrris Heller to Aim Marshall and Jolm W. Walker, October 27, 2003 8) Letter from Chris Heller to John W. Walker, January 12, 2004 9) LRSD Literacy Program Evaluation 10) An Evaluation of Mathematics and Science Programs in the Little Rock School District from 1998 to 2003 11) The LRSDs Implementation of the Courts Compliance Remedy, March 30, 2004 12) Resume, Walter M. Haney, Ed.D. (Professor, Lynch School of Education, Senior Research Associate, Center for the Study of Testing, Evaluation and Educational Policy, Boston College) 13) Grade to Grade Progression Data for LRSD and Arkansas, By Race 14) Vita, Richard C. Hunter, Ed. D. (Professor of Educational Administration and Head if the Educational Organization and Leadership Organization and Leadership Department. Joshua reserves the right to utilize the exhibits as listed by the defendants. Respectfully submitted.li Robert Pressman i ' Walker Hicks Walker, .''^.hh W V 22 Locust Avenue /-Rickey ] Lexington, MA 02421 781-862-1955 Mass Bar 405900 {/ John W. P.A. 1723 Broadway Little Rock, AR 72206 501-374-3758 Ark. 64046 Elaine R. Jones President & Director-Counsel Norman Chachkin Theodore Shaw NAACP Legal Defense and Educational Fund, 99 Hudson Street Inc. New York, NY 212-965-2200 10013-2897 CERTIFICATE OF SERVICE I do hereby state that a copy of the foregoing has been served on all counsel of record on this 21*^ day of April, 2004 by placing a copy of same in the United States mail postage prepaid. a W-b-A RECEIVED ARKANSAS APR 2 G 2004 OFFICE OF DESEGREGATION MONITORING 2 1200j clerk IN THE UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION EPCLE^ LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. CASE NO. 4:82CV866WRW PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1. ETAL. DEFENDANT MRS. LORENE JOSHUA, ET AL. INTERVENORS KATHERINE KNIGHT, ET AL. INTERVENORS THE JOSHUA INTERVENORS WITNESS LIST The Joshua Intervenors may call the following persons as witnesses during th kaon to rr scheduled for April 26 and 27, 2004: 1. Gene Jones, Office of Desegregation Monitoring - 1 hour 2. Walt Haney, Ed. D., Expert - 1 1/4 hours 3. Richard Hunter, Ed. D., Expert - 45 minutes 4. Margie Powell, Office of Desegregation Monitoring - 1 hour 5. Dennis Glasgow, Little Rock School District - 20 minutes 6. Ann Marshall, Office of Desegregation Monitoring - 20 minutes 7. Willie Morris, Arkansas Department of Education - 20 minutes 8. Morris Holmes, Interim Superintendent, Little Rock School District - 1/4 hour 9. Junious Babbs, Associate Superintendent, Little Rock School District - 15 minutes 10. Ethel Dunbar, Principal at Franklin Elementary School, LRSD - 10 minutes11. David Smith, Principal at Southwest Middle School, LRSD - 10 minutes 12. Cassandra Norman, Principal at McClellan High School, LRSD - 10 minutes 13. Karl Brown, Assistant Superintendent, PCSSD - 5 minutes 14. Bobby Acklin, Assistant Superintendent, NLRSD - 5 minutes Joshua reserves the right to call witnesses listed by the Little Rock School District. Respectfully submitted, P /I L F Robert Pressman
sman Walker 22 Locust Avenue Lexington, MA 02421 781-862-1955 Mass Bar 405900 A'Rickey Hicks John W. Walker, P.A. 1723 Broadway Little Rock, AR 72206 501-374-3758 Ark. 64046 Elaine R. Jones President & Director-Counsel Norman Chachkin Theodore Shaw NAACP Legal Defense and Educational Fund, Inc. 99 Hudson Street New York, NY 212-965-2200 10013-2897 CERTIFICATE OF SERVICE I do hereby state that a copy of the foregoing has been served on all counsel of record on this 21 St day of April, 2004 by placing a copy of sapae prepaid. ' \ 1 in the United States mail postage I / V mled IN THE UNITED STATES DISTRICT^j
! . . X X , * ---------------- ---------- CT COURT EASTERN DISTRICT OF ARKANSAS Arkansas WESTERN DIVISION APR 2 1 2004 LITTLE ROCK SCHOOL DISTRICT JAMES W. MCCORMACK, CLERK -ELAINTIFF OtPCLfcRK V, LR-C-82-866 PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1,ETAL RECEIVED DEFENDANTS MRS. LORENE JOSHUA, ET AL KATHERINE KNIGHT, ET AL APR 2, 2004 OFFICE OF DESEGREGATION MONITORING INTERVENORS INTERVENORS LITTLE ROCK SCHOOL DISTRICT WITNESS LIST AND EXHIBIT LIST The Little Rock School District expects to call the following witnesses and present the following exhibits at the hearing scheduled to being on April 26, 2004, except Dr. Lesley, whose testimony will be presented by deposition. WITNESS LIST 1. Dr. Steven M. Ross, Director, Center for Research in Education Policy, University of Memphis - expected direct examination time - 1 hour
2. Dr. Bonnie Lesley, former LRSD Associate Superintendent for Curriculum and Instruction - expected direct examination time - 1 hour
3. Dennis Glasgow, Interim Associate Superintendent for Curriculum and Instruction - expected direct examination time - 1 hour
4. Dr. Ed Williams, LRSD Research Specialist - expected direct examination time - 30 minutes
5. Krista Underwood, Director of Early Childhood and Elementary Literacy - expected direct examination time - 30 minutes
Page 1 of 46. Suzi Davis, Director of Secondary English - expected direct examination time - 30 minutes
7. Vanessa Cleaver, Director of National Science Foundation Grant - expected direct examination time - 30 minutes. EXHIBIT LIST 1. Program Evaluations and Accompanying Memoranda submitted to the LRSD Board of Directors for approval on October 24, 2002, November 21, 2002, December 19, 2002, February 13, 2003 and February 27, 2003 (These were attached to our Notice of Filing on March 14, 2003 in Volumes I - IV)
2. September 26, 2002 Program Evaluation Agenda, 2002-03
3. October 4, 2002 letter from Clay Fendley transmitting Compliance Plan to counsel and Ms. Marshall
4. October 10, 2002 memo to Dr. Ken James from Ann Marshall re LRSDs Compliance Plan 5. October 10, 2002 Memo to LRSD Board from Dr. Bonnie Lesley
6. October 11, 2002 letter from Clay Fendley to Counsel and Ann Marshall regarding Compliance Remedy
7. October 17,2002 Request for Qualifications of Revised Desegregation and Education Plan Program Evaluation Consultant
8. October 25, 2002 letter from Clay Pendley to Counsel and Ann Marshall
9. November 4, 2002 letter to John Walker and Ann Marshall from Bonnie Lesley
10. Guidelines for Completing Eight Program Evaluations in LRSD prepared by Dr. Ross
11. December 3, 2002 letter to Ann Marshall from Bonnie Lesley
12. December 3, 2002 letter to John Walker from Bonnie Lesley
13. January 27, 2003 Memo to Dr. Ken James from Dr. Bonnie Lesley regarding contracted Services - Dr.Ross
14. February 13, 2003 Memo to LRSD Board from Dr. Lesley regarding Information on Completion of Eight Program Evaluations for Submission to Federal Court Page 2 of 41 15. April 8, 2003 letter from John Walker to Clay Fendley
16. Response to ODM and Joshua Objections, by Dr. Steven M. Ross
17. Changes in Science Curriculum, by Dennis Glasgow
Respectfully Submitted, LITTLE ROCK SCHOOL DISTRICT Friday, Eldredge & Clark Christopher Heller (#81083) 2000 Regions Center 400 West Capitol Little Rock, AR 72201-3493 (501) 376-20JJ_______ BXC Christopher Heller Page 3 of 4 CERTIFICATE OF SERVICE I certify that a copy of the foregoing has been served on the following people by depositing a copy of same in the United States mail on April 21,2004
Mr. John W. Walker JOHN W. WALKER, P.A. 1723 Broadway Little Rock, AR 72201 Judge J. Thomas Ray U. S. District Courthouse 600 West Capitol Little Rock, AR 72201 Mr. Sam Jones Wright, Lindsey & Jennings 2200 Nations Bank Bldg. 200 West Capitol Little Rock, AR 72201 Mr. Mark Burnette Attorney at Law 1010 W. 3^"^ Little Rock, AR 72201 Mr. Steve Jones JACK, LYON & JONES, P.A. 425 W. Capitol, Suite 3400 Little Rock, AR 72201-3472 Ms. Ann Marshall Desegregation Monitor 1 Union National Plaza 124 W. Capitol, Suite 1895 Little Rock, AR 72201 Mr. Mark A. Hagemeier Office of the Attorney General 323 Center Street 200 Tower Building Little Rock, AR 72201 Christopher He^ Page 4 of 4RECEIVED MAY 1 r 2004 EASTERN RN DISTRICT ARK lUHT KANSAS OFFICE OF IN THE UNITED STATES DlSTRICTfpOURT EASTERN DISTRICT OF ARKANSAS I MAY 1 2 2004 ESEGREGATION MONITORING LITTLE ROCK DIVISION jameS W. McCORMACK, CLERK By:. DEP CLERK LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. No. 4:82CV00866 WRW/JTR PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, ET AL. RECEIVED DEFENDANTS MRS. LORENE JOSHUA, ET AL. MAY 1' 2004 INTERVENORS KATHERINE KNIGHT, ET AL. OFFICE OF DESEGREGATION MONITORING INTERVENORS { . ORDER In preparing for the June 14 and 15 evidentiary hearing on LRSDs Compliance Report, it is apparent that a number of matters need to be brought to the attention of counsel for LRSD and Joshua: (1) (2) The LRSD Board, in approving the October 10, 2002 Compliance Plan, also adopted IL-Rl, which sets forth the written procedures for evaluating the 2.7 programs. While the October 10, 2002 Compliance Plan is attached as Exhibit A to LRSDs March 14, 2003 Notice Of Filing Program Evaluations Required By Paragraph C Of The Courts Compliance Remedy (docket entry #3745), IL- Rl " is not attached to that document or otherwise included in the record. Counsel for LRSD must immediately provide me with a copy of IL-Rl. Exhibit A to LRSDs Compliance Report is an October 25, 2002 letter from Mr. John Fendley, one of LRSDs attorneys, to all parties, responding to certain written concerns raised by Joshuas counsel, Mr. John Walker, regarding AO 72A (Rev.8/82) 8 6 4LRSDs proposed Compliance Plan. In order for the Court to place Mr. Pendleys October 25,2002 letter in context, I need the following additional documents: (a) Mr. Walkers October 10 and 24, 2002 letters to Mr. Fendley raising his concerns about the Compliance Plan
and (b) a copy of the document that Mr. Fendley repeatedly quotes Mr. Walker referring to in his October 10 and October 24,2002 letters as ''''your document''^ Counsel for LRSD must immediately provide me with copies of the foregoing documents. (3) In my September 13, 2002 Memorandum Opinion, I thought I made it clear that I am a big fan of plain English and have no desire to learn the acronym-filled lexicon of the professional educator. Therefore, I am now directing counsel to comply with the following rules in all oral and written communications with the Court in this case: (a) Do not use any educational acronyms unless they are first defined. The pleadings that I have reviewed to date in preparing for the June 14 and 15 hearing are littered with references to SAIPs,' DRAs, DIBELs, ELLA, CRT, SMART, THRIVE, ACTAAP, SREB, CREP, and SFA. Counsel for LRSD must immediately prepare a glossary which defines all acronyms used in all exhibits attached to LRSDs Compliance Report. A copy of this glossary is to be provided forthwith. 'I speculate that your document is probably LRSDs Compliance Plan, which I already have. If my speculation is correct, LRSDs counsel should so advise me and need not provide the Court with a copy of that document. -2- AO72A (Rev.8/82)(b) During the hearing on June 14 and 15, please instruct your witnesses to testify using plain English - not professional educatorese. Based upon the parties previous written submissions and testimony taken in earlier hearings, I fear this may pose a significant challenge for some of the witnesses (and me). If so, I encourage these witnesses to begin now to practice speaking in plain English, so that they will be ready to testify by the June 14 and 15 hearing. (4) On or before June 7,2004, counsel for Joshua and LRSD must submit proposed Findings of Fact and Conclusions of Law on the issue of whether LRSD has substantially complied with its obligations under Section VII of the Courts September 13, 2002 Memorandum Opinion and 2.7.1 of the Revised Plan. (5) On April 22, 2004, we had a telephone conference during which LRSDs Compliance Hearing was rescheduled from April 26 and 27,2004, to June 14 and 15, 2004. During that telephone conference, I stated that I would make every effort to render my decision on LRSDs Compliance Report by June 30, 2004. Based upon my current work load, I now believe the earliest I will be able to enter my decision is thirty to sixty days after the conclusion of the evidentiary hearing in this matter. IT IS SO ORDERED., DATED this J day of May, 2004. AO72A (Rev.e/82) Thi-j uuouivicLi-. I .!< 11_,\L-i. Oil JCKST SHEET h- COMEuSmNC,' y_BY Oft c- FRC -3- UNITED STATES DISTRICT T c received IN THE UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION MAY 1 4 2004 OFRCEOF desegregation monitoring LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. NO.4:82CV00866 WRW PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, ET AL DEFENDANTS MRS. LORENE JOSHUA, ET AL INTERVENORS KATHERINE KNIGHT, ET AL INTERVENORS PLAINTIFFS NOTICE OF FILING DOCUMENTS IN RESPONSE TO THE COURTS ORDER FILED MAY 12. 2004 Plaintiff Little Rock School District (LRSD) for its Notice of Filing states: 1. Attached are the following documents requested by the Court in its Order filed May 12,2004: A. Little Rock School District Proposed Compliance Plan Revised Plan 2.7.1 (Appendix 1 of which is EL-Rl")
B. Letter from John W. Walker to Chris Heller dated October 10, 2002
and, C. Letter from John W. Walker to Chris Heller dated October 23, 2002 (received by fax on October 24, 2002). 2. As to Mr. Walkers references to your document, the Court is correct that Mr. Walker is referring to the Proposed Compliance Plan attached hereto as Exhibit A. Page 1 of 33. As to the educational acronyms, Counsel has requested that the authors of the comprehensive evaluations immediately prepare a glossary of acronyms used in their respective evaluations. These will be consolidated into a single glossary for all exhibits and provided to the Court as soon as possible. Respectfully Submitted, LITTLE ROCK SCHOOL DISTRICT Friday, Eldredge & Clark Christopher Heller (#81083) 2000 Regions Center 400 West Capitol Little Rock, AR 72201-3493 (501)376-2011 Christopher Heller Page 2 of 3CERTIFICATE OF SERVICE I certify that a copy of the foregoing has been served on the following people by depositing a copy of same in the United States mail on May 13, 2004: Mr. John W. Walker JOHN W. WALKER, P.A. 1723 Broadway Little Rock, AR 72201 Ms. Ann Marshall Desegregation Monitor 1 Union National Plaza 124 W. Capitol, Suite 1895 Little Rock, AR 72201 Mr. Sam Jones Wright, Lindsey & Jennings 2200 Nations Bank Bldg. 200 West Capitol Little Rock, AR 72201 Mr. Steve Jones JACK, LYON & JONES, P.A. 425 W. Capitol, Suite 3400 Little Rock, AR 72201-3472 Judge J. Thomas Ray U. S. District Courthouse 600 West Capitol Avenue, Suite 149 Little Rock, AR 72201 Mr. Tim Gauger Mr. Mark A. Hagemeier Office of the Attorney General 323 Center Street 200 Tower Building Little Rock, AR 72201 Mr. Clayton Blackstock Mr. Mark Burnett 1010 W. Third Street Little Rock, AR 72201 Page 3 of 3 Christopher HellerLittle Rock School District Compliance Committee Proposed Compliance Plan Revised Plan 2.7.1 1^ EXHIBITThe District Courts Compliance Remedy On September 13, 2002, the District Court issued its Memorandum Opinion (hereinafter Opinion) finding that the Little Rock School District (LRSD) had substantially complied with all areas of the Revised Desegregation and Education Plan (Revised Plan), with the exception Revised Plan 2.7.1. Section 2.7.1 provided: LRSD shall assess the academic programs implemented pursuant to Section 2.7' after each year in order to determine the effectiveness of the academic programs in improving Afiican-American achievement. If this assessment reveals that a program has not and likely will not improve Afiican-American achievement, LRSD shall take appropriate action in the form of either modifying how the program is implemented or replacing the program. The District Courts Opinion set forth a detailed Compliance Remedy to be implemented by the LRSD. The Opinion first stated: Because LRSD failed to substantially comply with the crucially important obligations contained in 2.7.1, it must remain under court supervision with regard to that section of the Revised Plan until it: (a) demonstrates that a program assessment procedure is in place that can accurately measure the effectiveness of each program implemented imder 2.7 in improving the academic achievement of Afiican-American students
and (b) prepares the program evaluations identified on page 148 of the Final Compliance Report and uses those evaluations as part of the program assessment procedure contemplated by 2.7.1 of the Revised Plan. The Opinion then outlined the details of the Compliance Remedy as follows: A. For the entire 2002-03 school year and the first semester of the 2003-04 school year, through December 31, 2003, LRSD must continue to assess each of the programs implemented under 2.7 to improve the academic achievement of African-American students. LRSD now has over three years of testing data and other information available to use in gauging the effectiveness of those programs. I expect LRSD to use all of that available data and information in assessing the effectiveness of those prograrhs and in deciding whether any of those programs should be modified or eliminated. 'Revised Plan 2.7 provided, LRSD shall implement programs, policies and/or procedures designed to improve and remediate the academic achievement of Afiican-American students, including but not limited to Section 5 of this Revised Plan. 1B. C. F. LRSD must maintain written records regarding its assessment of each of those programs. These written records must reflect the following information: (a) the written criteria used to assess each program during the 2002-03 school year and the first semester of the 2003-04 school year
(b) the results of the annual assessments of each program, including whether the assessments resulted in program modifications or the elimination of any programs
and (c) the names of the administrators who were involved with the assessment of each program, as well as at least a grade level description of any teachers who were involved in the assessment process (e.g., all fourth grade math teachers
all eighth grade English teachers, etc.). LRSD must use Dr. Nunnerly^ or another expert from outside LRSD with equivalent qualifications and expertise to prepare program evaluations on each of the programs identified on page 148 of the Final Compliance Report. I will accept all program evaluations that have already been completed by Dr. Nunnerly or someone with similar qualifications and approved by the Board. All program evaluations that have not yet been completed on the remaining programs identified on page 148 of the Final Compliance Report must be prepared and approved by the Board as soon as practicable, but, in no event, later than March 15, 2003. In addition, as these program evaluations are prepared, LRSD shall use them, as part of the program assessment process, to determine the effectiveness of those programs in improving African-American achievement and whether, based on the evaluations, any changes or modifications should be made in those programs. In addition, LRSD must use those program evaluations, to the extent they may be relevant, in assessing the effectiveness of other related programs. * * * On or before March 15, 2004, LRSD must file a Compliance Report which documents its compliance with its obligations under 2.7.1. Any party, including Joshua, who wishes to challenge LRSDs substantial compliance with 2.7.1, as specified above, may file objections with the court on or before April 15, 2004. Thereafter, I will decide whether the LRSD has substantially complied with 2.7.1, as specified in the Compliance Remedy, and should be released from all ftxrther supervision and monitoring. ^The Court is clearly referring to Dr. John Nunnery. 2Proposed Compliance Plan As the Compliance Committee understands the District Courts Opinion, the Compliance Remedy requires the LRSD to: 1. 2. 3. 4. 5. Continue to administer student assessments through the first semester of 2003-04
Develop written procedures for evaluating the programs implemented pursuant to Revised Plan 2.7 to determine their effectiveness in improving the academic achievement of African- American students
Maintain written records of (a) the criteria used to evaluate each program
(b) the results of the annual student assessments, including whether an informal program evaluation resulted in program modifications or the elimination of any programs
and (c) the names of the administrators who were involved with the evaluation of each program, as well as at least a grade level description of any teachers who were involved in the evaluation process
Prepare a comprehensive program evaluation of each academic program implemented pursuant to Revised Plan 2.7 to determine its effectiveness in improving the academic achievement of Afiican-American students and to decide whether to modify or replace the program
and Submit for Board approval the program evaluations identified on page 148 of the LRSDs Final Compliance Report that have been completed, and complete, with the assistance of an outside expert, the remaining evaluations identified on page 148 of the LRSDs Final Compliance Report. What follows is an explanation of how the Compliance Committee derived these five requirements from the District Courts Opinion, and what the Compliance Committee proposes to do to comply with each requirement. Assessment and Evaluation When first read, the District Courts Compliance Remedy seemed simple and straightforward, but as the Compliance Committee attempted to develop this Proposed Compliance Plan, numerous questions arose. The most fundamental question related to the District Courts use of the term assessment in Paragraphs A and B of the Compliance Remedy. The ambiguity of this term was the subject of testimony at the hearing. The District Court included in its Opinion Dr. Lesleys testimony on the difference between assessment and evaluation, see Opinion, p. 152, but it is unclear whether the Court accepted this testimony. 3It is clear that the District Court understood the distinction between testing data, which are derived from student assessments, and program evaluations, which are used to determine the effectiveness of programs. See Opinion, p. 152 (LRSD acknowledged in the Interim Compliance Report that it was required: (a) to use both the testing data and the program evaluations to determine the effectiveness of the key academic programs implemented pursuant to 2.7 ... (emphasis in original)). Even so, the District Court appears to have used the term assessment in some instances to refer to only student assessments and in other instances to refer to both student assessments and evaluations. This required the Compliance Committee to determine the District Courts intended meaning. In making this determination, the Compliance Committee considered the context in which the term was used, the District Courts findings of fact as set forth in the Opinion, what would be in the best interest of African- American students, and hopefully, common sense. An explanation of each requirement of the Compliance Remedy is provided below. To avoid any ambiguity, Compliance Committee hereinafter uses the term assessment to refer to student assessments and the term evaluation to refer to the program evaluations, whether formal or informal. 1. Continue to administer student assessments through the first semester of 2003-04. This requirement derives from Paragraph A of the Compliance Remedy. Given Paragraph As reference to testing data, it seems clear that Paragraph A concerns, in part, student assessments. The Compliance Committee proposes to comply with this part of Paragraph A by implementing the 2002-03 Board-approved assessment plan. The 2002-03 Board-approved assessment plan incorporates four changes that have been made since the LRSDs Final Compliance Report. First, the Board eliminated the fall administrations of the Achievement Level Tests (ALTs) in 2001-02. The administration recommended this for three reasons: (1) the loss of instructional time resulting from testing and test preparation
(2) fall results did not provide significantly different information from the previous springs results
and (3) the cost of administering and scoring the tests. Second, the fall administration of the Observation Surveys and Developmental Reading Assessment will only be used by the teacher for diagnostic purposes. The scores will not be reported to or maintained by the LRSD. This change saves considerable time in test administration and allows more time for instruction. It was approved by the Board on September 26,2002. Third, the LRSD will no longer administer the ALTs. The administration recommended the complete elimination of the ALTs for the following reasons: (1) the lack of alignment with the content and format of the State Benchmarks
(2) the loss of instructional time resulting from 4testing and test administration
(3) the new federal accountability requirements in the No Child Left Behind Act require annual testing by the State in grades 3-8, making the LRSDs administration of the ALTs redundant
and (4) the costs of administering and scoring the tests. The Board approved this change on September 26, 2002. Finally, the Arkansas Department of Education (ADE) has moved the administration of the SAT9 from the fall to the spring, effective 2002-03. The 2002-03 Board-approved assessment plan calls for the administration of the following student assessments in English language arts and mathematics: Kindergarten Grade 1 Grade 2 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8 Grades 7-10 Grades 9-11 Grade 10 Grade 11 Observation Surveys (5) Developmental Reading Assessment Observation Surveys (5) Development Reading Assessment Observation Surveys (3) Development Reading Assessment Norm-referenced test to be identified for gifted/talented screening Benchmark Literacy examination Benchmark Mathematics examination SAT9 Total Battery Benchmark Literacy examination Benchmark Mathematics examination SAT9 Total Battery Benchmark Literacy examination Benchmark Mathematics examination End-of Course Algebra I examination End-of Course Geometry examination SAT9 Total Battery End-of-Level Literacy examination All of these assessments are administered in the spring. Consequently, the final student assessment before March 15, 2004, will be administered in the spring of 2003. 2. Develop written procedures for evaluating the programs implemented pursuant to 2.7 to determine their effectiveness in improving the academic achievement of African-American students. This requirement derives from the opening paragraph of the Compliance Remedy. To comply with this requirement, two proposed regulations have been drafted, IL-Rl for formal evaluations and IL-R2 for informal evaluations, attached as Appendixes 1 and 2, respectively. 5Proposed regulation IL-Rl combines generally accepted principles of program evaluation with practices that have been in place in the LRSD for the past two years. See, e.g., Robby Champion, Map Out Evaluation Goals, Journal for Staff Development, Fall 2002, attached as Appendix 3. This regulation will be submitted to the Board, Office of Desegregation Monitoring (ODM) and the Joshua Intervenors (Joshua) for review and comment before being finalized. Proposed regulation IL-R2 specifically addresses the next requirement and is discussed therewith. 3. Maintain written records of (a) the criteria used to evaluate each program
0)) the results of the annual student assessments, including whether an informal program evaluation resulted in program modiflcations or the elimination of any programs
and (c) the names of the administrators who were involved with the evaluation of each program, as well as at least a grade level description of any teachers who were involved in the evaluation process. This requirement derives from Paragraph B of the Compliance Remedy. Paragraph B apparently came about as a result of the District Courts concern about the LRSD making program modifications based on informal evaluations of student assessment data. See Opinion, p. 155 (I have grave reservations about anyone this side of Solomon being wise enough to use two or three semesters worth of erratic composite test scores to make reliable decisions about which remediation programs for LRSDs African-American students were actually working.). Proposed regulations IL-R2 was drafted to specifically address this requirement. It prohibits substantial program modifications from being made without a written record as required by Paragraph B. This regulation will also be submitted to ODM and Joshua for review and comment before being finalized. Proposed regulation IL-Rl also complies with this requirement. It mandates that the criteria used to formally evaluate a program be identified as the research questions to be answered, the first of which will be, Has this curriculum/instruction program been effective in improving and remediating the academic achievement of African-American students?. See Appendix 1, IL-Rl, p. 5. Recommended program modifications and the members of the evaluation team are routinely included in formal evaluations. As to the results of annual student assessments, the LRSD will continue to maintain a computer database with the results of armual students assessments administered pursuant to the Board-approved assessment plan. 64. Prepare a comprehensive program evaluation of each academic program implemented pursuant to 2.7 to determine its effectiveness in improving the academic achievement of African-American students and to decide whether to modify or replace the program. This requirement derives from Paragraph A of the Compliance Remedy. To comply with this requirement, the Compliance Committee proposes to prepare the following new, comprehensive evaluations: (a) Primary Reading/Language Arts, (b) Middle and High School Literacy and (c) K-12 Mathematics and Science. Each evaluation will be prepared in accordance with proposed Regulation IL-Rl and will incorporate all available student assessment data relevant to the program being evaluated. Based on Paragraph F of the Compliance Remedy, the LRSD understands these evaluations must be submitted to the Court on or before March 15, 2004. new, Some may argue that Paragraph A and Paragraph C together require the LRSD to prepare comprehensive evaluations of all the programs identified on page 148 of the LRSDs Final Compliance Report. The Compliance Committee considered and rejected this argument for three reasons. First, Paragraph As description of the programs to be evaluated differs from that of Paragraph C. Paragraph A states that the LRSD must continue to assess each of the programs implemented under 2.7 ... The Compliance Committee understands this to mean that the LRSD should continue to prepare evaluations of some of the key programs, as identified in the Interim Compliance Report. See Opinion, p. 151 (In addition to the Assessment Plan, 2.7.1 of the Interim Compliance Report noted that the LRSD was preparing 'evaluations of some of the key programs designed to improve Afiican-American achievement in order to provide a more in-depth look at the effectiveness of those programs. (emphasis in original)). In contrast to Paragraph A, Paragraph C requires the LRSD to prepare evaluations of each of the programs identified on page 148 of the Final Compliance Report. The Compliance Committee understands this to mean that the LRSD should complete all of the evaluations identified on page 148 of the Final Compliance Report and submit those to the Court. See Opinion, p. 156 ([A]s of March 15, 2001, the date the Final Compliance Report was filed with the Court
(1) PRE had prepared only draft evaluations of some of the programs in question
(2) none of those evaluations had been approved by the Board .... (emphasis in original)). The District Courts statement in Paragraph C that it will accept evaluations already completed and approved by the Board further indicates that Paragraph C does not require new, comprehensive evaluations. Second, recognizing this distinction between Paragraph A and Paragraph C resolves a potential conflict between Paragraph C and Paragraph F. Paragraph C provides, All program evaluations that have not yet been completed on the remaining programs identified on page 148 7of the Final Compliance Report must be prepared and approved by the Board as soon as practicable, but, in no event, later than March 15, 2003. However, Paragraph F does not require the LRSD to file a compliance report on its compliance with Revised Plan 2.7.1 until March 15,2004. The Compliance Committee concludes that March 15, 2004, is the deadline for submitting the new, comprehensive evaluations of the programs implemented pursuant to 2.7. See Paragraph A of Compliance Remedy. This is consistent with Paragraph As requirement that the LRSD include assessment data through December 31, 2003. Obviously, such data could not be included in an evaluation filed on or before March 15, 2003. Finally, it makes the most sense for the LRSD to expend the greatest time and resources preparing evaluations of the programs designed to improve African-American achievement. While the requirement for new, comprehensive evaluations derives from Paragraph A, some may argue that Paragraph Cs requirement that the LRSD use an outside expert to prepare evaluations of each of the programs identified on page 148 of the Final Compliance Report applies to the new, comprehensive evaluations. The Compliance Committee hopes the District Court and the parties agree that the team approach to program evaluation set forth in proposed regulation IL-Rl renders this argument moot. Proposed Regulation IL-Rl states that the program evaluation team must include [a]n external consultant with expertise in program evaluation, the program area being evaluated, statistical analysis, and/or technical writing ... . Appendix 1, p. 4. The exact role of the external consultant may vary, depending upon the expertise required for the production of the program evaluation. Id. The Compliance Committee believes that the LRSDs practice over the last two years of using the team approach to program evaluation has produced credible evaluations. Moreover, participation of the LRSD staff on the evaluation team provides them an excellent learning experience that they do not typically receive when an evaluation is prepared entirely by an outside expert. The evaluations prepared over the last two years using the team approach are as follows
1. 2. Dr. Steve Ross was the external consultant in the production of the Early Literacy program evaluation for 1999-2000 and 2000-01. He was asked to read a nearfinal draft and to provide feedback, which he did. His suggestions were then incorporated into the final report before it was published and disseminated. Other team members included Bonnie Lesley (associate superintendent), Patricia Price (program director), Pat Busbea (program specialist), Ed Williams (statistician), and Ken Savage (computer programmer). Dr. Julio Lopez-Ferraro is the National Science Foundation (NSF) program officer who over-sees the LRSDs implementation of the grant-funded 8Comprehensive Partnership for Mathematics and Science Achievement (CPMSA). NSF trained a team of LRSD staff to produce the mandated annual program evaluations for this initiative and then assembled an external team of practitioners and researchers who came to the LRSD each year to validate our findings and provide written feedback. The LRSD team members who participated in writing of the annual progress reports included Vanessa Cleaver (project director), Dennis Glasgow (director of mathematics and science), Bonnie Lesley (associate superintendent and co-project investigator), Virginia Johnson (CPMSA program evaluator), Ed Williams (statistician), and Ken Savage (computer programmer). 3. 4. Mr. Mark Vasquez, an attorney and former employee of the Office for Civil Rights in Dallas, has been retained by the LRSD for the past three years to provide guidance in the design and production of the English as a Second Language (ESL) program evaluation. Other team members have been Bonnie Lesley (associate superintendent), Karen Broadnax (program supervisor), Ed Williams (statistician), Ken Savage (computer programmer), and Eddie McCoy (program evaluator). Dr. Larry McNeal, a professor at the University of Arkansas at Little Rock in education administration and a private consultant in program evaluation, was retained by the LRSD to lead the team that produced the program evaluation for the Charter School. Other members of that team included Linda Watson (assistant superintendent), Krista Young (program director), and Ed Williams (statistician). Dr. McNeal wrote this report. The team approach, supported by an external expert, ensures that all areas of expertise (program, implementation, technical and evaluative) are included. No one person would have all the knowledge and skills that a team would have. As these examples show, the external expert does not always perform the same role in every project. Rather, the role changes, depending on the expertise that is required for a credible report. 5. Submit for Board approval the program evaluations identified on page 148 of the LRSDs Final Compliance Report that have been completed, and complete, with the assistance of an outside expert, the remaining program evaluations identified on page 148 of the LRSDs Final Compliance Report. The following program evaluations identified on page 148 of the Final Compliance Report have been completed
1. Early Literacy. A comprehensive report for 1999-2000 and 2000-01 was prepared, completed, and presented to the Board in fall 2001. An update to this report for 2001-02 was presented to the Board in June 2002, with an emphasis on 9the improved achievement of African-American students and closing the achievement gap. 2. 3. 4. 6. 7. 8. 9. 10. Mathematics and Science. Three years (1998-99, 1999-2000, and 2000-01) of program evaluations as required by the NSF were prepared, presented to the Board, and submitted to NSF, and NSF has responded to each evaluation. Extended Year Schools. The LRSD staff prepared, completed, and presented to the Board in the spring of 2002 an evaluation of the Extended Year Schools. Elementary Summer School. The LRSD staff prepared, completed, and provided to the School Services Division an evaluation of elementary summer school programs for 2000-01. 5 HIPPY. The HIPPY program was evaluated by the LRSD staff in July 1999. The report was prepared, completed, and submitted to the program director and the Cabinet. Charter School. This program evaluation was prepared, completed, and presented to the Board in June 2001. ESL. The Office for Civil Rights has required the LRSD to prepare a program evaluation in this area for each of the past three years: 1999-2000, 2000-01, and 2001-02. The first two of these reports have been prepared, completed, submitted to the Board, and submitted to OCR. (A third program evaluation will be completed in October when state scores arrive and will be ready by the March 15, 2003 deadline). Lyceum Scholars Program. Two separate evaluations of this alternative education school program were prepared by the LRSD staff. South-west Middle School's SEDL Program. Southwest Middle School was the recipient of a two-year technical assistance grant from the Southwest Educational Development Lab (SEDL) to build professional community. SEDL prepared a comprehensive program evaluation that included Southwest among other grant recipients outside the LRSD. The LRSD staff provided SEDL data for this evaluation. Onward to Excellence (Watson Elementary). A grant from ADE funded a partnership between Watson Elementary and the Northwest Educational Development Lab to implement a school improvement initiative. The LRSD staff provided data to Watsons principal for preparation of program evaluations. The principal submitted two annual program evaluations to ADE. 1011. 12. Collaborative Action Team ("CAT"). This one-year partnership with SEDL provided in 2000-01 for establishing and training a Collaborative Action Team of parent and community volunteers supported by LRSD staff to improve parent involvement. SEDL wrote a 249-page evaluation of their three-year grant-funded program, of which LRSD was included only the last year. The LRSD staff provided SEDL data for this evaluation. Vital Link. The LRSD staff prepared a program evaluation, and it was provided to the project director. A question arises as to which of these evaluations are acceptable to the Court without additional work. The first sentence of Paragraph C of the Compliance Remedy provides, LRSD must use Dr. Nunnerly (sic) or another expert from outside LRSD with equivalent qualifications and expertise to prepare program evaluations of each of the programs identified on page 148 of the Final Compliance Report. The second sentence of Paragraph C states that the District Court will accept all program evaluations that have already been completed by Dr. Nunnerly (sic) or someone with similar qualifications. It is unclear whether an expert from outside the LRSD must have prepared the completed evaluations for them to be accepted by the District Court, or whether it is sufficient that they were prepared by someone within LRSD with similar qualifications. The District Courts findings of fact suggest that the District Court will accept only program evaluations already completed by an outside expert. The District Court noted that Dr. Lesley testified that, by the end of November 2000, it was her opinion that no one in PRE had the expertise to prepare program evaluations. Opinion, p. 153. Thus, the District Court likely concluded that the only acceptable program evaluations would be those prepared by persons outside the LRSD. Applying this standard, the Compliance Committee believes that the following evaluations are acceptable to the Court, following Board approval, without additional work: Early Literacy, Mathematics and Science, Charter School, ESL, Southwest Middle Schools SEDL Program and CAT. The remaining program evaluations identified on the bottom of page 148 of the Final Compliance Report must be completed by an outside expert. They are: Extended Year Schools, Middle School Implementation, Elementary Summer School, HIPPY, Campus Leadership Teams (CLTs), Lyceum Scholars Program, Onward to Excellence and Vital Link. The Compliance Committees proposal for completing each of these evaluations will be discussed below. In deciding how to go about completing these evaluations, the Compliance Committee focused on what makes sense to do at this time considering the goal of improving Afiican-American achievement and the limitations inherent in asking an expert to complete an evaluation. 11Extended Year Schools. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. Middle School Implementation. A draft of this evaluation was presented to the Board in July and August 2000, but it was never completed. The Compliance Committee proposes retaining an outside expert to rewrite the report and, if possible, prepare an evaluation based on the existing data. Elementary Summer School. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. HIPPY. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. CLTs. The LRSD staff conducted a survey of CLTs during 2000-01. A summary of the survey findings was presented during a CLT training session, but no formal report was ever prepared. The Compliance Committee proposes retaining an outside expert to review the survey data and, if possible, prepare an evaluation based on the existing survey data. Lyceum Scholars Program. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. Onward to Excellence. This evaluation was completed by the LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. Vital Link. This evaluation was completed by LRSD staff. The Compliance Committee proposes retaining an outside expert to review the report and, if possible, draw conclusions and make recommendations based on the existing data. 12Action Plan Timeline The Compliance Committee proposes implementation of this Compliance Plan in accordance with the following timeline. 1. Provide copies of this proposed Compliance Plan to ODM and Joshua for their reactions. 2. Incorporate, as possible, suggested revisions from ODM and Joshua. 3. Place Compliance Plan on the agenda for Board review and approval. 4. Place 2002-03 Program Evaluation Agenda on the Boards agenda for review and approval. 5. Place on Board agenda for approval two previously presented program evaluations (early literacy, and charter school). 6. Place on Board agenda for approval the evaluations of Southwest Middle Schools SEDL program and the Collaborative Action Team (also conducted by SEDL). 7. Place on Board agenda for approval the previously presented ESL program evaluations for 1999-2000 and 2000-01, plus the new evaluation for 2001-02. Week of September 30, 2002 Week of October 7, 2002 October 10, 2002 October 24, 2002 October 24, 2002 November 2002 November 2002 13 Clay Fendley Ken James Attorneys Ken James Compliance Team Ken James Attorneys Ken James Bonnie Lesley Bonnie Lesley Linda Watson Bonnie Lesley Bonnie Lesley Karen Broadnax8. Place on Board agenda for approval the three previously presented program evaluations for the NSF-funded CPMSA program, plus the new Year 4 report for 2001-2002. 9. Issue Request for Proposals (RFPs) from available external experts to review and complete the eight remaining program evaluations listed on page 148._____________________ 10. Form a screening team to determine recommendations to the Superintendent for designating external experts to review and complete the eight remaining program evaluations listed on page 148._____________________ 11. Select and negotiate consulting contracts with designated external experts. 12. Assign appropriate staff to each external expert to provide needed information, data, access to program staff, etc. 13. Monitor the work to ensure timely completion. 14. As each paper is completed and ready for circulation, send copies to ODM and Joshua for their review and comments. December 2002 Mid-October 2002 Late October 2002 Mid-November 2002 Mid-November 2002 Mid-November 2002February 2003 December 2002February 2003 14 Bonnie Lesley Vanessa Cleaver Dennis Glasgow Bonnie Lesley Darral Paradis Ken James Compliance Team Bonnie Lesley Ken James Bonnie Lesley Bonnie Lesley Bonnie Lesley15. As each paper is completed, place on the Boards agenda the item to be reviewed and approved. 16. Write Interim Compliance Report relating to programs on page 148 to be completed. 17. Establish staff teams for each of the three programs on the Boards Program Evaluation Agenda to be completed for 2002-2003 (Elementary Literacy, Secondary Literacy, and K- 12 Mathematics/ Science). 18. Publish RFPs to identify external experts to serve on each of the two staff teams for the Boards Program Evaluation Agenda (K-12 mathematics/science external experts are provided by NSF). 19. Establish consulting contracts with the two external experts required for the Elementary Literacy and Secondary Literacy program evaluations. 20. Train each program evaluation team, including the external expert, on the requirements of the approved Compliance Plan and IL-R. December 2002February 2003 March 15, 2003 March 1, 2003 March 1,2003 Late March 2003 May 2003 15 Ken James Bonnie Lesley Attorneys Compliance Committee Bonnie Lesley Bonnie Lesley Darral Paradis Bonnie Lesley Bonnie Lesley21. Monitor the completion of the work on all three program evaluations required in the Boards Program Evaluation Agenda. 22. Send copies of the completed Elementary Literacy program evaluation to ODM and Joshua for information. 23. Complete the evaluation of the Elementary Literacy program and place on the Boards agenda for approval. 24. Send copies of the Secondary Literacy program evaluation to ODM and Joshua for information. 25. Complete the evaluation of the Secondary Literacy program and place on the Boards agenda for approval. 26. Send copies of the completed CPMSA program evaluation to ODM and Joshua for information. 27. Complete the five-year evaluation of the CPMSA project (science and mathematics) and place on the Boards agenda for approval. 28. Write Section 2.7.1 Final Compliance Report for federal court and file with Court. MayOctober 2003 With October 2003 Board agenda packet October board meeting, 2003 With November 2003 Board agenda packets November board meeting, 2003 With December 2003 Board agenda packet December board meeting, 2003 March 15, 2004 16 Bonnie Lesley Ken James Bonnie Lesley Bonnie Lesley Pat Price Ken James Bonnie Lesley Bonnie Lesley Pat Price Ken Janies Bonnie Lesley Bonnie Lesley Vanessa Cleaver Dennis Glasgow Ken James Attorneys Compliance TeamAppendix 1 Proposed IL-RlLITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R1 PROGRAM EVALUATION AGENDA Purpose The purpose of these regulations is to provide guidance to the staff involved in the evaluation of programs required in the Boards Program Evaluation Agenda. They do not necessarily apply to grant-funded programs if the funding source requires other procedures and provides funding for a required evaluation. Criteria for Program Evaluations Policy IL specifies that the evaluations of programs approved in its Board- approved Program Evaluation Agenda shall be conducted according to the standards developed by the Joint Committee on Standards for Educational Evaluation. (See Joint Committee on Standards for Educational Evaluation, James R. Sanders, Chair (1994). The Program Evaluation Standards, Edition: How to Assess Evaluations of Educational Programs. Thousand Oaks, CA: Sage Publications.) They are as follows: Utility Standards The utility standards are intended to ensure that an evaluation will serve the information needs of intended users. These standards are as follows
Stakeholder identification. People involved in or affected by the evaluation should be identified so that their needs can be addressed. Evaluator credibility. The people conducting the evaluation should be both trustworthy and competent to perform the evaluation so that the evaluation findings achieve maximum credibility and acceptance. Information scope and sequence. Information collected should be broadly selected to address pertinent questions about the program and should be responsive to the needs and interests of clients and other specified stakeholders. Values identification. The perspectives, procedures, and rationale used to interpret the findings should be described carefully so that the bases for value judgements are clear. Report clarity. Evaluation reports should describe clearly the program being evaluated, including its context and the purposes, procedures, and findings of the evaluation, so that essential information is provided and understood easily. 1Report timeliness and dissemination. Significant interim findings and evaluation reports should be disseminated to intended users so that they can be used in a timely fashion. Evaluation impact. Evaluations should be planned, conducted, and reported in ways that encourage follow-through by stakeholders, so that the likelihood that the evaluation will be used is increased. Feasibility Standards Feasibility standards are intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal. Practical procedures. Evaluation procedures should be practical so that the disruption is kept to a minimum while needed information is obtained. Political viability. The evaluation should be planned and conducted with anticipation of the different positions of various interest groups so that their cooperation may be obtained, and so that possible attempts by any of these groups to curtail evaluation operations or to vias or misapply the results can be averted or counteracted. Cost-effectiveness. The evaluation should be efficient and produce information of sufficient value so that the resources expended can be justified. Propriety Standards The propriety standards are intended to ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results. Service orientation. Evaluations should be designed to assist organizations to address and effectively serve the needs of the full range of targeted participants. Formal agreements. Obligations of the formal parties to an evaluation (what is to be done, how, by whom, and when) should be agreed to in writing so that these parties are obligated to adhere to all conditions of the agreement or to formally renegotiate it. Rights of human subjects. Evaluations should respect human dignity and worth in their interactions with other people associated with an evaluation so that participants are not threatened or harmed. Complete and fair assessments. The evaluation should be complete and fair in its examination and recording of strengths and weaknesses of the program being evaluated so that strengths can be built upon and problem areas addressed. Disclosure of findings. The formal parties to an evaluation should ensure that the full set of evaluation findings, along with pertinent limitations, are made accessible to the people affected by the 2evaluation, as well as any others with expressed legal rights to receive the results. Conflict of interest. Conflict of interest should be dealt with openly and honestly so that it does not compromise the evaluation processes and results. Fiscal responsibility. The evaluators allocation and expenditure of resources should reflect sound accountability procedures and be prudent and ethically responsible so that expenditures are accounted for and appropriate. Accuracy Standards Accuracy standards are intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine the worth of merit of the program being evaluated. Program documentation. The program being evaluated should be described and documented clearly and accurately so that it programs is identified clearly. Context analysis. The context in which the program exists should be examined in enough detail so that its likely influences on the program can be identified. Described purposes and procedures. The purposes and procedure of the evaluation should be monitored and described in enough detail so that they can be identified and assessed. Defensible information sources. The sources of information used in a program evaluation should be described in enough detail so that the adequacy of the information can be assessed. Valid information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the interpretation arrived at is valid for the intended use. Reliable information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the information obtained is sufficiently reliable for the intended use. Systematic information. The information collected, processed, and reported in an evaluation should be review systematically so that the evaluation questions are answered effectively. Analysis of quantitative information. Quantitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Analysis of qualitative information. Qualitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Justified conclusions. The conclusions reached in an evaluation should be justified explicitly so that stakeholders can assess them. 3Impartial reporting. Reporting procedures should guard against distortion caused by personal feelings and biases of any party so the evaluation reports reflect the evaluation findings fairly. Metaevaiuation. The evaluation itself should be evaluated formatively and summartively against these and other pertinent standards so that its conduct is appropriately guided, and on completion, stakeholders can closely examine its strengths and weaknesses. Program Evaluation Procedures The following procedures are established for the evaluation of programs approved by the Board of Education in its annual Program Evaluation Agenda: 1. 2. 3. 4. 5. The Division of Instruction shall recommend to the Superintendent annually, before the budget for the coming year is proposed, the curriculum/instruction programs for comprehensive program evaluation. The recommendation shall include a proposed budget, a description of other required resources, and an action plan for the completion of the reports. Criteria for the proposed agenda are as follows: A. Can the results of the evaluation influence decisions about the program? B. Can the evaluation be done in time to be useful? C. Is the program significant enough to merit evaluation? (See Joseph S. Wholey, Harry P. Hatry, and Kathryn Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: Jossey- Bass Publishers. 5-7.) The Superintendent shall recommend to the Board of Education for approval the proposed Program Evaluation Agenda^with anticipated costs and an action plan for completion. For each curriculum/instruction program to be evaluated as per the Program Evaluation Agena, the Associate Superintendent for Instruction shall establish a staff team with a designated leader to assume responsibility for thp production of the report according to the timelines established in the action plan approved by the Board of Education. Each team shall include, at a minimum, one or more specialists in the curriculum/instruction program to be evaluated, a statistician, a programmer to assist in data retrieval and disaggregation, and a technical writer. If additional expertise is required, then other staff may be added as necessary. An external consultant with expertise in program evaluation, the program area being evaluated, statistical analysis, and/or technical writing shall be retained 4as a member of the team. The role of the external consultant may vary, depending upon the expertise required for the production of the program evaluation. 6. The team leader shall establish a calendar of regularly scheduled meetings for the production of the program evaluation. The first meetings will be devoted to the following tasks: A. B. C. D. E. F. G. Provide any necessary training on program evaluation that may be required for novice members of the team, including a review of the Boards policy IL and all of the required criteria and procedures in these regulations, IL-R. Assess the expertise of each team member and make recommendations to the Associate Superintendent for Instruction related to any additional assistance that may be required. Write a clear description of the curriculum/instruction program that is to be evaluated, with information about the schedule of its implementation. Agree on any necessary research questions that need to be established in addition to the question, Has this curriculum/instruction program been effective in improving and remediating the academic achievement of African-American students? (See Policy IL, 2.7.1 of the Revised Desegregation and Education Plan, and Judge Wilsons Compliance Remedy.) Generate a list of the data required to answer each research question, and assign responsibility for its collection and production. All available and relevant student performance data must be included. (See Judge Wilsons Compliance Remedy.) Decide who will be the chief writer of the program evaluation. Plan ways to provide regular progress reports (e.g., dissemination of meeting minutes, written progress reports, oral reports to the Superintendents Cabinet and/or Compliance Team) to stakeholders, including the Associate Superintendent for Instruction, the Superintendent of Schools, the Office of Desegregation Monitoring (until Unitary Status is achieved), and the Joshua Intervenors (until Unitary Status is achieved). (See Joellen Killion (2002). Assessing Impact: Evaluating Staff Development. Oxford, OH. National Staff Development Council (NSDC)
Robby Champion (Fall 2002). Map Out Evaluation Goals. Journal of Staff Development. 78-79
5Thomas R. Guskey (2000). Evaluating Professional Development. Thousand Oaks. CA: Convin Press
Blaine R. Worthen, James R. Sanders, and Jody L. Fitzpatrick (1997). Participant-Oriented Evaluated Approaches. Program Evaluation: Alternative Approaches and Practical Guidelines
153-169
Beverly A. Parsons (2002). Evaluative Inquiry: Using Evaluation to Promote Student Success. Thousand Oaks, CA: Corwin Press
and Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: Jossey-Bass Publishers.) 7. 8. 9. Subsequent meetings of the program evaluation team are required for the following tasks: to monitor the completion of assignments
to collaborate in the interpretation and analysis of data
to pose any necessary new questions to be answered
to review drafts and provide feedback to the writer
to formulate recommendations, as required, for program improvement, especially to decide if a recommendation is required to modify or abandon the program if the findings reveal that the program is not being successful for the improvement of African- American achievement
to assist in final proofreading
and to write a brief executive summary, highlighting the program evaluation findings and recommendations. A near-final copy of the program evaluation must be submitted to the Associate Superintendent for Instruction at least one month before the deadline for placing the report on the Boards agenda for review and approval. This time is required for final approval by staff, for final editing to ensure accuracy, and for submission to the Superintendent. When the program evaluation is approved for submission to the Board of Education for review and approval, copies of the Executive Summary and complete report must be made for them, for members of the Cabinet, for ODM (until Unitary Status is achieved), and for the Joshua Intervenors (until Unitary Status is achieved). 10. The program evaluation team shall plan its presentation to the Board of Education on the findings and recommendations. 611 .The Associate Superintendent for Instruction shall prepare the cover memorandum to the Board of Education, including all the required background information (see Judge Wilsons Compliance Remedy): A. If program modifications are suggested, the steps that the staff members have taken or will take to implement those modifications. If abandonment of the program is recommended, the steps that will be taken to replace the program with another with more potential for the improvement and remediation of African-American students. (See Section 2.7.1 of the Revised Desegregation and Education Plan and Judge Wilsons Compliance Remedy.) B. Names of the administrators who were involved in the program evaluation. C. Name and qualifications of the external expert who served on the evaluation team. D. Grade-level descriptions of the teachers who were involved in the assessment process (e.g., all fourth-grade math teachers, all eighth grade English teachers, etc.). 10. When the program evaluation is approved by the Board of Education, the team must arrange to have the Executive Summary and the full report copied and design a plan for communicating the program evaluation findings and recommendations to other stakeholders. This plan must then be submitted to the Associate Superintendent for approval. 11. Each program evaluation team shall meet with the Associate Superintendent for Instruction after the completion of its work to evaluate the processes and product and to make recommendations for future program evaluations. (See u Joellen Killion (2002). Evaluate the Evaluation. Assessing Impact: Evaluating Staff Development. Oxford, OH: National Staff Development Council. 46, 123-124.) 7Appendix 2 Proposed IL-R2LITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R2 INFORMAL PROGRAM EVALUATION Introduction The purpose of this regulation is to ensure that a written record exists explaining a decision to significantly modify an academic program. It is not the intent of this regulation to require a formal program evaluation before every significant program modification. Definitions Academic Program means one of the core curriculum programs of English/Language Arts, Mathematics, Science or Social Studies. Significantly modify means a material change in the content or delivery of an academic program implemented throughout the entire District. Written Record A written record must be prepared and maintained explaining a decision to significantly modify an academic program. The written record required by this regulation must include the following information: (a) the written criteria used to evaluate the program
(b) a summary of the student assessment data or other data on which the decision was based
and (c) the names of the administrators who were involved with the evaluation of each program, as well as at least a grade level description of any teachers who were involved in the evaluation process (e.g., all fourth grade math teachers
all eighth grade English teachers, etc.). 1Appendix 3 Robby Champion, Map Out Evaluation Goals, Journal for Staff Development, Fall 2002'l a k i n g ROBBY CHAMPION Map out evaluation goals A master plan can guide you down the rocky path of evaluation when you launch a major professional development evaluation, regardless of the projects scope, you may quickly find yourself on a with twists and unexpected turns. slippery, often rocky road. Before venturing too far and becoming disillusioned about program evaluation, create a master plan. While it requires an upfront investment of time and may delay starting, it quickly becomes an invaluable road map that helps you avoid delays and detours along the way. Developing an evaluation master plan is most useful when you are launching a major, summative program evaluation. A summative evaluation is done at major junctures in a programs life cycle and emphasizes documenting impact. Information from summative evaluations is used to make important decisions about the initiative, such as whether to continue, alter, expand, downsize, or eliminate it. A formative evaluation, on the other hand, means monitoring and collecting data, often informally and spontaneously, throughout program implementation. Formative evaluation helps show implementers where to make adjustments so a program can eventually achieve significant results. A thoughtfully prepared master plan for a major evaluation effort would: Focus the evaluation effort and help implementers avoid being sidetracked by leadership changes and new opinions
Create a realistic timeline and work plan that Robby Champion is president of Champion Training & Consulting. You can contact her at Champion Ranch at Trumbell Canyon, Mora, NM 87732, (505) 387-2016, fax (505) 387-5581, e-mail
Robbychampion@aol.com. 78 provides needed momentum for the work
Be a key informational document to provide an overview and answer specific questions throughout the process
Help recruit people to assist with the project on the myriad evaluation tasks
Give the message that the evaluation will be open and not secretive. Whether your evaluation must be completed within a few months or wiU extend for several years, think through four phases of work before starting. PHASE I: ORGANIZE THE PROCESS 1. Form a steering committee, including any needed outside expertise. 2. Learn more about program evaluation together. 3. Write a clear description of each program to be evaluated. 4. Agree on the primary purpose of the evaluation. .5. Plan how you will keep everyone informed along the way. Steering committees, charged specifically . with program evaluation, are important to focus attention and maintain the energy and momentum needed for the evaluation. They also help build a spirit of collaboration and open inquiry. And they keep the evaluation on track when other priorities ntight push the effort aside. Provide steering committee members with the tools to succeed. Members need not be evalu- National Staff Development Council JSD Fall 2002 t a k I J, g m e a 5 JI re ! i i i I I ! i ation experts, but they do need information, support, and guidance to make informed decisions. They need background material to leam about program evaluation and examples of good evaluation studies. Finally, they need access to experts on professional development, measurement, and the content areas of the training programs. Before launching any evaluation effort, have a written description of each program to be evaluated. You would be amazed at the number of people who do not have a clear idea of what you mean by the New TeacherJhduction Program or the Early Literacy Initiative since so many different initiatives are being undertaken simultaneously around the school or district PHASE II: DESIGN THE EVALUATION 1. Generate questions to guide the evaluation. 2. Generate potential data sources/ instruments to address the questions. 3. Using a matrix to provide a birds-eye view, agree on the most important questions and the best data sources. 4. Decide if collecting data from a sample group is warranted to make the evaluation manageable. 5. Determine the evaluation approach that makes sense
quantitative vs. qualitative/naturalistic. 6. Gather or create the instruments for data collection. 7. Determine a realistic schedule for collecting data. 8. Create a system for collecting, analyzing, and interpreting data. Decisions made in Phase n are critical. They determine the technical quality of your evaluation. In the questions you select, you determine what to examine and what to ignore. When you finish with the design phase, your program evaluation win be shaped to use a quantitative or a , qualitative model or a mixmre of the two. In the design phase, you make other major decisions, such as whether to use a sample group. You also decide whether to do an in-depth case study, whether to ON THE WEB. See an example of a matrix to help guide evaluations at: www.nsdc.org/library/jsd/ chainpion234.html. survey the whole population, whether to use examples of smdent work instead of official documents such as student grades or standardized test scores, or whether to judge adult learners understanding of the training content with performance tasks during training or by exit tests, classroom observations, or smdent feedback. If the programs to be evaluated already have stated indicators of longterm impact, generating appropriate evaluation questions is much simpler than when programs have only vague, lofty goals. The steering committee may drift into the realm of program planning as you encounter hurdles like fuzzy program outcomes. To avoid making misinformed evaluation design decisions, involve program leaders in your discussions. Developing or gathering instruments and then collecting the data are the most expensive steps in any evaluation. Think strategically about which data to collect, from whom to collect it or where to find it, and the best time to collect it. Your organization may already be collecting data for another purpose that now can be used for program evaluation. Some public records, such as smdent attendance, may be valuable if, for example, 20% increase in smdent attendance at aU grade levels is one of your programs indicators of impact. PHASE 111: PREPARE TO REPORT 1. Determine which audiences will want to know the results. 2. Consider several forums and formats to disseminate the results. 3. Plan reports, presentations, photo displays, graphs, charts, etc. Remember that your job is to make the evaluation results useful to your organization, so consider a range of ways to provide information to various groups. Consider briefs in the school or district newsletter, a handout updating staff about the schedule for data collection, five- minute progress updates in faculty meetings, bulleted statements on your web site, a digital picture album of the programs results in classrooms with photos of students, and hallway displays of student work. If your final report is a formal document complete with examples of your data collection instruments, consider writing an executive summary of five pages or less to help,readers get the essential information. PHASE IV: CREATE THE WORK PLAN 1. List all tasks to be completed for the whole evaluation., 2. Create a realistic timeline. 3. Assign work. 4. Distribute the master plan. You will have to be creative to accomphsh all the evaluation tasks. In education, we rarely have the luxury of contracting outsiders for the entire project Enlist steering committee members, partners, graduate students from the local university, and other talented critical friends to get the work done. One caution: For formal or summa- tive evaluations to be credible, avoid using insiders such as the program designers or implementers (coaches, mentors, trainers, or facilitators) to perform critical evaluation tasks that call for objectivity and distance. And be sure to get ongoing, high-quality techmeal expertise for the critical technical analysis. A CATALYST FOR REFLECTION Completing a major program evaluation usually serves as the catalyst for serious reflection on the current designs, poheies, and practices of your professional development programs their goals, content, processes, and contexts. In fact, revelations are often so powerful that they bring about the realization that major changes are needed if significant results are really expected from professional development. People frequently conclude that designing the evaluation should be the first step in the program planning process, rather than an afterthought during implementation. E JSD Fall 2002 National Staff Development Council 79I John W. Walker, P.A. Attorney At Law 1723 Broadway Little Rock, Arkansas 72206 Telephone (501) 374-3758 FAX (501) 374-4187 JOHN W. WALKER SHAWN CHILDS Via Facsimile - 376-2147 October 10, 2002 OF COUNSEL ROBERT McHENRY, P.A. DONNA J. McHENRY 8210 Henderson Road Little Rock, Arkansas 72210 Phone: (501) 372-3425 Fax (501) 372-3428 Email: mchenryd^wbeU.net Mr, Chris Heller Friday, Eldredge & Clark 2000 Regions Center 400 West Capitol Little Rock, AR 72201 Re: Little Rock School District v. PCSSD, et al. Case No. 4:82CV00866 Dear Chris: Plan. This refers to your letter of October 4, 2002, providing LRSDs proposed Compliance The courts remedy and the general subject matter are too complex for us to provide all comments and objections we may ultimately have before todays Board meeting. We do note the following: 1. More consideration is needed of the programs to be identified as implementation pursuant to Section 2.7 ..., which are to be subjected to a comprehensive program evaluation. .. Your document at page 7 identifies three areas. We note the absence of specific reference and detail regarding interventions / scaffolding ~ areas of vital importance given the achievement patterns of Afiican American students. We note also that the LRSD compliance report cited many more programs as designed to fulfill Section 2.7. 2. In a discussion prior to his testimony in the hearing Judge Wilson, we understood Dr. Ross to indicate that the existing evaluation of the Pre-K - 2 literary program was not adequate. The notation at page 4 of your document of the changed use of the Observation Survey and the DRA relates to part of the concerns he expressed. This undermines the LRSD argument (page 11) that the existing evaluation, upon Board approval, will satisfy a part of the courts remedy. 3. The LRSD discussion about satisfying the courts order regarding the evaluations mentioned at page 148 of the compliance report does not seem to take account of the material provided, which describes an adequate evaluation. 4. We question the period for implementation of a remedy which the court has identified and, therefore, the LRSD schedule. Once again, these comments should not be taken to be the full range of concerns, which Joshua may ultimately have about the courts remedy and the Compliance Plan. Nor do we intend to waive our concerns about the court setting forth a remedy, without first hearing from the parties and the ODM with regard to the courts views on an appropriate remedy. irtcerely, ohn W. Walker JWW:js cc: Ms. Ann Marshall All Counsel of RecordOCT.24.2002 8:06fiM JOHN W WALKER P A NO.963 P.2 JOHN W. WALKER SHAWN CHILDS John W. Volker, P.A. Attorney At Law 1723 Broadway Little Rock, Arkansas 72206 Telephone (601) 374-3758 FAX (501) 374-4187 October 23, 2002 or COUNSEL ROBERT McHENRY, PjL EONNAJ.McKE.NEY 8210 HbNDEKSON SQaD Litoe Rock, Arkansas 72210 Phone
(501) 372-3425 Pax (501) 372-3428 Email: mchemyd^awbollnet Mr. Christopher Heller FRIDAY, ELDREDGE & CLARK 400 W, Capitol, Suite 2200 Little Rock, Arkansas 72201 Re
LRSD V, PCSSD Dear Chris: This letter sets forth additional comments of the Joshua Intervenors concerning the LRSD Compliance Plan. We are offering these comments, although we are unable to discern that the comments we offered earlier were given consideration. 1. In using historical student assignment results, attention should be given to the quality of the data. In the past, LRSD has used results on the RA and the Observation Survey in ways not consistent with the purposes of those instruments. In addition, because teachers provided scores for their own students, the past use made of the data was in conflict with the districts recognition in the newly enacted Regulation IL-Rl that Conflict of Interest must be avoided. 2. We are concerned about the manner in which the regulation describes the team process for preparing evaluations, again in the context of conflict of interest. In order to insure that conflict of interest is avoided, the external consultant needs to write the report and control the context of the analysis. Paragraphs 3, 5 and 6 of the Program Evaluation Procedures do not guarantee that the external expert will have these roles. Of course, if reports were prepared in the manner which we describe, there would be no bar to LRSD staff preparing comments to the Board with a differing interpretation of the evaluation results. 3. We continue to be concerned about the global, general manner in which the content of planned evaluations is described (page 7 of the document, first paragraph). For example, the Board has adopted a policy and two regulations dealing with remediation for students whose performanof: is below par. Studying the actual implementation of these standards (in all or a representative sample of schools) is of vital importance to the Intervenor class because class members are so much more likely than other students to exhibit unsatisfactory performance on the Benchmark and Stanford Achievement Tests. A satisfactory description by the School Board of the evaluations which it : EXHIBIT 10/24/2002 THU 09:03 [TX/RX NO 8580] 002' OCT. 24.2002 8:07AM JOHN W WALKER P A NO.963 P.3 Page Two October 23, 2002 requires the staff to undertake should make clear that the actual implementation of remediation activities in district schools is to receive careful consideration. This is surely an important contextual factor (see Accuracy Standards, para. 2). 4. We understand from the Plan that the LRSD plans evaluations of programs deemed to be particularly directed to achievement of Afiican American students for the indefinite future, not simply for the period necessary to satisfy the court. We would like to receive the Boards assurance that this is the case. We would appreciate your providing this letter to the Superintendent and the members of the school board. . Walker Sincere^, JWW:Ip All Counsel Ms. Ann Marshall Judge Thomas Ray 10/24/2002 THU 09:03 [TX/RX NO 8580] @003 IN THE UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION LITTLE ROCK SCHOOL DISTRICT PLAINTIFF V. LR-C-82-866 RECEIVED PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. LET AL MAY 2 c 2004 DEFENDANTS MRS. LORENE JOSHUA, ET AL OFFICE OF DESEGREGATION MONITORING INTERVENORS KATHERINE KNIGHT, ET AL INTERVENORS PLAINTIFFS NOTICE OF FILING DOCUMENTS IN RESPONSE TO THE COURTS ORDER FILED MAY 12. 2004 Plaintiff Little Rock School District (LRSD) for its Notice of Filing states: 1. In response to the Courts Order filed May 12, 2004, attached is a Glossary of Acronyms and Educational Terms. Respectfully Submitted, LITTLE ROCK SCHOOL DISTRICT Friday, Eldredge & Clark Christopher Heller (#81083) 2000 Regions Center 400 West Capitol Little Rock, AR 72201-3493 (501) 376
^CL14---------- BYk Christopher Hell Page 1 of 2CERTIFICATE OF SERVICE I certify that a copy of the foregoing has been served on the following people by depositing a copy of same in the United States mail on May 24, 2004: Mr. John W. Walker JOHN W. WALKER, P.A. 1723 Broadway Little Rock, AR 72201 Mr. Mark T. Burnette Attorney at Law 1010 W. 3^ Little Rock, AR 72201 Mr. Robert Pressman 22 Locust Avenue Lexington, MA 02173 Mr. Sam Jones Wright, Lindsey & Jennings 2200 Nations Bank Bldg. 200 West Capitol Little Rock, AR 72201 Ms. Ann Marshall Desegregation Monitor 1 Union National Plaza 124 W. Capitol, Suite 1895 Little Rock, AR 72201 Judge J. Thomas Ray U. S. District Courthouse 600 West Capitol Little Rock, AR 72201 Mr. Steve Jones JACK, LYON & JONES, P.A. 425 W. Capitol, Suite 3400 Little Rock, AR 72201-3472 Mr. Mark A. Hagemeier Office of the Attorney General 323 Center Street 200 Tower Building Little Rock, AR 72201 istopher Heller Page 2 of 2 IGLOSSARY OF ACRONYMS AND EDUCATIONAL TERMS Below are identifications and/or definitions of acronyms and other educational terms that appear in exhibits. While most of the acronyms and terms are generically defined and equally applicable to most school districts in Arkansas, many are defined specifically in relation to the Little Rock School District. ACSIP (Arkansas Comprehensive School Reform Improvement Plan) Plan required by State which specifically sets steps for school improvement AFRAMER (African-American) ALP (Alternative Language Program) - Another name for ESL ALT (Achievement Level Tests) - Tests the LRSD developed, with the assistance of a commercial testing firm, for the purpose of measuring student achievement growth within a school year. The test items were selected from a menu in the test firm's item bank, so all the questions had been used numerous times in schools across the country. Students in grades 3-11 took these tests in the fall and spring of each year. The LRSD discontinued the ALTs in September 2002, ANCOVA (Analysis of Covariance) ANOVA (Analysis of variance) - Statistical test with one outcome AP (Advanced Placement) - High-level courses with curriculum developed by College Board which allows students to test for earned college-level credit while in high school. AR (Accelerated Reader) - A program based on the premise that students become more motivated to read if they are tested on the content of the books they have read and are rewarded for correct answers. Students read books at predetermined levels of difficulty, individually take a test on a computer, and receive some form of reward when they score well. AYP (Adequate Yearly Progress) - Amount of improvement in proficiency required each year to reach total proficiency under NCLB (2013). Benchmark Examination - One of the criterion-referenced examinations implemented by the Arkansas Department of Education (ADE) for all Arkansas public schools in the 4th, 6th, Sth, and 11th grades and in selected high school courses. The tests are based on the state's curriculum as outlined in the curriculum frameworks. Test results are categorized as Below Basic, Basic, Proficient, and Advanced. BL (Balanced Literacy) - An approach to literacy instruction that focuses on providing instruction that addresses students individual strengths and needs through whole group and flexible grouping to enhance student development in all of the language arts areasreading, writing, spelling, listening, and speaking.CAP (Concepts about Print) One of the assessments included in the Observation Survey Assessment which assesses childrens knowledge of book concepts. CAT (Collaborative Action Team) A process designed to increase stakeholders involvement in schools. CBL (Calculator-based Laboratories) Probes used to collect data for classrooms. CLT (Campus Leadership Teams) A term used to refer to school-based leadership committees CMP (Connected Mathematics Project) - Mathematics curriculum resource used in Grades 6- 8 in Little Rock School District CREP (Center for Research in Educational Policy) - This is an organization based at the University of Memphis that conducts program evaluations for educational organizations. Dr. Steve Ross and Dr. John Nunnery are two researchers for CREP. CRT (Criterion Referenced Tests) - Tests that LRSD curriculum specialists, teachers, and other staff developed using the state's curriculum frameworks and the district's curriculum to guide item development. CSR (Comprehensive School Reform) - A whole school reform model DI (Direct Instruction) - A reading program that uses very explicit instructional language and follows a highly prescriptive program of instruction that is implemented according to a predetermined scope and sequence of skills DIBELS (Dynamic Indicators of Basic Early Literacy Skills) - This is a system utilizing a variety of assessments to monitor a childs progress in developing specific literacy skills which have predictive value for future reading achievement. The assessments include, but are not limited to, letter identification, phoneme segmentation, and oral reading fluency. DRA (Developmental Reading Assessment) - The second of two assessments given to LRSD students in grades K-2. This assessment consists of stories that increase in difficulty as the child's reading ability increases. Students are evaluated on a variety of reading skills, including comprehension. DSA (Developmental Spelling Assessment) - An assessment to monitor student progress along a spelling developmental continuum ELLA (Early Literacy Learning in Arkansas) A statewide three-year staff development process designed to assist teachers in grades K-2 in implementing instructional techniques that support emergent learners. ELLA helps enhance teachers' understanding of how students learn to read and encourages them to use a balanced literacy approach in the classroom. EOC (End-of-course exam) - State-developed criterion-referenced tests implemented in Arkansas schools as part of the Arkansas Comprehensive Testing, Assessment, and Accountability Program (AT AAP). Currently, end-of-course exams are administered only in Algebra I and geometry. EXPLORE - An American College Testing (ACT) program designed to help Sth and 9th graders examine a broad range of options for their future. EXPLORE helps prepare students for their high school course work as well as their post-high school choices. ESL (English as a Second Language) - Refers to students for whom English is not their native language EYE (Extended Year Education) - Applies to schools with atypical school calendars without a long summer break. EEPE (Fluent English Proficient Exited) - students who are released from ESL program due to proficiency in English GT (Gifted and Talented) HBE (Home-based Educators) - employees of the Home Instruction for Parents of Preschool Youngsters (HIPPY) Program HIPPY (Home Instruction for Parents of Preschool Youngsters) - A parent-involvement readiness program for young children The program, which has been operating in the United States since 1984, offers home-based early childhood education for three-year-old children, working with their parent(s) as their first teacher. The HIPPY program provides parents with carefully developed materials, curriculum, and books designed to strengthen their children's early literacy skills and their social, emotional, and physical development. HLM (Hierarchical Linear Model) HSCP (Home, School, and Community Partnership) - A precursor to the Collaborative Action Team (CAT) HSTW (High Schools That Work) - A school-wide reform model for high schools that is based on the key practices of successful high schools IRC (Instructional Resource Center) - Offices of curriculum staff for LRSD. ITBS (Iowa Test of Basic Skills) - Norm-referenced assessment currently used by LRSD replacing Stanford Achievement TestJR TEAMS (Joint Recruiting and Teaching for Effecting Aspiring Minorities in Science) - A two week multidisciplinary pre-college science and engineering program offered through a partnership with the University of Arkansas at Little Rock aimed at increasing the number of minority students pursuing degrees in science and engineerinj LEP (Limited English Proficient) - Identifies students not proficient in English LPAC (Language Proficiency Assessment Committee) LPTQ - Literacy Program Teacher Questionnaire MANOVA (Multiple Analysis of Variance) - Statistical tests with multiple outcomes MSS - (Middle School Survey) - A survey completed by teachers and students on the implementation of the middle school model. NALMS (Not Assessed Language Minority Students) NCE (Normal Curve Equivalent) - A type of standard score, NCE scores are normalized standard scores on an equal interval scale from 1 to 99, with a mean of 50. The NCE was developed by RMC Research Corporation in 1976 to measure the effectiveness of the Title I Program across the United States. An NCE gain of 0 means that the Title I Program produced only an average gain or the expected gain if there was no Title I Program. (Students must answer more items correctly on the posttest than on the pretest in order to maintain the same NCE.) All NCE gains greater than 0 are considered positive. NCLB (No Child Left Behind) - Federal legislature requiring vast assessment and increased standards for American public schools NCTM (National Council of Teachers of Mathematics) An organization of math teachers and specialists that has provided the standards for K-12 mathematics NPR (National Percentile Rank) - National percentile ranks indicate the relative standing of a student in comparison with other students in the same grade in the norm (reference) groups (in this case, the nation) who took the test at a comparable time. Percentile ranks range from a low of 1 to a high of 99, with 50 denoting average performance for the grade. The percentile rank corresponding to a given score indicates the percentage of students in the same grade in the norm group obtaining scores equal to or less than that score. For example, a student earning a percentile rank of 62 achieved a score that was equal to or better than the scores earned by 62% of the students in the national sample. NSES (National Science Education Standards) - The standards established for K-12 science educationNSF (National Science Foundation) - A government entity created in 1950 to promote excellence in science and to fund research. The LRSD received funds from NSF through a multiyear grant to improve mathematics and science instruction and achievement, naming the program Comprehensive Partnerships for Mathematics and Science Achievement (CPMSA). Grant funding ended August 31,2003.' NWEA (Northwest Evaluation Association) - A company that developed the Achievement Level Tests OTE (Onward to Excellence) A whole school restructuring model PD (Professional Development) - Term used to describe the training provided to teachers to enhance their instructional or classroom management skills. PHLOTE (Primary Home Language other than English) PLAN - An American College Testing (ACT) guidance resource for 10th graders. PLAN helps students measure their current academic development, explore career or training options, and make plans for the remaining years of high school and post-graduation years. As a pre-ACT test, PLAN is a good predictor of success on the ACT. Typically, PLAN is administered in the fall of the sophomore year. PRE (Planning, Research, and Evaluation) - A department of the Little Rock School District Pre-AP (Pre-Advanced Placement) - Courses designed for middle school and high school to prepare students for success in Advanced Placement level courses. Pre-K-3 (Pre-kindergarten through 3^^ Grade) RIT (Rausch Unit) - a type of scaled score. RR (Reading Recovery) - An intensive early-intervention literacy program developed in New Zealand and used in this country for many years. The program is based on helping children with poor reading readiness skills develop the skills common to proficient readers. SAIP (Student Academic Improvement Plan) - A personalized plan required by State for lower-achieving students on ACTAAP Benchmark tests Includes both areas of deficiencies and plans for remediation. SAT 9 (Stanford Achievement Test, 9tb Edition) - A general education test used widely across the United States. It compares a student's performance on the test to a representative national norm group of students. For many years, the publisher of SAT-9 has had a contract with the ADE to provide tests to all students in the state's public schools in grades five, seven, and ten. The results are widely reported for every school district in the state, and each district receives data in varying formats to allow analysis of student performance by school, class, gender, race, or wealth.(Beginning in the 2003-04 school year, the state will require a similar nationally-normed test, the Iowa Tests, rather than the SAT.) SEDL (Southwest Educational Development Laboratory) - A private, not-for-profit education research and development corporation based in Austin, Texas. SEDL works with educators, parents, community members, and policymakers in the southwestern states to develop and implement effective strategies to address pressing educational problems. SEM (Science, Engineering, and Mathematics) SFA (Success for All) - A school-based achievement-oriented program for disadvantaged students in pre-K through grade five. The program is designed to prevent or intervene in the development of learning problems in the early years by effectively organizing instructional and family support resources within the regular classroom. Specifically, the goal of Success for All is to ensure that virtually every student in a high-poverty school will finish the 3rd grade with grade-level reading skills. SLET (Secondary Literacy Evaluation Team) SMART (Summer Mathematics Advanced Readiness Training) - This is a two-week halfday summer program for rising 8* and 9* grade students who will be enrolled in Algebra I during the upcoming school year. SMART provides opportunity for students to gain the knowledge, skills, and confidence needed to succeed in Algebra I. SpEd - Special Education SREB (Southern Regional Educational Board) - A private, not-for-profit education research and development corporation based in Atlanta, GA SREB works with schools, educators and policymakers in the southern states to develop and implement effective strategies to address pressing educational problems. One school-wide reform model, developed and sustained by SREB, is High School That Work (HSTW). SS (Scaled Score) - A type of standard score. Scaled score is calculated based on the difficulty of the questions and the number of correct responses. Scaled scores are useful for comparing student performance over time and across grades. All norm referenced scores are derived from the Scaled Score. Standard Score - Standard scores are a universally understood score system. Standard scores are used to place raw scores in context. For example, a raw score on a test doesnt mean much because it isn't compared to anyone or not compared to any scale. Standard scores offer two advantages to the student over conventional II. raw scores. It standard scores take into account the relative difficulties of various exams and assignments standard scores make it possible to measure improvementTAP (Teacher Advancement Program) - A strategy to attract, retain, motivate, and develop talented people to the teaching profession by rewarding good teachers with higher salaries. THRIVE - (Project THRIVE, a follow-up component to SMART) - This is a Saturday academy for students who are enrolled in Algebra I. Students participate in ten (10) Saturday sessions during the school year. Two primary goals of Project THRIVE are 1) to strengthen mathematical skills required to be successful in Algebra I, and 2) to prepare students for the State End-of-Course examination in Algebra I. URM (Underrepresented Minority Populations) - Includes American Indian/Alaskan Native, Black or African-American, and Hispanic or Latino. VOC - (Writing vocabulary) - One of the assessments included in the Observation Survey Assessment which WRAT (Wide Range Achievement Test) Z-scores - A test score that is converted to a common scale wherein scores from sets of data with different units can be compared.IN THE UNITED STATES DISTRICT COURT EASTERN DISTRICT OF ARKANSAS WESTERN DIVISION 7Z//V LITTLE ROCK SCHOOL DISTRICT V. LR-C-82-866 PULASKI COUNTY SPECIAL SCHOOL DISTRICT NO. 1, ET AL. MRS. LORENE JOSHUA, ET AL. RECZiVED DEFENDANTS KATHERINE KNIGHT, ET AL. Jl/N - 9 200^ INTERVENORS OFFICE OF INTERVENORS 07 The Joshua Intervenors' Proposed Findings of Fact and Conclusions of Law Concerning the LRSD's Implementation of the Section 2.7.1 Compliance Remedy On September 13, 2002, this court held that the LRSD had failed to substantially comply with Section 2.7.1 of the agreed upon desegregation and education plan. [Mem. Opin. at 150-60] Accordingly, the court set forth a Compliance Remedy. II [Id. at n 170-72] This court's September 2002 opinion identified the purpose of Section 2.7.1, the importance of substantial compliance with its terms. and the capacity which the LRSD must demonstrate as one element of its burden to justify the termination of the court's supervision. This court wrote: . . . . I find that the purpose of Sec. 2.7.1 was to make sure that the programs under Sec. 2.7 actually worked to improve the academic achievement of Af rican-american students. further find that LRSD's 2.7.1 was crucial to its commitment substantial compliance with Sec. to improve the academic I 1achievement of African-American students
for, without performing a rigorous annual assessment of each of the many dozens of programs implemented under Sec. 2.7, it would be impossible to determine which programs were working and should be continued and which programs were not working and should be discontinued, modified, emphasis in original] replaced with new programs [at 150
supervision and monitoring conclude that the court should continue of LRSD's compliance with this crucially important section of the Revised Plan in order to ensure that LRSD has in place an effective assessment program that will allow it to identify and improve those programs that are most effective in remediating the academic achievement of African American students, [at 168] I These elements of the court's opinion help to frame the issues presented by the Joshua Intervenor's opposition to the LRSD effort to be released from court supervision, heard by the court on June 14-15, 2004. Based upon the record, the court enters the following findings of fact and conclusions of law. 1 I. Findings of Fact A. The Lack of Capacity of the LRSD to Perform the Requisite Assessments and Evaluations (1.) Based upon the facts set forth in paragraphs 2 through 26, the court finds that the LRSD has failed to TI [demonstrate] that a program assessment procedure is in place that can accurately measure the effectiveness of each program implemented under Section 2.7 in improving the academic achievement of African-American students
. J! ["Compliance Remedy, TI Mem. Opin. at 170
see also id. at 168] [Haney, Hunter, Jones, Marshall testimony] 1 LRSD and Joshua Intervenors' exhibits are cited LRX at and JX at names. --, respectively. Witnesses are cited by their last 2The Lack of Adequate Staff (2.) In its ruling of September 13, 2002, this court cited the recognition of the school board and upper echelon administrators that the LRSD had been without the capacity to prepare what the court termed in-depth and analytic program evaluations. [Mem. Opin. at 15 6
see id. at 153 (Dr. Lesley)
at 156-57 (school board)
at 157 (Superintendent Carnine)]
at 159 (Dr. Lesley). (3.) The LRSD Compliance Plan was heavily dependent on actions by former Associate Superintendent Bonnie Lesley. n [LRX 3 ("Action Plan Timeline at 15-16)] Doctor Lesley left the district for employment out-of-state on March 14, 2003. [JX 11 at 5] The slow pace of filling her position played substantial part in continuing the lack of adequate staffing for the assessment \ evaluation task. [Id. (ODM notes filling of position on an interim basis on June 26, 2003)] [Jones and Marshall testimony] (4.) Overall, the evidence establishes that subsequent to the court's entry of the Compliance Remedy, the LRSD has continued to have an inadequately staffed evaluation\assessment capacity. [JX 11 at 2 (third paragraph), 5, 6, 16 (second paragraph) (ODM report. March 30, 2004)
Jones and Marshall testimony] The ODM report states in part: ri In the summer of 2001, the associate superintendent who led PRE had resigned and that position had remained empty. As a result, the top positions in both PRE and the instructional division were vacant at the critical time for preparing program evaluations. omitted] IT [JX 11 at 16] PRE II [footnote Planning, Research and Evaluation. refers to the Department of a 2 2 3The Failure to Identify the Programs Subject to the Compliance Remedy____________________________ (5.) In the opinion of September 13, 2002, this court found that the LRSD had identified many dozens of programs [as] implemented under Section 2.7 [of the agreed upon Plan] . . n [Mem. Opin. at 150] The court's Compliance Remedy provides in part as follows: A. For the entire 2002-03 school year and the first semester of the 2003-04 school year, through December 31, 2003, LRSD must continue to assess each of the programs implemented under Section 2.7 to improve the academic achievement of African- TXmerican students. . . [Mem. Opin. at 170
emphasis added] Nevertheless, despite inquiries from ODM, the LRSD never identified, with clarity, the programs which it deems to be subject to this mandate. [JX 11 at 23 (ODM report, March 30, 2004)
Jones testimony] Standards for Conducting Evaluations (6.) In the light of the court's opinion [Mem. Opin. at 151- 52
153
156-58], the LRSD properly concluded [LRX 3 at 7 (LRSD Compliance Plan)] that it must each year complete some comprehensive evaluations of key parts of the curriculum fl designed to improve and remediate the academic achievement of African- 7\merican students . If [Plan Section 2.7] (7.) In 2000, Dr. Ross met with the LRSD Compliance Committee. A part of the discussion is described in the ODM report, March 30, 2004, as follows: . . [Dr. Ross] also described the program evaluation process, which included classroom observation plan developed at the University of Memphis. The observations were to ensure that programs were being consistently implemented in the classrooms throughout the district a 4[JX 11 at 3
Jones testimony] (8.) Dr. Ross prepared for the LRSD a document, dated December n 3, 2002, regarding the completion of of the 14 page 148 8 evaluations (that is evaluations listed on page 148 of the March 2002 interim compliance report). It IS titled Guidelines for Completing Eight Program Evaluations in Little Rock School District. I! [JX 6] The document articulates. among others. the following premise [JX 6 at 1]
(9.) Program evaluations that focus predominately on student achievement outcomes while sufficient lacking implementation data have reduced value due to inability to determine the nature of the 'treatment.' The study will also fail to inform policymakers about the practicality of the program, how it was used and reacted to by stakeholders, or whether and\or how it needs to be improved to impact at-risk learners. On October 10, 2002, the LRSD school board adopted Regulation IL-Rl titled "Program Evaluation Agenda. The Regulation sets forth standards and procedures for the content of program evaluations in the LRSD. [JX 2] (a) LRSD Regulation IL-Rl [JX 2 at 3] identifies the need for the evaluation process to satisfy "accuracy standards. including one concerning tl program documentation. 11 n H Program Documentation. The program being evaluated should be described and documented clearly and accurately so that it is identified clearly........ (b) LRSD Regulation IL-Rl also contains the following The LRSD concluded that 6 of the 14 IT page 148 evaluations "without additional work could be approved by the school board . [LRX 3 at 5] Dr. Ross' Guidelines addressed the completion of the other 8 "page 148 evaluations." [JX 6 at 12] evaluations. TT 3 5provision: "Program Evaluation Procedures [JX 2 at 4-5] 6.C. Write a clear description of the curriculum\instruction program that is to be evaluated, schedule of its implementation. with information about the (c) Regulation IL-Rl provides in part (JX 2 at 5) that the first meetings [of the evaluation team] will be devoted to the following tasks . . D. Agree on any necessary research questions that need to be established in addition to the question, this curriculum\instruction program been ineffective in improving and remediating the academic achievement of African-American students?' (See Policy IL, 2.7.1 of the Revised Desegregation and Education Plan, and Judge Wilson's Compliance Remedy.) Thus, LRSD policy recognized that the court's Compliance remedy required a focus on individual programs (". . . program (10.) LRSD Policy IL ("Evaluation of Instructional Programs") [JX 1] provides that all program evaluations will follow standards established by the National Joint Committee on Standards for Education Evaluation. TI Policy IL-Rl further identifies these n ' Has . this . standards as The Program Evaluation Standards, 2nd Edition: How to Assess Evaluations of Educational Programs (Thousand Oaks, CA: Sage Publications). [JX 2 at 1] These standards include the following content in the section on "accuracy standards TI [at 125, 127-28]: STANDARD The program being evaluated should be described and documented clearly and accurately. so that the program is clearly defined. Overview It IS necessary for the evaluator to gain solid understanding of the program being evaluated, including both the way it was intended to be and the way it actually was a 6implemented, and to convey this description to others. Failure to gain such understanding will lead to an evaluation that, when completed, is likely to be of questionable use. A valid characterization of a program as it actually was implemented will describe its unique features and component parts in order to facilitate comparisons of the program with similar programs. A good description of the program will also facilitate attempts to associate components of the program with its effects. * GUIDELINES A. Ask the client and the other stakeholders to describe orally, and, if possible, in writing the intended and the actual program with reference to such characteristics as personnel, cost, procedures. location, facilities, setting, activities, objectives, nature of participation, and potential side effects .... C. Engage independent observers to'describe the program if time and budget permit. D. Set aside time at the beginning of the evaluation to observe the program and the staff and participants who are involved .... The Literacy Evaluation (filed by LRSD on March 12, 2004) (11.) The LRSD offers as one comprehensive evaluation the "Little Rock School District Literacy Program Evaluation." The LRSD provided or approved a list of research questions for this study not includincf the question quoted in para. 9(c), identified by the LRSD as a necessary element of any evaluation to be a part of the effort to satisfy the court's Compliance Remedy. 4 Where the focus was to be the impact of individual programs on African American achievement and the possible need for program changes. this omission led to an evaluation with an insufficient focus on 4 See Literacy Evaluation at 1
at 4 (indicating that question most relevant to the Compliance Remedy was given lesser emphasis). 7particular programs and their impact on the intervenor class. (12.) The Literacy Evaluation contains insufficient description of the program(s) being evaluated to satisfy LRSD or professional standards. See Literacy Evaluation at 10-11
paragraphs 9-10, supra
Haney, Hunter, Jones testimony. This IS particularly the case at the middle school and high school levels. [Literacy Evaluation at 11] Interviews of middle and high school teachers revealed a lack of knowledge of any literacy plan at those levels. [Literacy Evaluation at 7, 13, 43] (13.) The Literacy Evaluation does not provide senior administrators or the school board information on particular programs adequate to determine whether any particular program should be eliminated, modified, or better implemented. [Hunter and Jones testimony] There are evaluation models allowing a focus on individual programs. [Haney and Hunter testimony
LRX 16 at 6] (14.) The Literacy Evaluation provides scant information on the extent of implementation of any particular program. As Sec. 2.7.1 refers to "modifying how the program is implemented," this deficiency is highly significant. [Haney, Hunter testimony] (15.) The Literacy Evaluation is in the main an evaluation of student test scores. rather than an evaluation of the impact of particular education programs. [Literacy Evaluation at 44-47
Haney and Hunter testimony] The Literacy Evaluation is marked by several technical problems (absence of data on use of teacher questionnaires
lack of demographic information on teachers in focus groups
inadequate data on student whose files were excluded 8from analyses). [Hunter testimony] (16.) There are at least two problems in the analyses of the trends in African American students' scores on successive version of Arkansas benchmark tests. [Literacy Evaluation at 44-45] There IS no discussion of "equating successive versions (that IS , considering whether later versions are of comparable rigor). There is no consideration of whether the pattern reported at upper grades is attributable to the dropping out, disproportionately, of black students with weaker achievement levels. [Haney and Marshall testimony
JX 13] (17.) To satisfy professional standards for evaluations. a report that addresses progress on standardized tests should include other data bearing on the presence or absence of academic progress. such as grade to grade progression data (that is, whether students are being promoted or retained) and drop out data. [Haney and Marshall testimony] (18.) The Literacy Evaluation is deficient when measured against the standards earlier articulated by Dr. Ross . See paragraphs 7-8. The text of the Literacy Evaluation shows that it n focus[es] predominately on student achievement outcomes while lacking sufficient implementation data . . n The description of programs is exceedingly terse and, at grade levels 10-12, almost non-existent. [Literacy Evaluation at 10-11] It reflects no observation of classrooms by outside observers to assess actual program implementation. The latter problem is a consequence of the schedule for the evaluation adopted by the LRSD, as well as the 9inadequacy of funding (not sufficient to pay for classroom observation). This study can not help to answer the question n whether and\or how [the literacy program] needs to be improved to n impact at-risk learners. LRSD Regulation IL-Rl [JX 2] includes as one criterion for identifying evaluation topics the following question [at 4] : Can the results of the evaluation influence decisions about the program? II See also LRX 16 at 6 (memorandum by Dr. Ross dated April, 2004 recognizing the parameters of the Literacy Evaluation). The Math-Science Evaluation (filed by LRSD on March 12, 2004) (19. ) The LRSD offers as one comprehensive evaluation "An Evaluation of Mathematics and Science programs in the Little Rock School District from 1998 to 2003. The Math-Science Evaluation contains insufficient description of the program(s) being evaluated to satisfy LRSD or professional standards. See Math-Science Evaluation at 5-10
paragraphs 9-10, supra
Haney, Hunter, Jones testimony] (20.) The Math-Science Evaluation does not provide senior administrators or the school board information on particular programs adequate to determine whether any individual program should be eliminated, modified, or better implemented. [Hunter and Jones testimony] (21. ) The Math-Science Evaluation identifies methods for determining the extent of implementation of educational programs. but does not provide results for the math-science program. [Math- Science-Evaluation at 11] As Sec. 2.7.1 refers to "modifying how 10the program is implemented, IT this deficiency is highly significant. [Haney, Hunter testimony] (22.) The Math-Science Evaluation is in the main an evaluation of student test s
This project was supported in part by a Digitizing Hidden Special Collections and Archives project grant from The Andrew W. Mellon Foundation and Council on Library and Information Resoources.