Memorandum of Understanding This Memorandum of Understanding (hereinafter MOU), effective the seventeenth day of October 2005 (hereinafter the Effective Date ), is entered into by and between James S. Catterall (hereinafter Evaluator), Graduate School of Education & Information Studies, University of California, Los Angeles, CA 90024, and Little Rock School District (hereinafter Sponsor), whose offices are located at 810 West Markham Street, Little Rock, AR 72201. WITNESSETH WHEREAS, Sponsor, to comply with the June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division, and Program Evaluation Standards, will hire outside consultants to prepare formal, step-two evaluations and W HEREAS, Evaluator possesses unique knowledge and experience relating to such formal step-two evaluations and Program Evaluation Standards NOW, THEREFORE, in consideration of the premises and the mutual covenants and conditions ' hereinafter recited, the Sponsor and Evaluator do hereby agree as follows: 1. Definitions For purposes of this MOU, the following definitions apply: 1.1 Compliance Remedy shall mean the entire June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division in Little Rock School District v. Pulaski County Special School District No. I et al., Mrs. Lorene Joshua et al. and Katherine Knight et al. Intervenors (Exhibit A). 1.2 MOU Period shall mean the period commencing on the Effective Date of this MOU and terminating on October 31, 2006. The tenn of this MOU may be extended by the mutual written consent of the duly authorized representatives of Evaluator and Sponsor. 1.3 Formal step-two evaluation (hereinafter Evaluation) shall mean a summative evaluation of Sponsors A+ Program (hereinafter A+) program conducted by the Evaluator according to the Sponsors Comprehensive Program Assessment Process and described more fully in Exhibit B, which is incorporated herein by reference. Evaluation ascertains particularly performance of African-American students. 1.4 Comprehensive Program Assessment Process (Exhibit B) shall mean the process required by the Compliance Remedy, adopted by Sponsors Board of Directors on December 16, 2004, and incorporated as Appendix B in the first quarterly written update by the Sponsor to the Office of Desegregation Monitoring and Joshua, December 1,2004. 1.5 with this MOU. Evaluation Funds shall mean those funds paid by the Sponsor to the Evaluator in accordance 1.6 Evaluation Team shall mean the Evaluator and any personnel under the Evaluators direction and control who are supported in whole or in part by the Evaluation Funds.1.7 Planning, Research, and Evaluation (hereinafter PRE) shall mean Sponsors department who shall represent the Sponsor and oversee the Evaluation. 1.8 Proprietary Information shall mean any data, information, concepts, routines, artwork, design work, advertising copy, specifications, or improvement that is commercially valuable not generally available to or known in the industry and belonging to Evaluator. Proprietary Information shall not include information which: (a) is or becomes a part of the public domain through no act or omission of the receiving party (b) was in the receiving party's lawful possession prior to the disclosure and had not been obtained by the receiving party either directly or indirectly from the disclosing party (c) is lawfully disclosed to the receiving party by a third party without restriction on disclosure (d) is independently developed by the receiving party or (e) is disclosed by operation of law. 1.9 Confidential Infonnation shall mean data or information related to the identities of individuals such as Sponsors students, teachers, administrators including PRE, or Board of Directors guardians or relatives of such students community members or any other individuals related to the Evaluation. 2. Evaluation 2.1 2.2 During the MOU Period, the Evaluator shall conduct an Evaluation of Sponsors A+ on behalf of Sponsor in accordance with the Compliance Remedy (Exhibit A), within the mutually agreed schedule (Exhibit C), and substantially in accordance with the terms and conditions of this MOU. The Evaluations name is A+. 3. Payments 3.1 3.1.1 3.1.2 3.1.3 Sponsor shall pay Evaluator the Evaluation Funds in the following manner: Amount: Rate: Travel: 3.1.3 3.1.4 To be Paid: Invoices: Not to exceed Forty-five Thousand dollars (US $45,000.00). $1,500 per day for effort, plus travel expenses. Travel expenses for travel between Los Angeles, CA and Little Rock, AR including Little Rock accommodations and meals not to exceed $6,000.00 (economy class airfare only). Upon invoice for effort (days) expended, stated in invoice. Shall state days of effort. 3.2 Payee Taxpayer ID Address: Payments under the terms of this MOU shall be made by check payable to: James S. Catterall 141-38-3478 120 N. Topanga Canyon Blvd., Suite 203 Topanga, CA 90290 3.3 Anything herein to the contrary notwithstanding, should this MOU terminate early pursuant to Article 8 herein, Evaluator and Sponsor shall agree upon the estimate of the percentage of completeness of the Evaluators sendees rendered hereunder as of the date such notice is given. The Sponsor shall pay the Evaluator a pro rata fee based upon the agreed estimated percentage of completion such that payment will at least include all project costs incurred by Evaluator prior to the date of early termination. 10/17/05 24. Non-Exclusivity and Disclosure Nothing in this MOU shall be construed to limit the freedom of the Evaluator to engage in similar research performed independently under other grants, contracts, or agreements with parties other than Sponsor. If the Evaluator undertakes any research or evaluation that uses data from this Evaluation, Evaluator shall disclose such research or evaluation to PRE. 5. Publication and Disclosure The Evaluator shall have the right to present at symposia and national or regional professional meetings, and to publish in scientific or other publications, the results of the Evaluations conducted under this MOU. Evaluator agrees to make such publication(s) conveniently available to PRE. 6. Confidentiality and Non-Disclosure The Sponsor and Evaluator expressly acknowledge that Evaluator may need to provide to Sponsor information that Evaluator considers to be Proprietary Information. Sponsor agrees to hold Proprietary Information in strict confidence during the term of this MOU and for a period of two years after the tennination or expiration of this MOU except as required by law. Similarly, the Evaluator shall protect Confidential Information and prevent its disclosure in any manner, except as required by law. Not later than two years after the termination or expiration of this MOU, Evaluator shall destroy all Confidential Information or return it to Sponsor. 7. Ownership and Patents The Evaluator shall have sole and exclusive ownership rights to any intellectual property, including but not limited to copyrights and/or inventions of a product, device, process, or method, whether patentable or unpatentable (an Invention), deriving from the Evaluators efforts, exclusive of any data or information, arising out of the Evaluation. Data or information furnished to Evaluator by Sponsor shall remain the property of the Sponsor. 8. Termination This MOU shall remain in effect for the MOU Period unless extended in accordance with the terms of this MOU, as set forth in Section 1.2. In the event that either Evaluator or Sponsor shall be in default of any of its obligations under this MOU and shall fail to remedy such default within thirty (30) days after receipt of written notice thereof, the party not in default shall have the option of canceling this MOU by giving thirty (30) days written notice of termination to the other party. Termination of this Agreement shall not affect the rights and obligations of the parties, which shall have accrued prior to termination. No tennination of this MOU, however effectuated, shall release either party from its rights and obligations under Articles 3 through 17 herein. 3 10/17/059. Indemnification Sponsor agrees to defend, indemnify, and hold harmless the Evaluator and its officers and employees (all such parties are hereinafter referred to collectively as the Indemnified Parties) from and against any and all liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including reasonable attorneys fees and court costs) arising directly or indirectly out of the Evaluation or the design, manufacture, sale or use of any embodiment or manifestation of the Evaluation, regardless of whether any and all such liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs) arise in whole or in part from the negligence of any of the Indemnified Parties. Notwithstanding the foregoing. Sponsor will not be responsible for indemnification of Evaluator pursuant to this Article 9 for any liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs) which arise solely from: (a) the gross negligence or intentional misconduct of Evaluator or (b) actions by Evaluator in violation of applicable laws or regulations (c) violations of this MOU. or The Sponsor agrees to provide a diligent defense against any and all liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs), brought against the Indemnified Parties with respect to the subject of the indemnity contained in this Article 9, whether such claims or actions are rightfully or wrongfully brought or filed. Evaluator shall be indemnified by Sponsor after Evaluator has completed the following: (a) within a reasonable time after receipt of notice of any and all liability, claims, lawsuits losses, demands, damages, costs, and expenses, or after the commencement of any action, suit, or proceeding giving rise to the right of indemnification, notify Sponsor, in writing, of said liability, claims, lawsuits, losses, demands, damages, costs, and expenses and send to the Sponsor a copy of all papers serv'ed on the Indemnified Party and (b) allow Sponsor to retain control of any such liability, claims, lawsuits, losses, demands, damages, costs, and expenses, including the right to make any settlement. 10. Independent Contractors Sponsor and Evaluator shall act as independent parties, and nothing contained in this MOU shall be construed or implied to create an agency or partnership. Neither Sponsor nor Evaluator shall have the authority to contract or incur expenses on behalf of the other except as may be expressly authorized by collateral written agreements. No member of the Evaluation Team shall be deemed to be an employee of Sponsor. 11. Use of Evaluator Name The use by either Sponsor or Evaluator of the others name or any other names, insignia, symbol(s), or logotypes associated with the other party or any variant or variants thereof in advertising, publicity, or other promotional activities is expressly prohibited, unless required by law or the other party provides written consent. 4 10/17/0512. Severability If any one or more of the provisions of this MOU shall be held to be invalid, illegal, or unenforceable, the validity, legality, or enforceability of the remaining provisions of this MOU shall not in any way be affected or impaired thereby. 13. Waiver The failure of any party hereto to insist upon strict perfonnance of any provision of this MOU or to exercise any right hereunder will not constitute a waiver of that provision or right. This MOU shall not be effective until approved by Evaluator s President or his official designee. Whenever the consent or approval of the Evaluator is required or permitted hereunder, such consent or approval must be given by the Evaluators President or his official designee. 14. Notices Any notice or communication required or permitted to be given or made under this MOU by one of the parties hereto to the other shall be in writing and shall be deemed to have been sufficiently given or made for all purposes if mailed by certified mail, postage prepaid, return receipt requested, addressed to such other its respective address as follows: party at If to Sponsor: Karen DeJamette, Ph.D. Director, PRE Department Little Rock School District 810 West Markham Street Little Rock, AR 72201-1306 Phone (501) 447-3387, Fa.x (501) 447-7609 If to Evaluator: James S. Catterall, Ph.D. Research and Evaluation Office 120 N. Topanga Canyon Blvd. Suite 203 Topanga, CA 90290 15. Assignment Neither Sponsor nor Evaluator shall assign its rights or obligations under this MOU without the prior written consent of the other party. 5 10/17/0516. Entirety This MOU represents the entire agreement of Sponsor and Evaluator, and it expressly supersedes all previous written and oral communications between them. Neither Sponsor nor Evaluator was induced to enter into this Agreement by any statements or representations not contained in this MOU. This MOU may be modified only by written amendment executed by the Sponsor and the Evaluator. 17. Headings The headings of sections and subsections, if any, to the extent used herein are for convenience and reference only and in no way define, limit, or describe the scope or intent of any provision hereof, and therefore shall not be used in construing or interpreting the provisions hereof. IN WITNESS WHEREOF, Sponsor and Evaluator have caused this MOU to be executed in duplicate counterpart original by their duly authorized representatives to be effective as of the Effective Date. By: SPONSOR Signature Darral Paradis, Director Procurement Department Little Rock School District By: EVALUATOR Signe lure James S/catterall, Ph. D. Ic' ''/v-o 0 17-^ ao L> Date Date 6 10/17/05 Exhibit A COMPLIANCE REMEDY The Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division in Little Rock School District v. Pulaski County Special School District No. 1 et al., Mrs. Lorene Joshua et al. and Katherine Knight et al. Intervenors, is incorporated here by reference. Evaluator has a copy. 7 10/17/05Exhibit B COMPREHENSIVE PROGRAM ASSESSMENT PROCESS Little Rock School District 8 10/17/05LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R COMPREHENSIVE PROGRAM ASSESSMENT PROCESS Comprehensive Program Assessment Process Purpose The purpose of these regulations is to provide guidance in the appraisal of programs and to comply w'ith requirements of the US District Court for the Eastern District. They do not necessarily apply to grant-funded programs if the funding source requires other procedures and provides resources for a required evaluation. Criteria for Program Evaluations Policy IL specifies that the evaluations of programs approved in its Board-approved Program Evaluation Agenda will be conducted according to the standards developed by the Joint Committee on Standards for Educational Evaluation. (See Joint Committee on Standards for Educational Evaluation, James R. Sanders, Chair (1994). The Program Evaluation Standards, 2^ Edition: How io Assess Evaluations of Educational Programs. Thousand Oaks, CA: Sage Publications.) There are four attributes of an evaluation: Utility(U) -evaluations are informative, timely, and influential Feasibility (F) -evaluations must be operable in the natural setting and must not consume more resources than necessary Propriety (P) - rights of individuals must be protected Accuracy(A) -evaluations should produce sound information Prospective, controlled, summative evaluations are at one end of a spectrum of activities that review District operations. Other activities m this continuum include formative and less formal and rigorous evaluations, regular and occasional assessments, and fast or brief snapshots . As rigor and formality diminish along the range of reviews, fewcr standards apply. Examples of how the standards apply are found following table, adapted from The Program Evaluaiion Standards, pages 18 and 19: Checklist for Applying the Standards The reader should interpret the information provided in this table with reference both to the Standards (cited above) and the peculiar circumstances of given program reviews. Double plus signs (++) indicate that standards are fully addressed. Single pluses (+) mean that the standard is a concern but not necessarily fully addressed, and zeros (0) point to standards not usually' applicable. Not all summative evaluation will fully satisfy cs'ery standard, and other examples may observe more standards than indicated here. Note, however, that all reviews fully observe human rights and impartial reports 9 10/17/05LITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R Checklist of Evaluation Standards for Examples of Program Reviews Standard____________________ Ul Stakeholder Identification U2 Evaluator Credibility U3 Information Scope & Selection U4 Values Identification U5 Report Clarity U6 Report Timeliness & Dissemination U7 Evaluation Impact Fl Practical Procedures F2 Political Viability F3 Cost Effectiveness Pl Service Orientation P2 Formal Agreements P3 Rights of Human Subjects P4 Human Interaction P5 Complete & Fair Assessment P6 Disclosure of Findings P7 Conflict of Interest P8 Fiscal Responsibility Al Program Documentation A2 Context Analysis A3 Described Purposes and Procedures A4 Defensible Information Sources A5 Valid Information A6 Reliable Information A7 Systematic Information A8 Analysis of Quantitative Data A9 Analysis of Qualitative Data A10 Justified Conclusions AI 1 Impartial Reporting A12 Meta-evaluation Summative evaluations F + -h-F -F-t- F-F -F-F -F+ -F-F -F-F -F-F -F-F -F-F -F-F F-F ++ 10 Informal Assessments 0 -F -F-F -F -F -F T" + -F + -F -F-F -F ++ + -F -F -F-F -l-t- (j Formative Assessments (School Portfolios) -F -F -F -F -F T -F + "o' 0 -F + V F + + + F F T" F F 0 Snapshots 0 0 -F -F -F 0 -F V -F + V -F -F V 0 -F -F -F -F -F -F -F + -F-F + 10/17/05LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R Program Evaluation Procedures The following procedures are established for the evaluation of programs approved by the Board of Education in its annual Program Evaluation Agenda: 1. The Planning, Research, and Evaluation (PRE) Department will recommend to the Superintendent annually, before the budget for the coming year is proposed, the curriculum/instruction programs for comprehensive program evaluation. The recommendation will include d proposed budget, a description of other required resources, and an action plan for the completion of the reports. Criteria for the proposed agenda are as follows: A. Will the results of the evaluation influence decisions about the program? B. Will the evaluation be done in time to be useful? C. Will the program be significant enough to merit evaluation? (See Joseph S. Wholey, Harry P. Hatry and Kathryn Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: Jossey-Bass Publishers. 5-7.) 2. The Superintendent will recommend to the Board of Education for approval the proposed Program Evaluation Agendawith anticipated costs and an action plan for completion. 3. For each curriculum/instruction program to be evaluated as per the Program Evaluation Agenda, the Director of PRE will establish a staff team with a designated leader to assume responsibility for the production of the report according to the timelines established in the action plan approved by the Board of Education. 4. Each team will include, at a minimum, one or more specialists in the curriculum/instruction program to be evaluated, a statistician, a programmer to assist in data retrieval and disaggregation, and a technical writer. If additional expertise is required, then other staff may be added as necessary. 5. An external consultant with expertise in program evaluation, the program area being evaluated, statistical analysis, and/or technical writing will be retained as a member of the team. The role of the external consultant may vary, depending upon the expertise required for the production of the program evaluation. 6. The team leader will establish a calendar of regularly scheduled meetings for the production of the program evaluation. The first meetings will be devoted to the following tasks: A. Provide any necessary training on program evaluation that may be required for novice members of the team, including a review of the Boards policy IL and all of the required criteria and procedures in these regulations, IL-R. B. Assess the expertise of each team member and make recommendations to the Director of PRE related to any additional assistance that may be required. C. Write a clear description of the curriculumdnstruction program that is to be evaluated, with information about the schedule of its implementation. 11 10/17/057. 8. 9. LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R D. E. F. G. Agree on any necessary research questions that need to be established in addition to the question, Has this curriculum/instruction program been effective in improving and remediating the academic achievement of African-American students Generate a list of the data required to answer each research question, and assign responsibility for its collection and production. All available and relevant student performance data should be included. (See Judge Wilsons Compliance Remedy.) Decide who will be the chief writer of the program evaluation. Plan ways to provide regular progress reports {e.g., dissemination of meeting minutes, written progress reports, oral reports to the Superintendents Cabinet) to stakeholders. (See Joellen Killion (2002). Assessing Impact: Evaluating Staff Development. Oxford, OH. National Staff Development Council (NSDC) Robby Champion (Fall 2002). Map Out Evaluation Goals. Journal of Staff Development. 78-79 Thomas R. Guskey (2000). Evaluating Professional Development. Thousand Oaks, CA: Corwin Press Blaine R. Worthen, James R. Sanders, and Jody L. Fitzpatrick (1997). Participant-Oriented Evaluated Approaches. Program Evaluation: Alternative Approaches and Practical Guidelines: 153-169 Beverly A. Parsons (2002). Evaluative Inquiry: Using Evaluation to Promote Student Success. Thousand Oaks, CA: Corwin Press and Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: Jossey-Bass Publishers.) Subsequent meetings of the program evaluation team are required for the following tasks: to monitor the completion of assignments to collaborate in the interpretation and analysis of data to pose any necessary new questions to be answered to review drafts and provide feedback to the writer to formulate recommendations, as required, for program improvement, especially to decide if a recommendation is required to modify or abandon the program if the findings reveal that the program is not being successful for the improvement of African-American achievement to assist in final proofreading and to write a brief executive summary, highlighting the program evaluation findings and recommendations. A near-final copy of the program evaluation must be submitted to the Director of PRE at least one month before the deadline for placing the report on the Boards agenda for review and approval. This time is required for final approval by staff, for final editing to ensure accuracy, and for submission to the Superintendent. When the program evaluation is approved for submission to the Board of Education for review and approval, copies of the Executive Summary and complete report must be made for them, for members of the Cabinet. 12 10/17/0510. 11. 12. 13. LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R The program evaluation team will plan its presentation to the Board of Education on the findings and recommendations. The Director of PRE will prepare the cover memorandum to the Board of Education, including all the required background information: A. B. C. D. If program modifications are suggested, the steps that the staff members have taken or will take to implement those modifications. If abandonment of the program is recommended, the steps that will be taken to replace the program with another with more potential for the improvement and remediation of African-American students. Names of the administrators who were involved in the program evaluation. Name and qualifications of the external expert who served on the evaluation team. Grade-level descriptions of the teachers who were involved in the assessment process (e.g., all fourth-grade math teachers, all eighth grade English teachers, etc.). W hen the program evaluation is approved by the Board of Education, the team must arrange to have the Executive Summary and the full report copied and design a plan for communicating the program evaluation findings and recommendations to other stakeholders. This plan must then be submitted to the Director of PRE for approval. Each program evaluation team will meet with the Director of PRE after the completion of its work to evaluate the processes and product and to make recommendations for future program evaluations. (See Joellen Killion (2002). Evaluate the Evaluation. Assessing Impact: Evaluating Staff Development. Oxford, OH: National Staff Development Council. 46, 123-124.) Approved: December 16, 2004 13 10/17/05LITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R Evaluation Standards Criteria for Program Evaluations Policy IL specifies that the evaluations of programs approved in its Board-approved Program Evaluation Agenda will be conducted according to the standards developed by the Joint Committee on Standards for Educational Evaluation. (See Joint Committee on Standards for Educational Evaluation, James R. Sanders, Chair (1994). The Program Evaluation Standards, 2"** Edition: How to Assess Evaluations of Educational Programs. Thousand Oaks, CA: Sage Publications.) They are as follows: Utility Standards The utility standards are intended to ensure that an evaluation will serve the information needs of intended users. These standards are as follows: Stakeholder identification. People involved in or affected by the evaluation should be identified so that their needs can be addressed. Evaluator .credibility. The people conducting the evaluation should be both trustworthy and competent to perform the evaluation so that the evaluation findings achieve maximum credibility and acceptance. Information scope and sequence. Information collected should be broadly selected to address pertinent questions about the program and should be responsive to the needs and interests of clients and other specified stakeholders. Values identification. The perspectives, procedures, and rationale used to interpret the findings should be described carefully so that the bases for value judgments are clear. Report clarity. Evaluation reports should describe clearly the program being evaluated, including its context and the purposes, procedures, and findings of the evaluation, so that essential information is provided and understood easily. Report timeliness and dissemination. Significant interim findings and evaluation reports should be disseminated to intended users so that they can be used in a timely fashion. Evaluation impact. Evaluations should be planned, conducted, and reported in ways that encourage follow-through by stakeholders, so that the likelihood that the evaluation will be used is increased. Feasibility Standards Feasibility standards are intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal. Practical procedures. Evaluation procedures should be practical so that the disruption is kept to a minimum while needed information is obtained. Political viability. The evaluation should be planned and conducted with anticipation of the different positions of various interest groups so that their cooperation may be obtained, and so that possible attempts by any of these groups to curtail evaluation operations or to bias or misapply the results can be averted or counteracted. Cost-effectiveness. The evaluation should be efficient and produce information of sufficient value so that the resources expended can be justified. Service orientation. Evaluations should be designed to assist organizations to address and effectively serve the needs of the full range of targeted participants. Formal agreements. Obligations of the formal parties to an evaluation (what is to be done, how, by whom, and when) should be agreed to in writing so that these parties are obligated to adhere to all conditions of the agreement or to formally renegotiate it. 14 10/17/05LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R Rights of human subjects. Evaluation design and conduct should respect and protect human rights and welfare. Human interactions. Evaluators should respect human dignity and worth in their interactions with other people associated with an evaluation so that participants are not threatened or harmed. Complete and fair assessments. The evaluation should be complete and fair in its examination and recording of strengths and weaknesses of the program being evaluated so that strengths can be built upon and problem areas addressed. Disclosure of findings. The formal parties to an evaluation should ensure that the full set of evaluation findings, along with pertinent limitations, are made accessible to the people affected by the evaluation, as well as any others with expressed legal rights to receive the results. Conflict of interest. Conflict of interest should be dealt with openly and honestly so that it does not compromise the evaluation processes and results. Fiscal responsibility. The evaluators allocation and expenditure of resources should reflect sound accountability procedures and be prudent and ethically responsible so that expenditures are accounted for and appropriate. Accuracy Standards Accuracy standards are intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine the worth of merit of the program being evaluated. Program documentation. The program being evaluated should be described and documented clearly and accurately so that it is identified clearly. Context analysis. The context in which the program exists should be examined in enough detail so that its likely influences on the program can be identified. Described purposes and procedures. The purposes and procedure of the evaluation should be monitored and described in enough detail so that they can be identified and assessed. Defensible information sources. The sources of information used in a program evaluation should be described in enough detail so that the adequacy of the information can be assessed. Valid information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the interpretation arrived at is valid for the intended use. Reliable information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the information obtained is sufficiently reliable for the intended use. Systematic information. The information collected, processed, and reported in an evaluation should be review systematically so that the evaluation questions are answered effectively. Analysis of quantitative information. Quantitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Analysis of qualitative information. Qualitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Justified conclusions. The conclusions reached in an evaluation should be justified explicitly so that stakeholders can assess them. Impartial reporting. Reporting procedures should guard against distortion caused by personal feelings and biases of any party so the evaluation reports reflect the evaluation findings fairly. Meta-evaluation. The evaluation itself should be evaluated formatively and summativcly against these and other pertinent standards so that its conduct is appropriately guided, and on completion, stakeholders can closely examine its strengths and weaknesses. 15 10/17/05Exhibit C SCOPE OF SERVICES Evaluation of A+ Program This states services and products by the Evaluator, who will conduct an Evaluation of the Sponsors A+ Program and produce reports of that Evaluation. Evaluation questions For this Evaluation, the primaiy questions are: I. Has A+ as implemented in the Little Rock School District improved the academic achievement of students identified by the Sponsor as African-American (AA)? II. Has A+ as implemented in the Little Rock School District decreased the differences between AA students and those identified by the Sponsor as white (W)? III. To what extent does A+ account for changes in student performance? Secondary (step-two) questions are 1. What competing events or programs (relative to A+) explain changes in student performance? 2. What traits of each group explain their performance and differences in performance between them? 3. What changes in A+ do these results indicate to improve the effectiveness of the programs? 4. How will these recommendations improve AA student performance? Evaluation design, data, and products Prior to Evaluators commencing the Evaluation, Sponsor will agree with Evaluator regarding A. theoretical model(s), B. Evaluation design(s), to conform with summative evaluations of the Comprehensive Program C. D. E. F. G. Assessment Process, specific variables for the Evaluation, data adjustments and statistical methods, format(s) of data for use in the Evaluation delivered by the Sponsor to the Evaluator, content of deliverable products (written reports) and their formats, and schedule of ser\'ices and product delivery. The following table is the schedule of services and product delivery. Delivery (200506) November 2005 DecemberJuly 2006 August 1,2006 August 2006_____ September 1, 2006 ___________________________Service and Products__________________________ Evaluator and Sponsor will negotiate MOU and agree on design of the Evaluations, their schedules, and instruments.___________________________________________ Evaluator will observ'e classes, conduct interviews and surveys, and receive test and other data from Sponsor__________________________________________________ Evaluator will submit draft report of results to PRE.___________________________ Evaluator will discuss draft reports with PRE and alter report accordingly.________ Evaluator will submit final report to PRE. For the purpose of invoicing, Evaluator will track his efforts in increments of days or some portion thereof. 16 10/17/05Exhibit D A+ Program Evaluation Primaiy Evaluation Question: 1. Have the A+ Program (A+) been effective in improving and remediating the academic achievement of African-American students? Supplemental (Qualitative/Step 2) Evaluation Questions: 1. What are the quality and level of implementation of intersession instructional strategies? 2. What are the quality and level of implementation of instructional strategies during regular session? 3. What is the level of participation in A+ by African American students relative to other ethnic groups at the school? 4. What are the perceptions of A+ teachers regarding program impacts, strengths, and weaknesses? 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? 6. What are the perceptions of parents/guardians of A+ students regarding program impacts, strengths, and weaknesses? Program Description A+ combines daily arts instruction with academic subjects to boost both self-confidence and achievement. Committed to the Four Beliefs and Eight Essentials as the guiding philosophy for the program, A+ was built on the principle that every child learns better when his/her whole self engages in learning. Thus, A+ lessons stimulate all eight intelligences. Arts enrichment combines with the rich LRSD curriculum. The magic of the A+ program allows light bulbs experiences to illumine students, whether they learn in classroom groups or in movement, music, or visual arts lessons. The magic of the A+ program energizes students with the simple thrill of learning in ways many of them have never experienced before. Included in this scope of A+ is professional development for the faculty. Currently Woodruff Elementary School implements A+. Schools Name Woodruff Number of Teachers Number of Students Percent Africa n- American 21 235 91 Percent Eligible for Free/Reduced Lunch 86 17 10/17/05Proposed Design Primary Evaluation Question 1. Have the A+ lessons been effective in improving and remediating the academic achievement of African- American students? IVhole School. A treatment-control school, pretest-posttest design will be employed. The analysis will control for pretest, gender, ethnicity, and SES. Subsample: Within Woodruff Elementary School, students who participated in A+ will be identified and their achievement gains compared to predicted scores based on school status and student pretest, gender, ethnicity, and SES. Supplemental (Qualitative/Step 2) Evaluation Questions: 1. What are the quality and level of implementation of A+ instructional strategies? 2. What are the quality and level of implementation of A+ instructional strategies? A+ teachers will be interviewed by phone. A+ instruction will be observed. 3. What is the level of participation in A+ lessons by African American students relative to other ethnic groups at the school? Student records/archival data for 2004-05 and 2005-06 will be analyzed. 4. What are the perceptions of A+ teachers regarding program impacts, strengths, and weaknesses? The A+ teacher interviews and the A+ Teacher Survey will address this question via closed-ended and open- ended items. 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? A survey will be administered to program participants. 6. What are the perceptions of parents/guardians of A+ students regarding program impacts, strengths, and weaknesses? A Parent Survey will address this question via a questionnaire including closed- and open-ended items. 18 10/17/05Summary of Data Sources and Participants by Evaluation Question _________Evaluation Question Primary Question: Participants Data Sources 1. What are the effects of participation in A+ on student achievement? All grades at Woodruff Elementary School Benchmark, ITBS, and school records Supplemental Questions: 1. What are the quality and level of implementation of instructional strategies? All A+ teachers Teacher phone interview Classroom observations 2. What are the quality and level of implementation of instructional strategies? 3. What is the level of participation in A+ Programs by African American students relative to other ethnic groups? All A+ classes School records/archival data 4. What are the perceptions of A+ teachers regarding program impacts, strengths, and weaknesses? All A+ teachers A+ teacher interview and survey 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? A+ students A+ student survey 6. What are the perceptions of parents/guardians of A+ students regarding program impacts, strengths, and weaknesses? Parents of A+ students A+ parent survey 19 10/17/05November DecemberMarch MarchApril MayJune July JuneJuly August 1, 2006 August September 1, 2006 Schedule (2005-2006) Planning, refinement, and consultation with PRJ2 and A+ experts and instrument development A+ classroom observations and A+ teacher interviews Survey A+ school teachers and complete A+ teacher interviews Records/archival data analyses Evaluator will receive benchmark test results. Evaluator will analyze data of benchmark tests, surx'eys, and interviews. Evaluator will submit draft report to PRE. Evaluator will receive feedback from PRE and finish final draft. Evaluator will submit final report to PRE. received OCT 2 5 2005 desegregation monitoring 20 10/17/05Cfiri^ecfed Memorandum of Understanding This Memorandum of Understanding (hereinafter MOU), effective the first day of February 2005 (hereinafter the Effective Date), is entered into by and between Education Innovations, LLC (hereinafter Evaluator), a Tennessee limited liability company, and Little Rock School District (hereinafter Sponsor), whose offices are located at 810 West Markham Street, Little Rock, AR 72201. WITNESSETH WHERE.AS, Sponsor, to comply with the June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division, and Program Evaluation Standards, will hire outside consultants to prepare formal, step-two evaluations and WHEREAS, Evaluator possesses unique knowledge and experience relating to such formal step-two evaluations and Program Evaluation Standards NOW, THEREFORE, in consideration of the premises and the mutual covenants and conditions hereinafter recited, the Sponsor and Evaluator do hereby agree as follows 1. Definitions For purposes of this MOD, the following definitions apply: 1.1 Compliance Remedy shall mean the entire June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of .-Arkansas, Little Rock Division in Little Rock School District v. / iilaski C ounty Special School Districi No. I el al., Mrs. Lorene .Joshua et al. and Katherine Knight et al. Intervenors (Exhibit A), which is incorporated herein by reference. 1.2 MOU Period shall mean the period commencing on the Effective Date of this MOU and terminating on November 1, 2005. The term of this MOU may be extended by the mutual written consent of the duly authorized representatives of Evaluator and Sponsor. 1.3 Plincipal Investigator shall mean Steven M. Ross, Ph D., appointed by Evaluator to conduct the step-two evaluations hereunder 1.4 Formal step-two evaluations (hereinafter Evaluations) shall mean summative evaluations of the three programs conducted by the Evaluator according to the Sponsors Comprehensive Program Assessment Process and described more fully in Exhibit B, which is incorporated herein by reference. 1.5 Comprehensive Program Assessment Process (Exhibit B) shall mean the process required by the Compliance Remedy, adopted by the Sponsors Board of Directors December 16, 2004, and incorporated as Appendix B in the first quarterly written update by the Sponsor to the Office of Desegregation Monitoring and Joshua, December 1, 2004. 1.6 Evaluation Funds shall mean those funds paid by the Sponsor to the Ev-aluator in accordance with this MOU. 1.7 Evaluation Team shall mean the Principal Investigator and the research personnel under the Principal Investigators direction and control who are supported in whole or in part by the Evaluation Funds. 4 8 0.'=: 1BI 1.8 Planning, Research, and Evaluation (hereinafter PRE) shall mean the department of the Sponsor who shall represent the Sponsor and oversee the Evaluations. 1.9 Proprietary Information shall mean any data, information, concepts, routines, artwork, design work, advertising copy, specifications, or improvement that is commercially valuable, not generally available to or known in the industry and belonging to Evaluator. Proprietary' Information shall not include information which: (a) is or becomes a part of the public domain through no act or omission of the receiving party (b) was in the receiving party's lawful possession prior to the disclosure and had not been obtained by the receiving party either directly or indirectly from the disclosing party (c) is lawfully disclosed to the receiving party by a third party without restriction on disclosure (d) is independently developed by the receiving party or (e) is disclosed by operation of law. 2.0 Confidential Information shall mean data or information related to the identities of individuals such as Sponsors students, teachers, administrators including PRE, or Board of Directors guardians or relatives of such students community members or any other individuals related to the Evaluations. 2. Evaluations 2.1 2.2 During the MOU Period, the Evaluation Team shall conduct three (3) Evaluations on behalf of Sponsor in accordance with the Compliance Remedy (Exhibit A). Evaluator agrees to perform the Evaluations within a mutually agreed schedule (Exhibit C) and further agree,s to complete Evaluations substantially in accordance with the terms and conditions of this MOU. The three Evaluations are named below. 2.2.1 Reading Recovery (Exhibit D) -7 9 9 9 Compass Learning (Exhibit E) Smart/Thrive (Exhibit F) 3. Payments 3.1 3.1.1 Sponsor shall pay Evaluator the Evaluation Funds in the following manner. Amount: 3.1.2 Rate: To be Paid: 3.1.4 Invoices. Not to exceed one hundred eighty thousand (US$180,000.00) plus reimbursable expenses as indicated below. $1,000 per day for effort. Travel cost to be reimbursed at actual cost. Translation cost to be reimbursed at actual cost. Upon invoice for effort (days) expended, stated in invoice, and/or reimbursable expenses, documentation included with invoice. Shall state days of effort for each of the Evaluations. 3.2 Payments under the terms of this MOU shall be made by check payable to: Payee Tax ID #: Address. Educations Innovations, LLC 56-2288391 3161 Campus Postal Station Memphis, Tennessee 38152-3830 2 4/8/053.3 Anything herein to the contrary notwithstanding, should this MOU terminate early pursuant to Article 8 herein, Principal Investigator and Sponsor shall agree upon the estimate of the percentage of completeness of the Evaluator's services rendered hereunder as of the date such notice is given. The Sponsor shall pay the Evaluator 'dpro rata fee based upon the agreed estimated percentage of completion such that payment will at least include all project costs incurred by Evaluator prior to the date of early termination. 4. Non-Exclusivity and Disclosure Nothing in this MOU shall be construed to limit the treedom of the Evaluator to engage in similar research performed independently under other grants, contracts, or agreements with parties other than Sponsor. If the Evaluator undertakes any research or evaluation that uses data from this Evaluation, Evaluator shall disclose such research or evaluation to PRE. 5. Publication and Disclosure The Evaluator shall have the right to present at s^'mposia and national or regional professional meetings, and to publish in scientific or other publications, the results of the Evaluations conducted under this MOU. Evaluator agrees to make such publication(s) conveniently available to PRE 6. Confidentiality and Non-Disclosure rhe Sponsor and Evaluator expressly acknowledge that Evaluator may need to provide to Sponsor information that Evaluator considers to be Proprietary Information. Sponsor agrees to hold Proprietary Information in strict confidence during the term of this MOU and for a period of two years after the termination or expiration of this MOU except as required by law. Similarly, the Evaluator shall protect Confidential Information and prevent its disclosure in any manner, except as required by law. Not later than two years after the termination or expiration of this MOU, Evaluator shall destroy all Confidential Information or return it to Sponsor. 7. Ownership and Patents The Evaluator shall have sole and exclusive ownership rights to any intellectual property, including but not limited to copyrights and/or inventions of a product, device, process, or method, whether patentable or unpatentable (an Invention), deriving from the Evaluators efforts, exclusive of any data or information, arising out of the Evaluations. Data or information furnished to Evaluator by Sponsor shall remain the property of the Sponsor. 8. Termination 1 his MOU shall remain in effect for the MOU Period unless extended in accordance with the terms of this MOU, as set forth in Section 1.2. In the event that either Evaluator or Sponsor shall be in default of any of its obligations under this MOU and shall fail to remedy such default within thirty (30) days after receipt of written notice thereof, the party not in default shall have the option of canceling this MOU by giving thirty (30) days written notice of termination to the other party. Termination of this Agreement shall not affect the rights and obligations of the parties, which shall have accrued prior to termination. No termination of this MOU, however effectuated, shall release either party from its rights and obligations under Articles 3, 4, 5, 6, 7, 8, 9, 11 and 18 herein. 3 4.8/059. Indeinnincation Sponsor agrees to defend, indemnify and hold harmless the Evaluator and its officers and employees (all such parties are hereinafter referred to collectively as the Indemnified Parties) from and against any and all liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including reasonable attorneys fees and court costs) arising directly or indirectly out of the Evaluation or the design, manufacture, sale or use of any embodiment or manifestation of the Evaluation, regardless of whether any and all such liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs) arise in whole or in part from the negligence of any of the Indemnified Parties. Notwithstanding the foregoing, Sponsor will not be responsible for indemnification of Evaluator pursuant to this Article 9 for any liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs) which arise solely from: (a) the gross negligence or intentional misconduct of Evaluator or the Principal Investigator, or (b) actions by Evaluator or the Principal Investigator in violation of applicable laws or regulations, (c) violations of this MOU. or The Sponsor agrees to provide a diligent defense against any and all liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs), brought against the Indemnified Parties with respect to the subject of the indemnity contained in this Article 9, whether such claims or actions are rightfully or wrongfully brought or filed. Evaluator shall be indemnified by Sponsor after Evaluator has completed the following: (a) within a reasonable time after receipt of notice of any and all liability, claims, lawsuits losses, demands, damages, costs, and expenses, or after the commencement of any action, suit, or proceeding giving rise to the right of indemnification, notify Sponsor, in writing, of said liability, claims, lawsuits, losses, demands, damages, costs, and expenses and send to the Sponsor a copy of all papers served on the Indemnified Party and (b) allow Sponsor to retain control of any such liability, claims, lawsuits, losses, demands, damages, costs, and expenses, including the right to make any settlement. 10. Independent Contractors Sponsor and Evaluator shall act as independent parties, and nothing contained in this MOU shall be construed or implied to create an agency or partnership. Neither Sponsor nor Evaluator shall have the authority to contract or incur expenses on behalf of the other except as may be expressly authorized by collateral written agreements. No member of the Evaluation Team shall be deemed to be an employee of Sponsor. 11. Use of Evaluator Name The use by either Sponsor or Evaluator of the others name or any other names, insignia, symbol(s), or logotypes associated with the other party or any variant or variants thereof in advertising, publicity, or other promotional activities is expressly prohibited, unless required by law or the other party provides written consent. 12. Severability It any one or more of the provisions ot this MOU shall be held to be invalid, illegal, or unenforceable, the validity, legality, or enforceability ot the remaining provisions of this MOU shall not in any way be affected or impaired thereby. 4 4/8'0513. Waiver The failure of any party hereto to insist upon strict performance of any provision of this MOU or to exercise any right hereunder will not constitute a waiver of that provision or right. This MOU shall not be effective until approved by Evaluators President or his official designee. Whenever the consent or approval of the Evaluator is required or permitted hereunder, such consent or approval must be given by the Evaluators President or his official designee 14. Notices Any notice or communication required or permitted to be given or made under this MOU by one of the parties hereto to the other shall be in writing and shall be deemed to have been sufficiently given or made for all purposes if mailed by certified mail, postage prepaid, return receipt requested, addressed to such other party at its respective address as follows: If to Sponsor. Karen DeJarnette, Ph.D Director, PRE Little Rock School District 810 West Markham Street Little Rock, AR 72201-1306 Phone (501) 447-3387/Fax (501) 447-7609 If to Evaluator with respect to all non-technical matters: Cindy Hurst Education Innovations, LLC 3161 Campus Postal Station Memphis, TN 38152-3830 Phone (901) 678-5063/Fax (901) 678-4257 If to Evaluator with respect to technical questions. Steven M. Ross, Ph D. Director, Education Innovations, LLC 3161 Campus Postal Station Memphis, TN 38152-3830 Phone (901) 678-3413/Fax (901) 678-4257 16. Assignment Neither Sponsor nor Evaluator shall assign its rights or obligations under this MOU without the prior written consent of the other party. 17. Entirety This MOU represents the entire agreement ot Sponsor and Evaluator, and it expressly supersedes all previous written and oral communications between them. Neither Sponsor nor Evaluator was induced to enter into this 5 4 (5Agreement by any statements or representations not contained in this MOU. This MOU may be modified only by written amendment executed by the Sponsor and the Evaluator, 18. Headings The headings of sections and subsections, if any, to the extent used herein are for convenience and reference only and in no way define, limit, or describe the scope or intent of any provision hereof, and therefore shall not be used in construing or interpreting the provisions hereof. IN WITNESS WHEREOF, Sponsor and Evaluator have caused this MOU to be executed in duplicate counterpart original by their duly authorized representatives to be effective as of the Effective Date SPONSOR EVALUATOR By. Signature By: Signature Superintendent P?*r-e<te., /rie^ct Samuel Hurst Vice PresidentBusiness and Finance Date Date 4/8/05 /5-<^y 6 Exhibit A COMPLIANCE REMEDY The Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock- Division in Little Rock School District v. Pulaski County Special School District No. 1 et al., Mrs. Lorene .Joshua et al. and Katherine Knight et al. Intervenors, is incorporated here by reference. Evaluator has a copy of this document but may request another copy from Sponsor. 7 4/8/05Exhibit B COMPREHENSIVE PROGRAM ASSESSMENT PROCESS The Comprehensive Program Assessment Process of the Little Rock School District is incorporated here by reference. Evaluator has a copy of this document but may request another copy from Sponsor. 8 4/8/05Exhibit C SCOPE OF SERVICES This states services and products by the Evaluator, who will perform Evaluations of three programs of the Sponsor, viz.^ Reading Recovery, Smart/Thrive, and Compass Learning, and produce reports of the Evaluations. Evaluation questions For each of the three Evaluations, the primary' questions are I. II. 111. Has the program improved the academic achievement of students identified by the Sponsor as African- American (.A.A) as implemented in the Little Rock School District' Elas the program decreased the differences between AA students and those identified by the Sponsor as white (W)' To what extent does each program account for changes in student performance? Secondary (step two) questions are 1. 2. 3. 4. What competing events or programs (relative to each evaluated program) explain changes in student performance? What traits of each group explain their performance and differences in performance between them What changes in the program do these results indicate to improve the effectiveness of the programs' How will these recommendations improve AA student performance? Evaluation design, data, and products Prior to Evaluators commencing the Evaluations, Sponsor will agree with Evaluator regarding A B C D E. F G theoretical model(s), Evaluation design(s), to conform with summative evaluations of the Comprehensive Program Assessment Process, specific variables for the summative evaluations, data adjustments and statistical methods, format) s) of data for use in the Evaluations delivered by the Sponsor to the Evaluator, content of deliverable products (written reports) and their formats, and schedule of services and product delivery. The following table is the schedule of services and product delivery. Delivery (2004-05) OctoberFebruary October 1 October November 1 ___________________________Service and Products________________________ Evaluator and Sponsor will negotiate MOU and agree on design of the Evaluations, their schedules, and instruments.___________________________________________ Evaluator will submit draft report of results to PRE.__________________________ Evaluator will discuss draft reports with PRE and alter report accordingly._______ Evaluator will submit final report to PRE. For the purpose of invoicing. Evaluator will track its efforts in increments of days or some portion there of. 9 4.'8C5Payments Little Rock School District Desegregation Court Mandate Support Project Period October 2004 to November 2005 Costs: Consulting @ $1,000 per day - 180 days maximum Actual Cost - Travel (est. $500) Actual Cost - Translation Seivices (est. $500) $180,000 * Maximum Payment $180,000* Plus actual cost of travel and translation services 10 4/8/05Exhibit D Reading Recovery Proposal for the Evaluation of Reading Recovery in Little Rock School District: Outline Version Evaluation Questions Primary Evaluation (Juestion. 1. Has the Reading Recovery (RR) program been effective in improving and remediating the academic achievement of African-American (AA) students? Supplemental (Qualitative Level 2) Evaluation (Questions: 1. What are the quality and level of implementation of RR at the 18 schools implementing it in 2004-05? 2. What is the level of participation in RR by African American students relative to other ethnic groups at the school? J. What is the progress demonstrated by African American and other student participating in RR in improving achievement, as demonstrated on program-specific measures? What percentage of students are discontinued or not discontinued? 4. What are the perceptions of RR teachers regarding RR program implementation, impacts, strengths, and w'eaknesses'i 5. What are the perceptions of K.-3 classroom teachers in the school regarding RR program implementation, impacts, strengths, and weaknesses'^ 6. What are the perceptions of parents/guardians of first grade RR students regarding program impacts, strengths, and weaknesses' Program Description RR is one of the eight literacy programs, interx entions, and/or models used by various LRSD schools. It is restricted to the first-grade and involves providing systematically designed individual tutoring to students identified as having the highest need for supplemental support. LRSD funds support the RR Program. Currently, RR is implemented by 18 elementary schools (whose .AA student composition follows their names): Bale: 82%, Booker: 53%, Carver: 53%, Chicot: 73%, Dodd: 54%, Franklin: 96%, Geyer Springs: 88?o, Gibbs: 53%, Meadowcliff: 78%, Mitchell: 96%, Otter Creek: 60%, Rightsell: 100%, Stephens: 95%, Terry: 53%, Wakefield: 78%, Watson: 96%, Williams: 52%, and Wilson: 89%. Proposed Design A mixed-methods design will address the research questions as follows: 11 4-'8/O5Printaiy Evaluation Question. 1. Has the RR program been effective in improving and remediating the academic achievement of AA students? A. Whole School'. A treatment-control school, pretest-posttest design will be employed in Grades 1- 3. The analysis will control for pretest, gender, ethnicity, and SES. It may be decided to examine (a) all 18 schools relative to the entire district elementary-school database or (b) a stratified random sample of RR schools relative to matched control schools. Pretests: DRA or DIBELS (whichever has the most usable database) administered in kindergarten. Posttests: 2004-05 Iowa Test of Basic Skills (ITBS) Reading and Math Subtests. B. Reading Recovery Subsample: Within each of the RR schools, first- to third-grade students who participated in RR as first graders will be identified and their achievement gains compared to predicted scores based on school status (RR vs Control), and student pretest, gender, ethnicity, and SES. Supplemental (Qualitative Step 2) Evaluation Questions: I. What is the quality and level of implementation of RR at the 18 schools implementing it in 2004-05'^ RR teachers will be interv-iewed by phone. First grade teachers and other grade-level teachers will be surveyed. Observations of RR tutoring sessions will be made at a sample of schools. A minimum of 12 tutoring classroom observations will be conducted. RR Teachers in-training will not be observed. 2. What is the level of participation in RR by AA students relative to other ethnic groups at the school? Student records/archival data for 2003-04 and 2004-05 will be analyzed. 3. What is the progress demonstrated by AA and other student participants in RR in improving achievement, as demonstrated on program-specific measures? What percentage of students is discontinued or not discontinued'^ RR Teachers will provide Recommendations for Discontinuing and Statement of Progress for Non- Discontinued Student data information on each 2004-05 RR student. 4. What are the perceptions of RR teachers regarding RR program implementation, impacts, strengths, and weaknesses' The RR teacher survey will directly address this question. RR teachers in-training will be interviewed by phone. 5. What are the perceptions of K-3 classroom teachers in the school regarding RR program implementation, impacts, strengths, and weaknesses? 12 4/8/05The K-3 classroom teacher survey will address this question via closed-ended and open-ended items. Only experienced RR schools will be surveyed. Respondents will identify their status by grade and role 6. What are the perceptions of parents/guardians of first grade RR students regarding program impacts, strengths, and weaknesses? A RR Parent Surv'cy will be conducted to address this question via a questionnaire including closed- and open-ended items in experienced RR schools. Suinmary of Data Sources and Participants by Evaluation Question: Reading Recovery Evaluation Question | Participants Data Sources Primary Question: 1. Wliat are the effects of participation in RR on AA student achievement? Supplemental (Step 2) Questions: All grades I -3 students at 18 RR schools and other elemcntaiy schools RR student participants within above samples DRA or DIBELS (pretest in K) 2004-05 ITBS Reading and Math subtests (posttest in grades 1-3) 1 What is the quality and level of implementation of RR at the 18 schools implementing it in 2004-05? 2. What is the level of participation in RR by African American students relative to other ethnic groups at the schixil?____________________ 3. What is the progress demonstrated by RR students in improving achievement, as demonstrated on program- specific measures? What percentage of students are discontinued or not discontinued"?______________ 4. W hat are the perceptions of RR teachers regarding RR program implementation, impacts, strengths, and weaknesses? All RR teachers Classroom teachers at experienced RR schools Principals at RR schools All RR schools All RR teachers will provide program data for first grade students All RR teachers 13 Random sample of principals and teachers intraining interview K-3 classroom Teacher Surt'ey (faculty meeting) RR student data RR Tutoring Session Observation (min. of 12 observations) School records/archival data RR student program data RR teacher survey RR teacher in-training interview 4 S 05no 5. Whal are the perceptions of K-3 classroom teachers regarding RR program implementation, impacts, strengths, and weaknesses' All K-3 classroom teachers in experienced RR schools K-3 classroom teacher survc> (disaggregated by 1 grade vs. other grades) 6. What are the perceptions of parents/guardians of first grade RR students regarding program impacts, strengths, and weaknesses' Timelines Februan,' 2005 March: MarchApril: May-June: JulySeptember: October 1 November 1: Parents of RR students in experienced RR schools RR Parent Survee Planning, refinement, and consultation with PRE and RR experts Instrument Development Begin observations, RR Teacher In-Training Phone Interviews Complete observations, RR Teacher and K-3 Classroom Teacher Survey, RR Principal Phone Interviews RR Student Data, Records/Archival data analyses Achievement data analyses/complete survey and interview analyses Submit draft report of finding to PRE Receive feedback from PRE Finalize and submit final repon to PRE 14 4/8/05Exhibit E Compass Learning Proposal for the Evaluation of Compass Learning in Little Rock School District: Outline Version Evaluation Questions Primary Evahialion Question. 1 What are the effects of participation in CL on the achievement of African-American (AA) and other students? Supplemental (Qualitative Step 2) Evaluation Questions: 1. What are the quality, nature, and level of implementation of CL at the 20 elementary schools identified as implementing the program in 2004-05? What is the level of participation in CL by AA students relative to other ethnic groups at the implementing schools J. What are the perceptions of teachers, lab attendants, and Technology Specialists regarding CL program implementation, impacts, strengths, and weaknesses? 4. What are the perceptions of parents/guardians of CL students regarding program impacts, strengths, and weaknesses 5. What are the perceptions of school principals, whose schools no longer use CL, with regard to past use of the CL program and possible adoption of a different program? Program Description Compass Learning is a computer-based program designed to develop students skills in reading, writing, and spelling. Additional purposes are to support teacher management of student performance, personalize instruction, and connect communities of learners. The theme-based lessons and activities provided by CL take a crosscurricular approach and offer a real world context for learning. The Compass Management system assessment is either automatic or customizable. A Technology Specialist assists classroom teachers with any technology question or need. In the 2004-05 school year, 20 LRSD elementary schools utilized CL programs, while 2 middle schools and 1 high school used the program in previous years. The AA student composition follows the individual school names: Elementary Schools Bale. 82%, Booker: 53%, Brady: 78%, Carver: 52%, Chicot: 73%, Fair Park: 75%, Forrest Park: 20%, Franklin: 96%, Fulbright. 26%, Geyer Springs: 88%, Gibbs: 53%, Mabelvale: 80%. McDermott: 62%, Mitchell. 96%, Otter Creek: 60%, Rightsell: 100%, Rockefeller: 67%, Stephens: 95%, Wakefield: 78%, Williams: 52%. 15 4.8/05Middle Schools: Cloverdale: 82%, and Henderson 82% High School: Accelerated Learning Center (ALC): 92% Proposed Design A mixed-methods design will be employed to address the research questions as follows Primary Evaluation Question. 2. What are the effects of participation in CL on the achievement of African American and other students? A. Quasi-experimental desist: Due to the insufficient sample size and unique nature of the high school (n = 1), the quasi-experimental analysis will be conducted with the elementary (n = 20 schools) and middle (n = 2) school samples only*. A descriptive examination (see below) of test scores for the high school will also be conducted to determine trends and patterns at that site. Specifically, the quasi-experimental design will compare CL elementary and middle schools to other schools in the district, most likely using multiple-regression type analyses in which the dependent variable is posttest (2004-05) scores (Arkansas Benchmarks in grades 3-8, and Iowa Test ot Basic Skills in grades K-8) and covariates are pretest (pre-program) test scores, gender, ethnicity, and SES. Pretests. Iowa Test ot Basic Skills (ITBS) (for grades K-8), Arkansas Benchmarks (for grades 4- 8) Posttests: 2004-05 ITBS Reading and Math Subtests (for grades 1-8) Arkansas Benchmarks (for grades 3-8). Supplemental (Qualitative Step 2) Evaluation Questions: 1 What are the quality, nature, and level of implementation of CL at the 20 elementary schools identified as implementing the program in 2004-05? Phone interviews will be conducted with (a) the LRSD CL Coordinator and (b) all school Technology- Specialists and (c) the lab attendant at the 7 elementary schools randomly selected for observations. All teachers at the 20 elementary schools will be surveyed so that site-specific data regarding implementation will be available. Observations of CL laboratory sessions will be conducted at a random sample of 7 elementary schools. At four of the observed schools, a brief (20-min.) student focus group {n = 5 to 7 students per school) will be conducted to ascertain students perspectives on their experiences in using CL (nature of activities, usefulness, enjoyment, etc.) 2. U hat is the level of participation in CL by AA students relative to other ethnic groups at the schools involved? Student records/archival data for 2003-04 and 2004-05 and CL observation data will be analyzed. Recent information indicates that the nnddie and high schools are no longer using CL. The principals will be intei^iewcd at these schools regarding CL usage. If the schools are not using CL, CREP will work w ith PRE to modify the achievement analy sis accordingly. 16 4/8'053. What are the perceptions of teachers, lab attendants, and Technology Specialists regarding CL program implementation, impacts, strengths, and weaknesses? This question will be addressed via the teacher survey and Technology Specialist interviews in schools identified as implementing CL and interviews with lab attendants in the 7 schools randomly selected for CL observations. 4. What are the perceptions of parents/guardians of CL students regarding program impacts, strengths, and weaknesses? A CL parent survey consisting of closed- and open-ended items will be administered to parents at 5 randomly selected schools. 5 What arc the perceptions of school principals, w hose schools no longer use CL, w ith regard to past use of the CL program and possible adoption of a different program? This question will be addressed via the Principal Interview. Summary of Instruments and Participants by Evaluation Question: Compass Learning Evaluation Question Primary Question: 1. What are the effects of participation in CompassLearning (CL) on the achievement of African American and other students? Participants Supplemental Questions: 1. Whal are I be qualih. naliire. and level of implementation of CL at the 20 elementaiy schools identified a.s implementing the program in 2004-05? Students al 20 CL elementary and middle schools and comparison schools Whole grade-level means al ihc Compass Learning high school. (T)ic middle schools and high school may not be included in the achievement analysis depending on CL usage)________________ All CL school teachers All Technology Specialists at schools implcinenting CL CL Lab Attendants at the 7 schools randomly selected for observations District CL Program Coordinator Student Focus Groups at 4 randomly selected elementan schools Data Sources ITBS as pretest for Grades K-9 Arkansas Benchmarks as poshest for 3-8) 2004-05 ITBS Reading and Math subtests (as poshest in grades 1- 9)2004-05 Grade 11 Literacy Exam (as poshest) 2004-05 Algebra I and Geometry End- of-Course Exams (as posttest) Teacher Survey (faculty meeting) Technology Specialist Phone Interview District CL Program Coordinator Phone Interview Lab Attendant Phone Inten iew Two-hour CL Laboratory Observations (7 randomly selected elementary schools) 20-min. Student Focus Groups (n = 5-7 students), one each at 4 schools randomly selected from the 7 observation schools 17 4'8'052. What is the level of participation in CL by African American students relative to other ctlmic groups at the schools concerned? .3. What are the perceptions of teachers, lab attendants, and Technology Specialists regarding CL program implementation, impacts, strengths, and weaknesses? 4. What are the perceptions of parents/guardians of CL students regarding program impacts, strengths, and n eaknesses? 5. What are the perceptions of school principals, whose schools no longer use CL. with regard Io past use of the CL program and possible adoption of a different program? All Coinpass schools School rccords/archiv al data Two-hour CL Laboratory Observ ations (J randomly selected elementary schools) Ail CL school teachers All Technology Specialists al schools implementing CL CL Lab Attendants al the 7 schools randomly selected for observations Parents of CL students Teacher Survey (faculty meeting) Technology Specialist Phone Inlcn ie Lab Allendani Phone Inlen ieu CL Parent Suney Distributed to one class at each grade level al 5 elenicnlarv schools Principals at two middle schools, one high school, and possibly one elementary- school Principal Phone Inlen iew 18 4/8/052005 Timeline January. February: March-April: May-June July-September: October 1: Planning/Refmement, consultation with PRE and CL experts, and instrument development Complete instrument development and begin observations CL Teacher Surx'ey (at faculty meetings), conduct phone interviews with district CL program coordinator, technology specialists, lab attendants, and principals in schools no longer implementing CL complete obsen'ations conduct student focus Groups. Records/Archival data analyses Achievement data analyses/complete survey and interview analyses Submit draft report of findings to PRE receive feedback from PRE November 1: Finalize and submit report to PRE 19 4/8/05Exhibit E Smart/Thrive Proposal for the Evaluation of the Smart/Thrive (S/T) Programs in the Little Rock School District: Outline Version Evaluation Questions Primary Evaluation Question. 1. 1 .Have the Smart/Thrive programs been effective in improving and remediating the academic achievement of African-American students' Supplemental (Qualitative Step 2) Evaluation Questions: 1. What is the level of participation in Sniart and Thrive by African American students'^ 2. What instructional strategies are used during the tutoring sessions' 3. What are the perceptions of Smart/Thrive Tutors regarding program impacts, strengths, and weaknesses? 4. What are the perceptions of Algebra 1 teachers regarding program impacts, strengths, and weaknesses? 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? 6. What arc the perceptions of parents/guardians of Smart/Thrive students regarding program impacts, strengths, and weaknesses' Program Description The S/T program was designed as an intervention for S"- and 9"'-grade African-.American students who are lacking the knowledge, skills, and/or confidence required for success in Algebra I. This program currently (2004-2005) engages approximately 10 percent of the total African-American student population enrolled in /Algebra 1 classes. During the 2003-2004 academic year, the program served 264 students. Participants were offered pre-algebra instruction for two weeks during the summer (Smart Program) and 10 Saturdays across the school year Thrive Program). Various local grants have fiinded this program since 1999. In the current school year, S/T serves students from all eight LRSD middle schools (whose AA student composition follows their names): Cloverdale: 82%, Dunbar: 61%, Forest Heights: 77%, Henderson: 82%, Mablevale: 81%, Mann: 52%, Pulaski Heights 57%, and Southwest: 94% Proposed Design A mixed-methods design will be employed to address the research questions as follows 20 4 8/05Primary Evaluation Question . 1. Have the S/T programs been effective in improving and remediating the academic achievement of African-American students' A treatment (2 levels)-control student, pretest-posttest design will be employed. The analysis will control for pretest, gender, ethnicity, and SES. Three types of Algebra I students will be compared depending on their program enrollment . i. No program ii. Smart program only iii. Both Smart and Thrive programs Pretests: 2003-2004 Math Benchmark Test Posttests: 2004-05 (ITBS) Math Subtests Algebra I EOC Math Benchmark Test. Supplemental CQiialitative/Slep 2) Evaluation Questions: 1. What is the level of participation in Smart and Thrive by African American students' Student records/archival data for 2003-04 and 2004-05 will be analyzed. In addition to descriptive information, the levels of participation will be gathered as a potential variable for the student achievement analyses. 2. What instructional strategies are used during the tutoring sessions? Random observation visits will be conducted during the Saturday Thrive Program sessions. Approximately five visits will be made. Observations of the Summer Smart Program can be conducted in 2005. 3. What are the perceptions of S/T Tutors regarding program impacts, strengths, and weaknesses? A questionnaire will be administered to S/T Tutors. 4. What are the perceptions of Algebra I teachers regarding program impacts, strengths, and weaknesses' A questionnaire will be administered to Algebra I teachers. 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? A questionnaire will be administered to program participants. A sample of program participants will also be selected to participate in student focus groups. Approximately 3-5 focus groups will be conducted, with each comprised of approximately 5 students. 6. What are the perceptions ot parents/guardians of S/T students regarding program impacts, strengths, and weaknesses' An S/T Parent survey will be conducted to address this question via a questionnaire including closed- and open-ended items. 21 4.'g/05Summary' of Data Sources and Participants by Evaluation Question: Smart/Thrive Evaluation Question [ I Primary Question:___________ 1. What are the effects of participation in the Smart and/or Thrive Programs on student achievement? Participants Data Sources Timelines All 8* and 9* grade Algebra I students 2003-2004 Math Benchmark 2004-05 ITBS Math subtests Math Benchmark Algebra I EOC Supplemental Questions: I. What is the level of participation in Smart and Thrive by African American students?__________________ 2. What instructional strategics are used during the tutoring sessions?___________ 3. WJiat are the perceptions of S/T Tutors regarding program impacts, strengths, and weaknesses?________________ 4. What are the perceptions of Algebra 1 teachers regarding program impacts, strengths, and weaknesses?____________ 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses/ 6. What are the perceptions of parents/guardians of S/T students regarding program impacts, strengths, and weaknesses? January', 2005. Febiuary: March-April: May-June: July-September: October T. October: November 1 All program participants ST teachers and students All S/T Tutors All Algebra I teachers Program participants Parents of ST students School records/archival data Observations of tutoring sessions ST Tutor Questionnaire Algebra 1 Teacher Questionnaire ST Student Questionnaire Focus Groups ST Parent Questionnaire Planning, refinement, and consultation with PRE, Instrument Development Begin observations of Thrive sessions Administer Teacher, Tutor, and Student Questionnaires and begin focus groups. Complete focus groups and observations and analyze records and archival data Analyze achievement data and complete survey and interview analyses. Submit draft report to PRE. Discuss draft report of findings with PRE and write final report. Submit final report to PRE 22 4 8/05RECEIVED OCT 2 5 2005 OfRCEOF DESEGREGATION MONITORINGi-C> 'rvtc4ecl te Memorandum of Understanding This Memorandum of Understanding (hereinafter MOU), effective the first day of February 2005 (hereinafter the Effective Date), is entered into by and between James S. Catterall (hereinafter Evaluator), Graduate School of Education & Information Studies, University of California, Los Angeles, CA 90024, and Little Rock School District (hereinafter Sponsor), whose offices are located at 810 West Markham Street, Little Rock, AR 72201. WITNESSETH WHEREAS, Sponsor, to comply with the June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division, and Program Evaluation Standards, will hire outside consultants to prepare formal, step-two evaluations and WHEREAS, Evaluator possesses unique knowledge and experience relating to such formal step-two evaluations and Program Evaluation Standards NOW, THEREFORE, in consideration of the premises and the mutual covenants and conditions hereinafter recited, the Sponsor and Evaluator do hereby agree as follows: 1. Definitions For purposes of this MOU, the following definitions apply 1.1 Compliance Remedy shall mean the entire June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division in Little Rock School District v. Pulaski County Special School District No. I et al., Mrs. Lorene Joshua et al. and Katherine Knight et al. Intervenors (Exhibit A). 1.2 MOU Period shall mean the period commencing on the Effective Date of this MOU and terminating on November 1, 2005. The term of this MOU may be extended by the mutual written consent of the duly authorized representatives of Evaluator and Sponsor. 1.3 Formal step-two evaluation (hereinafter Evaluation) shall mean a summative evaluation of Sponsors Year-Round Education (hereinafter YRE) program conducted by the Evaluator according to the Sponsors Comprehensive Program Assessment Process and described more fully in Exhibit B, which is incorporated herein by reference. Evaluation ascertains differences among schools as w'ell as for the LRSD. 1.4 Comprehensive Program Assessment Process (Exhibit B) shall mean the process required by the Compliance Remedy, adopted by Sponsors Board of Directors on December 16, 2004, and incorporated as Appendix B in the first quarterly written update by the Sponsor to the Office of Desegregation Monitoring and Joshua, December 1,2004. 1.5 with this MOU. Evaluation Funds shall mean those funds paid by the Sponsor to the Evaluator in accordance 1.6 Evaluation Team shall mean the Evaluator and any personnel under the Evaluators direction and control who are supported in whole or in part by the Evaluation Funds.1.7 Planning, Research, and Evaluation (hereinafter PRE) shall mean Sponsors department who shall represent the Sponsor and oversee the Evaluation. 1.8 Proprietary Information shall mean any data, information, concepts, routines, artwork, design work, advertising copy, specifications, or improvement that is commercially valuable not generally available to or known in the industry and belonging to Evaluator. Proprietary Information shall not include information which (a) is or becomes a part of the public domain through no act or omission of the receiving party (b) was in the receiving party's lawful possession prior to the disclosure and had not been obtained by the receiving party either directly or indirectly from the disclosing party (c) is lawfully disclosed to the receiving party by a third party without restriction on disclosure (d) is independently developed by the receiving party or (e) is disclosed by operation of law. 2.0 Confidential Information shall mean data or information related to the identities of individuals such as Sponsors students, teachers, administrators including PRE, or Board of Directors guardians or relatives of such students community members or any other individuals related to the Evaluation. 2. Evaluation 2.1 2.2 During the MOU Period, the Evaluator shall conduct an Evaluation of Sponsors YRE on behalf of Sponsor in accordance with the Compliance Remedy (Exhibit A), within the mutually agreed schedule (Exhibit C), and substantially in accordance with the terms and conditions of this MOU. The Evaluations name is YRE. 3. Payments 3.1 3.1.1 3.1.2 3.1.3 Sponsor shall pay Evaluator the Evaluation Funds in the following manner: Amount: Rate: Travel 3.1.3 3.1.4 To be Paid: Invoices Not to exceed Forty-five Thousand dollars (US $45,000.00). $1,500 per day for effort, plus travel expenses. Travel expenses for travel between Los Angeles, CA and Little Rock, AK including Little Rock accommodations and meals not to exceed $6,000.00 (economy class airfare only). Upon invoice for effort (days) expended, stated in invoice. Shall state days of effort. 3.2 Payments under the terms of this MOU shall be made by check payable to Payee Taxpayer ID Address: James S. Catterall 141-38-3478 120 N. Topanga Canyon Blvd., Suite 203 Topanga, CA 90290 3.3 Anything herein to the contrary notwithstanding, should this MOU terminate early pursuant to Article 8 herein. Evaluator and Sponsor shall agree upon the estimate of the percentage of completeness of the Evaluators services rendered hereunder as of the date such notice is given. The Sponsor shall pay the Evaluator a pro rata fee based upon the agreed estimated percentage of completion such that payment will at least include all project costs incurred by Evaluator prior to the date of early termination. 2 3/22/054. Non-Exclusivity and Disclosure Nothing in this MOU shall be construed to limit the freedom of the Evaluator to engage in similar research performed independently under other grants, contracts, or agreements with parties other than Sponsor. If the Evaluator undertakes any research or evaluation that uses data from this Evaluation, Evaluator shall disclose such research or evaluation to PRE. 5. Publication and Disclosure The Evaluator shall have the right to present at symposia and national or regional professional meetings, and to publish in scientific or other publications,, the results of the Evaluations conducted under this MOU. Evaluator agrees to make such publication(s) conveniently available to PRE. 6. Confidentiality and Non-Disclosure The Sponsor and Evaluator expressly acknowledge that Evaluator may need to provide to Sponsor information that Evaluator considers to be Proprietary Information. Sponsor agrees to hold Proprietary Information in strict confidence during the term of this MOU and for a period of two years after the termination or expiration of this MOU except as required by law. Similarly, the Evaluator shall protect Confidential Information and prevent its disclosure in any manner, except as required by law. Not later than two years after the termination or expiration of this MOU, Evaluator shall destroy all Confidential Information or return it to Sponsor. 7. Ownership and Patents The Evaluator shall have sole and exclusive ownership rights to any intellectual property, including but not limited to copyrights and/or inventions of a product, device, process, or method, whether patentable or unpatentable (an Invention), deriving from the Evaluators efforts, exclusive of any data or information, arising out of the Evaluation. Data or information furnished to Evaluator by Sponsor shall remain the property of the Sponsor. 8. Termination This MOU shall remain in effect for the MOU Period unless extended in accordance with the terms of this MOU, as set forth in Section 1.2. In the event that either Evaluator or Sponsor shall be in default of any of its obligations under this MOU and shall fail to remedy such default within thirty (30) days after receipt of written notice thereof, the party not in default shall have the option of canceling this MOU by giving thirty (30) days written notice of termination to the other party. Termination of this Agreement shall not affect the rights and obligations of the parties, which shall have accrued prior to termination. No termination of this MOU, however effectuated, shall release either party from its rights and obligations under Articles 3 through 17 herein. 3 3/22/059. Indemnification Sponsor agrees to defend, indemnify, and hold harmless the Evaluator and its officers and employees (all such parties are hereinafter referred to collectively as the Indemnified Parties) from and against any and all liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including reasonable attorneys fees and court costs) arising directly or indirectly out of the Evaluation or the design, manufacture, sale or use of any embodiment or manifestation of the Evaluation, regardless of whether any and all such liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs) arise in whole or in part from the negligence of any of the Indemnified Parties. Notwithstanding the foregoing. Sponsor will not be responsible for indemnification of Evaluator pursuant to this Article 9 for any liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs) which arise solely from: (a) the gross negligence or intentional misconduct of Evaluator or (b) actions by Evaluator in violation of applicable laws or regulations or (c) violations of this MOU. The Sponsor agrees to provide a diligent defense against any and all liability, claims, lawsuits, losses, demands, damages, costs, and expenses (including attorneys fees and court costs), brought against the Indemnified Parties with respect to the subject of the indemnity contained in this Article 9, whether such claims or actions are rightfully or wrongfully brought or filed. Evaluator shall be indemnified by Sponsor after Evaluator has completed the following: (a) within a reasonable time after receipt of notice of any and all liability, claims, lawsuits losses, demands, damages, costs, and expenses, or after the commencement of any action, suit, or proceeding giving rise to the right of indemnification, notify Sponsor, in writing, of said liability, claims, lawsuits, losses, demands, damages, costs, and expenses and send to the Sponsor a copy of all papers served on the Indemnified Party and (b) allow Sponsor to retain control of any such liability, claims, lawsuits, losses, demands, damages, costs, and expenses, including the right to make any settlement. 10. Independent Contractors Sponsor and Evaluator shall act as independent parties, and nothing contained in this MOU shall be construed or implied to create an agency or partnership. Neither Sponsor nor Evaluator shall have the authority to contract or incur expenses on behalf of the other except as may be expressly authorized by collateral written agreements. No member of the Evaluation Team shall be deemed to be an employee of Sponsor. 11. Use of Evaluator Name The use by either Sponsor or Evaluator of the others name or any other names, insignia, symbol(s), or logotypes associated with the other party or any variant or variants thereof in advertising, publicity, or other promotional activities is expressly prohibited, unless required by law or the other party provides written consent. 4 3/22/0512. Severability If any one or more of the provisions of this MOU shall be held to be invalid, illegal, or unenforceable, the validity, legality, or enforceability of the remaining provisions of this MOU shall not in any way be affected or impaired thereby. 13. Waiver The failure of any party hereto to insist upon strict performance of any provision of this MOU or to exercise any right hereunder will not constitute a waiver of that provision or right. This MOU shall not be effective until approved by Evaluators President or his official designee. Whenever the consent or approval of the Evaluator is required or permitted hereunder, such consent or approval must be given by the Evaluators President or his official designee. 14. Notices Any notice or communication required or permitted to be given or made under this MOU by one of the parties hereto to the other shall be in writing and shall be deemed to have been sufficiently given or made for all purposes if mailed by certified mail, postage prepaid, return receipt requested, addressed to such other party at its respective address as follows: If to Sponsor: Karen DeJamette, Ph.D. Director, PRE Department Little Rock School District 810 West Markham Street Little Rock, AR 72201-1306 Phone (501) 447-3387, Fax (501) 447-7609 If to Evaluator: James S. Catterall, Ph.D. Research and Evaluation Office 120 N. Topanga Canyon Blvd. Suite 203 Topanga, CA 90290 15. Assignment Neither Sponsor nor Evaluator shall assign its rights or obligations under this MOU without the prior written consent of the other party. 5 3/22/0516. Entirety This MOU represents the entire agreement of Sponsor and Evaluator, and it expressly supersedes all previous written and oral communications between them. Neither Sponsor nor Evaluator was induced to enter into this Agreement by any statements or representations not contained in this MOU. This MOU may be modified only by written amendment executed by the Sponsor and the Evaluator. 17. Headings The headings of sections and subsections, if any, to the extent used herein are for convenience and reference only and in no way define, limit, or describe the scope or intent of any provision hereof, and therefore shall not be used in construing or interpreting the provisions hereof. IN WITNESS WHEREOF, Sponsor and Evaluator have caused this MOU to be executed in duplicate counterpart original by their duly authorized representatives to be effective as of the Effective Date. EVALUATOR By: SPONSOR Signature Darral Paradis, Director Procurement Department Little Rock School District By: Jame: 'ure 'atterall. Ph. D. Date Date 3/22/05 a 6 Exhibit A COMPLIANCE REMEDY The Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division in Little Rock School District v. Pulaski County Special School District No. I et al., Mrs. Lorene Joshua et al. and Katherine Knight et al. Intervenors, is incorporated here by reference. Evaluator has a copy. 7 3/22/05Exhibit B COMPREHENSIVE PROGRAM ASSESSMENT PROCESS Little Rock School District 8 3/22/05LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R COMPREHENSIVE PROGRAM ASSESSMENT PROCESS Comprehensive Program Assessment Process Purpose The purpose of these regulations is to provide guidance in the appraisal of programs and to comply with requirements of the US District Court for the Eastern District. They do not necessarily apply to grant-funded programs if the funding source requires other procedures and provides resources for a required evaluation. Criteria for Program Evaluations Policy IL specifies that the evaluations of programs approved in its Board-approved Program Evaluation Agenda will be conducted according to the standards developed by the Joint Committee on Standards for Educational Evaluation. (See Joint Committee on Standards for Educational Evaluation, James R. Sanders, Chair (1994). The Program Evaluation Standards, 2^ Edition: How to Assess Evaluations of Educational Programs. Thousand Oaks, CA: Sage Publications.) There are four attributes of an evaluation: Utility(U) -evaluations are informative, timely, and influential Feasibility (F) -evaluations must be operable in the natural setting and must not consume more resources than necessary Propriety (P) - rights of individuals must be protected Accuracy(A) -evaluations should produce sound information Prospective, controlled, summative evaluations are at one end of a spectrum of activities that review District operations. Other activities in this continuum include formative and less formal and rigorous evaluations, regular and occasional assessments, and fast or brief snapshots. As rigor and formality diminish along the range of reviews, fewer standards apply. Examples of how the standards apply are found following table, adapted from The Program Evaluation Standards, pages 18 and 19: Checklist for Applying the Standards The reader should interpret the information provided in this table with reference both to the Standards (cited above) and the peculiar circumstances of given program reviews. Double plus signs (++) indicate that standards are fully addressed. Single pluses (+) mean that the standard is a concern but not necessarily fully addressed, and zeros (0) point to standards not usually applicable. Not all summative evaluation will fully satisfy every standard, and other examples may observe more standards than indicated here. Note, however, that all reviews fully observe human rights and impartial reports. 9 3/22/05LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R Checklist of Evaluation Standards for Examples of Program Reviews Standard____________________ UI Stakeholder Identification U2 Evaluator Credibility U3 Infonnation Scope & Selection U4 Values Identification U5 Report Clarity U6 Report Timeliness & Dissemination U7 Evaluation Impact Fl Practical Procedures F2 Political Viability F3 Cost Effectiveness Pl Service Orientation P2 Formal Agreements P3 Rights of Human Subjects P4 Human Interaction P5 Complete & Fair Assessment P6 Disclosure of Findings P7 Conflict of Interest P8 Fiscal Responsibility Al Program Documentation A2 Context Analysis A3 Described Purposes and Procedures A4 Defensible Information Sources A5 Valid Information A6 Reliable Information A7 Systematic Information A8 Analysis of Quantitative Data______ A9 Analysis of Qualitative Data AIO Justified Conclusions A11 Impartial Reporting A12 Meta-evaluation Summative evaluations Informal Assessments Formative Assessments (School Portfolios) ++ ++ ++ ++ ++ ++ ++ ++ ++ 10 0 4- 4-4- 4- 4- 4- y 4- 4-4- 4- 4- 4-4- 4- T 4- 4- 4- 4- 4-4- 4- V 0 + + + + + + + 0 4- 4- 4- V 4- 4- 4- + 4- 4- y 4- 4- 0 Snapshots 0 0 + + 4- _0 + 2 4- y _0 4- 4- y _0 4- 4- 4- + + + + + + 3/22/05LITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R Program Evaluation Procedures The following procedures are established for the evaluation of programs approved by the Board of Education in its annual Program Evaluation Agenda: 1. The Planning, Research, and Evaluation (PRE) Department will recommend to the Superintendent annually, before the budget for the coming year is proposed, the curriculum/instruction programs for comprehensive program evaluation. The recommendation will include a proposed budget, a description of other required resources, and an action plan for the completion of the reports. Criteria for the proposed agenda are as follows: A. Will the results of the evaluation influence decisions about the program? B. Will the evaluation be done in time to be useful? C. Will the program be significant enough to merit evaluation? (See Joseph S. Wholey, Harry P. Hatry, and Kathryn Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: Jossey-Bass Publishers. 5-7.) 2. The Superintendent will recommend to the Board of Education for approval the proposed Program Evaluation Agendawith anticipated costs and an action plan for completion. 3. For each curriculum/instruction program to be evaluated as per the Program Evaluation Agenda, the Director of PRE will establish a staff team with a designated leader to assume responsibility for the production of the report according to the timelines established in the action plan approved by the Board of Education. 4. Each team will include, at a minimum, one or more specialists in the curriculum/instruction program to be evaluated, a statistician, a programmer to assist in data retrieval and disaggregation, and a technical writer. If additional expertise is required, then other staff may be added as necessary. 5. An external consultant with expertise in program evaluation, the program area being evaluated, statistical analysis, and/or technical writing will be retained as a member of the team. The role of the external consultant may vary, depending upon the expertise required for the production of the program evaluation. 6. The team leader will establish a calendar of regularly scheduled meetings for the production of the program evaluation. The first meetings will be devoted to the following tasks: A. Provide any necessary training on program evaluation that may be required for novice members of the team, including a review of the Boards policy IL and all of the required criteria and procedures in these regulations, IL-R. B. Assess the expertise of each team member and make recommendations to the Director of PRE related to any additional assistance that may be required. C. Write a clear description of the curriculum/instruction program that is to be evaluated, with information about the schedule of its implementation. 11 3/22/057. 8. 9. LITTLE ROCK SCHOOL DISTRICT NEPNCODE IL-R D. E. F. G. Agree on any necessary research questions that need to be established in addition to the question, Has this curriculum/instruction program been effective in improving and remediating the academic achievement of African-American students Generate a list of the data required to answer each research question, and assign responsibility for its collection and production. All available and relevant student performance data should be included. (See Judge Wilsons Compliance Remedy.) Decide who will be the chief writer of the program evaluation. Plan ways to provide regular progress reports (e g., dissemination of meeting minutes, written progress reports, oral reports to the Superintendents Cabinet) to stakeholders. (See Joellen Killion (2002). Assessing Impact: Evaluating Staff Development. Oxford, OH. National Staff Development Council (NSDC) Robby Champion (Fall 2002). Map Out Evaluation Goals. Journal of Staff Development. 78-79 Thomas R. Guskey (2000). Evaluating Professional Development. Thousand Oaks, CA: Corwin Press Blaine R. Worthen, James R. Sanders, and Jody L. Fitzpatrick (1997). Participant-Oriented Evaluated Approaches. Program Evaluation: Alternative Approaches and Practical Guidelines 153-169 Beverly A. Parsons (2002). Evaluative Inquiry: Using Evaluation to Promote Student Success. Thousand Oaks, CA: Corwin Press and Joseph S. Wholey, Harry P. Hatry, and Kathryn E. Newcomer (1994). Handbook of Practical Program Evaluation. San Francisco, CA: Jossey-Bass Publishers.) Subsequent meetings of the program evaluation team are required for the following tasks: to monitor the completion of assignments to collaborate in the interpretation and analysis of data to pose any necessary new questions to be answered to review drafts and provide feedback to the writer to formulate recommendations, as required, for program improvement, especially to decide if a recommendation is required to modify or abandon the program if the findings reveal that the program is not being successful for the improvement of African-American achievement to assist in final proofreading and to write a brief executive summary, highlighting the program evaluation findings and recommendations. A near-final copy of the program evaluation must be submitted to the Director of PRE at least one month before the deadline for placing the report on the Boards agenda for review and approval. This time is required for final approval by staff, for final editing to ensure accuracy, and for submission to the Superintendent. When the program evaluation is approved for submission to the Board of Education for review and approval, copies of the Executive Summary and complete report must be made for them, for members of the Cabinet. 12 3/22/0510. LITTLE ROCK SCHOOL DISTRICT NEPNCODE IL-R 11. The program evaluation team will plan its presentation to the Board of Education on the findings and recommendations. 12. The Director of PRE will prepare the cover memorandum to the Board of Education, including all the required background information: A. If program modifications are suggested, the steps that the staff members have taken or will take to implement those modifications. If abandonment of the program is recommended, the steps that will be taken to replace the program with another with more potential for the improvement and remediation of African-American students. B. Names of the administrators who were involved in the program evaluation. C. Name and qualifications of the external expert who served on the evaluation team. 13. 14. D. Grade-level descriptions of the teachers who were involved in the assessment process (e.g., all fourth-grade math teachers, all eighth grade English teachers, etc.). When the program evaluation is approved by the Board of Education, the team must arrange to have the Executive Summary and the full report copied and design a plan for communicating the program evaluation findings and recommendations to other stakeholders. This plan must then be submitted to the Director of PRE for approval. Each program evaluation team will meet with the Director of PRE after the completion of its work to evaluate the processes and product and to make recommendations for future program evaluations. (See Joellen Killion (2002). Evaluate the Evaluation. Assessing Impact: Evaluating Staff Development. Oxford, OH: National Staff Development Council. 46,123-124.) Approved: December 16, 2004 13 3/22/05LITTLE ROCK SCHOOL DISTRICT NEPNCODE: IL-R Evaluation Standards Criteria for Program Evaluations Policy IL specifies that the evaluations of programs approved in its Board-approved Program Evaluation Agenda will be conducted according to the standards developed by the Joint Committee on Standards for Educational Evaluation. (See Joint Committee on Standards for Educational Evaluation, James R. Sanders, Chair (1994). The Program Evaluation Standards, 2"** Edition: How to Assess Evaluations of Educational Programs. Thousand Oaks, CA: Sage Publications.) They are as follows: Utility Standards The utility standards are intended to ensure that an evaluation will serve the information needs of intended users. These standards are as follows: Stakeholder identification. People involved in or affected by the evaluation should be identified so that their needs can be addressed. Evaluator credibility. The people conducting the evaluation should be both trustworthy and competent to perform the evaluation so that the evaluation findings achieve maximum credibility and acceptance. Information scope and sequence. Information collected should be broadly selected to address pertinent questions about the program and should be responsive to the needs and interests of clients and other specified stakeholders. Values identification. The perspectives, procedures, and rationale used to interpret the findings should be described carefully so that the bases for value judgments are clear. Report clarity. Evaluation reports should describe clearly the program being evaluated, including its context and the purposes, procedures, and findings of the evaluation, so that essential information is provided and understood easily. Report timeliness and dissemination. Significant interim findings and evaluation reports should be disseminated to intended users so that they can be used in a timely fashion. Evaluation impact. Evaluations should be planned, conducted, and reported in ways that encourage follow-through by stakeholders, so that the likelihood that the evaluation will be used is increased. Feasibility Standards Feasibility standards are intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal. Practical procedures. Evaluation procedures should be practical so that the disruption is kept to a minimum while needed infonnation is obtained. Political viability. The evaluation should be planned and conducted with anticipation of the different positions of various interest groups so that their cooperation may be obtained, and so that possible attempts by any of these groups to curtail evaluation operations or to bias or misapply the results can be averted or counteracted. Cost-effectiveness. The evaluation should be efficient and produce information of sufficient value so that the resources expended can be justified. Service orientation. Evaluations should be designed to assist organizations to address and effectively serve the needs of the full range of targeted participants. Formal agreements. Obligations of the formal parties to an evaluation (what is to be done, how, by whom, and when) should be agreed to in writing so that these parties are obligated to adhere to all conditions of the agreement or to formally renegotiate it. 14 3/22/05LITTLE ROCK SCHOOL DISTRICT NEPN CODE: IL-R Rights of human subjects. Evaluation design and conduct should respect and protect human rights and welfare. Human interactions. Evaluators should respect human dignity and worth in their interactions with other people associated with an evaluation so that participants are not threatened or harmed. Complete and fair assessments. The evaluation should be complete and fair in its examination and recording of strengths and weaknesses of the program being evaluated so that strengths can be built upon and problem areas addressed. Disclosure of findings. The formal parties to an evaluation should ensure that the full set of evaluation findings, along with pertinent limitations, are made accessible to the people affected by the evaluation, as well as any others with expressed legal rights to receive the results. Conflict of interest. Conflict of interest should be dealt with openly and honestly so that it does not compromise the evaluation processes and results. Fiscal responsibility. The evaluators allocation and expenditure of resources should reflect sound accountability procedures and be prudent and ethically responsible so that expenditures are accounted for and appropriate. Accuracy Standards Accuracy standards are intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine the worth of merit of the program being evaluated. Program documentation. The program being evaluated should be described and documented clearly and accurately so that it is identified clearly. Context analysis. The context in which the program exists should be examined in enough detail so that its likely influences on the program can be identified. Described purposes and procedures. The purposes and procedure of the evaluation should be monitored and described in enough detail so that they can be identified and assessed. Defensible information sources. The sources of information used in a program evaluation should be described in enough detail so that the adequacy of the information can be assessed. Valid information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the interpretation arrived at is valid for the intended use. Reliable information. The information-gathering procedures should be chosen or developed and then implemented in a manner that will ensure that the information obtained is sufficiently reliable for the intended use. Systematic information. The information collected, processed, and reported in an evaluation should be review systematically so that the evaluation questions are answered effectively. Analysis of quantitative information. Quantitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Analysis of qualitative information. Qualitative information in an evaluation should be analyzed appropriately and systematically so that the evaluation questions are answered effectively. Justified conclusions. The conclusions reached in an evaluation should be justified explicitly so that stakeholders can assess them. Impartial reporting. Reporting procedures should guard against distortion caused by personal feelings and biases of any party so the evaluation reports reflect the evaluation findings fairly. Meta-evaluation. The evaluation itself should be evaluated formatively and summatively against these and other pertinent standards so that its conduct is appropriately guided, and on completion, stakeholders can closely examine its strengths and weaknesses. 15 3/22/05Exhibit C SCOPE OF SERVICES Evaluation of Year-Round Education This states services and products by the Evaluator, who will conduct an Evaluation of the Sponsors YRE Programs and produce reports of that Evaluation. Evaluation questions For this Evaluation, the primary questions are: 1. II. III. Has YRE as implemented in the Little Rock School District improved the academic achievement of students identified by the Sponsor as African-American (AA)? Has YRE as implemented in the Little Rock School District decreased the differences between AA students and those identified by the Sponsor as white (W)? To what extent does YRE account for changes in student performance? Secondary (step-two) questions are 1. What competing events or programs (relative to YRE) explain changes in student performance? 2. What traits of each group explain their performance and differences in performance between them? 3. What changes in YRE do these results indicate to improve the effectiveness of the programs? 4. How will these recommendations improve AA student performance? Evaluation design, data, and products Prior to Evaluators commencing the Evaluation, Sponsor will agree with Evaluator regarding A. B. C. D. E. F. G. theoretical model(s). Evaluation design(s), to conform with summative evaluations of the Comprehensive Program Assessment Process, specific variables for the Evaluation, data adjustments and statistical methods, format(s) of data for use in the Evaluation delivered by the Sponsor to the Evaluator, content of deliverable products (written reports) and their formats, and schedule of services and product delivery. The following table is the schedule of services and product delivery. Delivery (2005-06) FebruaryMarch October 1,2006 October November 1,2006 ____________________________Service and Products_________________________ Evaluator and Sponsor will negotiate MOU and agree on design of the Evaluations, their schedules, and instruments.___________________________________________ Evaluator will submit draft report of results to PRE.___________________________ Evaluator will discuss draft reports with PRE and alter report accordingly.________ Evaluator will submit final report to PRE. For the purpose of invoicing. Evaluator will track his efforts in increments of days or some portion thereof. 16 3/22/05Exhibit D Year-Round Education Programs Primary Evaluation Question: 1. Have the Year-Round Education (YRE) Programs been effective in improving and remediating the academic achievement of African-American students? Supplemental (Qualitative/Step 2) Evaluation Questions: 1. What are the quality and level of implementation of intersession instructional strategies? 2. What are the quality and level of implementation of instructional strategies during regular session? 3. What is the level of participation in YRE Programs by African American students relative to other ethnic groups at the school? '4. What are the perceptions of YRE teachers regarding program impacts, strengths, and weaknesses? 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? 6. What are the perceptions of parents/guardians of YRE students regarding program impacts, strengths, and weaknesses? Program Description Year-Round Education is a concept which reorganizes the school year so that instruction occurs throughout the year with regularly scheduled breaks interspersed. Instruction and vacations are shorter and spaced throughout the year for more continuous learning and more frequent breaks. YRE has emerged nationally as a way to offer all students a better education, regardless of their ethnic background, social status or academic performance. LRSDs design is a single-track, 45-10 calendar where all students and teachers in the school are in class or on vacation at the same time. The 45-10 refers to 45 days in a quarter then 10 days of intersession/vacation. Intersession is a five-day program and attendance is voluntary. Currently there are five elementary schools implementing YRE Elementary Schools Cloverdale Mablevale Mitchell Stephens Woodruff Number of Teachers 26 25 22 39 21 Number of Students 360 257 156 499 235 Percent of Students African- American 77 80 96 95 91 Percent of Students Free/Reduced Lunch 89 88 92 91 86 17 3/22/05Primary Evaluation Question Proposed Design 1. Have the YRE Programs been effective in improving and remediating the academic achievement of African-American students? fVhole School-. A treatment-control school, pretest-posttest design will be employed. The analysis will control for pretest, gender, ethnicity, and SES. Subsample: Within each YRE school, students who participated in intersession will be identified and their achievement gains compared to predicted scores based on school status and student pretest, gender, ethnicity, and SES. Supplemental (Qualitative/Step 2) Evaluation Questions: 1. What are the quality and level of implementation of intersession instructional strategies? 2. What are the quality and level of implementation of instructional strategies during regular sessions? Year-Round Education teachers will be interviewed by phone. Year-Round Education classrooms, both regular and during intersession, will be observed. 3. What is the level of participation in YRE Programs by African American students relative to other ethnic groups at the school? Student records/archival data for 2003-04 and 2004-05 will be analyzed. 4. What are the perceptions of YRE teachers regarding program impacts, strengths, and weaknesses? The Year-Round Education teacher interview and the Year-Round Education Teacher Survey will address this question via closed-ended and open-ended items. 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? A survey will be administered to program participants. 6. What are the perceptions of parents/guardians of YRE students regarding program impacts, strengths, and weaknesses? A Parent Survey will be conducted to address this question via a questionnaire including closed- and open-ended items. 18 3/22/05Summary of Data Sources and Participants by Evaluation Question Evaluation Question Primary Question: Participants Data Sources 1. What are the effects of participation in YRE on student achievement? All grades at YRE schools and other elementary schools. Year-Round Education intersession student participants within above samples Benchmark, ITBS, and school records Supplemental Questions: 1. What are the quality and level of implementation of intersession instructional strategies? All YRE teachers Teacher phone interview Classroom observations 2. What are the quality and level of implementation of instructional strategies during regular session? 3. What is the level of participation in YRE Programs by African American students relative to other ethnic groups? All YRE schools School records/archival data 4. What are the perceptions of YRE teachers regarding program impacts, strengths, and weaknesses? AH YRE teachers YRE teacher interview and survey 5. What are the perceptions of participating students regarding program impacts, strengths, and weaknesses? YRE students grades 4 and 5 YRE student survey 6. What are the perceptions of parents/guardians of YRE students regarding program impacts, strengths, and weaknesses? Parents of YRE students YRE parent survey 19 3/22/05Timelines February-March: March: March-April: May-June: July-September: October: November 1 Planning, refinement, and consultation with PRE and YRE experts and instrument development Begin YRE classroom observations and YRE teacher interviews Survey YRE school teachers and complete YRE teacher interviews Records/Archival data analyses Analyze achievement data/complete survey and interview analyses Submit draft report of findings to PRE and receive feedback from PRE Finalize and submit report to PRE received OCT 2 5 2005 .OFFICE OF DKEGfiEGAnONMONirORING 20 3/22/05Memorandum of Understanding This Memorandum of Understanding (hereinafter MOU), effective the first day of February 2005 (hereinafter the Effective Date), is entered into by and between Janies S. Catterall (hereinafter Evaluator), Graduate School of Education & Information Studies, University of California, Los Angeles, CA 90024, and Little Rock School District (hereinafter Sponsor), whose offices are located at 810 West Markham Street Little Rock, AR 72201. WITNESSETH WHEREAS, Sponsor, to comply with the June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division, and Program Evaluation Standards, will hire outside consultants to prepare formal, step-two evaluations and WHEREAS, Evaluator possesses unique knowledge and experience relating to such formal step-two evaluations and Program Evaluation Standards NOW, THEREFORE, in consideration of the premises and the mutual covenants and conditions hereinafter recited, the Sponsor and Evaluator do hereby agree as follows: 1. Definitions For purposes of this MOU, the following definitions apply: RECEIVED OCT 2 0 2005 OFFICE OF desegregation monitoring 1.1 Compliance Remedy shall mean the entire June 30, 2004 Memorandum Opinion by the United States District Court for the Eastern District of Arkansas, Little Rock Division in Little Rock School District v. Pulaski County Special School District No. 1 et al., Mrs. Lorene Joshua et al. and Katherine Knight et al. Intervenors (Exhibit A). 1.2 MOU Period shall mean the period commencing on the Effective Date of this MOU and terminating on November 1, 2005. The term of this MOU may be extended by the mutual written consent of the duly authorized representatives of Evaluator and Sponsor. 1.3 Formal step-two evaluation (hereinafter Evaluation) shall mean a summative evaluation of Sponsors Year-Round Education (hereinafter YRE) program conducted by the Evaluator according to the Sponsors Comprehensive Program Assessment Process and described more fully in Exhibit B, which is incorporated herein by reference. Evaluation ascertains differences among schools as well as for the LRSD. 1.4 Comprehensive Program Assessment Process (Exhibit B) shall mean the process required by the Compliance Remedy, adopted by Sponsors Board of Directors on December 16, 2004, and incorporated as Appendix B in the first quarterly written update by the Sponsor to the Office of Desegregation Monitoring and Joshua, December 1,2004. 1.5 with this MOU. Evaluation Funds shall mean those funds paid by the Sponsor to the Evaluator in accordance 1.6 Evaluation Team shall mean the Evaluator and any personnel under the Evaluators direction and control who are supported in whole or in part by the Evaluation Funds.1.7 Planning, Research, and Evaluation (hereinafter PRE) shall mean Sponsors department who shall represent the Sponsor and oversee the Evaluation. 1.8 Proprietary Information shall mean any data, information, concepts, routines, artwork, design work, advertising copy, specifications, or improvement that is commercially valuable not generally available to or known in the industry and belonging to Evaluator. Proprietaiy Information shall not include information which, (a) is or becomes a part of the public domain through no act or omission of the receiving party (b) was in the receiving party's lawful possession prior to the disclosure and had not been obtained by the receiving party either directly or indirectly from the disclosing party (c) is lawfully disclosed to the receiving party by a third party without restriction on disclosure (d) is independently developed by the receiving party or (e) is disclosed by operation of law. 2.0 Confidential Information shall mean data or information related to the identities of ind This project was supported in part by a Digitizing Hidden Special Collections and Archives project grant from The Andrew W. Mellon Foundation and Council on Library and Information Resoources.