Compliance hearing exhibits, ''Planning for Program Evaluation''

PLANNING FOR PROGRAM EVALUATIONozo>o mm on 0) z Planning for Program Evaluation 1. 2. 3. 4. 5. 6. 1. 8. 9. Memorandum to designated principals from Mona Briggs, Aug. 23, 1999, providing information on standards for accreditation from ADE Memorandum to elementary staff, Jan. 20, 1999, relating to an ADE evaluation of Early Literacy Learning in Arkansas (ELLA) /-yy Memorandum to Kathy Lease and Ed Williams, June 29, 1999, on program evaluation with attached articles on qualitative research and an example of a research report from Austin ISD by Glynn Ligon Memorandum to Division of Instruction, Feb. 1, 2000, with agenda relating to program implementation E-mail to Virginia Johnson and Debbie Milam, Feb. 4, 2000, suggesting a model for the evaluation of ViPS programs /V/ Memorandum in March 15, 2000, Learning Link relating to progress made by schools implementing the ALT assessment program Document from Kathy Leasecalendar of meetings with Dr. Steve Ross since March 15, 2000
attached planning document on program evaluation E-mail to Bonnie Lesley, Mar. 24, 2000, providing information about a meeting with Dr. Steve Ross to discuss the middle school evaluation / E-mail to Kathy Lease, May 23, 2000, providing feedback on proposed middle school student survey 10. E-mail to Bonnie Lesley, Marian Lacey, and Sadie Mitchell, June 12, 2000, from Les Gamine requesting information about the middle school evaluation 11. E-mail from Steve Ross to Kathy Lease, June 27,2000, with attached design notes for Title I/Elementary Literacy Program Evaluation 12. E-mail from Kathy Lease to her staff, Aug. 6, 2000, requesting them to place the memorandum and program evaluations on the Board agenda 13. E-mail from Kathy Lease to Les Gamine, Aug. 10, 2000, providing copies of drafts of the ESL and middle school evaluations
then his questions and her answers. 14. Memorandum to Board of Education, Aug. 24, 2000, from Kathy Lease presenting the program evaluations: Title EElementary Literacy, LRCPMSA (mathematics and science), English as a Second Language, and Middle School Transition and Program Implementation. Attached is her PowerPoint presentation: Program Evaluation.15. E-mail from Steve Ross to Les Camine, Sept. 7, 2000, giving his feedback to the program evaluation reports. i7 16. E-mail from Debbie Milam to Cabinet members, Sept. 20, 2000, requesting permission to conduct interviews of parents on the subject of parental involvement. 17. E-mail from Kathy Lease to staff, Oct. 11, 2000, advising them of an upcoming meeting with Dr. Steve Ross related to program evaluation /s-y 18. E-mail from Virginia Johnson to Bonnie Lesley and Vanessa Cleaver, Oct. 20, 2000, relating to our required participation in an evaluation study conducted by the National Science Foundation 19. Memorandum to Gene Jones, ODM, from Kathy Lease, Oct. 27, 2000, inviting him to an intensive work session with Dr. Steve Ross on program evaluation /q/ 20. Document prepared by PRE in November 2000 that lists Additional Programs and Strategies Requesting Evaluation / 21. E-mail to Cabinet members from Kathy Lease, Nov. 28, 2000, attaching Dr. Steve Ross planned presentation to the Board of Education on Using Evaluation for Program Improvement: Lessons Learned 22. E-mail from Bonnie Lesley to Virginia Johnson, Jan. 2, 2001, setting up a meeting to finalize CPMSA program evaluation plan 23. E-mail from Virginia Johnson to Bonnie Lesley, Jan. 3, 2001, attaching her tentative plan 24. E-mail from Kathy Lease to Les Camine and Junious Babbs, Jan. 5, 2001, providing information relating to outsourcing program evaluations to Dr. John Nunnery 25. E-mail from/to Virginia Johnson, Jan. 5-20, 2000, relating to submission of Core Data Elements to the National Science Foundation 26. E-mail from/to Virginia Johnson, Apr. 14-16, 2000, relating to CPMSA program evaluation issues J 27. E-mail from Kathy Lease to Les Camine, Jan. 22, 2001, attaching a draft of the work from Dr. John Nunnery 28. Memorandum (one of several) from Kathy Lease, Jan. 24, 2001, inviting participants to the first meeting of the Research Committee /TO29. Memorandum from Kathy Lease to John Walker, Jan. 24, 2001, inviting him to participate in first meeting of Research Committee 30. Agenda for Feb. 5, 2001, meeting of the Research Committee and sign-in sheet 31. E-mail from Bonnie Lesley to Eddie McCoy, Ed Williams, and Karen Broadnax, Feb. 16, 2001, to set up a meeting to discuss ESL program evaluation /73 32. Memorandum from Kathy Lease to Research Committee setting up Feb. 26, 2001, meeting 33. Agenda for Feb. 26,2001, Research Committee meeting and sign-in sheet 34. Invoice from Dr. John Nunnery to LRSD for services rendered, February-March 2001 35. E-mail from Bonnie Lesley to CPMSA staff, Feb. 21, 2001, setting up a meeting to discuss the CPMSA program evaluation /v7 36. E-mail from Virginia Johnson to Bonnie Lesley, March 14,2001, providing updates 37. E-mail to middle school staff from Bonnie Lesley, Mar. 15, 2001, summarizing a meeting to plan for a Middle School Team Leaders Institute, including recognition of need to train team leaders on assessment and using data /7? 38. E-mail from Bonnie Lesley to CPMSA staff. Mar. 19, 2001, setting up follow-up meeting to discuss CPMSA program evaluation /?O 39. Memorandum to Carnegie Management Team, March 20, 2001, from Bonnie Lesley with information about counseling program and need for a program evaluation /^/ 40. Memorandum from Kathy Lease to Research Committee, Apr. 16, 2001, setting up next meeting on summer school evaluation and program evaluation for the National Science Foundation grant 41. Sign-in sheet for Apr. 23, 2001, meeting of the Research Committee 42. E-mail from Bormie Lesley to Dennis Glasgow, Suzi Davis, and Laura Beth Arnold, April 17, 2001, to discuss program evaluation for Element 5 of the Safe Schools/ Healthy Students project 43. E-mail from Virginia Johnson to Bonnie Lesley, Apr. 18, 2001, relating to next steps in providing information about SAT9 item analyses for teachers 1^5 44. E-mail from Mona Briggs to Bonnie Lesley, Apr. 25, 2001, relating to survey needs for national evaluation of Safe Schools/ Healthy Students project45. E-mail from Dennis Glasgow to elementary and middle school staff, Apr. 26, 2001, summarizing a large scale study that links classroom practices to student achievement in mathematics 1^7 46. E-mail among team working on CPMSA program evaluation, Apr. 18-May 2, 2001, relating to model for program evaluation and data analysis I'S'^ 47. E-mail from Kathy Lease to Research Committee, May 2, 2001, with attached latest version of the Guidelines for Program Evaluations 1^^ 48. Agenda for May 7, 2001, meeting of the Research Committee and sign-in sheet no 49. E-mail from Don Crary to Bonnie Lesley, May 24, 2001, announcing that a program evaluator had been hired by New Futures to conduct the program evaluation for Safe Schools/ Healthy Students 50. E-mail from Kathy Lease to Research Committee with attached memorandum relating to next meeting on June 11, 2001 51. Agenda for June 11, 2001, meeting of the Research Committee and sign-in sheet 52. E-mail from Junious Babbs to Bonnie Lesley, June 12, 2001, relating to information on program evaluation /'iV 53. E-mail from Kathy Lease to Compliance Team, June 14, 2001, with an outline of a plan for the completion of the Middle School Evaluation I 54. E-mail from Kathy Lease to Research Committee, June 14, 2001, attaching a copy of final draft of Dr. Nunnerys evaluation of the mathematics/science programs 55. E-mail from Dennis Glasgow to Ed Williams, July 3, 2001, requesting additional ALT reports n7 56. E-mail from Vanessa Cleaver to others working on CPMSA program evaluation, July 10,2001, requesting help in publishing a three-year progress report on the CPMSA1 ssc .1 Planning, Research, and Evaluation Instructional Resource Center 3001 South Pulaski Street Little Rock, Arkansas, 72206 August 23, 1999 To: Principals Designated for Standards Review 1999-2000 LL- (Carver, Cloverdale E., Geyer Springs, Gibbs, Hall, King, Mabelvale E., Meadowcliff, and Pulaski Heights Middle) t 51 J1 J From: Through: RE: Mona Briggs, Technical Assistance Team Dr. Kathy Lease, Assistant Superintendent for PRE Standards Compliance Checklist As you know, I have invited Bettye G. Davis, the specialist at the Department of Education for Standards Assurance, to meet with us on September 29,1999, at 3:30 p.m. (IRC, room 12). In order to further your understanding of the standards for accreditation, I am providing you with a copy of the compliance checklist that was furnished to us last year. While there may be some minor changes in this years checklist, this will give you a sense of the documentation you will be collecting for the states visit. If you will take a moment and review this document, it may help clarify what is involved and may also serve as a catalyst for formulating questions that you may want Ms. Davis to address during our meeting. If you have any questions, you may e-mail me at mrbrigg@.irc.lrsd.kl2.ar.us or call me at 324-2120. (If you are not able to attend the September meeting, please notify my office and give me the name of the person who will be your designee at the meeting.) I ARKANSAS PUBLIC SCHOOLS STANDARDS FOR ACCREDITATION COMPLIANCE CHECKLIST LEA. DISTRICT SCHOOL I I I 5 Grade Levels Enrollment Field Services Specialist Date Rev. 6/98 ,'' Standard Yes No EVIDENCE/COMMENTS I. GOALS, POLICIES, AND PROCEDURES 5 S A. Policies and actions are non- discriminatory and in compliance with state and federal laws. Equity Compliance Report Equity Assistance Center Verification B. State and National Goals I. II.A fl C. School District Goals ILB 1, The district's five-year educational plan (all schools' COE plans) has been developed, with staff and community participation. It is reviewed annually, and public meetings (district and school) are held to discuss progress. This information is published annually. Newspaper article(s) Date and attendees for meeting II.B f D. School District Administration II.C It s 1. School board policies Copy of school board policies 2. Reports and records Test results on file with ADE II.C.l II.C.2 i I 1 I5 I 3 1 J t t I if 1 I 3 standard Yes No li. The school board held a public meeting to review progress toward accomplishing district goals and accreditation. Minutes F. School Goals 1. 2. n.c.3 LD evidence/comments The school has an appropriately developed and reviewed school improvement plan. (Reviewed under I-C) COE plan The school has an in-depth five year curriculum review. (Reviewed under II-A-1) G. The community is actively engaged in the educational program. * Appropriate documentation H. The discipline policies are written and filed according to established 1 I. J. r.D.i I.D.2 III. V.D 4 i guidelines. Discipline policies Signed written statements There is a written policy that governs participation in extracurricular activities. * Extracurricular activities policy There is a written homework policy. Homework policy K. The enrollment and attendance policy is consistent with applicable laws and regulations. Policy L. Grades assigned to students reflect only educational objectives and are consistent with laws and regulations. Grading policy V.E-F V.G VI. VILB A 2I I i * ______ \ Standard Yes No EVIDENCE/COMMENTS f sI I a I I 4II ! ! 2. 3
Time is scheduled for instruction in the core curriculum (language arts, math, social studies, science). Schedule/ Observation Time is scheduled for instruction in the other curriculum areas as specified in the Standards. Schedule/ Observation C. Grades 5-8 1. Instruction is developmentally appropriate. Observation 2. Time is scheduled for instruction in the core curriculum (language arts, math, social studies, science). Scheduie/Observation 3. At least one semester of Arkansas history is taught in grade seven or eight (or in grades 9-12). Scheduie/Observation 4. Time is scheduled for instruction in the other curriculum areas as specified in the Standards. Scheduie/Observation D. Grades 9-12 *(may be taught every other year) 1. Language Arts-6 units 4 units English 1 unit oral communications or Vz unit oral communications and V2 unit drama * 1 unit journalism IV. IV. 13 I 1i ' IV, IV. IV. IV. IV. 4 I3 Standard Yes No EVIDENCE/COMMENTS 1 j 5 I f s I J 'i 1 I 'll j I 8. 9. Health and Safely Education and Physical Education-} 'A units I unit Physical Education 'A unit Health and Safety Education TecA Prep and Applied Technology-9 units (EIGHT UNITS MUST BE TAUGHT EVERY YEAR.) III. GRADUATION REQUIREMENTS A. All graduates have completed a minimum of 21 units of credit._____ B. A unit of credit is awarded for a minimum of 120 clock hours of instruction.____________ C. All graduates have completed the following 15 units of credit: English-4 units Oral CommunicationsA unit Social Studies-3 units or 2 units of social studies and lunit of Vocational/Technical studies Mathematics3 units Science-3 units, at least 1 science unit in a life science and 1 in physical science physical Education-A unit Health and Safety Education-'A unit Fine Arts-'A unit . Transcripts IV. IV. I ( IX.A IX.B IX.C 6Standard Yes No evidence/comments a '^1 I 1 D. All graduates completing the college preparatory path of study have completed the following units: Science-1 unit of biology (or equivalent), 1 unit of chemistry, or lunit of physics (or equivalent) Mathematics1 unit of Algebra I (or equivalent), 1 unit of Algebra II, and 1 unit of geometry Social Studies1 unit of World History or Cultures, 1 unit of American History, 'A unit of Civics/American Government, and A unit elective Foreign Language-2 units of 1 foreign language Transcripts _________ IX. 11 i I I E. All graduates completing the technical preparatory path of study have completed the following units: Science-at least 2 units include content which is equivalent to science courses in the college preparatory track Mathematics-2 units must include content which is equivalent to mathematics courses in the college preparatory track Social Studies-1 unit of American History, 1 unit of World History or Global Studies, and at least A unit of Civics/American Government Vocational credits-4 credits in a vocational sequence are required. . Transcripts IX. I- 1 I 7I* ^5: Standard Yes No EVIDENCE/COMMENTS F. Honor graduates are selected according to the guidelines established by the Rules and Regulations as related to Act 980 of 1991. IX. Transcripts IV. TEACHERS s Si A. Student-teacher interaction time is a minimum of 178 days. V.A.l A B. Teacher contracts are a minimum of 185 days, including 5 days staff development and in-service training. VA.2 i! C. The planned instructional time in each school day does not average less than six hours per day or thirty hours per week. . School calendar Daily schedule D. Student/teacher ratio: 1. Kindergarten 20/1 or 22/1 with half-time instructional aide Annual reports/Observation 2. Grades 1-3-23/1 with no more than 25 in a classroom Annual reports/Observation 3. Grades 4-6-25/1 with no more than 28 in a classroom Annual reports/Observation V.A.4 V.B.2 V.B.3. V.B.4 o at s ri 4. Grades 7-12-Each class has 30 students or less
no more than 150 students per day Annual reports/Observation V.B.5 I 1I. 8 2 LITTLE ROCK SCHOOL DISTRICT INSTRUCTIONAL RESOURCE CENTER 3001 PULASKI STREET LITTLE ROCK, AR 72206 (501)324-2131 January 20, 1999 TO: Gene Parker Judy Milam Judy Teeter Sadie Mitchell Kris Huffman Pat Price Ann Freeman Kathy Lease Ed Williams Frances Cawthon FROM: Dr. Bonnie Lesley,, AAssss(ociate Superintendent for Instruction SUBJECT: Evaluation of ELLA Please see the attached letter. You may receive questions from some of our schools. BAL/rcm ) DEPARTMENT of EDUCATION 4 STATE CAPITOL MALL LITTLE ROCK. ARKANSAS 72201-1071 (501) 682-4475 RA> MOND SIMON, Director MEMORANDUM DATE: January 5, 1999 ^4/V TO: Superintendents FROM: SUBJECT: Dr. Kevin Penix, Assistant Dire School Improvement and Instructional Support Initiation of Impact Study for Districts Participating in the Second Year of Early Literacy Learning in Arkansas (ELLA) The Arkansas Department of Education (ADE) is conducting a study to determine the effectiveness of the Early Literacy Learning in Arkansas (ELLA) training and its impact on student achievement. Teachers who. are currently enrolled into the second year of ELLA will provide data on three to five selected students in their classroom. The data collected will be cunent information from the Observation Survey and the Developmental Reading Assessment. These assessments are administered as part of the requirement for participating in ELLA and should not utilize any additional class time. Enclosed is a sample of the form that will be used to record the data on each student. Data information compiled on students who are in the classroom of the targeted ELLA teachers will be assigned an identification number to maintain anonymity and provide future follow-up studies. The assessment listed will be administered as a pretest during the first twelve weeks of school and a posttest during the first of May or prior to school year completion. A control group will also be used in reviewing the impact of the ELLA training. Again, data have already been compiled on these students as part of a random sample study in coordination with the University of Arkansas at Little Rock, Reading Recovery/Early Literacy Training Center. The study for ELLA is in response to the Smart Start Initiative and the accountability of the staff development being offered by the ADE to meet the needs of participating teachers. The ADE is working with the University of Arkansas Research Center in the compilation and analysis of this study. This is part of an ongoing process to assess this staff development impact on student achievement. If you need additional information or have questions regarding this study please contact either the Early Childhood Curriculum Specialist at the Education Service Cooperative in your area or contact the Early Childhood / Reading Unit at 501-682-5615. STATE BOARD OF EDI ( ATION
Chairman - BETTY PICKETT. C onwa) Vice Chairman - JoNELL CALDWELL. Bmanl Members: ED\M\ B. ALDERSON. JR.. El Dorado CARL E. BAGGETT. Rogers MARTHA DIXON. Arkadelphia WILLIAM B. FISHER. Paragould LI KE GOKin. \an Buren ROBERT HACKLER. Mountain Home JAMES McLARTT 111. Newport RICHARD C. SMITH. JR.. McGehee LEWIS THOMPSON. JR.. Tesarkaoa ANITA Y ATES. Bentonville An Equal Opportunin Employer ! Cooperative Teacher___ Early Literacy Learning in Arkansas Impact Study 1998-99 ____________________ ^District___________________ School_________ ____ ______ LEA# Phone Students Identification # Birthdate Age Grade Check one: Gender. , Male Female Ethnicity... Black White Hispanic Asian NativeAm Other This is the first year second year student has been under the instruction of an ELLA trained teacher. Date of pretest Date of post-test Observation Survey Subtests Beavers DRA Pre Post Letter ID Word Test CAP Writing Vocabulary Dictation Text Reading Level Students Identification # Birthdate .Age, Grade Check one: Gender. . Male Female 1 Date of pretest Pre Post I Ethnicity...O Black White Hispanic Asian NativeAm Other This is the first year second year student has been under the instruction of an ELLA trained teacher. Date of post-test Observation Survey Subtests Beavers DRA Letter ID Word Test CAP Writing Vocabulary Dictation Text Reading Level Students Identification # Check one: Gender. , Male Female Birthdate .Age. Grade Ethnicity...n Black White Hispanic Asian NativeAm Other This is the first year second year student has been under the instruction of an ELLA trained teacher. Date of pretest Date of post-test Observation Survey Subtests Beavers DRA Pre Post Letter ED Word Test CAP Writing Vocabulary Dictation Text Reading Level 3 LITTLE ROCK SCHOOL DISTRICT INSTRUCTIONAL RESOURCE CENTER 3001 PULASKI STREET LITTLE ROCK, AR 72206 (501)324-2131 June 29, 1999 TO: Kathy Lease Ed. Williams FROM: Dr. Bonnie Lesley,, zA\Ossociate Superintendent, Instruction SUBJECT: Program Evaluation I found in my files the attached documents which may be helpful: 1. A couple of articles on qualitative research. 2. A copy of the executive summary of a research report from Austin ISD by Glynn Ligon. Please note both the content and the format. Attachment BAL/rcm 06 USING WHAT WE KNOW ABOUT TEACHING QUALITATIVE RESEARCH ANOTHER WAY OF KNOWIN 107 Spindler, G. and Spindler, L. "Roger Harker and Schonhausen: From Familiar to Strange and Back Again." In Doing the Ethnography of Schooling. Edited by G. Spindler. New York: Holl, Rinehart, and Winston, 1982. Stevenson, C. "A Phenomenological Study of Perceptions about Open Education Among Graduates of the Fayrweather Street School." Unpublished Doctoral dissertation. University of Connecticut, Storrs, 1979. Updike, John. Rabbit is Rich. New York: Knopf, 1981. Varenne, H. "Jocks and Freaks: The Symbolic Structure of the Expression of Social Inlerac- A Response to Rogers lion Among American Senior High School Students." In Doing the Ethnography of School- Edited by G. Spindler. New York: Holt, Rinehart, and Winston, 1982. E.
Campbell, DjSchuarlz, R.
and Sechresl, L. t/ziotfrwsfwMcflSHres. Chicago: Rand McNally and Co., 1966, p. 9. Wilcox, K. "Ethnography As a Methodology and Ils Implications to the Study of Schooling." In Doing the Ethnography of Schooling. Edited by G. Spindler. New York: Holt, Rinehart, and Winston, 1982. Wilcox, K. Schooling and Socialization for Work Roles: A Structural Inquiry into Cultural Transmission in an Urban Community. Doctoral dissertation. Harvard University, Cambridge, 1978. Wigginton, E. Eoxfire. New York: Anchor Books/Doubleday, 1972. Wolcott, H. The Man in the Principal's Office. New York: Holt, Rinehart, and Winston, 1973. I I I I WILLIAM D. CORBETT If you don't know where you are going, any road will get you there. The Talmud The direction of education should be based on the proven successes of the past and present. Identification of proven success, however, is not as clear cut as it would appear to be because of the complexify of the educational process and the diversity of the constituencies we serve. It is from the multitude of components that contribute to good education and the variety of efforts made by innovators that we expect researchers to assist us in mapping our course. Since educational research affects the lives of practitioners as well as the students served, it is valuable to have lucid description of the two major types of research by a person who is a recognized leader in the field. Vincent Rogers has depicted the strengths and weaknesses of both quantitative and qualitative research and offered cogent examples of each technique. The chapter should be excerpted from ASCD's Yearbook and placed on the required reading list of those who are preparing for teaching and administrative careers. Current practitioners should also read the chapter with care. Those of us who are public or private school practitioners have been both beneficiaries of sound research and victims of poor research. The very word "research" tends to lend authority to headlines, however outrageous, to meet the public's appetite for news: Class Size a Factor in Reading Success Class Size Not Important in Educational Achievement Open Education Proves Successful in Affective Education Research Shows Traditional Approaches Best for Basics Reading Scores Improving Study Shows High School Graduates Are Illiterate Headlines like these confuse the public and frustrate educators. They indeed embarrass serious researchers. Much of the questionable research that gains wide attention is "so called" hard data research. It is often dependent upon the results and analysis of multiple choice, fill-in, machine-scored tests. Deductions drawn from this type of research are statistical with seldom a careful look at the instruments used, not to mention the effects these instruments William D. Corbett is Principal, James Russell Lowell School, Watertown Public Schools, Walerlowii. Massnchusells. 108 USING WHAT WE KNOW ABOUT TEACHING QUALITATIVE RESEARCH ANOTHER WAY OF KNOWING 109 have on the educational process. The more the multiple choice, fill-in instruments are used to draw educational conclusions, the more the emphasis is placed on them at all levels. Education, at least in the United States, is correspondingly diminished to serve these evaluation procedures. Qualitative research is much more expensive and requires unusual sensitivity and experience in both process and analysis. Let it be said that meaningful educational research of all kinds is costly and needs talented and perceptive directorship. Let it also be said that a great disservice to both education in general and to conscientious researchers in particular is done by the several who engage in shallow research. Perhaps it is time for serious researches like Vincent Rogers, John Goodlad, and others to call for a permanent blue-ribbon research monitoring committee to rank educational research according to its integrity. The committee would be ready to analyze and answer authoritatively the shoddy pieces of research that appear periodically in the news media. In this manner the word "research" would reacquire the respect it deserves and those affected by research would give it the attention necessary to chart the direction of education. The Price of Everything, the Value of Nothing ANNE RONEY In his essay, Vincent Rogers defends qualitative research. Perhaps his need to do so points more to certain predilections in ourselves than to deficiencies in qualitative methods. We are as entranced by numbers as crows are by shiny objects. How do we explain this attraction? Our delight with numbers probably goes back to the very moment when, as young children, we first counted six cookies on a plate or 23 cows alongside the highway or 15 days until Christmas. What precision and economy of expression! What power it was to realize that an unknown that pile of cookiescould be counted and thus controlled, manipulated to divide the pile or to win the game or to sequence time itself. So we began to attack unknowns with numbers, using ever more sophisticated calculations. In a society with competing traditions, populated by people from many nations, and striving to move forward, we encountered many unknowns or, at least, questions for which previous answers no longer sufficed. This faith in the quantitative was reinforced on every hand as our penchant for problem-solving bloomed into technology. For some of us the faith occasionally dimmed. As a new teacher, 1 was dismayed when the librarian stopped at my door, form in hand, and inquired as to my circulation total for the month. I had not kept any circulation figures, I told her, searching about in my mind for a way to construct a number. I stammered something about having 32 students and having been to the library two or three times. She said, "Oh, that's all right. I'll just put down '150' for you. That's close enough." She went on her way? With her went my incorrect number, wafting its way through the bureaucratic channels, making wrong every other number it touched. Realizing how often such estimates are entered on forms, I became skeptical about numbers. They are very nearly all bail-park figures, used more for their economy of expression than for their precision. Of course researchers are not as naive about the precision of numbers as 1 was. They have devised all sorts of safeguards and hedges
the standard error of measurement, Type I and Type II errors, levels of significance, degrees of freedom, random selection, and so on. Each safeguard fulfills a necessary function and in so doing, makes the resultant numbers more authoritative than ever. But, transformed in analysis, the original bit of data has been so far removed from its origin as to be unrecognizable even to its mother. I Aitne Roney is Elementary Supervisor, Department of Public Instruction, Knox County, Knoxville, Tennessee. 110 4 -ft JUK USING WHAT Wfe KNOW ABOUT TEACHING I )UALITATIVE research-ANOTHER WAY OF KNOWIN,. 111 In addition to the seduction of numbers, we must contend with both a predilection for method and the unwise application of quantitative designs. Apparently, pioneer educational researchers came from agriculture and psychology and were constantly glancing with envy at the laboratory experiments of chemists and physicists. Using these models in education, we have applied spelling treatments to classroom groups as if we were applying fertilizers to plots of corn
and we have counted the responses of students in class discussions as if they were rats in a maze. We have thus removed the variables under study from their settingthe school or the social group, such removal being a condition of the quantitative design. Researchers have not set out to isolate their problems from context. Ideally, each problem worthy of inquiry is derived from both a situation and a review of related research and literature. But in doctoral dissertations, Chapter Twos are often deadly
and the lines of thought connecting them to problem, methodology, and findings are likely to be less than clear and direct. In other research reports, the space devoted to the review of the literature and the rationale for the study is usually much less than , the space given to metholodogy. Preoccupied with design rather than utility, the researcher is compelled to explicate his/her mathematics for the benefit of other researchers
that is, to share the recipe whether or not the pudding is worth eating. Quantitative designs are often precise and elegant. We get caqght up in their tight beauty in the same way that wc admire an architect's elevation drawings, whose delineated grace may obscure the clumsiness of the resulting structure. It is lack of attention Io context and overemphasis on the means instead of emphasis on the ends that make the use of research discouraging to the practitioner. Even if early educational researchers had derived their methods from , sociology, anthropology, and history. we probably could not have escaped the American romance with quantitative methods. And would we want to? Oh, no. As Rogers pointed out, quantitative methods are effective and useful. The power of numbers is particularly persuasive, as I found on a winter morning when the heater in a portable classroom had been turned off the night before. The teacher had complained to rne (her principal) about the cold, but it was only when she sent a note saying "It is 42 in here" that 1 jumped up and arranged for her class to occupy the cafeteria. Numbers give substance and specificity to description
they support or fail to support our judgments and our hunches
they enable its to evaluate reported information. Indeed, a school leader would be lost without his/her quantitative litany: How soon? How many? How often? Out of how many chances? At what cost? Quantitative approaches stem from our logical and analytical ways of knowing. What they do not give us is the context, the setting, the framework of meaning that surrounds each problem and that would abTes w (frialitative methods permit the scrutiny and analysis of individual vari- ^n^jTTTijiir^reserving the setting under study. The reports of guiililntive mwirrii are written as narratives, which h^ the advantage of accessibil-itv of meaning to the reader, being full of concrete references and idenlifj-able characters. If we deal only with quantitative data, like Oscar WiIde's T^THc, weTcnow "the price of everything and the value of nothing." Numbers cannot tell the whole story. It is qualitative information that irises from and addresses th'eTioIi^ic and intuitive ways of knowing that ^ic truF scientist does not fear. References Me.id. Margaret. Coining 0/ Age in Sanioo. Laurel Edition. New York: Dell Publishing Com-pany. 1961. Wilde, Oscar. Complete Works of Oscar Wilde. London: Collins, 1981, p. 418. 91.29 ' Vustin Independent School District ^ Department of Management Information Office of Research and Evaluation Drug-Free Schools 1991-92 Evaluation Report Executive Summary Author: Kristen Blis^. Program Description: The Drug-Free Schools and Communities (DFSC) Act of 1986 provides funding to school districts to combat drug and alcohol abuse on their campuses. In 1991- 92, its fifth year of funding, the Austin Independent School District (AISD) received $464,924 from the DFSC grant. An additional $165,745 was carried over from 1990-91 for a total of $630,669. These grant monies fund a wide assortment of District programs aimed at drug abuse prevention and education. Progrrim components funded during the 1991-92 school year included: Student Alcohol and Drug Education and Prevention Program, Peer Assistance and Leadership, Conflict Resolution Project, Student Assistance Program, Drug Abuse Resistance Education, Elementary Curriculum, MegaSkills, Office of Student Intervention Services, Private Schools, Education for SelfResponsibility II, Medicine Education and Safety Program, Parent Involvement, AU Well Health Services Program, AISD Campus Police, and Read Pilot. The grant also provided for both a full-time evaluation associate and program fecilitator. Major Findings: 1. Students whose parents participated in the MegaSkills workshops had higher test scores than the national average, as well as higher attendance, lower discipline, and lower retention rates than other elementary students districtwide (pp. 26-30), 2. Both staff and PAL students agreed that the PAL program is an effective way for older students to help younger students avoid problems with drugs and alcohol. Dropout rates for secondary students served by the program both semesters were lower than predicted, and GPAs for these students were higher than their GPAs for the previous school year (pp. 10-15). 3. DARE is perceived as an effective way to communicate important information to students about the effects of drugs and alcohol by both teachers and the DARE officers, rhe officers are satisfied with the fifth-grade curriculum but believe the seventh-grade curriculum is not age appropriate and does not convey the no-use message as effectively as the fifthgrade curriculum (pp.20-23). 4. Dropout rates for all secondary students participating in the Student Alcohol and Drug Education and Prevention Program were below prediction, and the retention rate of elementary program participants was lower than that of other elementary students districtwide. Two thirds of the students reported that they learned about the dangers of drugs and alcohol, felt more confident, were better able to make decisions, and saw themselves as leaders after participation in the workshops (pp. 6-9). 5. High school students rank the use of drugs and drinking/ alcoholism in the top five of the biggest problems with which their school must deal while teachers at all grade levels, campus professionals, and campus administrators do not consistently rank them in the top 10 (p. 4). 6. There are considerable differences between high school students' perceptions of the prevalence of illegal drugs and alcohol on their campuses and their teachers' perceptions. Districtwide surveys found that the majority of high school teachers, administrators, and campus professionals believe the presence of drugs is staying the same, while most high school students believe it is either increasing or decreasing. More high school students believe the presence of alcohol on their campus is increasing than do their teachers, campus professionals, and administrators (pp. 3-4). 7. A number of program components were not implemented as planned, including the Student Assistance Program, Office of Student Intervention Services, and Education for Self-Responsibility II (pp. 18, 31-32,37). Budget Implications: Mandate: External frmding agency- Drug-Free Schools and Communities Act of 1986 (Public Laws 99-570,100-297,101- 226, and 101-647). Funding Amount: 1991-92 Allocation: $464,924 Funding Source: Federal Implications: Funding of this program has contributed to increasing achievement scores and lowering dropout rates and retention rates of students in the program. Continued funding will assure that more students participate and benefit from its positive effects, Contin-ued funding and evaluation of results are imperative if AISD is to achieve Goal 6 of the AMERICA 2000 action plan that by the year 2000, every school in America will be free of drugs, as well as AISD's first strategic objective that every student will function at his/her optimal level of achievement and will progress successfully through the system. According to PL 99-570, no local education agency shaU be eligible to receive funds or any other form of financial assistance under any federal program unless it certifies to its state agency that it has adopted and implemented a program to prevent the use of illicit drugs and alcohol by students and employees. MegaSkills 1991-92 Allocation
$40,650 Students whose parents participated in the MegaSkills workshops had higher test scores than the national averages, as well as higher attendance and lower discipline and retention rates than other elementary students districtwide. These students also showed improvement in these areas since the 1990-91 school year. Nearly all the parents reported that they would recommend the workshops others, and nearly all the principals believe it is important to continue providing the workshops. to The MegaSkills program, created by Dr. Dorothy Rich, founder and president of the Home and School Institute, offered parenting skiUs workshops to parents at 52 District schools. The series of five to eight workshops focuses on such skills as confidence, motivation, effort, responsibility, initiative, perseverance, caring, common sense, teamwork, and problem solving. Each workshop consists of information-sharing, large and small group discussions, and demonstrations of hands-on activities (called "recipes") which cm be repeated at home with children. Two MegaSkills facilitators were hired: one from AISD who was paid from the DFSC grant, and one from the A-t- Coalition, paid for by IBM. Additional workshops were offered at five businesses and three neighboring school districts, but the results of this report only include students from AISD schools. Eight area businesses contributed more than $13,000 in cash, services, or facilities to the MegaSkills project: Advanced Micro Devices, DuRite Duplication, HEB Grocery, IBM, Markborough Texas/Harris Branch, Southwestern Bell Telephone, 3M, and Southwest Area Council of the Greater Austin Chamber of Commerce. Additional funding in the amount of $21,980 was also provided by the Chapter 1 grant. The AISD MegaSkills facilitator sent letters to all elementary campuses describing the program and requesting signed letters of intent and leadership nominations from those campuses interested in providing the workshops. Upon completion of 10 hours of training, leaders received certification from the Home and School Institute to become workshop leaders. In 1991-92, a total of 214 District staff, campus staff, and parents received training as workshop leaders. An additional 46 leaders from 1990-91 continued to lead workshops. The schools advertised the workshops to parents through fliers, PT A or school newsletters, AISD cable channel announcements, and advertisements in the city paper. What information about drug use prevention did the program provide? The MegaSkills facilitator and the Drug-Free Schools project facilitator collaborated on an effort to expand the scope of the workshops to include more information about drug and alcohol use, prevention, and detection. During the course of the year the project facilitator left his position, but the curriculum plan is expected to be in place for the 1992-93 school year. Evaluation A number of methods were used to evaluate the McgaSkills programs, including surveys of parents and school staff, and student success measures such as achievement, attendance, discipline, and retention rates At each workshop, parents were asked to fill in the names of their children on the sign-in form so that ORE could create a database to assess the aforementioned measures of success for the students whose parents were involved in the program. Unfortunately, because all leaders were not firm in insisting that parents fill out the fonn, many parents neglected to provide their childrens names. Therefore, the database did not contain a complete record of students potentially served by this program. The following results are based on those students included in the database. 2691.29 MegaSkills Student Characteristics Of the 1,196 elementary students included in the analysis: 5% were in pre-K, 25% were in kindergarten, 15% were in grade 1, 14% were in grade 2, 12% were in grade 3, 12% were in grade 4, 11% were in grade 5, and 4% were in grade 6 (see Figure 18)
15% were African American, 35% were Hispanic, and 50% were Other
11 % were limited English proficient (LEP)
46% were low income
30% of the students were identified as at risk
13 % of the students were identified as gifted/talented
and 10% were overage for their grade. FIGURE 18 GRADE LEVEL OF MEGASKILLS STUDENTS 1991-92 2iid grade The GENESYS program examined achievement, attendance, discipline, and retention rates for the group of students in the ORE database. Figure 19 compares MegaSkills students 1991-92 attendance, discipline, and retention rates with their 1990-91 rates, and the 1991-92 rates of elementary students districtwide. Achievement MegaSkills students achievement was analyzed in three ways. Program students scores on two standardized tests were compared to national averages, to predicted scores, and to District averages. In a comparison of 1992 ITBS/NAPT achievement scores to 1991 national norms, the MegaSkills students scores were above the national average in reading in five of six comparisons, and above the national average in mathematics in all six comparisons. The 1992 ITBS/NAPT scores for these students were also examined using OREs Report on Program Effectiveness (ROPE). ROPE predicts achievement scores for the group of students who have both 1991 ITBS/TAP scores and 1992 ITTBS/NAPT scores. These predictions are then compared to the students actual scores. The difference between these two scores is called the ROPE residual score, which is based on a grade equivalent score scale. If students ROPE residual scores are far enough above or below zero to achieve statistical significance, they are said to have either "exceeded predicted gain" or to be "below predicted gain." Nonsignificant residual scores are classified as "achieved predicted gain." j MegaSkills students scores exceeded predicted levels in two comparisons, achieved predicted levels in 11, and were below predicted levels in no comparisons. The Texas Assessment of Academic Skills (TAAS) scores of program students in grades 3 and 5 were also compared to District averages. The 27 I 1 (TAAS) scores of program students in grades 3 and 5 were also compared to District averages. The percentage of MegaSkills students who mastered the TAAS was higher in eight comparisons, and the same in seven, and below in none. Anendance Compared with the attendance rates for elementary students districtwide, the rate for the MegaSkills students was higher in both the fall 1991 and the spring 1992 semesters. When the attendance rates are compared to these same students during the 1990-91 school year, attendance rates increased from the spring of 1991 to the fall of 1991, and then dropped slightly in the spring of 1992. A decline in attendance between the fall and spring semesters is common districtwide at all grade levels. Discipline The rate of discipline incidents for MegaSkills students was lower than that of elementary students districtwide in 1991-92, as well as for these same students during the 1990-91 school year. Retention Compared with the percentage of all AISD elementary students recommended for retention for the 1992- 93 school year, the percentage of MegaSkills students recommended for retention was lower. FIGURE 19 PROGRESS INDICATORS FOR MEGASKILLS STUDENTS AND OTHER ELEMENTARY STUDENTS IN AISD, 1991-92 Indicator Semester MegaSkills Students: 1991-92 MegaSkills Students 1990-91 AISD Elementary 1991-92 Attendance Rate Fall 1991 97,0%
97.3% 96.5% Spring 1992 96.6% 96.3% 96.0% Discipline Rate Fall 1991 0.0% 0.3% 0.1% Spring 1992 0.0% 0.3% 0.2% Retention Rate Spring 1992 0.3% NA 0.4% Parent Opinion At each workshop parents were asked to complete a sign-in sheet and a session feedback form. The sign- in sheet functioned as both an attendance record and a student roster. Because the leaders did not insist that *e forms be filled out, the attendance record was not accurate. A total of 1,666 parents from more than 30 different schools completed feedback sheets evaluating the workshops. Nearly all (90%) said they gained new information during the workshop (N = 1,646)
Nearly all (96%) would recommend MegaSkills workshops to others (N = 1,651)
The vast majority (80%) said the workshops helped them increase their understanding of their role in their childrens education (N = 1,372)
2891.29 Almost half (49%) reported that since attending the workshops, they have increased their involvement at their childrens school (N = 1,339). Parents views were split in-these areas: A third (33%) agreed that the lessons helped em teach their children about the dangers of drugs and alcohol: about one fourth (26%) selected a neutral response, and over one third (37%) selected "not applicable" (N = 1,315)
Nearly a third (31%) said their childrens grades have improved since using these recipes: another third (35%) selected a neutral response, and another third (33%) selected "not applicable (N = 1,324)
and A third (34%) repotted that the recipes had a positive impact on eir childrens attendance in school: less than one third (31%) selected a neutral response, another third (33%) selected "not applicable," one percent disagreed, and one percent selected more than one response (N = 1,325). Parents reported at they received new infonnation and would recommend the workshops to others. In the 1992-93 school year, MegaSkills funding will be provided by both Drug-Free Schools and Chapter 2. Since the DFSC grant will continue to fund a large portion of the program, more emphasis should be placed on helping parents teach their children about the dangers of drugs and alcohol, as well as helping them identify behaviors that indicate possible drug and alcohol use. Principal Opinion A total of 37 principals
returned a questionnaire at the end of the year assessing the program at their school. The results indicate that of the children whose parents participated in the woikshops, most principals reported: Improved or much improved academic work (69%
N = 36), Better or much better attitudes (74%
N = 34), and Fewer or much fewer behavioral problems (74%
N = 35). See Figures 20, 21, and 22 for a breakdown of all responses to these questions. FIGURE 20 PRINCIPALS ASSESSMENT OF MEGASKILLS STUDENTS ACADEMIC WORK FIGURE 21 PRINCIPALS ASSESSMENT OF MEGASKILLS STUDENTS ATTITUDE ssx Neutral Much Improved 11% Better 53% 29 1 ) i Most principals also agreed agreed that
or strongly FIGURE 22 It is important to continue offering MegaSkills at their school (9'1%- N = 37)
The training increased panicipating parents involvement in their childrens education (86%
N = 36)
Most of the panicipating parents improved or increased their communication with their childrens teachers (67%
N = 36)
Participating parents seemed . -------more relaxed in discussing their children, education, and the school (76 % N = 34)
and ^ey had seen a noticeable difference in the behaviors and attitudes of the student whose parents participated in the training (71%
N = 34), The DFSC cost per student was $33.99 (40.650/1.196). 30 STUDENTS behavioral Fewer . 60 / Q megaskills problems About the Same \ 26% Much fewer 1456 J4 LITTLE ROCK SCHOOL DISTRICT INSTRUCTIONAL RESOURCE CENTER 3001 PULASKI STREET LITTLE ROCK, AR 72206 February 1, 2000 TO
Division of Instructioj FROM
jction Dr. Bonnie Leslejyy,. AAssssoocciiate Superintendent for Instruction SUBJECT
Division Meeting
Wednesday, February 2 Let's use our meeting this month to take stock" of where we are on our Work Plan for 1999-2000. Please bring your copy. People giving reports need to be brief and talk fast. A 1.2000-2001 Curriculum Catalog 2. 2000-2001 Proposed Calendar 3. ESL Update 4. Middle School Publication and Plans 5. NSF Update 6. Personalized Education Plans 7. Talent Development Plan 8. Instructional Standards - Update 9. Cultural Diversity and Prejudice Reduction Training 10. Elementary Literacy Update 11. Curriculum Mapping Bonnie Lesley Bonnie Lesley Karen Broadnax Linda Austin Vanessa Cleaver and Dennis Glasgow Gary Smith Bonnie Lesley Mable Donaldson Marion Woods 12. Assessment Plan 13. Collaborative Action Team 14. Parent-School Compacts Pat Price Mona Briggs and Eddie McCoy Kathy Lease Debbie Milam and Marion Baldwin Pat Price and Leon Adams WHEW!! Some planning we need to do
1. Schedule of summer training and notification to teachers. 2. What to do about thematic instruction? BAUadg 5 LESLEY, BONNIE From: Sent: To: Cc: Subject: LESLEY, BONNIE Friday, February 04, 2000 4:05 PM JOHNSON, VIRGINIA
MILAM, DEBBIE LEASE, KATHY R. RE
partnership evaluation Thanks for following up, Debbie. I think you are going to be impressed with the work that Virginia has done. I am. Original Message From: Sent: To: Cc: Subject: JOHNSON, VIRGINIA Friday, February 04, 2000 10:22 AM MILAM, DEBBIE LESLEY, BONNIE, LEASE, KATHY R. RE: partnership evaluation How about Wednesday 2/9 in the morning or Friday 2/11 in the afternoon? Original Message From: MILAM, DEBBIE Sent: Thursday, February 03, 2000 3:59 PM To: JOHNSON, VIRGINIA Subject
RE: partnership evaluation okay. Let me know what is good for you. Debbie Miiam Volunteers in Public Schools Original Message From: Sent: To: MILAM, DEBBIE JOHNSON, VIRGINIA Thursday, February 03, 2000 2:56 PM Subject: RE: partnership evaluation I am finalizing it now, in between our orientation sessions on the upcoming NWEA level testing. Lets get together some time next week and you can give me some input on the final draft. There are two sections, per Julios request: Community Engagement, and Resources. Original Message From
MILAM, DEBBIE Sent: Thursday, February 03, 2000 1:36 PM To:JOHNSON, VIRGINIA Subject
partnership evaluation Virginia, Dr. Lesley said that you put together a great evaluation piece for the NSF partners. She thought it might be useful for our community programs and suggested I get with you sometime. Let me know when you have time to show it to me. Debbie MUam Volunteers in Public Schools 1 6 LL ^h's/oo II 1- LITTLE ROCK SCHOOL DISTRICT INSTRUCTIONAL RESOURCE CENTER 3001 PULASKI STREET LITTLE ROCK, AR 72206 (501) 324-2131 4 March 14. 2000 TO: Everyone FROM
e^)r.. Bonnie Lesley, Associate Superintendent for Instruction SUBJECT
The Value of Assessments and Data Analysis Please read carefully the attached article about the results of a school that uses the NWEA Achievement Level Tests that you have just administered. You may also wish to share this information with your staff. 4 '^1- -ft : -J* The new assessments created tons of extra work for everyone involved, but if they help us improve and align teaching and learning, they are well worth the effort! Carrie Martin Elementarys story may help you ensure improved results at your school! Attachment BAL/rcm # I 8ft: 3 a#
-,
Data-Driven Success 4 when fourth-graders at Carrie Martin Elementary School made the second highest gains on the 1998 Colorado State Assessment in reading and writing, state officials wondered how we could have achieved so much so quickly. A few privately joked that we must have cheated, but one look at the data showed our changes were serious and real. Eighty percent of Carrie Martin students passed the state reading test, and 65 percent passed the writing test. This compared to 65 percent passing the reading test and 33 percent passing the writing test the year before. And it compares to statewide averages of 60 percent and 30 percent, respectively. Our success is all the more remarkable because more than 25 percent of our students qualify for the free and reduced- price lunch program, and 22 percent have special education individualized education plans. (In Colorado, special education students are required to attempt the state tests. If the tests are too difficult for a student, a zero is averaged into the schools score for that student.) How did we raise our scores so dra-matjcally? We used our assessment program to measure everything that affected student performanceTThen we changed or cut anything that didnt im-prove achievement. 1 Data-driven instruction Carrie Martin is one of 18 elementary schools in the Thompson School District, a primarily rural district of 14,325 students in northern Colorado. Frustrated with the limited information that standardized tests gave us, district officials began using Northwest Evaluation Association (NWEA) achievement-level tests 10 years ago, but we didnt get serious about data-driven instrucrion How one elementary school mined assessment data to improve instruction By Keith Liddle until four years ago. Thats when we started relying on pre-assessment and state content standards to identify student needs and learning styles, then using that information to plan and implement teaching strategies appropriate for each child. Pivotal to our assessment program are the NWEA achievement-level tests, which have been custom-designed to align with our curriculum and to predict how students will do on the state tests. We used the data from the NWEA tests to measure student progress and the effects of changes in die curriculum. The data also allowed us to predict performance on the state tests, to encourage students to do better, and to point out specific areas where they need to work harder. Even before we began using NWEA tests, we realized we had been focusing too much on middle or average students. If we were going to challenge all of our students appropriately, we needed to raise our benchmarks and stop teaching to the middle. We now try to teach each child at his or her own achievement level. To measure how were doing, we test children at their achievement level^which isnt necessarily their grade level. An advanced fourth-grader might take achievement tests at the sixth-grade level, while a classmate might be tested at the third-grade level. NWEA helped us set up this system by sending representatives to meet with a group of teachers from our district. Together, they drew from NWE^ bank of 15,000 field-tested items to develop math and reading tests that aligned with our curriculum. NWEA helped us develop short assessments, called locator tests or placement tests, to determine at what level each student should be tested. Charting individual students growth on achievement-level tests allows us to focus on each students needs and progress. Most students take pencil-and^ per tests in the fall and spring. At-risk studentsincluding those who score below the benchmark in the fall also take a computerized version of the tests as a mid-year assessment. Most students show progress after the fell test, and they cant wait to tell their parents and teachers about their success. The mid-year test provides the positive feedback these kids need, and most are never at risk again. TTie achievement-level tests also help us challenge our more advanced students. For example, when last years fifth-grade students broke the previous school record on NWEA math scores in 10 years, we told them, We think you can do better. We raised the bar as high as we could, challenging some students to take tests at the highest level. Our students rose to the challenge: Twenty scored above the eighth-grade benchmarks, 17 scored above the seventh- grade benchmarks, and the majority of our special education students J 30 www.electronic-school.com March 2000 t rscored at or above the fifth- and sixthgrade benchmarks. Crunching the numbers I NWEA provides ongoing help with test ' administration, scoring, and data interpretation, which helps us use the test data appropriately to improve learning. Data are collected and analyzed for the ivhgedistrict, for specific schools, for 5 'Hiferent grade levels, and for each stu- TemT Detailed test data for each student showing the students test scores and how they compare to whats expected ~are used dunng parent-teacher confer-prices. For every single student, we set gSalTthat include what the parents will HoTwhat the student will do, and what the teacher will do. If a student is below -yhrbenchmarlc, the student, the parent, and the teacher develop a personal edu-cation plan. Together, they review state -assessment scores, achievement-level test scores, classroom activities, and a variety of other factors. Then they decide on a plan that might include tutoring, summer school^ or other actions to help the student succeed. tests also enable us to provide accountability information~to our broader &5l!ata from the achievement-level school community. Every year, we compile a school profile and an annual report for the district, the state, and our accountability committee. This report includes an action plan, an oudine of our goals, and a report on our measured growth. Among other things, it also includes graphs of our test scores, along with breakouts of the data, such as how girls scored versus boys. Members of the accountability committee which includes staff, parents, Ttiident council members, aniad (commu- nity representative!___-u_s_e_ t_h_e_s_e _r_e_sults to evaluate whats working and to recommend changes. The committee also uses the mtormation to develop surveys that are sent to parents and teachers for more input. We analyze the information gathered from these surveys to make changes at the classroom level. Based on our survey results, teachers detemuned what fifthgraders needed to exit our elementary school. We worked backward, so each grade level was a stepping stone to the exit requirements. For example, we changed our spelling practices to improve daily writing. Each grade level was given about 20 words that are considered no-excuse words. The weekly spelling test was no longer the only criterion for the spelling grade
if a no-excuse word was misspelled in a writing assignment, the students spelling grade could slip from an A to a C or lower. The no-excase words are cumulative, so students nnust be able to spell words required in previous grades, as well as their own. Once students understood the importance of the no-excuse words, most learned them well. Making time for tests VNo change_________ ___________ Initially, some teachers thought the comes without problems^ achievement-level tests were just another assessmentand a big waste of time. Many said, Were not going to have time to teach if we have to administer all these assessments.' To address these concerns, the districts assessment director and NWEA representatives explaiped how achievement- level tests are differenthow they would show students progress over time. We would be able to see whether our students progressed as much in grade five, for instance, as they did in grades three or tour Hearing this caused some additional anxiety among teachers, who feared they would receive poor evaluations if their students didnt progress. However, as teachers implemented the tests and COMPUTERIZED TESTING By Allan Olson I Assessment experts are just beginning to tap the potential for achievementlevel testing. The next step is to leave paper and pencil behind and move on to computerized adaptive tests that measure each individual students achievement in less time and with more reliability than anything weve seen so far. The Northwest Evaluation Association (NWEA), a nonprofit assessment organization that serves more than 300 member school districts around the country, is in the final stages of developing an Internet-enabled assessment system that adapts questions to the performance of each student When a student answers questions correctly, the questions become more and more difficult
incorrect answers lead to easier questions. The idea is to help students avoid the frustration caused by too-difficult questions or the boredom resulting from questions that are too easy. These tests can be shorterand take less class timewhile still providing a highly reliable estimate of each students achievement level. Research shows that scores from an adaptive test are as valid as those from a traditional test of twice the length. As with NWEAs paper-and-pencil achievement-level tests, the computerized tests can be customized according to a school districts curriculum and state standards. Each test draws from a large, calibrated pool of questions that vary according to each students answers. No test items will be repeated for a student who takes the test more than once. These adaptive tests can be designed for both PC-based and Mac-based networks, which enables schools to give tests to whole classes of students and transmit results for scoring and analysis. Typically, these computerized adaptive tests cost less to administer than conventional standardized tests and eliminate the cost of test booklets and materials handling. Because test administrators can connect to a testing service and download appropriate testing infonnation for each student as needed, tests can be kept secure. The new computerized system, now being tested in five school districts, will soon be available nationwide. For more information, check out NWEAs web site at 6t^://mino.mi)ea.i>rg. Allan Olson {allon@nwea.orgj is executive director of the Northwest Evaluation Association, a nonprofit assessment organization in Portland, Ore., that serves more than 300 school districts nationwide. March 2000 www.electronic-school.com 31 II FMO PRESS STwo important new titles from Dr. Jamie McKenzie, a pathbreaking former superintendent and an inventor of leading edge school programs making powerful use of networks and information technologies. Beyont/ Tcchnofo^y Shows how to create information literate schools emphasizing questioning and research. $20.00 -180 pages 2000 ISBN: 0967407826 iJ Is HowTeacfiers Learn Technology Be?* Outlines effective professional development strategies to recruit and win support of all teachers. $20.00-180 pages 1999 ISBN:0967407818 it- r Jamie McKezie Editor From Now On: The Educational Technology Journal saw for themselves how information from the tests could be used to improve student learning, the teachers became less fearful. Still, theres no denying the fact that the achievement-level teststwo one-hour sessions in the fall and another two in the springdo take some time away from ofoer activities. To make time for the tests (and academics), we have made some sacrifices. For example, we rarely schedule schoolwide assemblies or activities that pull students out of the regular classroom and away from core curriculum. Instead, we focus on what we need to teach to meet our goals. It turns out that this has not been much of a hardship. Our surveyshow that students, parents, and teachers all want to stay focused on aca- ~demics. We decided to give up something else last yearthe Iowa Test of Basic Skills. Initially, the school board thought we needed the Iowa test to measure how schools were doing. But as board members saw how rich the achievement-level test data could be, they realized that we didnt need the Iowa test and that the rime would be better spent working on areas identified by the NWEA tests. Teachers werent the only ones who were nervous about the tests. Some parents and students also worried, esjpe-ciaUy when they realized that the district tied these test results to high school graduation requirements. We addressed this concern by educating parents about the tests and the data when their children enter third grade. Most parents are amazed to see that we can predict as early as third grade whether their child is on track for high school graduation. While this idea scares some parents at first, it also prompts them to help their child grow academically. We try to limit anxiety during testing rimes. We reassure the students that theres no rime limit
we just want to see this to the six days of state testing that leave students mentally and emotionally drained. http://fno.org how much theyve grown since the last test. Stu Jents who have had experience To order books call toll free 1-888-453-4046 Purchase securely online http ://f no. org/books. html with these achievement-level tests typi-cally look torward to each testing time. They vvarit to be able to prove what theyve learned and what they know. Overall, the test administration is not as grueling for a child as other tests can be. A student can usually complete a test in 45 to 60 minutes. Because reading and math tests are given on separate days, testing takes two days. Contrast Beyond testing Of course, no single test can guarantee success. But weve used the achieve-ment- level test results to work with our entire school communitystudents, parents, classroom teachers, district administrators, and othersto measure the effects of different strategies. By measuring before-and-after test results, for instance, we found that a strict discipline program, coupled with incentives, led to higher student achievement. Students who come to class ill-prepared, for example, or who talk without raising their hands or dont stay on task, get a check. Students who have fewer than three checks each quarter are rewarded through recognition, additional recess time, and other bonuses. Test scores also improved after we increased homework for all students even for kindergartners. Every night, our students are expected to write a paragraph and read for 30 minutes. Homework also includes activities such as going to the grocery store to estimate how much selected items will cost and to compare that estimate to the total. As these activities increased parent involvement, parents have requested guidance in monitoring their childrens efforts, So we developed a system in which teachers send home a weekly sheet that tells what each child is doing, gives a status on assignments, and notes any problems, discipline or otherwise. Parents dont have to wait until the end of the quarter to know how their child is doing. Achievement-level testing allows us to measure the success of every initiative. These tests keep us on track and allow us to create higher standards for otjr students. And, weve found, when you have higher standards, students rise to meet them. Keith Uddle (Iiddlek@ttiomp5on.k12.co.us) is principal of the Carrie Martin Elementary School in Loveland, Colo. Editors Note: For a discussion of the technology of data mining, see Smart Data: Mining the School District Data Warehouse, Electronic School, September 1999. I i I I I I I I ( I I I I I I I I ( I j f I K' I I I j I 5 ! 1 32 www.electronic-school.com March 2000 I i I L Compliance Report Information Section 2.1.1 We have met with Steve Ross on the following dates since March 15, 2000: May 5, 2000 Leon Adams and PRE Staff to discuss Title I student achievement issues June 23, 2000 Ed, Virginia, and SteveInformation on schools August 4, 2000 Phone Conference with Steve Ross re: Program Evaluation August 25, 2000 Steve RossProgram Evaluation August 31, 2000 Steve Ross^Title I/Program Evaluation September 1, 2000 Conference call with Steve Ross and Dr. Camine re: $20 million ADE loan October 18, 2000 Steve RossProgram Evaluation October 20, 2000 Conference Call with Steve Ross re: Program Evaluation November 2, 2000 Steve RossProgram Evaluation November 17, 2000 Steve Ross, Kathy Lease, and Compliance Committee November 30, 2000 Steve Ross-Program Evaluation December 1, 2000 Steve RossProgram Evaluation December 15, 2000 Steve Ross and Compliance CommitteeConference call with Kathy Lease because of ice storm Program Evaluation Agenda Data Collected Data In Process Future Data Collection PreK-3 Literacy Plan 1. Fall 99 and Spring 00 Observation Survey and Developmental Reading Assessment (OS/DRA) 1. Spring 01 Observation Survey and Developmental Reading Assessment 1. Longitudinal study of impact of PreK-3 Literacy Plan using 4th Grade Benchmark Scores. 2. Fall 00 Observation Survey and Developmental Reading Assessment (OS/DRA) 2. Reviewing impact of PreK-3 literacy plan using growth data from Achievement Level Test (ALT) 2. Longitudinal study of impact of PreK programs on student achievement using OS/DRA, Benchmark, and ALT data 3. 99-00 Climate Survey of parents and teachers 3. Impact of summer school on achievement using a comparison of Spring 99 and Fall 00 ALT scores. 3. Impact of Extended Year schools on achievement using ALT, OS/DRA, and Benchmark scores 4. Promotion Rate 5. Attendance 6. Percent of students eligible for Free and Reduced-Cost meals 7. Demographic data: race, gender 8. Special Populations: Special Needs Students, Limited English Proficient National Science Foundation Project Components (K-12) 1. Attendance 2. Percent of students eligible for Free and Reduced-Cost meals 3. Demographic data: race, gender 4. Special Populations
Special Needs Students, limited English proficient 1. Annual updates for SY 2001-2002 (attendance, demographics, special populations, promotion, free and reduced meals, teacher professional development and certification) 2. Identifying trends in math achievement utilizing SAT-9, ALT, Benchmark, CRT, Explore, Plan, ACT, and Advanced Placement Test scores 3. Identifying trends in science achievement utilizing SAT-9, ALT, CRT, Explore, Plan, ACT, and Advanced Placement Test scores. 4. Identifying outcomes of SMART using fall and spring ALT scores 1. Longitudinal study of trends in math achievement by race and gender utilizing SAT-9, ALT, CRT, Benchmark, Explore, Plan, ACT, and Advanced Placement scores 5. Promotion Rate 6. Teacher professional development 7. Teacher certification 5. Identifying outcomes of After School Science Club utilizing attendance rosters and student survey 6. Identifying outcomes of professional development utilizing ALT and end-of-unit math and science CRT scores. 7. Identifying outcomes of professional development utilizing teacher survey data from end-of-unit math and science CRTs 8. Climate survey (teacher, parent, student, administrator) 9. Middle School Survey: Math and Science items (teacher and student perceptions) 8. Identifying teacher/student perceptions of newly implemented science curriculum using middle school survey data. 9 10. Seventh Grade SEPUP Survey (Fall 00) 11. Teacher survey, grades 2-8, at end of each math and science module 12. SAT-9 (math and science reasoning), grades 5, 7, 10 13. Math Benchmark Exams (Grades 4 and 8) 14. End-of-Math-Module CRT (Grades 3-8) 15. End-of-Science Unit CRT (Grades 3-8)______________________________ 14. Math ALT (Grades 2-8)__________ 15. Science ALT (Grades 3-8)_______ 16. Algebra I ALT (Grades 7-11) 17. Algebra II ALT (Grades 9-11) 18. Geometry ALT (Grades 9-11)_____ 19. Biology ALT (Grades 9-11)_______ 20. Physics ALT (Grades 10-11)______ 21. Chemistry ALT (Grades 9-11) 22. Advanced Placement Tests 23. Explore (Grade 8)_______________ 24. Plan (Grade 10)_________________ 25. ACT (Grade 11)________________ 26. Math course completition and final grades (Algebra I and II, Geometry, Concept Geometry, Trigonometry, Calculus, Statistics) 27. Science course completition and final grades (Biology, Physics, Chemistry) 28. Impact of SMART summer program using pre and post test scores 29. Impact of After School Science Clubs using 8 LESLEY, BONNIE From: Sent
To: Subject: LESLEY, BONNIE Friday, March 24, 2000 1:16 PM LEASE, KATHY R. RE: Steve Ross I would love to join you, but I am in another meeting in a few minutes on a grant proposal. I wish I had known. Original Message From
Sent: To: Subject: LEASE, KATHY R. Friday, March 24, 2000 11:14 AM LESLEY, BONNIE Steve Ross Bonnie, Steve Ross is in town for a meeting that his wife is attending. He is coming by about 1:00 to visit with me about Middle School evaluation. Could you join us? If so. I'll let you know when he gets here. Thanks, Kathy Kathy Lease, Ed.D. Assistant Superintendent Planning, Research, and Evaluation 3001 S. Pulaski Little Rock, AR 72206 501-324-2122 (VM) 501-324-2126 (Fax) krlease@irc.lrsd.kl2.ar.us 1 9 LESLEY, BONNIE From: Sent: To: Subject: LESLEY, BONNIE Tuesday, May 23, 2000 4
55 PM LEASE, KATHY R. Middle School Survey I do not feel that the questions on the survey forms give us much information about the middle school transition issues-which, I thought, was the reason for the survey-to use in the middle school evaluation. I suggest some of the following be added or used instead of those on the general sheet. 1. I want to know if kids like the way that math (and English and science) are taught this year, as compared to last year. 2. I want to know if kids felt adequately challenged by the instruction they received. Was it too difficult? too easy? interesting and engaging? 3. I want to now if they prefer hands-on, group activities or for the teacher to direct the class through lecture and recitation. 4. I want to know if they had adequate amounts of meaningful homework. Was it challenging and interesting? 5. I want to know if they feel that their teachers care about them. 6. Do they like working with a team of teachers? 7. Have they had opportunities to participate in intramural activities or sports? 8. Have they had opportunities to participate in clubs? 9. Are the number of periods in the day about right? 10. Do they feel they are being well prepared for the next grade level? for high school? 11. Do they like their elective classes? 12. If they need extra help, do they get it? 13. Is time used wisely? too many free periods? field trips? videos? 14. Are kids well behaved in the school? 15. Is the principal or assistant principals visible to the students in the halls? cafeteria? In other words, is the middle school restructuring working as planned? The questions should be about curriculum, but also about the other components of the middle school plan. Did anyone consult with Linda Austin about these surveys? She's the expert and has sample forms for surveys. I am very worried now that at this late date these surveys cannot be done before the students depart for the summer. I thought they had already been done and the forms had been collected. I only got these drafts to review today. Please keep me informed about where we are. 1 10 LESLEY, BONNIE From: Sent: To: Cc: Subject: LESLEY, BONNIE Monday, June 12, 2000 3:30 PM CARNINE, LESLIE V.
LACEY. MARIAN G.
MITCHELL, SADIE AUSTIN, LINDA RE: Middle School Evaluation Implementation and Evaluation? Kathy Lease is supposed to present the program evaluation in July. I tried to schedule a meeting with her and Ed last week to see where they are on this, but Kathy left town, and we didn't have it. I'll catch her later this week when she returns. Original Message- From
Sent: To: Cc: Subject: CARNINE, LESLIE V. Monday, June 12, 2000 3:22 PM LESLEY, BONNIE
LACEY, MARIAN G.
MITCHELL, SADIE AUSTIN, LINDA Middle School Evaluation Implementation and Evaluation? I know you heard the request for the plan and / or information on what we believe is working and what plans we have for an evaluation. I think we are still in good shape on this issue and Baker is not pressing. I also think we should keep this issue in "front of us". 1 11 LESLEY, BONNIE From
Sent: To: Subject: LEASE, KATHY R. Monday, July 31,2000 9:03 PM ADAMS, LEON
McCOY, EDDIE FW: Design Notes FYI-Here are the notes from Steve Ross. He wants to meet on August 25 (is that Friday?) at 1:00, if thats OK with us. Unless I hear from you. Ill tell him OK. KL Original Message From: Steve Ross-f&MTI^mwhbi Sent: TuesdayvJui To: LEASE, KATlXR. T7000 4
5^M iemphis.edu] <mailto:[SIVITP:smross(g).metnphis.edu1> Subject: Design Noles' Hi Kathy, Good to see you last week. Attached are my notes. Please distribute to Ed and Virginia. Turns out that I wont be able to meet on 8/4. However, perhaps you could identify a time for a speaker phone call the following week, and we can determine status of the project. Let me know if there are any Unknown Document questions about the notes. Thanks! 1 Kathy, here are my notes on the research plan. Basic Design is Program-Matched Control School with 9 SFA School/9 Controls Leon Adams will provide qualitative confirmation of the initial matchings and history of implementation in grades within schools. Ed Williams will examine 1996 SAT data to ensure that matched pairs were equivalent at baseline. Units of Analysis: Students nested in schools nested in programs Dependent measures: All subscales of all available test data Pretest (Covariate) 1996 SAT, Grades 2-5, Reading/Language Posttests 1997 SAT (Grades 2,3*) 1998 SAT (Grade 3) 1999 SAT (Grade 5*) 2000 Tests Observation (K, 1, 2) ALT (2,3,4,5*) Benchmark (4) Pretest score from 1996 Analyses 1. 2. 3. 4. 5. 6. Covariance/longitudinal: Treatment x Year (1997, 1999) on SAT for 1999 fifth graders with pretest Covariance/By Year: 1997 3'* graders with pretest
1999 fifth graders with pretest No covariate/Grade by Year: Treatment by Grade (2,3) in 1997
Treatment (Grade 3) in 1998
Treatment (Grade 5) in 1999. Observation x Grade (K, 1, and 2) in 2000 ALT by Grade (2, 3, 4, 5) in 2000 (Also, separate covariate analysis in Grade 5) Benchmark by Grade (4) in 2000. Special Analyses Repeat above disaggregating for ethnicity, LEP, and no Mobility (1 grade enrollment at school) Factor in implementation scores provided by Memphis. 12 LESLEY, BONNIE From: Sent: To: Subject: LEASE, KATHY R. Sunday, August 06, 2000 10:04 PM TRUETT, IRMA
JOHNSON, VIRGINIA
WILLIAMS, ED Memo for Board Agenda Meeting Importance: High Dear Folks, Attached is the memo I drafted for the Board agenda. Virginia, would you and Irma look it over and make any necessary changes? Irma, will you take it down to Bonnie, get her initials, then deliver it to Bev before noon on Tuesday? If you all have questions, let me know. I can make revisions tomorrow night or you all can make the revisions, just let me know what your suggestions are. We can wait to turn it in until Tuesday, as long as we get it there by noon. Ed and Virginia, what Dr. Gamine said he wanted for the agenda and board meetings is just the executive summaries, the conclusions, and the recommendations. We need to keep the recommendations very general, such as request that the curriculum division make additional recommendations on ....or study the possibilities of... I feel sure that Bonnie wont have enough time to react to all the reports to make specific curriculum and instruction recommendations. She will need to see the reports, but I want to see all final versions fkst. I can check my email every night. This will really be a tight schedule and may require some midnight oil, but we have to meet this deadline. The only extra time we might buy is to give them the reports in the Friday package, but I would really like to have them ready for Thursday night. Keep me posted!!! Hang in there everybody!! Kathy Bd Rpt Aug 2000 prog evat.doc 1 13 LESLEY, BONNIE From: Sent: To: Subject: LEASE. KATHY R. Thursday, August 10. 2000 11:06 AM CARNINE. LESLIE V. RE: program evaluations We don't have any national comparison data for ESL. Our consultant from Austin was supposed to send Ed some studies, but he hasn't done so. We'll do some web searches and see what we come up with. We'll also see if we can come up with some national sources on Middle School to add. Ed, Eddie and Virginia stayed late last night. I told them they had to bring their pj's and couldn't go home until these reports were finished. We are still getting corrections. I just found out that Dodd's DRA scores were incomplete. That means all of that data has to be recalculated-never ending story. That omission impacts the Title l/Pre-k-3 evaluation and the ESL evaluation. We'll keep you updated on the latest versions. The time of day is printed on the bottom of the reports so you can discard accordingly. We've gotten no feedback from Bonnie, but we are routing the reports for final review to as many of the curriculum folks as are in the building (workshops going on today!). Karen Broadnax has already been in to add her changes. Later, KL Original Message From: Sent: To: Subject: CARNINE, LESLIE V. Thursday, August 10, 2000 10:09 AM LEASE, KATHY R. RE: program evaluations As you would guess I had only a very brief look at the report and my only reaction at this time is the language assumes we may be different. What I'm suggesting is that the Middle School report suggests the change from Juniors Highs was because of.... If you look at national data the concern about middle level education and middle level youngsters is very consistent. People feel much better about early childhood, elementary and high school than they do about middle or junior high. Most researchers suggest this is because of the age and maturity issues plaguing these young people. Age of raging hormones...! Obviously your lead statements should be neutral rather than conclusive. I had a similar thought when you wrote about ESL, etc. Are those youngsters really different from their counterparts nationally or are we seeing lower achievement than the national data???? Based on what you indicated the district program may be better than or worse than. Any idea which??? Obviously if I'm correct you may want to color the recommendations differently??? Original Message From: Sent: To: Subject: LEASE, KATHY R. Thursday, August 10, 2000 9:02 AM CARNINE, LESLIE V. program evaluations Dr. Carnine, Here are the first two program evaluation reports. Bonnie is not in so I wanted you to be able to glance at them to see if they pass muster. The other two are on their way after a few other corrections are made. Each report will have a cover page for Board. Kathy File: Executive Summary ESL 99-OO.doc File: Executive Summary Middle Level 99-OO.doc Kathy Lease, Ed.D. Assistant Superintendent Planning, Research, and Evaluation 3001 S. Pulaski Little Rock, AR 72206 501-324-2122 (VM) 501-324-2126 (Fax) krlease@irc.lrsd.kl2.ar.us 1 14 Little Rock School District 810 W. Markham Little Rock, AR 72201 August 24, 2000 TO: Board of Directors FROM: Dr. Kathy Lease, Assistant Superintendent, PRE THROUGH: Dr. Leslie Gamine, Superintendent of Schools Dr. Bonnie Lesley, Associate Superintendent, Curriculum and Instruction SUBJECT: Program Evaluation In accordance with the research agfenda adopted by the Board of Education and recommended by the Superintendent and Associate Superintendent for Curriculum and Instruction, the Planning, Research, and Evaluation Department is presenting its findings from the first year of program evaluation of the four areas designated for the research agenda: Title l/Elementary Literacy, Little Rock Partnerships for Mathematics and Science Achievement (LRPMSANSF Grant), English as a Second Language Program (ESL), and Middle School Transition and Program Implementation. In order to carry out the program evaluation plan, data had to be gathered in three categories: participation, perception, and performance. In order to collect data on the performance aspect of the evaluation, it was necessary for the district to implement a new, comprehensive assessment plan. This plan is ready for full implementation with the 2000- 2001 school year. Benchmark data is now available and growth comparisons can begin for the 2000-2001 school year. Also, data was collected in the areas of participation and perception for several of the programs scheduled to be evaluated. Recognizing that program evaluation is an on-going process with continuous refinements, the Planning, Research, and Evaluation Department is presenting an executive summary of each of the four program evaluations, along with conclusions and recommendations. Program Evaluation Planning, Research, and Evaluation Little Rock School District July 2000 When gathering data for program evaluation, three areas are assessed: Participation Performance PerceptionParticipation consists of... Who was involved? What is their gender? What is their race? What school do they go to? What choices did they make regarding curricular, co-curricular, or extra-curricular activities?Performance consists of. . . Test Scores iwBenchmark Exams ^SAT-9 ^ALTs i*-Explore Grades Enrollment RatingsPerception consists of. . . Expectation Application Acquisition AttitudeProgram Evaluation Agenda 1999-2000 Middle School Transition NSF Grant ESL Program K-2 Literacy PlanLESLEY, BONNIE From: Sent: To: Subject: LEASE, KATHY R. Tuesday, April 04, 2000 4:13 PM LEASE, KATHY R. Projects Dear Folks: I know that we have a lot on our plates right now. Here is a list of the current projects that I know about and projected due dates. If you have others, please email me. Mona- Lit Review on Middle Schools-April 24 Work with Ed on Draft of Middle School Evaluation Draft-Due May 1 Final Pieces of CM, the plan for June 5, and Implementation Plan for Procedures for Credit by Examination-To Bonnie and me by April Eddie- Title I /PreK-3 Evaluation Plan-Draft Due May 1 Lit Review on Successful Programs in Low-Performing Schools or Program evaluation of Title I schools Yvette-Benchmark and End of Course Training- Kathy Lease, Ed.D. Assistant Superintendent next year-April 28 14 for review something similar. Planning, Research, and Evaluation 3001 S. Pulaski Little Rock, AR 72206 501-324-2122 (VM) 501-324-2126 (Fax) krlease@irc,lrsd.kl2.ar.us 1 LRSD Assessment Plan Using Assessment to Enhance Student AchievementEssential Purposes of Assessment Improvement of Student Learning Improvement of Instructional Programs Public Accountability, Confidence, and SupportThe design of our assessment plan is guided by the Revised Desegregation and Education Plan... 2.6 No barriers to participation by qualified African-Americans in extracurricular activities 9 AP courses, honors and enriched courses and the gifted and talented program 2.7 Improve and remediate the academic achievement of African-American students 2.7.1 Assess academic programs for effectiveness in improving African-American achievementif not effective, modify or eliminateRDEP, continued. 2.8 Promote and encourage parental and community involvement and support in the operation of LRSD and the education of LRSD students 5.2. La. By completion of the third grade, all students will be reading independently and show understanding of words on a pageRDEP, continued... 5.2.1 Primary Grades 5.2.l.d. Identify clear objectives for student mastery of all three reading cueing systems and of knowing-how-to-leam skills
5.2.l.g. Monitor student performance using appropriate assessment devices
5.2.1.h. Provide parents/guardians with better information about their childs academic achievement in order to help facilitate the academic development of students
RDEP, continued... 5.2.2 Intermediate Grades 5.2.2.a. By completion of the sixth grade all students will master and use daily higher level reading comprehension skills for learning in all subject areas, for making meaning in real life experiences and for personal growth and enjoyment
5.2.2.e. Monitor student performance using appropriate assessment devices
5.2.2.f. Provide parents/guardians with better information about their childs academic achievement...RDEP, continued... ra 5.2.3. Secondary Schools 5.2.3.a. Adopt as a goal that upon graduation all students will read independently with comprehension in all subject areas and be proficient in language arts, as necessary to be successful workers, citizens, and life-long learners
5.2.3.f. Monitor student progress and achievement using appropriate assessment devices.RDEP, continued... 5.3 Mathematics 5.3.2. Develop appropriate assessment devices for measuring individual student achievement and the success of the revised curriculum.Other guiding documents that impact assessment decisions... Strategic Plan Title I/K-3 Literacy Plan NSF Grant ACTAAP (State Accountability Plan) - Benchmark exams - End-of-Course exams - SAT-9Proposed Modifications to the LRSD Assessment Plan: Individual pre- and post-assessments for Kindergarten and 1 st grade Individual pre- and post-assessments for 2nd grade with G/T sereening second semester (CRT and Raven) Pre- and post-criterion referenced tests to measure individual student growth from year to year (grades 3-11)LRSD Assessment Plan, cont cl... State Required Assessments^ SAT-9 norm-referenced test for grades 5, 7, and 10 Primary benchmark exam (grade 4) Intermediate benchmark exam (grade 6not yet developed) Middle Level benchmark (grade 8) End-of-Course testsAlgebra I, Geometry, and LiteracyLRSD Assessment Plan, contd... District Coordinated Classroom Assessments Performance assessments aligned with Benchmark assessments and End-of-Course exams District developed CRTs measuring attainment of state standardsLittle Rock School District is committed to monitoring the individual academic growth of every student, and our assessment program must meet that need.Students use tests to ansyver these questions : a Am I learning what Im supposed to learn? Can I do what Im supposed to do? Am I trying as hard as I can? Should I try harder?Teachers use tests to ansyver these questions: Is each child growing in what he or she knows and can do? Is my teaching/instruction helping this group of students to be successful? Do any of my students need assistance from a special program? What changes do I need to make in my instruction?Parents use tests to ansyver these questions: sW ' ' How is my child doing? How is my child doing compared with others? Has my child mastered his/her grade level skills?The Board uses tests to ansyver these questions: Is the program of instruction working? Are our students meeting or exceeding the standards?Administrators use tests to ansy\^er these questions: What staff development is needed? How and where should we alloeate resourees?State and community use tests to ansyver these questions ? How well is the district doing its job? How do our schools and district compare with others?What skills does our community expect our students to have? Literacy skills Problem solving skills Ability to work togetherSchool Report Cards.... High Stakes Accountability Accountability for individual schools Who is not achieving? - Identify by name all students who are below proficient level Why not? - Curriculum - Instruction - Assessment What are we going to do about it?Paradigm Shifts Bell Curve - Normal distribution continues to fall into predictable patterns unless interventions are made. The New Paradigm - Standards-driven system - Smart Start belief systemWhat is a standard? What we want students to know and be able to do Common assessment of students 5 performance: create tests worth teaching to Externally set criteria for passing (a rubric/scoring guide)Standards-Driven Belief System Effort-based achievement Clear expectations to students Clear content standards Alignment of assessment with curriculum and instruction Adequate amount of time Honest feedback about progress Multiple opportunities to demonstrate what students have learned.Teaching Toward Tests Worth Taking... Academic Content Skills - Charts, graphs, number line, value of money, fractions, addition, subtraction, estimation, measurement - Editing skills, specific content from reading material (3 types of texts), vocabulary, main idea, plot, character, setting, elements of style, using resource material (dictionary) Process skills - Drawing a conclusionbest answer/most reasonable - Probabilitymost likely what is missing/wh^i^ needed - Reading strategiescontext clues, drawing conclusions (main idea), inferring information: predicting, understanding why the author wrote the material, and sequencing events Problem solving skills: organizing infonnation from one or more sources/eliminating unnecessary information/defending a position (specific to material provided)/ comparing or contrasting Writing process skills: prewriting/editing/revisionChildren's self-esteem gets better yvhen they see themselves getting better, Heidi Hayes JaeobsWhat are the essential questions about assessment? What do we want to accomplish with our assessment plan? What is the purpose of the assessment system? What do we want to do with the information? How do we value the Benchmark exams?Essential Questions... continued What difference will the assessment system make in the educational experience of the students? What difference will the assessment system make to the classroom teacher? Does the assessment system prepare students for high stakes exams? What skills are required for teacher and student success?7 Steps to Increase Student Achievement... 1. Acknowledge where you are. 2. Analyze where you are. 3. Align teaching with assessment. 4. Assess in a manner that is the same as on high stakes testing. 5. Attitude is everything all the time. 6. Accentuate your focus on testing strategies. 7. Activate a plan that will meet the needs of your learners. Charity Smith, ADE15 LESLEY, BONNIE From: Sent: To: Subject: LEASE, KATHY R. Friday, September 08, 2000 3:37 PM 'Steve Ross' RE: Program Effects Thanks for you emails. Both of them were well stated! KL Original Message From
Sent: To: Cc: Subject: Stevi ThflfJ 'Tsmrossi iphis.edu] __ [y? September 07, 2000_^26 PM l/carnine@lrsdadm.LRSD, 'ar.us Program Effects rK12,AR.US I have read three of the recent program evaluation reports completed by your research department. They appear to be of very good quality--well-written, clear, and comprehensive. There is substantial rationale, both logical and empirical, for giving programs time to impact student achievement. The first stage of impact is program implementation, the next is changing instruction and/or conditions for learning, and after these effects occur, achievement may be impacted. In our Memphis study, it took at least tvzo years for school reform programs to show/ positive results. In fact, after the first year, achievement scores went down! This same pattern was replicated with three different cohorts of schools. I will be sending you a copy of that report, which should be completed by 10/1/00. 1 16 LESLEY, BONNIE From: Sent: To: Subject: LESLEY, BONNIE Wednesday. September 27.2000 6:33 PM MILAM. DEBBIE RE: parent involvement surveys Oh, and 1 love the idea! Original Message From: Sent: To: Cc: Subject: MILAM, DEBBIE Wednesday, September 20, 2000 4:11 PM LESLEY, BONNIE CARNINE, LESLIE V. FW: parent involvement surveys Bonnie, I think you're aware of this already. Do you see a problem with it? If not. I'll send the survey to you when we finish the draft tomorrow. Frances says it's okay with her. In fact, she offered to help interview parents. Debbie Miiam Volunteers in Public Schools Original Message From: Sent: To: Subject: MH HE fednesday, September 20, 2000 1J,
! MITCHELL SAnir PAWW P9r6ht involvement surveys AM ^TRANCES H.
LACEY, MARIAN G. Dear Sadie, Frances and Marian, Our Collaborative Action Team (CAT) would like to conduct oral interviews of parents on the subject of parental involvement. We'd like to have volunteers in a few schools on Wednesday, October 4 as parents come in for conferences. We will not have enough volunteers to cover all schools so we'll want to select at least one elementary, middle and high school, with some geographic diversity. Would this be alright with you? can run the survey by you tomorrow if you don't see a problem with this. We want to start collecting information on parental involvement since it is our major focus. Thanks, Debbie Mi lam Volunteers in Public Schools 1 17 LESLEY, BONNIE From: Sent: To: Cc: Subject: LEASE, KATHY R. Wednesday. October 11,2000 5:41 PM WILLIAMS, ED
JOHNSON, VIRGINIA
McCOY, EDDIE CARNINE, LESLIE V.
LESLEY, BONNIE Regular meetings with Steve Ross Importance: High We now have our meeting scheduled with Steve Ross. Please plan to work from 9:00 until he decides to leave on October 18th. We will review with him all revisions of the program evaluations and go over any new data (just in case any is available by then). We will meet in room 12. Thanks, KL Kathy Lease, Ed.D. Assistant Superintendent Planning, Research, and Evaluation 3001 S. Pulaski Little Rock, AR 72206 501-324-2122 (VM) 501-324-2126 (Fax) krlease@irc.lrsd.kl2.ar.us 1 LESLEY, BONNIE From: Sent: To: Subject: LESLEY, BONNIE Monday, October 23, 2000 5:01 PM JOHNSON, VIRGINIA RE: Washington meeting Thanks, Virginia. Original Message ______ From: WOOlCVtRGlNIA Sent: To: ' Subject: Friday, October 20, 2000 11:46 LESLEY, BONNIE- Cl EAJ FW! WSShington meeting ^vANESSA
GLASGOW, DENNIS
CARNINE, LESLIE V. I have continued my email conversations with Dr. Jason Kim of Systemic Research and made arrangements to have all the materials sent directly to me from the December Sth CPMSA Kev Indicator and Evaluative Study Workshop for Data Managers and Evaluators. In addition, Jason says I will have access to Linda Crasco for any questions I may have. This is great because she has worked with me when I first came on board last year and we have a history of good communication. Her input will give me acess to group discussions where evaluators share "experiences and issues in core data collection, analysis, utilization, and local evaluation." (I took that right off the workshop outline.) In addition, Jason may also be able to send me whatever is collected from the other members of my cohort that includes Beaumont, Dayton, and Montgomery, all of which initiated their programs in 1998. I have conveyed these details to Julio and assured him that (1) he can depend on me to benefit from this workshop, as NSF intends, even though I will not be able to attend in person, and (2) I will make sure that valuable strategies related to core data and local evaluation are implemented here in Little Rock. He has responded that this "indeed reflects your full understanding of the role of the evaluator in the CPMSA undertaking." So. I think its covered from all aspects. Now if you want to send Mona Briggs to take notes it's your call, but it may not be necessary. However, you may want to talk to Vanessa as she also talked to Jason about alternatives related to her needs from the SR workshop. Original Message From: Sent
To
Subject
JOHNSON, VIRGINIA Thursday, October 19, 2000 7:51 AM LESLEY, BONNIE RE: Washington meeting You see, this is why I said to Vanessa that you would have a plan! This is a good idea and I support it. Original Message- From: Sent: To: LESLEY, BONNIE Wednesday, October 18, 2000 12:56 PM CLEAVER, VANESSA
JOHNSON, VIRGINIA Subject: Washington meeting What if we send Mona Briggs and Ken Savage to the Washington meeting to represent us? Both understand at some level the kind of data that NSF likes to collect, etc., and Mona is great at taking notes. What do you think? Dr. Bonnie A. Lesley, Associate Superintendent for Instruction Little Rock School District 3001 S. Pulaski Little Rock, Arkansas 72206 501/324-2131 501/324-0567 (fax) 1 19 Planning, Research, and Evaluation Ish Instructional Resource Center 3001 S. Pulaski Little Rock, AR 72206 To
Gene Jones, ODM From: Kathy Lease, Asst. Supt., PRE Date: October 27, 2000 Re: Program Evaluation i We are meeting as a department with Dr. Steve Ross on November 2 at about 9:00, depending on his arrival time from Memphis. We are doing intensive work on each program evaluation to begin the revisions based on the new data that has come in. We will be meeting in Room 12 at the IRC. We would love to have you join us for the day. We will be meeting with Dr. Ross every two weeks through December. I will give you a complete schedule so that you can join us whenever possible. Most meetings are on Thursdays, except for November 19**^. i Call and leave me a message, if you can come. I will be at a meeting in North Carolina until Wednesday. I understand you really left town! We are anxious to hear about your trip. We look forward to seeing you. C: Dr. Bonnie Lesley, Associate Superintendent Ms. Sadie Mitchell, Associate Superintendent Mr. Junious Babbs, Associate Superintendent Mr. Brady Gadberry, Associate Superintendent Dr. Don Stewart, Associate Superintendent 20 Additional Programs and Strategies Requesting Evaluation Planning, Research, and Evaluation November 2000 SMART After School Science SECME Vital Link Benchmark Open Response Study for math Elementary Summer School Middle School Summer School High School Summer School Learning to Cope with Differences Alternative Learning Environments Lyceum Scholars High School Academic progress of ALC and ACC Charter School Hippy (as needed for Federal reporting) CAT Scottish Rite Reading Program Voyager Accelerated Reader Campus Leadership Team Survey Climate Survey 21 LESLEY, BONNIE From: Sent: To: Subject: LEASE, KATHY R. Tuesday, November 28, 2000 4:31 PM BABBS, JUNIOUS
FRANCES CAWTHON
Gadberry, Brady L.
Hurley, Richard
LESLEY, BONNIE
Leslie Carnine
LINDA WATSON
MARIAN LACEY
Milhollen, Mark
Sadie Mitchell
STEWART, DONALD M.
Vann, Suellen Steve Ross-Program Evaluation.ppt Steve Ross-Program Evaluation.... KL FYI-Here is a copy of Steve's presentation to the Board. 1 Using Evaluation for Program Improvement: Lessons Learned Steven M. Ross Center for Research in Educational Policy The University of MemphisI. -It
I li I 1 Types of Evaluation Formative: Improving developing programs 3 I a 55 How are we doing? I J Summative: Judging completed programs How did we do?The Evaluation Process t t stakeholder Buy-in I Evaluation Questions - Instruments - Data Collection - Analysis - Report IReporting I Executive Summary Introduction/Purposes Evaluation Questions Instruments Procedures Data Analysis Results ConclusionsConsiderations/Suggestions ( 1 I Evaluation is not sufficient in many districts/ schools. Evaluation needs to be ongoing. Programs alone do not increase achievement. It generally takes more than two years for programs and strategies to increase achievement. IWhat does increase achievement? Improved teaching Teacher buy-in Improved school climate Principal leadershipr Additional Suggestions: Form a research committee One or more Board members Assistant Superintendent For PRE and designated staff I Selected administrators, parents, students (should have research interests/expertise)Responsibility of Research Committee b i Committee meets monthly Reviews reports Initiates and plans new studies Focuses on applying research results to decision making Board Member(s) serve as liaison to full board IReporting to the Board I Spread out the presentation of research reports at Board meetings - One per meeting limit (unless there are special circumstances) IResearch Briefs Research Briefs should be prepared by PRE for Board members who need information in a short amount of time I Briefs will be highly readable and focus on major findings and implications I22 LESLEY, BONNIE From: Sent: To: Cc: Subject: LESLEY, BONNIE Tuesday, January 02, 2001 3:39 PM JOHNSON, VIRGINIA CLEAVER, VANESSA
GLASGOW, DENNIS
GILLIAM, ANITA NSF Program Evaluation Virginia, I know that Julio is going to want us to produce our program evaluation plan very soon. Please plan to meet with me and Vanessa and Dennis very soon to get this plan developed. Send me what you gave him previously, please, so that I can see what is lacking. Anita, please schedule a meeting for all of us asap. Thanks. Dr. Bonnie A. Lesley, Associate Superintendent for Instruction Little Rock School District 3001 S. Pulaski Little Rock, Arkansas 72206 501/324-2131 501/324-0567 (fax) 1 23 LESLEY, BONNIE From: Sent: To: Subject: JOHNSON, VIRGINIA Wednesday, January 03, 2001 2:55 PM LESLEY. BONNIE RE: NSF Program Evaluation prof dev. evaluation plan.doc Original Message From: Sent: To: Cc: Subject: LESLEY, BONNIE Tuesday, January 02, 2001 3:39 PM JOHNSON, VIRGINIA CLEAVER, VANESSA
GLASGOW, DENNIS
GILLIAM, ANITA NSF Program Evaluation Virginia, I know that Julio is going to want us to produce our program evaluation plan very soon. Please plan to meet with me and Vanessa and Dennis very soon to get this plan developed. Send me what you gave him previously, please, so that I can see what is lacking. Anita, please schedule a meeting for all of us asap. Thanks. Dr. Bonnie A. Lesley, Associate Superintendent for Instruction Little Rock School District 3001 S. Pulaski Little Rock, Arkansas 72206 501/324-2131 501/324-0567 (fax) 1 The following plan was submitted January, 2000 upon request by Julio. Julios site visit statements indicate he feels we have achieved these goals. While that is a testimonial to our presentations during the site visit, it is not entirely true that we have maxed out on this plan. You may want to ascertain that records developed in preparation for the 2000 site visit and housed in a Math Dept, computer file will be integrated with the district professional development data base at some time in the future. This integration would permit analysis by grade level, by school, by individual teacher etc, in relation to certification and to total hours of professional development as well as to various program implementation initiatives. You may want to consider how to assess the relationship between outcomes from various program implementation initiatives and professional development using the current data filing system. As you re-write this plan, it might help you to know that other funded programs have the capacities identified in the following plan and that is why they were included in ours. If our NSF professional development data was stored in the Districts professional development database we could access it as part of the identification of outcomes for any program initiative by teacher, by grade, by class, by school, etc. At the present time we are not able to do that. We can only offer a global participation report such as we gave during the site visit. While I am not minimizing the importance of doing this for our site visit, it does not permit any analysis in relation to specific outcomes of various initiatives. Julio saw that. What he did not see and we did not offer was the electronic inability to go beyond the fact that 238 3"* grade teachers received 3 hours of Investigations training on August 10, 1999. I am painfully aware that folks are real tired of hearing me fuss over this so I am glad to have your energy and guidance. Evaluation Component: Professional Development and Certification of Teachers of Math & Science Procedures have been established to collect relevant quantitative data from (1) the database maintained by the Professional Development Division of the LRSD, and (2) records maintained by Instructional Resource Center personnel responsible for providing professional development related to CPMSA activities. These sources provide incomplete archival data for NSF reporting. Therefore, a rudimentary record keeping procedure has been implemented to document activities until procedures can be developed to collect the comprehensive data necessary for Core Data Elements reporting and other NSF reporting parameters. The initial process of collecting district-wide demographics has begun to identify data for the baseline school year of 1997-98, the first year of 1998-99, and the second year of 1999-20 to date. This activity will continue across each succeeding year of the grant. Data has been and will continue to be, disaggregated by elementary, middle, and high school levels. The total number of instructors teaching math and science will be identified as well as the total number of those certified in math and science areas. Across elementary, middle, and high school categories, total number and percentages will be computed to identify the total number and percent (1) teaching math and science, (2) certified to teach in math and/or science areas, (3) completed less that 60 hours of professional development, (4) completed more than 60 but less than 120 hours of professional development, (5) completed more than 120 but less than 200 hours, and (5) completed more than 200 hours of professional development. This information will be displayed in table and figure form (graph with accompanying table) for the baseline year and each succeeding year of the project. In addition, custom-designed figures will identify demographic trends by displaying the percent of change from the baseline year to year five (2003) of the project. Formats for the tables and graphs used to clearly and concisely display data in this category can be viewed in the Program Evaluation Record. The process of collecting district-wide demographics has begun to identify data for the baseline school year of 1997-98, the first year of 1998-99, and the second year of 1999-20 to date. This activity will continue across each succeeding year of the grant. Data has been and will continue to be, disaggregated by elementary, middle, and high school levels. The total number of instructors teaching math and science will be identified as well as the total number of those certified in math and science areas. Across elementary, middle, and high school categories, total number and percentages will be computed to identify the total number and percent (1) teaching math and science, (2) certified to teach in math and/or science areas, (3) completed less that 60 hours of professional development, (4) completed more than 60 but less than 120 hours of professional development, (5) completed more than 120 but less than 200 hours, and (5) completed more than 200 hours of professional development. This information will be displayed in table and figure form (graph with accompanying table) for the baseline year and each succeeding year of the project. In addition, custom-designed figures will identify demographic trends by displaying the percent of change from the baseline year to year five (2003) of the project. Formats for the tables and graphs used to clearly and concisely display data in this category can be viewed in the Program Evaluation Record. 24 LESLEY, BONNIE From: Sent: To: Subject: LEASE, KATHY R. Friday, January 05, 2001 2:15 PM CARNINE, LESLIE V.
BABBS, JUNIOUS John Nunnery We are on board with Dr. Nunnery. He sounds like he will really be great. Having a public school background, he is very familiar with the time constraints, stakeholder issues, and politics. I am FedExing him a box of background materials to get started on. He will only be in Missouri for 2 or 3 weeks, then he will join his wife in Virginia. He is doing some consulting work for Johns Hopkins and the Memphis City Schools, but it sounds like he works hard and fast. Let me know any questions you have. Kathy Kathy Lease, Ed.D. Assistant Superintendent Planning, Research, and Evaluation 3001 S. Pulaski Little Rock, AR 72206 501-324-2122 (VM) 501-324-2126 (Fax) krlease@irc.lrsd.kl2.ar.us 1 25 LESLEY, BONNIE From: Sent
To: Subject
LESLEY, BONNIE Thursday, January 20, 2000 8:05 AM JOHNSON, VIRGINIA RE: CORE DATA ELEMENTS good, Virginia. Thanks. Original Message From: Sent: To: Cc: Subject: JOHNSON, VIRGINIA Wednesday, January 19, 2000 4:46 PM CLEAVER, VANESSA LEASE, KATHY R.
LESLEY, BONNIE
GLASGOW, DENNIS
WILLIAMS, ED RE: CORE DATA ELEMENTS I have been in contact with Michael Flynn at QRC about the changes He and Kevin Greenberg were identified at the NSFconference as contact personnel for evaluators. However, you should forward me whatever you get from NSF just to be sure. At this time, I am using a hard copy of the main menu and each page of the computer spreadsheet to organize required data. I have much of the data now and will have the remainder ready to submit using the electronic format in plenty of time to meet the deadline which is three months away. The core data elements (CDE) are just that - the core. They constitute a small subset of information currently contained in our existing Program Evaluation Record. Original Message From: LESLEY, BONNIE Sent: Thursday, January 06, 2000 8:02 AM To:CLEAVER, VANESSA
JOHNSON, VIRGINIA Cc: LEASE, KATHY R.
GLASGOW, DENNIS Subject: RE: CORE DATA ELEMENTS Let's put on our agenda what we are going to do now that Dr. Johnson is leaving. We have a major problem to get the work done to ensure a good report. Original Message From: Sent: CLEAVER, VANESSA Wednesday, January 05, 2000 4:31 PM To: JOHNSON, VIRGINIA Cc: LESLEY, BONNIE
LEASE, KATHY R.
GLASGOW, DENNIS Subject: CORE DATA ELEMENTS I received, today, updated information on the scope and content of the CDE activity for the 1998-1999 school year. The memo states that several of the items contained in the draft version (which we received in October) have been revised. You may have also received this information. Let me know if you did not and I'll forward a copy to you. 1 26 LESLEY, BONNIE From: Sent: To: Subject: LESLEY, BONNIE Sunday, April 16, 2000 12:55 PM JOHNSON, VIRGINIA RE: This Week-end Electronic dissemination is great! Original Message From: Sent: To: Subject: JOHNSON, VIRGINIA Friday, April 14, 2000 6:36 PM LESLEY, BONNIE RE: This Week-end Well, for #2 its back to basic summative/formative - Participation, Performance, Perceptions. I have a few ideas on #3 related to disseminating A&l electronically in a systematic manner as well as prepatory to planning activities. Bytes of A&l data as they emerge rather than in huge lumps. These bytes might be more effective if they contained an implications for practice attachment directly following A&l, ideally prepared by you so application in the field could occur swiftly and with the appropriate endorsement from administrative personnel. An electronic approach is very likely to read if it takes this format while print materials are easy for all to set aside. Both would be good. Given what you have sent, I will not write, just think and send helpful (hopefully) thoughts. Going home to Bob and Sophie now. See you at 2 on Sunday. Original Message From: LESLEY, BONNIE Sent: Friday, April 14, 2000 5:21 PM To:JOHNSON, VIRGINIA Subject: RE: This Week-end I just looked at my notes. He wants: 1. List of the program components-not just what we are funding from NSF, but what he called the LRSD Agenda for Mathematics and Science. 2. Description of the design to collect data on these for program evaluation. 3. Description of the procedures to feed the leadership with analysis and interpretation of data necessary for decision-making. Original Message From: Sent: To: LESLEY, BONNIE JOHNSON, VIRGINIA Friday, April 14, 2000 6:06 PM Subject: RE: This Week-end Great! Of course we need something for now and always in this type of environment you need to evolve as you go. I have in my mental computer sketched out a brief overview of the components addressed in CPMSA program evaluation and the rationale for their inclusion (basically the 8 components (see Status Report table of contents) were identified by NSF as necessary but not sufficient). 1 I noted last night that you had a list in the Compliance Report on page 55 under formative evaluations that might be a good place to work on that on beyond necessary into sufficient territory. We do have an established record, congruent with NSF requirements, but it does not move on beyond into the area of strategic plan implementation etc. Given that this is sometimes designated turf, it can be clearly defined who and how this on beyond Component 8 evaluation is accomplished. It may be that the first segment of this would focus on summative evaluation, the necessary 8 components, (I can draft that) while the second and major focus, would be on formative evaluation (you could draft this). Is this a reasonable starting point? If it doesnt come together in a way that will influence Julio positively, we can go back to the drawing board. Original Message From: LESLEY, BONNIE Sent: Friday, April 14, 2000 4:42 PM To: JOHNSON, VIRGINIA Subject: RE: This Week-end I can help you write that last piece. I'll just make up something for now. We can refine it in real life. How will that be? From
Sent: To: Subject: Original Message JOHNSON, VIRGINIA Friday, April 14, 2000 4:38 PM LESLEY, BONNIE RE: This Week-end I plan to work at a down town office were I have privileges and access to all equipment. Also have proof reader lined up who has worked for me for over ten years. Access to this building is just impossible. My home phone is 221- 9750 and my cell is 590-8217. If you do indeed come to the IRC to work that would be much more facilitative for me. Let me know. Probably will work both days. Want to have the Interpretation of Test Results report Julioized to reduce froth at the mouth syndrome. Sorry, this mechanism really does work to reduce the stress but it is so politically incorrect. Now my notes on the 3''*^ segment you asked me to do read major components of data, dissemination of findings to others Do you want to give me any other thoughts you had to guide me in producing what you need and had envisioned. I plan to work till 6ish. -----Original Message From: LESLEY, BONNIE Sent: Friday, April 14, 2000 3:09 PM To:GLASGOW, DENNIS
CLEAVER, VANESSA
JOHNSON, VIRGINIA Subject: This Week-end I am taking stuff home to work this week-end. If any of you are going to be here, call me, and I'll come here. I have my lap top at home and will periodically check e-mail from there. Or you can call me if you 2 need me. 868-4289 I have Cabinet Monday morning and a meeting all afternoon Monday with Sadie on the CLT Institute. If there is stuff that I need to review before we mail it, you HAVE to get it to me now or over the week-end. Otherwise, we have the same situation that we need to avoid-sending stuff as is because 3 27 LESLEY, BONNIE From: Sent: To: Cc: Subject: LEASE, KATHY R. Monday, January 22, 2001 11:23 AM CARNINE, LESLIE V. WILLIAMS, ED
JOHNSON, VIRGINIA FW: achievement gap charts Importance: High sample_charts.doc Kathy Thought you'd want a preview of what we are getting from Dr. Nunnery, our program evaluator. Original Message From: John Nunnery [mailto:john_nunnery@hotmail.com] Sent: Friday, January 19, 2001 4:05 PM To: KRLEASE@IRC.LRSD.K12.AR.US Subject: achievement gap charts Attached is a Word file with some example achievement gap analyses based upon the Stanford 9. "Standardized Achievement Gap" is the effect size of the black/white difference in scale score means, disaggregated by FAR and Pay. (White Mean minus Black Mean divided by Population Standard Deviation from the norm manual. The resulting numeral can be roughly interpreted as the difference in "years in achievement" between black and white students. For example, a value of+1.0 means that white students, on average, perform nearly one full grade level above black students. A negative value indicates that black students outperform white students. As the example charts show, LR school district had a very large achievement gap in 1997, but by 2001 the gap was completely eliminated in math and reading for FAR students!! Modest improvement was evident for Pay students. As we discussed,the analysis for Pay students is problematic because of the wide range of incomes in the Pay category and the likelihood that White Pay students' families have higher incomes than Black Pay. These charts are very encouraging and compelling. I look forward to receiving the 7th and 10th grade data. Get your FREE download of MSN Explorer at http://explorer.msn.com 1 28 January 24,2001 Planning, Research, and Evaluation Ish Instructional Resource Center 3001 S. Pulaski Little Rock, AR 72206 Mr. Larry Buck, Principal Henderson Middle School 401 Barrow Road Little Rock, AR 72205 Dear Mr. Buck: The first meeting of the Little Rock School District's Research Committee will be held on February 5, 2001 at 4:30 in Room ?? at the Instructional Resource Center. This committee will function to review and discuss the districts research agenda. Your participation and input are vital to the success of this committee. I look forward to seeing you at the meeting. A tentative agenda has been planned for the meeting that will allow us to establish some organizational guidelines and set up our future meeting dates. If you cannot attend, please call me at 324-2121. Sincerely, Kathy Lease, Ed.D. Assistant Superintendent 29 January 24, 2001 Planning, Research, and Evaluation Ish Instructional Resource Center 3001 S. Pulaski Little Rock, AR 72206 Ct kxxz) 1 i Uta Q.cy-'v-tC' II Mr. John Walker Attorney at Law 1723 Broadway Little Rock, AR 72206 Dear Mr. Walker: The first meeting of the Research Committee for the Little Rock School District will be held on Monday, February 5, 2001 at 4:30 in the Conference Room at the Administration Building. This committee will function to review and discuss the districts research agenda. We would be happy to have you or your representative observe the work of this committee. Our district is committed to improving student achievement, and this committee will work toward that goal. I look forward to seeing you at the meeting. A tentative agenda has been planned for the meeting that will allow us to establish some organizational guidelines and set up our future meeting dates. If you cannot attend, please call me at 324-2121. Sincerely, CllvtUi Kathy Lease, Ed.D. Assistant Superintendent 30 Little Rock School District Research Committee Agenda February 5, 2001 Establish Mission/Purpose Review reports and research briefs Initiate and plan new studies Focus on applying research results to decision making Board member serves as liaison to full Board Review Implementation of Section 2.7.1 of the Revised Desegregation and Education Plan Discuss Committee Decision-Making Process Review and discussion of reports Acceptance of report by vote of committees voting members (excluding ex officio members) Confirm Committee Organization Standing agenda Review of prior meeting and unfinished business Review of new research reports Suggestions for new district research Impact of reports on the Revised Desegregation and Education plan Preparation for Board meeting Additional ideas/suggestions Assistant Superintendent for PRE organizes and facilitates meetinffs' / ) c 7^ 03 i^, 1,/^SD Ul c.^^ Jo
This project was supported in part by a Digitizing Hidden Special Collections and Archives project grant from The Andrew W. Mellon Foundation and Council on Library and Information Resoources.