Imagine you have just taken on the role as leader of an educatio
Imagine you have just taken on the role as leader of an educational program in your specialization and have been told to implement a change process that is part of a district initiative.
-
md3dicussion2.docx
-
evaluationmatters.pdf
-
HowtheBestLeadersBuildTrustbyStephenM.R.CoveyLeadershipNow.html
-
MISSIONContentServer.pdf
-
TheDevelopmentalEvaluationofSchoolImprovementNetworks.html
-
TheImpactofLeadershiponStudentOutcomes_HowSuccessfulSchoolLeadersUseTransformationalandInstructionalStrategiestoMakeaDifference-ChristopherDayQingGuPamSammons2016.html
-
TeacherTrustinDistrictAdministration_APromisingLineofInquiry-CurtM.AdamsRyanC.Miskell2016.html
-
RubricDetailBlackboardLearnMOD3DISCUSSION.html
1
Discussion 2: Strategies for Stakeholder Trust in Change
Reversing the trend of dissatisfaction and disengagement must be at the heart of any serious reform effort. (Fullan, 2016, p. 97)
For this Discussion, you will analyze evidence-based strategies to establish stakeholder trust and buy-in for change and counteract resistance to change.
To prepare:
· Review the assigned chapters in the Fullan (2016) text. Consider the difference between adopting an innovative program, the complexity of actually implementing it, and why stakeholders resist change.
· Read the Gurley, Peters, & Collins (2015); Day, Gu, & Sammons (2016); Covey (2009); and Adams & Miskell (2016) articles. Think about the process of initiating and implementing change, the influence of leadership on change, and how to gain buy-in and trust from stakeholders throughout the change process.
· Reflect on experiences you have had in your professional practice where staff were resistant to a change in your specialization. What attempts were made by leadership to establish trust and buy-in for the change? What strategies were (or were not) used when staff members refused or pushed back during implementation? As a leader, what strategies would you have employed?
· Research evidence-based strategies for establishing trust and buy-in from staff prior to implementing change and for supporting staff when they resist changes during implementation.
By Day 5 of Week 6
Post an explanation of the following:
· Background information on an experience from your professional practice where staff were resisting a change in a program or practice in your specialization
· At least two strategies you would have used to establish trust and buy-in from the staff prior to implementing the change. Provide a research-supported rationale for your selected strategies.
· At least two strategies you would have employed when staff members refused or pushed back during implementation of the change process. Provide a research-supported rationale for your selected strategies.
For this Discussion, and all scholarly writing in this course and throughout your program, you will be required to use APA style and provide reference citations.
By Day 7 of Week 6
Read a selection of your colleagues’ posts.
Respond to at least two colleagues by offering an additional strategy for trust and buy-in and for when staff refuse or push back during implementation of the change process. Explain how the strategies would have supported leadership in your colleagues’ experience. Be sure to support your response with reference to the Learning Resources, research, and/or your personal experiences.
REFERENCES
https://go.openathens.net/redirector/waldenu.edu?url=https://journals.sagepub.com/doi/abs/10.1177/0013161X16652202
Fullan, M. (2016). The new meaning of educational change (5th ed.). New York, NY: Teachers College Press.
· Chapter 4, “Initiation, Implementation, and Continuation” (pp. 54–81)
· Chapter 6, “The Teacher” (pp. 97–122)
· Chapter 10, “The District Administrator” (pp. 177–208)
http://www.leadershipnow.com/CoveyOnTrust.html
https://www2.ed.gov/about/offices/list/oese/sst/evaluationmatters.pdf
https://cdn-media.waldenu.edu/2dett4d/Walden/EDDD/2015/CH/mm/grand_city/index.html
,
EVALUATION MATTERS: GETTING THE INFORMATION YOU NEED
FROM YOUR EVALUATION A GUIDE FOR EDUCATORS TO BUILD EVALUATION INTO PROGRAM PLANNING AND
DECISION-MAKING, USING A THEORY-DRIVEN, EMBEDDED APPROACH TO EVALUATION
Prepared by:
Susan P. Giancola Giancola Research Associates, Inc.
Prepared for:
U.S. Department of Education Office of Elementary and Secondary Education
School Support and Rural Programs DRAFT 2014
This publication was prepared for the U.S. Department of Education under Contract Number ED-07-CO-0098 (Contracting Officer’s Representatives: Kenneth Taylor, Sharon Horn, and Vickie Banagan) with Kauffman & Associates, Inc. The views expressed in this publication do not necessarily reflect the positions or policies of the U.S. Department of Education. For the reader’s convenience, this publication contains information about and from outside organizations, including hyperlinks and URLs. Inclusion does not constitute endorsement by the Department of any outside organization or the products or services offered or views expressed. Nor is any endorsement intended or implied of the consulting firm “Evaluation Matters.” In fact, this publication was not prepared with help from or in consultation with, in any manner, that firm. U.S. Department of Education Arne Duncan Secretary Office of Elementary and Secondary Education Deb Delisle Assistant Secretary School Support and Rural Programs Jenelle V. Leonard Director DRAFT January 2014 This publication is in the public domain, except for the 1.1 Implementation Rubric in Appendix B, for which the William & Ida Friday Institute for Educational Innovation at North Carolina State University kindly granted permission to reproduce herein. Authorization to reproduce Evaluation Matters in whole or in part—except for the 1.1. Implementation Rubric—is granted. Any further use of the 1.1 Implementation Rubric in Appendix B is subject to the permission of the William & Ida Friday Institute (for more information, email Jeni Corn, director of program evaluation programs, Friday Institute, at [email protected]). The citation for Evaluation Matters should be: U.S. Department of Education, Office of Elementary and Secondary Education, School Support and Rural Programs, Evaluation Matters: Getting the Information You Need From Your Evaluation, Washington, D.C., 2014. To obtain copies of this publication,
Write to ED Pubs, Education Publications Center, U.S. Department of Education, P.O. Box 22207, Alexandria, VA 22304. Or fax your request to 703-605-6794. Or email your request to [email protected] Or call in your request toll-free to 1-877-433-7827 (1-877-4-ED-PUBS). Those who use a telecommunications device for the deaf (TDD) or a teletypewriter (TTY), should call 1-877-576-7734. If 877 service is not yet available in your area, call 1-800-872-5327 (1-800-USA-LEARN). Or order online at http://edpubs.gov.
On request, this publication is available in alternate formats, such as Braille, large print, audiotape, or compact disk. For more information, please contact the Department’s Alternate Format Center at 202-260-9895 or 202-260- 0818. In addition, if you have difficulty understanding English, you may request language assistance services for Department information that is available to the public. These language services are available free of charge. If you need more information about interpretation or translation services, please call 1-800-USA-LEARN (1-800-872- 5327) (TTY: 1-800-437-0833), or email the content contact below. Content Contact: Nancy Loy, Project Officer Phone: 202-205-5375; Email: [email protected]
Contents Acknowledgements …………………………………………………………………………………………………iv
Before You Get Started …………………………………………………………………………………………….vi Introduction ………………………………………………………………………………………………………….. 1
What Is the Purpose of the Guide? ……………………………………………………………………………………………. 1 Why Evaluate and What Do I Need to Consider? …………………………………………………………………………. 2 Where Do I Start? ……………………………………………………………………………………………………………………. 6 How Is the Guide Organized? ……………………………………………………………………………………………………. 7
Embedding Evaluation Into the Program …………………………………………………………………….. 9 STEP 1: DEFINE – What Is the Program? …………………………………………………………………………………….. 9 STEP 2: PLAN – How Do I Plan the Evaluation? ………………………………………………………………………….. 24 STEP 3: IMPLEMENT – How Do I Evaluate the Program? …………………………………………………………….. 53 STEP 4: INTERPRET – How Do I Interpret the Results? ………………………………………………………………… 61 STEP 5: INFORM and REFINE – How Do I Use the Evaluation Results? ………………………………………….. 68
Appendix A: Embedded Evaluation Illustration – READ* ……………………………………………… 74 Program Snapshot …………………………………………………………………………………………………………………. 74 Step 1: Define the Program …………………………………………………………………………………………………….. 74 Step 2: Plan the Evaluation ……………………………………………………………………………………………………… 80 Step 3: Implement the Evaluation ……………………………………………………………………………………………. 94 Step 4: Interpret the Results …………………………………………………………………………………………………. 107 Step 5: Inform and Refine – Using the Results …………………………………………………………………………. 120
Appendix B: Embedded Evaluation Illustration – NowPLAN* ……………………………………….. 124 Program Snapshot ……………………………………………………………………………………………………………….. 124 Step 1: Define the Program …………………………………………………………………………………………………… 124 Step 2: Plan the Evaluation ……………………………………………………………………………………………………. 132 Step 3: Implement the Evaluation ………………………………………………………………………………………….. 133 Step 4: Interpret the Results …………………………………………………………………………………………………. 145 Step 5: Inform and Refine – USING the Results………………………………………………………………………… 145
Appendix C: Evaluation Resources ………………………………………………………………………….. 162 Evaluation Approaches …………………………………………………………………………………………………………. 162 Program Theory and Logic Modeling ……………………………………………………………………………………… 163 Research and Evaluation Design, Including Reliability and Validity …………………………………………….. 163 Threats to Validity ……………………………………………………………………………………………………………….. 164 Budgeting Time and Money ………………………………………………………………………………………………….. 164 Ethical Issues……………………………………………………………………………………………………………………….. 164 Data Collection, Preparation, and Analysis ……………………………………………………………………………… 165 Evaluation Pitfalls ………………………………………………………………………………………………………………… 165 Interpreting, Reporting, Communicating, and Using Evaluation Results ……………………………………… 166
Appendix D: Evaluation Instruments for Educational Technology Initiatives …………………… 167
Appendix E: Evaluation Templates ………………………………………………………………………….. 173
Appendix F: Lists of Tables and Figures ……………………………………………………………………. 175 List of Tables ……………………………………………………………………………………………………………………….. 175 List of Figures ………………………………………………………………………………………………………………………. 176
iii
Acknowledgements This guide was created with the valuable input and advice from many individuals. Some individuals provided input into shaping the initial conceptual framework of the guide, some in editing portions of the guide, and some in reviewing draft versions of the guide.
Kathleen Barnhart, Principal Education Consultant, Illinois State Board of Education
Barbara DeCarlo, Retired Principal and Teacher
Beverly Funkhouser, Adjunct Professor, University of Delaware
Rick Gaisford, Educational Technology Specialist, Utah State Office of Education
Robert Hampel, Interim Director, School of Education, University of Delaware
Vic Jaras, Education Technology Director, Iowa Department of Education
Karen Kahan, Director of Educational Technology, Texas Education Agency
Tonya Leija, Reading Recovery Teacher Leaders, Spokane Public Schools
Melinda Maddox, Director of Technology Initiatives, Alabama Department of Education
Daniel Maguire, District Instructional Technology Coach, Kennett Consolidated School District
Jeff Mao, Learning Technology Policy Director, Maine Department of Education
Jennifer Maxfield, Research Associate, Friday Institute for Educational Innovation, North Carolina State University
Brandy Parker, Graduate Research Assistant, Friday Institute for Educational Innovation, North Carolina State University
Shannon Parks, State Education Administrator, Technology Initiatives, Alabama Department of Education
Barry Tomasetti, Superintendent, Kennett Consolidated School District
Bruce Umpstead, Educational Technology Director, Michigan Department of Education
Carla Wade, Technology Education Specialist, Oregon Department of Education
Brent Williams, Director, Educational Technology Center, Kennesaw State University
iv
Thanks also to Jeni Corn, Director of Evaluation Programs, with the William & Ida Friday Institute for Educational Innovation at North Carolina State University for obtaining approval to use the 1:1 Implementation Rubric in the Evaluation Matters guide.
I would like to extend a special thank you to Jenelle Leonard, Director of School Support and Rural Programs (SSRP), Office of Elementary and Secondary Education (OESE) at the U.S. Department of Education, for being the driving force in the creation of this guide.
In addition, I would like to thank Andy Leija, Kelly Bundy, Kim Blessing, Janelle McCabe, and Anna Morgan with Kauffman & Associates, Inc. for their continued support during the creation of the guide.
And finally, I would like to especially thank Nancy Loy (SSRP/OESE) at the U.S. Department of Education for her constant assistance and support throughout the development and writing of the guide, from brainstorming ideas to reading multiple drafts to facilitating review of the evaluation guide.
Portions of this guide were adapted from An Educator’s Guide to Evaluating the Use of Technology in Schools and Classrooms prepared by Sherri Quiñones and Rita Kirshstein at the American Institutes for Research for the U.S. Department of Education in 1998 (Nancy Loy, Project Officer).
v
Before You Get Started Some who use this guide, especially those who are unfamiliar with evaluation or educational program design, may decide to read it cover to cover. However, most who read the guide will likely use it as a compendium and a companion with which they will travel to those portions that are relevant to their current needs. To facilitate this use, there are several features that will aid you in your navigation through the guide.
Click on the I note icon to go to excerpts from Appendix A: Embedded Evaluation Illustration – READ* that appear throughout the text to illustrate each step of the
evaluation process. If you find the excerpts interspersed within text to be distracting, you may want to skip them in the main text and instead read the example in its entirety in Appendix A. There you will find a detailed example of a theory-driven, embedded program evaluation from its inception through the use of its first-year results. Appendix B: Embedded Evaluation Illustration – NowPLAN* provides another example. Both examples set out in this guide are provided solely for the purpose of illustrating how the principles in this guide can be applied in actual situations. The programs, characters, schools, and school districts mentioned in the examples are fictitious.
Click on the R note icon to see additional resources on a topic included in Appendix C: Evaluation Resources.
vi
Introduction What Is the Purpose of the Guide?
Who Is this Guide For? This guide is written for educators. The primary intended audience is state- and district-level educators (e.g., curriculum supervisors, district office personnel, and state-level administrators). Teachers, school administrators, and board members also may find the guide useful. It is intended to help you build evaluation into the programs and projects you use in your classrooms, schools, districts, and state. This guide will also provide a foundation in understanding how to be an informed, active partner with an evaluator to make sure that evaluation provides the information you need to improve the success of your program, as well as to make decisions about whether to continue, expand, or discontinue a program.
No previous evaluation knowledge is needed to understand the material presented. However, this guide may also be useful for experienced evaluators who want to learn more about how to incorporate theory-based evaluation methods into their programs and projects.
In addition to using the guide to embed evaluation within your program, the guide will be useful for
•
•
•
•
•
State education agencies during preparation of program and evaluation guidelines within Requests for Proposals (RFPs), in order to facilitate uniform assessments of proposals and for districts to know how their proposals will be assessed.
School districts in responding to RFPs or in writing grant proposals, in order to set clear expectations for what a program is intended to accomplish and how the evaluation will be embedded within the program to measure changes as a result of the program.
Teams of educators to show value added for a program, in order to build program support and provide budget justification.
Program staff to tell the story of a program using data.
Organizations for evaluation training and professional development
1
How Is this Guide Different From Other Evaluation Guides? There are many evaluation guidebooks, manuals, and tool kits readily available. So, what makes the material presented in this guide different from other evaluation guides? This guide is written with you, the educator, in mind. It outlines an evaluation approach that can be built
into your everyday practice. It recognizes the preciousness of time, the need for information, and the tension between the two. The theory- driven, embedded approach to evaluation is not an additional step to be superimposed upon what you do and the strategies you use but rather a way to weave evaluation into the design, development, and implementation of your programs and projects.
The term program is used broadly in this guide to represent activities, small interventions, classroom-based projects, schoolwide programs, and district or statewide initiatives.
2
This guide will help you to embed evaluation within your program in order to foster continuous improvement by making information and data the basis upon which your program operates. The step-by-step approach outlined in this guide is not simply a lesson in “how to evaluate” but rather a comprehensive approach to support you in planning and understanding your program, with a rigorous evaluation included as an integral part of your program’s design.
In Appendices A and B, you will find two examples of educators building evaluation into their everyday practices. Through a narrative about programs, characters, schools, and school districts that are fictitious, each example is designed to illustrate how the principles in this guide can be applied in actual situations. While embedded evaluation can be used for any type of program you may be implementing, these illustrations specifically focus on programs that involve infusing technology into the curriculum in order to meet teaching and learning goals.
Why Evaluate and What Do I Need to Consider?
Why Evaluate? Evaluation is important so that we can be confident the programs we are using in our schools and classrooms are successful. A common criticism regarding evaluation is that it takes time and resources that could be dedicated to educating students. However, evaluation, done properly, can actually result in better quality practices being delivered more effectively to enhance student learning.
You would not hire new teachers without regularly monitoring and mentoring to help them improve their skills and foster student success. Would you adopt and maintain a new curriculum full scale without being sure that student learning improved when you tested the
new curriculum? What if student learning declined after implementing a new curriculum? How would you know whether the curriculum did not work well because it was a faulty curriculum,
or because teachers were not trained in how to use the curriculum, or because the curriculum was not implemented properly? Building evaluation into your educational programs and strategies enables you to make midcourse corrections and informed decisions regarding whether a program should be continued,
expanded, scaled down, or discontinued.
Evaluation enables you to identify and use better quality practices more effectively to improve learning outcomes.
3
A primary purpose of evaluation is to make summative decisions. You can use summative evaluation results from rigorous evaluations to make final, outcome-related decisions about whether a program should be funded or whether program funding should be changed. Summative decisions include whether to continue, expand, or discontinue a program based on evaluation findings.
Another important purpose of evaluation is to make formative decisions. You can use formative evaluation data from rigorous evaluations to improve your program while it is in operation. Formative evaluation examines the implementation process, as well as outcomes measured throughout program implementation, in order to make decisions about midcourse adjustments, technical assistance, or professional development that may be needed, as well as to document your program’s implementation so that educators in other classrooms, schools, or districts can learn from your program’s evaluation.
Who Should Do the Evaluation? Once you have decided to evaluate the implementation and effectiveness of a program, the next step is to determine who should conduct the evaluation. An evaluation can be conducted by someone internal to your organization or someone external to your organization. However, the ideal arrangement is a partnership between the two, i.e., forming an evaluation team that includes both an internal and an external evaluator.
Preferably, evaluation is a partnership between staff internal to your organization assigned to the evaluation and an experienced, external evaluator.
Such a partnership will ensure that the evaluation provides the information you need for program improvement and decision-making. It also can build evaluation capacity within your organization.
An internal evaluator may be someone at the school building, district office, or state level. For evaluations that focus on program improvement
and effectiveness, having an internal evaluator on your evaluation team can foster a deeper understanding of the context in which the program operates. Involving people inside your organization also helps to build capacity within your school or district to conduct evaluation. An internal evaluato
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.