As a leader in the field of education, you must constantly review pro
As a leader in the field of education, you must constantly review program goals and initiatives, collect and analyze program-specific data, and refine strategies for the continuation of your programs to affect educational change. In Module 3, you began your Course Project by identifying a program within your specialization needing improvement or change and designing a plan to evaluate that program.
-
ToolsforFormativeEvaluation_GatheringtheInformationNecessa.pdf
-
mod4assigntevaluationmatters.pdf
-
USW1_EDSD_7900_Module4_ActionPlanTemplate_21.docx
-
mod4assignmentdoc.docx
-
qualityevaluation.pdf
-
RubricDetailBlackboardLearn.html
-
USW1_EDSD_7900_Module4_ActionPlanTemplate_22.docx
-
USW1_EDSD_7900_Module4_ActionPlanTemplate_12.docx
-
Program2.docx
Journal of Extension Journal of Extension
Volume 54 Number 1 Article 7
2-1-2016
Tools for Formative Evaluation: Gathering the Information Tools for Formative Evaluation: Gathering the Information
Necessary for Program Improvement Necessary for Program Improvement
K. S. U. Jayaratne North Carolina State University, [email protected]
Recommended Citation Recommended Citation Jayaratne, K. S. (2016). Tools for Formative Evaluation: Gathering the Information Necessary for Program Improvement. Journal of Extension, 54(1), Article 7. https://tigerprints.clemson.edu/joe/vol54/iss1/7
This Tools of the Trade is brought to you for free and open access by TigerPrints. It has been accepted for inclusion in Journal of Extension by an authorized editor of TigerPrints. For more information, please contact [email protected]
February 2016 Volume 54 Number 1 Article # 1TOT2 Tools of the Trade
Tools for Formative Evaluation: Gathering the Information Necessary for Program Improvement
Abstract New Extension educators experience a steep learning curve when attempting to develop effective Extension programs. Formative evaluation is helpful to new, and experienced, Extension educators in determining the changes necessary for making programs more effective. Formative evaluation is an essential part of program evaluation. However, its use has been overlooked by Extension educators due to overemphasis on outcomes evaluation for accountability. Extension educators must develop evaluation tools with questions appropriate for determining program weaknesses and strengths and identifying changes necessary for improvement. This article describes how to develop formative evaluation questions for program improvement.
Introduction
Program accountability and improvement are two major functions of Extension evaluation. Program improvement is considered one of the most important tasks of evaluation. However, this function of evaluation has been overlooked by some Extension educators due to their overemphasis on accountability. This situation can be attributed to the increased demand for accountability in Extension services (Hachfeld, Bau, Holcomb, & Craig, 2013).
The program improvement function of evaluation is called formative evaluation (Scriven, 1991). The intent of formative evaluation is to "help form or shape the program to perform better" (Rossi, Lipsey, & Freeman, 2004, p. 34). As the name indicates, formative evaluation makes Extension educators more informed about the changes needed to improve their programs. Formative evaluation helps Extension educators determine what worked well and what went wrong in their educational programs. Also, formative evaluation helps Extension educators determine the changes
K. S. U. Jayaratne Associate Professor and State Leader for Program Evaluation Department of Agricultural and Extension Education North Carolina State University Raleigh, North Carolina [email protected] du
needed to further improve their programs.
Cooperative Extension services in the United States are experiencing more resource limitations (Monroe, McDonell, Hermansen-Báez, Long, & Zipperer, 2007) and are being forced to optimize their program effectiveness. Under current resource limitations, there is no room for programs to fail. For Extension educators operating under this demanding situation, it is critical to determine all necessary improvements at the pilot stage of programs and to make necessary changes before the next round of programming. This approach prevents potential program failures and increases the cost effectiveness of Extension.
Extension educators are having to develop more effective programs than ever before and should be prepared to better their programs continuously by determining needed improvements. This continuous improvement process contributes to the effectiveness of Extension programs and maximizes program outcomes and impacts. Also, this process contributes to improving the professional outlook of Extension educators and to achieving client satisfaction. The latter factor is a desirable condition for gaining public support for Extension programs.
Normally, new Extension educators experience a steep learning curve when developing Extension programming. They have to learn educational programming within a very short period of time and without failing. New Extension educators can accomplish this task if they pay due attention to formative evaluation and use formative evaluation as a helpful tool in determining the changes needed to further improve their educational programs.
The purpose of this article is to discuss how to develop a formative evaluation tool useful for gathering the information needed for program improvement.
Information Needed for Formative Evaluation
All the information needed for formative evaluation can be categorized into three broad groups: (a) negative factors associated with the program, (b) positive factors associated with the program, and (c) the changes needed for further improvement of the program.
Gathering Information About Negative Factors
If an Extension educator is aware of the weaknesses, problems, and barriers associated with a program, he or she can find alternatives to overcome those issues in the next round of programming. For this reason, it is important to ask questions to identify weaknesses, problems, and barriers associated with a program. The following questions can be posed to program participants to gather this information:
What do you consider to be the most significant weakness of the program?
Did you meet your learning expectations from this program? (yes or no)
If you did not meet your learning expectations, why not?
Tools of the Trade Tools for Formative Evaluation: Gathering the Information Necessary for Program Improvement JOE 54(1)
©2016 Extension Journal Inc. 1
Gathering Information About Positive Factors
If an Extension educator is aware of the strengths of an educational program, he or she can capitalize on those strengths when building future programs. The following questions are helpful for gathering information about program strengths:
What do you consider to be the most significant strength of the program?
Which part of the program did you like the most?
Gathering Information for Making Needed Changes
Program participants are the best group from whom to solicit suggestions for further improvement of any educational program. Adult audiences, especially, are capable of identifying what needs to be done to improve a program. For this reason, Extension educators can gather useful information from participants for further improvement of their programs. The following questions can be used to gather such information:
What do you suggest could be done to improve this program?
What would be the best way to spread the word about this program to potential participants in future?
In addition to the aforementioned questions, provide participants with scaled items for assessing their satisfaction with the program (see Figure 1).
Figure 1. Scaled Items for Assessing Participants’ Satisfaction with Program
Please circle the appropriate number for your level of response.
How satisfied are you with Not Satisfie
d
Somewhat Satisfied
Satisfied Very Satisfie
d
the relevance of information to your needs?
1 2 3 4
the presentation quality of the instructor(s)?
1 2 3 4
the subject matter knowledge of the instructor(s)?
1 2 3 4
the training facilities? 1 2 3 4
the overall quality of the workshop?
1 2 3 4
Tools of the Trade Tools for Formative Evaluation: Gathering the Information Necessary for Program Improvement JOE 54(1)
©2016 Extension Journal Inc. 2
Practical Application
To design more useful evaluation tools, Extension educators should include formative evaluation questions in addition to outcome evaluation questions on evaluation instruments. In assessing Extension programs, questions such as those suggested above can be used to gather formative information for making changes necessary for program improvement. Such questions help Extension educators determine the strengths and weaknesses associated with their programs. When Extension educators are aware of the strengths of their programs, they can capitalize on those strengths to overcome weaknesses, thereby building better programs for the future. Additionally, when Extension educators have information that helps them identify changes needed for program improvement, they can make informed decisions about necessary program revisions.
References
Hachfeld, G. A., Bau, D. B., Holcomb, C. R., & Craig, J. W. (2013). Multiple-year Extension program outcomes and impacts through evaluation. Journal of Extension [Online], 51(1) Article 1FEA2. Available at: http://www.joe.org/joe/2013february/a2.php
Monroe, M. C., McDonell, L., Hermansen-Báez, L. A., Long, A. J., & Zipperer, W. (2007). Building successful partnerships for technology transfer. Journal of Extension [Online], 45(3) Article 3TOT6. Available at: http://www.joe.org/joe/2007june/tt6.php
Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed.), p.34. Thousand Oaks, CA: Sage Publications.
Scriven, M. (1991). Evaluation thesaurus (4th ed.). Newbury Park, CA: Sage Publications.
Copyright © by Extension Journal, Inc. ISSN 1077-5315. Articles appearing in the Journal become the property of the Journal. Single copies of articles may be reproduced in electronic or print form for use in educational or training activities. Inclusion of articles in other publications, electronic sources, or systematic large-scale distribution may be done only with prior electronic or written permission of the Journal Editorial Office, [email protected]
If you have difficulties viewing or printing this page, please contact JOE Technical Support
Tools of the Trade Tools for Formative Evaluation: Gathering the Information Necessary for Program Improvement JOE 54(1)
©2016 Extension Journal Inc. 3
- Tools for Formative Evaluation: Gathering the Information Necessary for Program Improvement
- Recommended Citation
,
EVALUATION MATTERS: GETTING THE INFORMATION YOU NEED
FROM YOUR EVALUATION A GUIDE FOR EDUCATORS TO BUILD EVALUATION INTO PROGRAM PLANNING AND
DECISION-MAKING, USING A THEORY-DRIVEN, EMBEDDED APPROACH TO EVALUATION
Prepared by:
Susan P. Giancola Giancola Research Associates, Inc.
Prepared for:
U.S. Department of Education Office of Elementary and Secondary Education
School Support and Rural Programs DRAFT 2014
This publication was prepared for the U.S. Department of Education under Contract Number ED-07-CO-0098 (Contracting Officer’s Representatives: Kenneth Taylor, Sharon Horn, and Vickie Banagan) with Kauffman & Associates, Inc. The views expressed in this publication do not necessarily reflect the positions or policies of the U.S. Department of Education. For the reader’s convenience, this publication contains information about and from outside organizations, including hyperlinks and URLs. Inclusion does not constitute endorsement by the Department of any outside organization or the products or services offered or views expressed. Nor is any endorsement intended or implied of the consulting firm “Evaluation Matters.” In fact, this publication was not prepared with help from or in consultation with, in any manner, that firm. U.S. Department of Education Arne Duncan Secretary Office of Elementary and Secondary Education Deb Delisle Assistant Secretary School Support and Rural Programs Jenelle V. Leonard Director DRAFT January 2014 This publication is in the public domain, except for the 1.1 Implementation Rubric in Appendix B, for which the William & Ida Friday Institute for Educational Innovation at North Carolina State University kindly granted permission to reproduce herein. Authorization to reproduce Evaluation Matters in whole or in part—except for the 1.1. Implementation Rubric—is granted. Any further use of the 1.1 Implementation Rubric in Appendix B is subject to the permission of the William & Ida Friday Institute (for more information, email Jeni Corn, director of program evaluation programs, Friday Institute, at [email protected]). The citation for Evaluation Matters should be: U.S. Department of Education, Office of Elementary and Secondary Education, School Support and Rural Programs, Evaluation Matters: Getting the Information You Need From Your Evaluation, Washington, D.C., 2014. To obtain copies of this publication,
Write to ED Pubs, Education Publications Center, U.S. Department of Education, P.O. Box 22207, Alexandria, VA 22304. Or fax your request to 703-605-6794. Or email your request to [email protected] Or call in your request toll-free to 1-877-433-7827 (1-877-4-ED-PUBS). Those who use a telecommunications device for the deaf (TDD) or a teletypewriter (TTY), should call 1-877-576-7734. If 877 service is not yet available in your area, call 1-800-872-5327 (1-800-USA-LEARN). Or order online at http://edpubs.gov.
On request, this publication is available in alternate formats, such as Braille, large print, audiotape, or compact disk. For more information, please contact the Department’s Alternate Format Center at 202-260-9895 or 202-260- 0818. In addition, if you have difficulty understanding English, you may request language assistance services for Department information that is available to the public. These language services are available free of charge. If you need more information about interpretation or translation services, please call 1-800-USA-LEARN (1-800-872- 5327) (TTY: 1-800-437-0833), or email the content contact below. Content Contact: Nancy Loy, Project Officer Phone: 202-205-5375; Email: [email protected]
Contents Acknowledgements …………………………………………………………………………………………………iv
Before You Get Started …………………………………………………………………………………………….vi Introduction ………………………………………………………………………………………………………….. 1
What Is the Purpose of the Guide? ……………………………………………………………………………………………. 1 Why Evaluate and What Do I Need to Consider? …………………………………………………………………………. 2 Where Do I Start? ……………………………………………………………………………………………………………………. 6 How Is the Guide Organized? ……………………………………………………………………………………………………. 7
Embedding Evaluation Into the Program …………………………………………………………………….. 9 STEP 1: DEFINE – What Is the Program? …………………………………………………………………………………….. 9 STEP 2: PLAN – How Do I Plan the Evaluation? ………………………………………………………………………….. 24 STEP 3: IMPLEMENT – How Do I Evaluate the Program? …………………………………………………………….. 53 STEP 4: INTERPRET – How Do I Interpret the Results? ………………………………………………………………… 61 STEP 5: INFORM and REFINE – How Do I Use the Evaluation Results? ………………………………………….. 68
Appendix A: Embedded Evaluation Illustration – READ* ……………………………………………… 74 Program Snapshot …………………………………………………………………………………………………………………. 74 Step 1: Define the Program …………………………………………………………………………………………………….. 74 Step 2: Plan the Evaluation ……………………………………………………………………………………………………… 80 Step 3: Implement the Evaluation ……………………………………………………………………………………………. 94 Step 4: Interpret the Results …………………………………………………………………………………………………. 107 Step 5: Inform and Refine – Using the Results …………………………………………………………………………. 120
Appendix B: Embedded Evaluation Illustration – NowPLAN* ……………………………………….. 124 Program Snapshot ……………………………………………………………………………………………………………….. 124 Step 1: Define the Program …………………………………………………………………………………………………… 124 Step 2: Plan the Evaluation ……………………………………………………………………………………………………. 132 Step 3: Implement the Evaluation ………………………………………………………………………………………….. 133 Step 4: Interpret the Results …………………………………………………………………………………………………. 145 Step 5: Inform and Refine – USING the Results………………………………………………………………………… 145
Appendix C: Evaluation Resources ………………………………………………………………………….. 162 Evaluation Approaches …………………………………………………………………………………………………………. 162 Program Theory and Logic Modeling ……………………………………………………………………………………… 163 Research and Evaluation Design, Including Reliability and Validity …………………………………………….. 163 Threats to Validity ……………………………………………………………………………………………………………….. 164 Budgeting Time and Money ………………………………………………………………………………………………….. 164 Ethical Issues……………………………………………………………………………………………………………………….. 164 Data Collection, Preparation, and Analysis ……………………………………………………………………………… 165 Evaluation Pitfalls ………………………………………………………………………………………………………………… 165 Interpreting, Reporting, Communicating, and Using Evaluation Results ……………………………………… 166
Appendix D: Evaluation Instruments for Educational Technology Initiatives …………………… 167
Appendix E: Evaluation Templates ………………………………………………………………………….. 173
Appendix F: Lists of Tables and Figures ……………………………………………………………………. 175 List of Tables ……………………………………………………………………………………………………………………….. 175 List of Figures ………………………………………………………………………………………………………………………. 176
iii
Acknowledgements This guide was created with the valuable input and advice from many individuals. Some individuals provided input into shaping the initial conceptual framework of the guide, some in editing portions of the guide, and some in reviewing draft versions of the guide.
Kathleen Barnhart, Principal Education Consultant, Illinois State Board of Education
Barbara DeCarlo, Retired Principal and Teacher
Beverly Funkhouser, Adjunct Professor, University of Delaware
Rick Gaisford, Educational Technology Specialist, Utah State Office of Education
Robert Hampel, Interim Director, School of Education, University of Delaware
Vic Jaras, Education Technology Director, Iowa Department of Education
Karen Kahan, Director of Educational Technology, Texas Education Agency
Tonya Leija, Reading Recovery Teacher Leaders, Spokane Public Schools
Melinda Maddox, Director of Technology Initiatives, Alabama Department of Education
Daniel Maguire, District Instructional Technology Coach, Kennett Consolidated School District
Jeff Mao, Learning Technology Policy Director, Maine Department of Education
Jennifer Maxfield, Research Associate, Friday Institute for Educational Innovation, North Carolina State University
Brandy Parker, Graduate Research Assistant, Friday Institute for Educational Innovation, North Carolina State University
Shannon Parks, State Education Administrator, Technology Initiatives, Alabama Department of Education
Barry Tomasetti, Superintendent, Kennett Consolidated School District
Bruce Umpstead, Educational Technology Director, Michigan Department of Education
Carla Wade, Technology Education Specialist, Oregon Department of Education
Brent Williams, Director, Educational Technology Center, Kennesaw State University
iv
Thanks also to Jeni Corn, Director of Evaluation Programs, with the William & Ida Friday Institute for Educational Innovation at North Carolina State University for obtaining approval to use the 1:1 Implementation Rubric in the Evaluation Matters guide.
I would like to extend a special thank you to Jenelle Leonard, Director of School Support and Rural Programs (SSRP), Office of Elementary and Secondary Education (OESE) at the U.S. Department of Education, for being the driving force in the creation of this guide.
In addition, I would like to thank Andy Leija, Kelly Bundy, Kim Blessing, Janelle McCabe, and Anna Morgan with Kauffman & Associates, Inc. for their continued support during the creation of the guide.
And finally, I would like to especially thank Nancy Loy (SSRP/OESE) at the U.S. Department of Education for her constant assistance and support throughout the development and writing of the guide, from brainstorming ideas to reading multiple drafts to facilitating review of the evaluation guide.
Portions of this guide were adapted from An Educator’s Guide to Evaluating the Use of Technology in Schools and Classrooms prepared by Sherri Quiñones and Rita Kirshstein at the American Institutes for Research for the U.S. Department of Education in 1998 (Nancy Loy, Project Officer).
v
Before You Get Started Some who use this guide, especially those who are unfamiliar with evaluation or educational program design, may decide to read it cover to cover. However, most who read the guide will likely use it as a compendium and a companion with which they will travel to those portions that are relevant to their current needs. To facilitate this use, there are several features that will aid you in your navigation through the guide.
Click on the I note icon to go to excerpts from Appendix A: Embedded Evaluation Illustration – READ* that appear throughout the text to illustrate each step of the
evaluation process. If you find the excerpts interspersed within text to be distracting, you may want to skip them in the main text and instead read the example in its entirety in Appendix A. There you will find a detailed example of a theory-driven, embedded program evaluation from its inception through the use of its first-year resu
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.