Briefly restate the purpose of your team?s program evaluation RFP. Describe a situation for which a single-group design might be considered for an
To prepare:
- Review Chapter 9 from your course text listed in this week’s Learning Resources.
- Review the articles in the Learning Resources that provide examples of use of single-group designs.
Post your comprehensive response to each of the following:
- Briefly restate the purpose of your team’s program evaluation RFP.
- Describe a situation for which a single-group design might be considered for an evaluation of the general program situation of your RFP (e.g., school anti-bullying program). Why might it be appropriate in this situation? Explain.
- Given your example, explain the strengths and limitations of a single-group design for use in this evaluation.
- Would you choose a posttest only or pretest-posttest design? Why?
- How would you try to deal with threats to internal validity as part of your planning?
Program Evaluation Proposal Template
(Complete for team evaluation project. See due dates within.)
Introduction
Evaluation Goal
The purpose of this evaluation is to seek an evaluation of the impact of sexual harassment training on the college community at large with regard to intervention by peers. Sexual harassment is a significant problem in academia, particularly for women, who are more likely to experience it. Victims of sexual harassment are often reluctant to report incidents, and peers who witness or hear about such harassment are unlikely to intervene directly or offer support, potentially due to negative responses from other peers. (Bonar et al., 2023). Universities are legally required to have policies and procedures in place to protect students and address sexual harassment.
By addressing sexual harassment issues at ABC University, the campus has offered face-to-face training over the past 10 years, during which students can participate voluntarily. The second part of the training focuses on bystander intervention. Similar to a study conducted by Abrams, Foster & Fullagar (2018), the training specifically targets attitudes, beliefs, and norms that influence the actions of victims. The model of bystander intervention is based on the Bowes-Sperry & O’Leary-Kelly model (2005), which focuses on observers of sexual harassment. By training students in bystander intervention strategies for both victims and witnesses, the university aims to encourage positive actions and reduce negative responses that currently deter victims from reporting incidents or seeking help from their peers. The overall goal is to promote a more supportive environment where peers are empowered to take positive actions to help victims of sexual harassment on campus, creating a safer atmosphere not only within the campus but also in the community.
Evaluators
Table 1: Evaluation Team
Individual(s) |
Role |
Responsibilities |
Evaluation advisory or advisory group (optional) |
Section I: Stakeholder Assessment
Stakeholders
The stakeholders for this program include a broad range of individuals and groups who play a role in shaping campus safety and culture at ABC University. At the center is the Office of the President, which commissioned the evaluation and is responsible for overseeing the university’s compliance with policies related to sexual harassment. The student body is another primary stakeholder group, encompassing undergraduates, graduate students, and doctoral students who may participate in the training, complete the annual Campus Climate Survey, or be directly affected as victims, observers, or peers in incidents of harassment. The Counseling Center is also critical, as it designs and delivers the two-part training sessions that address both victim responses and bystander intervention strategies (Abrams, 2018; Bowes-Sperry & O’Leary-Kelly, 2005). Faculty and staff, though not the primary participants in the training, influence the overall climate through their interactions with students and their adherence to institutional policies (Cantalupo & Kidder, 2018). External stakeholders, including parents and prospective students, indirectly benefit from a safe and supportive campus environment, while community partners and the wider academic community have an interest in the university maintaining its reputation for equity and student well-being.
Roles of Stakeholders
Each of these groups plays a distinct role in the program. The Office of the President coordinates access to records, surveys, and relevant personnel while ensuring confidentiality and ethical standards are upheld. Students serve as both the beneficiaries of training and as key data sources, offering first-hand perspectives through survey responses and potential follow-up interviews. Counseling Center staff act as implementers and informants, providing insight into how training is conducted and perceived (Foster & Fullagar, 2018). Faculty and staff contribute by modeling appropriate responses to harassment and reinforcing the policies introduced in student training (Clancy et al., 2014). External audiences, while less directly engaged, influence the program through expectations for accountability and by shaping how the university is perceived as a safe environment for learning (Wood et al., 2017).
Engagement Plan
Engaging stakeholders is an essential part of this evaluation. The Office of the President will serve as the primary liaison, coordinating approved contacts and supporting requests for data. Students will be engaged through the Campus Climate Survey, which provides anonymous feedback about their experiences and perceptions, as well as through targeted interviews with those who completed the training. Trainers from the Counseling Center will also be interviewed to better understand program implementation and challenges in delivery. Faculty and staff may be consulted to provide additional perspectives on campus norms and peer response (Orchowski & Gidycz, 2015). This approach ensures that engagement captures both leadership perspectives and the lived experiences of students, creating a more complete and balanced evaluation of the program.
Table 2: Stakeholder Assessment and Engagement Plan
Stakeholder Category |
Interest or Perspective |
Role in the Evaluation |
How and When to Engage |
Contractor for evaluation services |
They prioritize methodlogical rigor/they take an objective, evidence first view point |
Independently designs and conducts program evaluations |
From planning to closeout |
Program administrators |
System level risk and evidence-oriented view focused on accountability and long term impact |
Design, fund, govern set performance standards |
They engage throughout the life cycle. From needs assessments to monitoring and audits |
Program providers |
Focused view on client access service fidelity and operational realities |
Operationalize the model by delivering services managing staff and logistics and ensuring day to day quality |
They engage at implementation/ intake, service delivery, data collection and reporting to administrators |
Program recipients |
Through lived experiences/ they want accessible , respectful and culturally responsive services |
The individuals or communities that receive the services |
Through enrollment, participation in services and follow up assessments. |
Other interested parties |
Introduction and Section I due by end of Week 4.
Section II: Background and Description of the Program; Program Logic Model
General Description
· Provide a general description of the program/project to be evaluated.
Need to Be Addressed
· Identify the specific problem to be addressed by the program activities and provide justification of that need.
Context
· Within what context is the program operating (i.e., setting; any environmental factors that may positively or negatively affect the initiative)?
Target Population
· Who is the target population of this program?
Objectives
· What are the objectives of the program/project? (Answer the questions for each of the S.M.A.R.T. criteria and guidelines as follows):
1. Specific. What is to be done for this project? How will you know if it is being done? How will you know when it is finished? Describe the expected results and end product of the work to be done, overall or in phases. Make your descriptions of the activities and outcomes in terms of observables: that is, that which can be observed by an external party.
2. Measurable. What is to be measured and how? How will you know if the program activities meet expectations that were preset, such as goals for quantity, quality, frequency, costs, and deadlines? To what extent can the outcome be measured against some standard of comparison? If qualitative, how will you know?
3. Achievable. Is the program plan doable? By the persons indicated, in terms of their knowledge, capability, availability, motivation? In the time planned? With the available resources, etc.?
4. Relevant. Should the program be carried out? Why? What will be the impact?
5. Time-oriented. When will the objective be met—for the program, for the evaluation? Is there one end point, or are there several milestones or checkpoints along the way as the program continues?
For an example, see:
Wayne State University. (n.d.). Wayne LEADS: S.M.A.R.T. objectives worksheet example. Retrieved February 26, 2019, from https://hr.wayne.edu/leads/phase1/smartobjworksheetexample.pdf
Stage of Program Development
· In which stage is this program at present: planning, implementation, or mature phase of program? Provide some details.
Logic Model
Resources/Inputs (What resources are available to the program in terms of staff, money, space, time, and partnerships?)
Activities (What activities are being undertaken or planned to achieve the outcomes?)
Outputs (What services and products from the activities will be produced by your staff?)
Outcomes (What are the program’s intended outcomes in the short-term, mid-term, or long-term?)
Table 3: Program Description
Activities |
Outcomes |
||||
Resources |
Initial |
Subsequent |
Outputs |
Short-term/Mid-term |
Long-term |
Section II due by end of Week 6.
Section III: Design of the Evaluation
Stakeholder Needs
· Who will use the evaluation findings?
· How will the findings be used?
· What do stakeholders need to learn from the evaluation?
Evaluation Questions
· What are your evaluation questions? Include process-driven or outcome-driven evaluation questions.
· What do you want to learn from the evaluation?
Evaluation Design
· Specify the evaluation model you are planning (see Chapter 2 in your course text).
· Specify the methodology you are planning (see chapters on various methodologies in your course text).
· Specify procedures for the evaluation. (Use Table 4 to summarize information for the following questions: What is the evaluation question to be asked and answered? What information will you be seeking/using? What measurement tool/method will you be using? When will the information be collected, from whom, how will it be collected, and by whom?)
Table 4: Evaluation Design
Evaluation Question |
Data/information to Be Collected (or available) |
Measure |
When Collected |
From Whom |
Method of Collection |
Collected by Whom |
Analysis
· What method(s) will you use to analyze your data in order to answer the evaluation questions (quantitative or qualitative techniques)?
· Consider Table 4 to summarize if you have several questions and/or sources of data.
Section III due by end of Week 8.
Section IV: Evaluation
Evaluation Standards
· Does your evaluation design address the following standards for effective evaluation?
· Utility
· Feasibility
· Propriety
· Accuracy
After you have collected and analyzed your data:
Interpretation
· Whom will you involve in drawing, interpreting, and justifying conclusions?
· What are your plans to involve them in this process?
Dissemination
· Who is your audience(s)?
· What medium/media do you plan to use to disseminate the evaluation findings to your audience(s)?
· Do you have written permission from your client/stakeholder or other relevant authorizing entities to disseminate information about the evaluation and/or findings to these audiences, through these means?
Use
· Will you also be involved beyond evaluation in use of evaluation findings? If so, what are your plans for using evaluation findings?
· How, where, and when will the findings be used?
· Who will implement these findings?
· How will you monitor your implementation plan?
Section IV due by end of Week 9.
Final Section: Executive Summary PowerPoint
· Executive Summary of the proposed evaluation; include the following key information in this abstract-like narrative:
· Names of evaluation team members (any specific roles for specific members of the team, or all shared?)
· Who requested the evaluation (internal or external evaluation; if external, funding source)
· Program/project type and name
· Problem/need addressed by the program/project
· Population targeted/affected
· Objectives of the program/project to be evaluated
· Key activities of the program/project to be evaluated
· Providers of program/project activities
· Evaluation questions
· Evaluation design to answer the evaluation questions
Create your Executive Summary in PowerPoint.
Executive Summary PowerPoint Assignment due by end of Week 10.
Presentations will take place in Weeks 10 and 11.
References
Abrams, Z. (2018). Sexual harassment on campus. Monitor on Psychology, 49(5), 68. https://www.apa.org/monitor/2018/05/sexual-harassment
Bonar, E. E., DeGue, S., Abbey, A., Coker, A. L., Lindquist, C. H., McCauley, H. L., Miller, E., Senn, C. Y., Thompson, M. P., Ngo, Q. M., Cunningham, R. M., & Walton, M. A. (2022). Prevention of sexual violence among college students: Current challenges and future directions. Journal of American college health : J of ACH, 70(2), 575–588. https://doi.org/10.1080/07448481.2020.1757681
Bowes-Sperry, L., & O’Leary-Kelly, A. M. (2005). To act or not to act: The dilemma faced by sexual harassment observers. Academy of Management Review, 30(2), 288–306. https://doi.org/10.5465/amr.2005.16387886
Cantalupo, N. C., & Kidder, W. C. (2018). A systematic look at a serial problem: Sexual harassment of students by university faculty. Utah Law Review, 2018(3), 671–786.
Clancy, K. B. H., Nelson, R. G., Rutherford, J. N., & Hinde, K. (2014). Survey of academic field experiences (SAFE): Trainees report harassment and assault. PLoS One, 9(7), e102172. https://doi.org/10.1371/journal.pone.0102172
Foster, P. J., & Fullagar, C. J. (2018). Why don’t we report sexual harassment? An application of planned behavior. Basic and Applied Social Psychology, 40(3), 148–160. https://doi.org/10.1080/01973533.2018.1463226
Orchowski, L. M., & Gidycz, C. A. (2015). Peer responses to sexual harassment: The role of gender, shame, and guilt. Violence Against Women, 21(4), 550–573. https://doi.org/10.1177/1077801215573335
Wood, L., Sulley, C., Kammer-Kerwick, M., Follingstad, D., & Busch-Armendariz, N. (2017). Climate surveys: An inventory of understanding sexual assault and other crimes of interpersonal violence at institutions of higher education. Violence Against Women, 23(10), 1249–1267. https://doi.org/10.1177/1077801216657897
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.
