Identifying Sources and Types of Data By now, y
Identifying Sources and Types of Data
By now, you should have a well-formulated research question, approved by your facilitator. Using your research question as a basis, provide your answers to each of the questions below in a 500 words:
- What quantitative/qualitative data would be appropriate to collect for your study? Why?
- Suggest at least three things you might expect to discover in analyzing the research you find.
Be sure to cite at least three references to the resources provided in this session in your response.
-
VSCPD_ActionResearch_Wk4_Article_Chapter5ActionResearch_PPT_TC03292022.pptx
-
VSCPD_ActionResearch_Wk4_Article2_ComparisonChartQualVsQuan_TC03292022.pdf
-
VSCPD_ActionReserch_WK3_Article1_GuidingSchoolImprovementwithActionResearch_Chapter9_TC03292022.pdf
-
VSCPD_ActionResearch_WK4_Article3_Chapter11_TC03292022.pdf
-
AnnotatedBibliography1.docx
-
ActionResearch.docx
Chapter 5: Data Gathering
Qualitative Data Collection: Accuracy, Credibility, Dependability
Qualitative
Data Collection:
Observations
Interviews
Journals
Existing
Documents
Quantitative Data Collection: Validity, Reliability
Quantitative
Data Collection
Surveys,
Questionnaires,
Rating Scales
Checklists
Tests and
Other Formal
Instruments
Qualitative Data Collection: Observations
Observation: carefully watching, systematically recording what is seen and heard in a particular setting ( Schmuck, 1997).
Structured Observation: observer is looking for specific behaviors, reactions, or interactions.
Unstructured, Semi-structured: observer flexible. May engage in brief, intense periods of observation, note-taking.
Qualitative Data Collection: Observations
Field notes: two columns: (1) Observations,
(2) Comments, interpretations, meanings.
Observation Limitations: Effects of observer, large volume, variance between observers.
"Write what you See": most effective way to observe. Videotaping?
Later transcription? Patterns?
Qualitative Data Collection: Interviews
Interviews: Teacher/researcher questions posed to study participants: formal, informal.
Individual, Group Interviews, Focus Groups.
Interview Guide: Specific or general questions to be asked prior to interview.
Structured Interview: Pre-determined ????
Semi-structured Interview: Base questions.
Open-ended Interviews: Few, broad ???
Qualitative Data Collection: Journals
Data Journals: kept by teachers or students.
Student Journals: daily thoughts, perceptions, experiences.
Teacher Journals: narrative accounts of personal reflections on professional practice.
Class Journal: blank notebook passed around in class or in learning center.
Qualitative Data Collection: Existing Documents and Records
Existing Records: Curriculum materials, textbooks, reports, projects, test scores, grades, discipline records, portfolios.
Data: Retention rates, attendance, graduation rates, socioeconomic data, etc.
Other: Meeting minutes, newspaper stories, standardized test reports, discipline referrals, classroom artifacts.
Caution: Ethical use. Student privacy.
Qualitative Data Collection: Accuracy, Credibility, Dependability
Data Quality: accurate recording, organized processes, match between data gathered and research question.
Triangulation: multiple data sources.
Member Checking: sharing interview transcripts, comments, w/ study participants.
Prolonged Engagement, Persistent Observation: 'thick-description', not 'thin'.
Quantitative Data Collection: Surveys, Questionnaires, Rating Scales
Survey Design: Open-ended questions? Multiple Choice? Likert Scale?
Age – Appropriate?
Design affects Data: Consider implications.
Self-assessment.
Focus: Each item focuses on single concept.
Brevity: Each question, clear, necessary.
Unbiased: No leading questions.
Quantitative Data: Checklists Surveys:
Formal Tests, Existing Records: Can be used 'quantitatively' too.
Checklists.
Pre-Tests, Post-Tests:
Specifically designed for study.
Standardized Tests:
Standardized 'Surveys‘:
developed for specific purposes.
Quantitative Data Collection: Validity, Reliability
Validity: degree to which all accumulated evidence matches intended interpretation (p. 111,AERA, APA & NCME, 1999).
Reliability: consistency of connected data.
Internal Consistency: statistical estimate of data reliability:
Kuder-Richardson formula 21: or KR-21
r = (K) (SD)2 – M(K – M)
(SD)2 (K – 1)
References:
1) Mertler, C. A. (2014). Action Research: Improving Schools and Empowering Educators, 4th ed. Los Angeles, CA: Sage Publishers.
,
Comparison Chart: Qualitative vs. Quantitative Methods
Action Research for Literacy Coaches
VirtualSC PD
Last Updated Winter 2019
Comparison: Qualitative vs. Quantitative Methods The chart below outlines the main differences between qualitative and quantitative reseach
methods.
Qualitative Quantitative
Seeks to explore, explain, and understand
phenomena—ask what and why.
Seeks to confirm a hypothesis about a
phenomena—asks how many.
Data is in the form of a narrative, pictures, or
objects.
Data is in the form of numbers and statistical
results.
Methods less structured. Data gathered through
interviews, observations, content-analysis, and
more.
Highly structured methods. Data gathered through
the use of tools, equipment, questionnaires, and
more.
Asks open-ended questions in an effort to explore. Asks closed-ended questions to reach quantifiable
answers.
Research design has flexibility; can emerge and
evolve as study develops.
Research design is highly structured; laid out in
advance of the study.
Results may be presented subjectively. May
reveal biases, values, or experiences that impact
how the results are interpreted.
Results are documented using objective language.
,
ASCD / www.ascd.org 1
Chapters 7 and 8 introduced a variety of viable data collection techniques. However, employing proven techniques doesn’t guarantee the quality of the findings that emerge. The reality is, action research simply isn’t worth doing unless it is done well. Although that may sound like just an old refrain, it is far more. The imperative for maintaining high standards of quality is a truth learned and sometimes painfully relearned by teacher researchers. There are three fundamental reasons why you as a teacher researcher should hold yourself to the highest quality standards possible:
1. Your obligation to students 2. The need for personal and collective efficacy 3. The need to add to the professional knowledge base
The first reason, your obligation to students, rests on the premise that the education of the community’s young is a sacred trust placed upon you as a educator. Therefore, the decisions you make on behalf of students are actions of no small consequence. No one, least of all teach- ers, would wish to see students victimized by malpractice. When you make teaching decisions on the basis of sloppy research, you place your students at risk.
A second reason to hold your action research to the highest stan- dards of quality is that understanding your influence on educational out- comes can significantly enhance your personal and collective feelings of efficacy. However, before you can take credit for the success reflected in your data, the quality of that data must withstand the scrutiny of the world’s most critical jury—your own skeptical mind. Ultimately, if you doubt your own conclusions regarding the contribution you have made
109 From Guiding School Improvement with Action Research by R. Sagor. © 2000 by ASCD. Reproduced with permission. All rights reserved.
2 ASCD / www.ascd.org
110 Guiding School Improvement with Action Research
to your students, those findings won’t have much impact on your feelings of self-worth.
The third factor, adding to the knowledge base, may not seem im- portant if you are a teacher researcher practicing in a one-room school or you find yourself in a school culture that emphasizes individualism. However, it should be extremely relevant to the vast majority of teach- ers—those of you who tend to share what you’ve learned with your col- leagues. Not infrequently, one of the unspoken reasons for conducting action research is to persuade or entice your skeptical colleagues to con- sider “your” perspective on an issue. When you present your research to peers who are skeptical about the theory you are following, you should expect a similar skepticism about the research findings you produce con- cerning those theories. If your pedagogical opponents can find fatal flaws in your action research data, all future efforts at persuasion become that much more difficult.
The criteria used to establish the quality of action research should be no different from those used with other forms of research. Topping any re- searcher’s list of quality criteria are the twin pillars of science: validity and reliability, first introduced in Chapter 1. These concepts are so criti- cal to the quality of action research that it is worth taking some time to discuss and explore each of them.
As you no doubt recall from Education Psychology 101, validity re- fers to the essential truthfulness of a piece of data. By asserting validity, the researcher is asserting that the data actually measure or reflect the specific phenomenon claimed. Scientific history is full of examples of re- search findings that were discredited because they were shown to lack validity.
A mercury thermometer is an example of a valid instrument yielding valid data. The height reached by the fluid in an accurate thermometer is a valid and appropriate measurement of air temperature. Similarly, the movement of a membrane in a barometer is an appropriate and valid way to determine barometric pressure. A ruler can be a valid way to measure length, and unfortunately (for those of us who are weight conscious) a bathroom scale can be a valid measure of weight.
ASCD / www.ascd.org 3
Nothing has helped me understand the importance of attending to validity as much as my experience with performance assessment. One of the great accomplishments of the modern assessment movement has been drawing teacher attention to the value of authentic work products. Although bubble-sheet tests can, in many cases, produce valid data, teachers’ preference for authentic work products is understandable. It is analogous to historians’ preference for “primary source material” over “secondary source material.” Intuitively, we all know that words from the horse’s mouth are more believable than words related by the horse’s trainer. Similarly, a piece of actual student writing has more validity than a score obtained on the language section of a standardized multiple-choice exam. A performance by the school band is a better in- dicator of students’ ability to execute a musical piece than are the stu- dents’ grades in band.
However, even given the deserved popularity of performance and portfolio assessments, these types of data are not exempt from concerns regarding validity. For example, how should we react to the use of a writ- ten lab report as a means to assess student understanding of the scientific method? Should a lab report written in standard English be accepted as a valid indicator of a student’s understanding of science?
Suppose you answered yes. Would you still accept that lab report as a valid indicator if you learned that the student lacked fluency in English? Probably not. This is because the English-language proficiency needed to complete the report introduced what scientists call an intervening and confounding variable. In the case of assessing the proficiency in science of a student with limited English proficiency, the written aspect of the re- port intervenes and thereby confounds the accuracy of the assessment. Intervening and confounding variables are factors that get in the way of valid assessment. This is why when conducting assessments on student learning and collecting data for action research, it is important to ask:
Are there any factors or intervening variables that should cause me to distrust these data?
Reliability is a different but no less important concept. Reliability relates to researchers’ claims regarding the accuracy of their data. A few years ago, when a police officer issued me a ticket for speeding, I didn’t question the validity of his using an expensive, city-issued speedometer. I was willing to concede to the officer the validity of measuring vehicular
Data Collection: Building a Valid and Reliable Data Collection Plan
111
4 ASCD / www.ascd.org
• Is this information an accurate representation of reality? • Can I think of any reasons to be suspicious of its accuracy?
To appreciate the concepts of validity and reliability and how you might establish them, consider how you would behave as a juror deliberating in a criminal trial. Lawyers for both sides would argue their cases as persua- sively as possible. Your task as a juror is to determine which of the argu- ments to believe. In deciding if a lawyer had “proved the case,” you would probably ask these questions regarding validity: Are these claims
112 Guiding School Improvement with Action Research
speed with a speedometer. However, I urged him to consider my thesis regarding the reliability of his speedometer. I respectfully suggested that although I knew he sincerely believed that his speedometer was accu- rate, he ought to consider the possibility that it could be damaged. I ar- gued that if it were broken it wouldn’t produce an accurate, credible, and reliable measure of my speed. What I was suggesting was that although speedometers are valid measures of speed, they aren’t always reliable.
Unfortunately, I lost that argument. I fared no better when I pre- sented the same “reasonable doubt” plea to the judge. Unbeknownst to me, the state police regularly establish the reliability (accuracy) of their speedometers by testing the speedometer on each patrol car every morn- ing. In the end, I had to pay the fine. But in return I learned a memorable lesson on the value of establishing reliability.
Reliability problems in education often arise when researchers over- state the importance of data drawn from too small or too restricted a sample. For example, imagine if when I was a high school principal I claimed to the school board that I had evidence that the parents love our school’s programs. When the board chair asked me how I could make such a claim, I responded by defensively asserting it was a conclusion based on “hard data”—specifically, a survey taken at the last winter band banquet. The board chair might respond that because that event was at- tended by only 5 percent of the school’s parents and all the parents who attended had one thing in common—they had children in band—my conclusions were “unreliable.” He would be right. Claiming that such a small and select sample accurately represented the views of a total popu- lation (all the school’s parents) stretches the credibility of my assertion well beyond reasonableness.
To enhance the reliability of your action research data, you need to continually ask yourself these questions when planning data collection:
ASCD / www.ascd.org 5
credible? Can I truly believe that this evidence means what these wit- nesses and lawyers say it does? To determine the reliability of the evi- dence, you would ask questions such as these about the accuracy of the witnesses’ recollections and testimony: Can I trust the accuracy of their eyes and ears? Could time or emotions have played a trick on their memories?
So how do legal “researchers”—defense lawyers and prosecutors— convince a jury of the essential truth and accuracy (validity and reliabil- ity) of their cases? They do it through the twin processes of corroboration and impeachment. When they want the jury to believe what one of their witnesses said, they bring in other independent witnesses. If an addi- tional witness corroborates everything the first witness said, it increases the confidence a juror will have in the initial testimony. The more inde- pendent pieces of evidence a lawyer can place before a jury, the more the jurors will trust the truthfulness and accuracy of the claims. Conversely, if lawyers want the jury to doubt the truth and accuracy (validity and re- liability) of the other side, they try to impeach (challenge the credibility of) the testimony of the other side, by, for example, entering into evi- dence alternative or irreconcilable reports on the same phenomenon from several independent sources.
Action researchers use a similar process to that used by lawyers. It is called triangulation, and, as was discussed in Chapters 1 and 2, it involves the use of multiple independent sources of data to establish the truth and accuracy of a claim.
There are ways to develop valid and reliable instruments without triangulation, but these methods are often problematic. First, they are time-consuming and frequently prohibitive in terms of cost. This is be- cause significant field-testing is required to establish the validity and re- liability of a measuring instrument. Just consider the many millions of dollars invested by publishers to support the validity and reliability of their standardized tests. But even if teachers were willing to invest the time, money, and energy required to establish technical validity (con- struct and content) for their home-grown instruments, they probably wouldn’t be happy with what they produced.
For good reason, educators are intuitively unimpressed with “single instrument measures.” They tend to question whether any single tool could ever capture the full reality of any meaningful educational out- come. Occasionally I will meet a layperson who believes that SAT scores
Data Collection: Building a Valid and Reliable Data Collection Plan
113
6 ASCD / www.ascd.org
Is Sagor High School a school?good
Parent Surveys
Student Surveys
Teacher Surveys
Graduate Follow-Ups
College Admissions
SAT Scores
Drop-Out Rates
114 Guiding School Improvement with Action Research
alone (or another piece of seemingly compelling data, such as college ad- missions data or discipline referrals) provide an accurate picture of a school’s quality. But I have never met a knowledgeable educator who is willing to make a judgment based upon any of those same valid and reli- able instruments. This is because educators know that what these “valid and reliable” instruments reveal is simply too narrow to justify conclu- sions regarding educational quality.
This is not to say that these instruments (SAT scores, college admis- sions, discipline referrals, and so forth) aren’t valuable windows into the larger phenomenon (the quality of a school), but before conclusions can be drawn about the big picture, those findings need to be corroborated by looking at the phenomenon through a variety of other windows.
Figure 9.1 illustrates what a plan for triangulated data collection might look like to answer a question on the quality of a high school.
Although we might be skeptical about drawing conclusions regard- ing a school’s quality from any one of the success indicators in Figure 9.1, if all of these instruments painted a similar picture, we would, no doubt, feel confident in declaring the school “good.”
FIGURE 9.1
A Plan for Triangulated Data Collection
ASCD / www.ascd.org 7
Chapter 6 presented guidelines for producing a written problem statement/research proposal (Implementation Strategy #6). The sample proposal written by Richard and Georgia, although short, contained all the items expected from a formal research proposal except the data col- lection plan. Chapter 2 described the triangulation matrix as a helpful planning tool (Figure 2.3, p. 21). Figure 9.2 shows the triangulated data collection plan, in the form of a matrix, that Richard and Georgia used to answer their research questions. Implementation Strategy #10 can help you complete a triangulation matrix.
Data Collection: Building a Valid and Reliable Data Collection Plan
115
1. Could we motivate our 8th graders to con- duct and com- plete Real World Advo- cacy Projects?
Teacher journals
Student surveys Grade book records
2. What would be the quality of the projects produced by our students?
Teacher assess- ments using a project rubric
Student self- assessments using the same rubric
Assessment by community members using the rubric
3. Would the completion of Real World Ad- vocacy Projects result in en- hanced feelings of social effi- cacy for our students?
Surveys of stu- dents’ other teachers
Interviews with random sample of students
Interviews with random sample of parents
8 ASCD / www.ascd.org
Implementation Strategy #10— Building a Triangulated Data Collection Plan
WHAT: Constructing a data collection plan with high probability of producing valid and reliable answers to your research questions
HOW: 1. Prepare a four-column data collection matrix with separate rows for each research question (see Figure 9.2).
2. Write your research questions in column 1 of your matrix.
3. For each research question, ask yourself the following: What is one source of data that could help answer this question? Write your answer in column 2 next to the research question.
4. Ask the question two more times to determine a second and third source of data, and write your answers in columns 3 and 4, respectively.*
5. Repeat this process for each research question.
6. Review the completed matrix and ask yourself the following question: Are these the best sources of data I/we could collect in answer to each of these questions? When you are satisfied with your answer to this ques- tion, you have a completed data collection plan.
*Although this strategy suggests collecting three types of data to answer a research question, it is perfectly permissible to collect more than three types.
Once you have developed a triangulated data collection plan, you have accomplished much of the hard work of action research. Most doctoral students report that the hardest aspect of completing a doctorate is get- ting a comprehensive research proposal through their dissertation com- mittee. Once the rationale for their research has been established and a methodology (the data collection plan) for answering their research questions has been put in place, all that is left is to carry out the proposal.
116 Guiding School Improvement with Action Research
ASCD / www.ascd.org 9
If you, alone or with colleagues, have followed the steps outlined in this book thus far, you are ready to proceed. Now all you have to do is carry out your plan.
Unfortunately, many beginning action researchers stall at this point, usually because completing the next stage, data collection, re- quires budgeting time from an already packed schedule. To get over this hurdle, it is helpful to commit to a time line and a process for completing the work of data collection. The rationale for formalizing this commit- ment is to keep the demands of a hectic work life from getting in the way of completing what should prove to be a most satisfying piece of work. Implementation Strategy #11 takes only a few minutes to complete, but doing so will help ensure that you get over the time hurdle and maintain your momentum for completing your research.
Implementation Strategy #11— Data Collection Time Line/To-Do List
WHAT: Making a commitment to a plan for completing the data collection por- tion of your action research
HOW: 1. Make a four-column list on a sheet of chart paper.
2. Brainstorm (either individually or, if your research is a team effort, with your colleagues) a list of each thing that needs to be accomplished in order to complete your triangulated data collection plan. List these items (roughly in chronological order) in the left-hand column on the chart paper.
3. In the second column, write the date that each should be accom- plished. Then ask yourself if it is realistic to complete this item by that date. If the answer is yes, go to the next item. If the answer is no, deter- mine the earliest “realistic” date.
4. If working individually, go on to the next step. If working as a team, go through each item on the list and determine who is willing to be respon- sible to see that this item is accomplished by the agreed upon date. Write that person’s name in column 3.
Data Collection: Building a Valid and Reliable Data Collection Plan
117
10 ASCD / www.ascd.org
5. Ask yourself (or ask the team) the following question: What types of support or help might I/we need to complete each of these items? Per- haps you will need some support from your principal or some help from a professor at a local university. Write the name of the person or organiza- tion whose help you anticipate needing in the last column and commit to a time for making contact with these “critical friends.”
6. One last time, ask yourself or your team if this plan is realistic. If you an- swer yes, you are ready to proceed. If you answer no, repeat this strategy.
Chapters 10 and 11 explore the three remaining steps in the action research process: data analysis, reporting, and action planning. Chapter 12 discusses a number of important ethical and methodological issues that will be particularly helpful for beginning researchers. If you intend to conduct your data collection before reading the rest of this book, I strongly recommend that you read Chapter 12 first.
118 Guiding School Improvement with Action Research
,
234
11 Conducting Teacher Action
Research
This chapter describes a process for conducting a teacher actionresearch study. The suggestions offered here have emanated from my reading in the action research literature and my personal experi- ences and engagement in a variety of collaborative teacher action research studies during the past 40 years. My pedagogical voice per- meates the chapter, but I hope it does so in a way that establishes mean- ingful contact with you the reader. I have tried to capture in this chapter the realities, complexities, and challenges of conducting teacher action research. In several places in the chapter, I emphasize the importance of the critical process that recursion represents in the con- duct of action research, particularly as recursion affects research ques- tions and the processes of data collection and analysis. I hope this chapter will be a meaningful resource and foundation for you as you conduct your own research and that it will give you all the rudiments of practice you need to become a lifelong researcher.
� MODEST BEGINNINGS
Action research is demanding, complex, and challenging because the researcher not only assumes responsibilities for doing the research but
� � �
Conducting Teacher Action Research 235
also for enacting change. Enacting change is not easy—it requires time, patience, and sound planning, communication, and implementation skills. So, in establishing a foundation for conducting action research, I believe that modest beginnings are no disgrace and are in most respects preferable to more ambitious ones. The visibility and impact of early efforts may be small, but it is advisable to consider carefully the relative merits of simple versus more intricate research plans and data analysis procedures. It is likely that by adopting the strategies of a methodological miser, there is more to be gained than lost. In the con- duct of action research, just as in the interpretation of its results, the law of parsimony is recommended. Modest beginnings can serve to build step-by-step an action research tradition of dealing with real problems that already have a natural and interested audience.
By selecting and pursuing questions that focus on the immediate and imperative problems of the classroom and the school, teacher action research can attract the greatest attention at the most opportune time (when there is something substantial to report), for the best reason (because some progress has been made, either in terms of increased understanding or approaches to dealing with a problem), and probably for the appropriate audience (those who have a preexisting interest and investment in the problem and its solution). A mounting record of vis- ible accomplishment is an excellent way to dispel the initial anxiety teachers may experience in undertaking action research.
� FINDING CRITICAL FRIENDS
As a member of a collaborative action research team, whether pursu- ing an individual research study or a team study, it is important to engage colleagues in a process of collaborative inquiry to advance the developing research effort. Particular colleagues may be enlisted at the beginning of the research for a variety of reasons—because they are especially sensitive to emerging problems, or are creative and have ideas about how educational issues might be addressed, or are skilled in problem definition, or are greatly interested in a particular issue.
Whatever the reason, it is extremely helpful to have a circle of “crit- ical friends” who will work with you to help define the research prob- lem, formulate the questions, collect and analyze the data, and discuss the data and outcomes of the study (Bambino, 2002; Cushman, 1998). To facilitate critical collegiality, it is helpful to consider the norms developed by the Bay Area Coalition of Essential Schools, which are paraphrased here:
• In collaborating with a group of critical friends, you and the members of the group describe only what you see; you don’t try to describe what you don’t see; you learn to express what you don’t see in the form of questions.
• Together, you resist the urge to work on solutions until you are comfortable with what the data say and don’t say.
• The perspectives and experiences of each member of the group are brought to the analysis.
• Everyone seeks to understand differences of perception before trying to resolve them, recognizing that early consensus can inhibit depth and breadth of analysis.
• In this critical process, members raise questions with each other when they don’t understand ideas or what the data are saying.
• Members surface assumptions and use data to challenge them, actively
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.