READ THESE three student responses to the original question you need to answer each student SEPERATELY – this is for discussion participation credit-BibliU-Print-9781119003601.pdfTopic4DQ12
READ THESE three student responses to the original question
you need to answer each student SEPERATELY – this is for discussion participation credit-
1748390 – Wiley US ©
derator or interviewer of the group be familiar with group processes and with the range of possible roles as moderator (Barbour, 2008; Hennink, 2014; Krueger & Casey, 2015; Stewart & Shamdasani, 2015).
Finally, “focus groups work best for topics people could talk about to each other in their everyday lives—but don't” (Macnaghten & Myers, 2004, p. 65). Obviously, a focus group is a poor choice for topics that are sensitive, highly personal, and culturally inappropriate to talk about in the presence of strangers. Of course, it's not always obvious ahead of time how appropriate a topic might be. Crowe (2003) reports successful use of focus groups to create culturally appropriate HIV prevention material for the deaf community. Jowett and O'Toole (2006) report an interesting analysis of two focus groups—one of mature students and their attitude toward participation in higher education, and one of young women and their views of feminism. They found that the mature students' focus group was a failure but the young women's group was a success. The authors had not anticipated “how ingrained the sense of inadequacy is for some people who have felt excluded from education” (p. 462), nor how the power imbalance among members of the mature students' group and between the researcher and the group inhibited participation. Finally, Stewart and Williams (2005) explore the practical and ethical issues of conducting synchronous and asynchronous online focus groups.
Thus, as with any other data collection method, focus groups are appropriate to be used when this is the best way to get the best data that addresses your research question. And as with any other method, the advantages need to be weighed against the disadvantages; one also needs to develop the skills necessary for using this technique.
Online Interviews There is no question that the Internet has changed the world. It has also increased the possibilities for the myriad ways that one can collect data through online venues in conducting qualitative research through various information communication technologies (ICTs) and computer mediated communication (CMC) tools (Salmons, 2015). Qualitative data are collected from or through email, blogs, online discussion groups, Skype, tweets, texts, and various forms of social media. Here we discuss issues in conducting online interviews.
One can conduct online interviews synchronously (in real time) through various CMC tools such as Skype or Adobe Connect. These are typically verbal interviews with a video component that are more like face-to-face interviews; one can also conduct voice-to-voice real-time interviews over the telephone. One can also conduct interviews asynchronously (where there is a lag time) over email or on an online discussion group; asynchronous interviews tend to be text-based or written interviews. There are strengths and weaknesses to both synchronous and asynchronous venues. As will be discussed in more detail later, in general it is helpful to build rapport with participants when conducting qualitative interviews. Rapport building can be slightly more challenging in text-only asynchronous venues (such as email), when visual cues are missing (James & Busher, 2012). Further, participants may not respond to email queries or not respond to certain questions over
1748390 – Wiley US ©
email that they would likely answer in synchronous video or voice-to-voice formats. On the other hand, text-based interviews over email provide the researcher with a ready-made transcript, making it easy to document what was said, though the nonverbal cues and pauses in conversation are missing. Such an email “transcript” can save the researcher time and money in transcription costs.
Given the availability of various information and communications technology (ICT) tools for conducting online interviews in either synchronous or asynchronous formats, Salmons (2015), in her book on online interviews, presents a framework for considering what she refers to as “e-interview research” (p. 4). She invites the researcher to explore key questions in eight interrelated categories: (1) aligning the purpose of the research with the design; considering issues related to (2) choosing data collection methods and (3) one's position as a researcher; determining (4) the e-interview style, (5) the type of ICT tools to use, (6) sampling issues, (7) ethical issues, and (8) actually collecting the data. While qualitative researchers always need to consider similar issues in all qualitative studies, Salmons proposes questions and issues specifically related to the online environment.
There is a growing discussion of the availability of various ICT tools for conducting online interviews, many of which are reviewed by Salmons (2015) and others, talking mainly about individual interviews. Tuttas (2015) focuses more specifically on lessons learned from using real-time audiovisual web conferencing technology to carry out focus group interviews with nurses from geographically dispersed locations in the United States. While she ultimately chose to use Adobe Connect, she considers the strengths and weaknesses of various web conferencing technology venues (Skype, ooVoo, GoToMeeting, and Adobe Connect) for her focus group interviews, which can provide some guidance for some of the available options.
Like any data collection method, conducting online interviews has its strengths and weaknesses. One of the obvious strengths is that the researcher is no longer constrained by geography in considering participants. A researcher could interview participants across the world, and could perhaps even conduct focus group interviews where all parties can see each other. Another strength is that many CMC venues allow video recordings to be made, which can be helpful if one wants to explore or review nonverbal cues later. Some obvious weaknesses are that not everyone has access to various CMC tools or the knowledge of how to use them. Further, technology is always subject to breakdowns. There can be problems with audio recording equipment, as voices sometimes break up on cell phones or over Skype or other computer mediated venues, which can cause frustration for both the interviewer and interviewee. Finally, there is always the chance of confidentiality being compromised when one uses CMC tools over the Internet. While this may be unlikely, it is always a consideration for researchers and institutional review boards in dealing with ethical issues in doing research. In sum, all of the strengths and weaknesses of CMC tools in relationship to qualitative interviewing need to be considered when undertaking a qualitative research study.
Asking Good Questions The key to getting good data from interviewing is to ask good questions; asking good questions takes practice. Pilot interviews are crucial for trying out your questions. Not only do you get some practice in interviewing, but you also quickly learn which questions are
1748390 – Wiley US ©
confusing and need rewording, which questions yield useless data, and which questions, suggested by your respondents, you should have thought to include in the first place.
Different types of questions will yield different information. The questions you ask depend upon the focus of your study. Using the example of mentoring in the career development of master teachers, if you wanted to know the role mentoring played in career development, you would ask questions about teachers' personal experience with mentoring and probably get a descriptive history. Follow-up questions about how they felt about a certain mentoring experience would elicit information that is more affective in nature. You might also want to know their opinion as to how much influence mentoring generally has in a teacher's career.
The way in which questions are worded is a crucial consideration in extracting the type of information desired. An obvious place to begin is by making certain that what is being asked is clear to the person being interviewed. Questions need to be couched in familiar language. “Using words that make sense to the interviewee, words that reflect the respondent's world view, will improve the quality of data obtained during the interview. Without sensitivity to the impact of particular words on the person being interviewed, the answer may make no sense at all—or there may be no answer” (Patton, 2015, p. 454). Avoiding technical jargon and terms and concepts from your particular disciplinary orientation is a good place to begin. In a study of HIV-positive young adults, for example, participants were asked how they made sense of or came to terms with their diagnosis, not how they constructed meaning in the process of perspective transformation (the theoretical framework of the study) (Courtenay, Merriam, & Reeves, 1998).
Types of Questions, Good Questions, and Questions to Avoid An interviewer can ask several types of questions to stimulate responses from an interviewee. Patton (2015) suggests six types of questions:
1. Experience and behavior questions —This type of question gets at the things a person does or did, his or her behaviors, actions, and activities. For example, in a study of leadership exhibited by administrators, one could ask, “Tell me about a typical day at work; what are you likely to do first thing in the morning?”
2. Opinion and values questions —Here the researcher is interested in a person's beliefs or opinions, what he or she thinks about something. Following the preceding example of a study of administrators and leadership, one could ask, “What is your opinion as to whether administrators should also be leaders?”
3. Feeling questions —These questions “tap the affective dimension of human life. In asking feeling questions—‘how do you feel about that?’—the interviewer is looking for adjective responses: anxious, happy, afraid, intimidated, confident, and so on” (p. 444).
4. Knowledge questions —These questions elicit a participant's actual factual knowledge about a situation.
5. Sensory questions —These are similar to experience and behavior questions but try to elicit more specific data about what is or was seen, heard, touched, and so forth.
�. Background/demographic questions —All interviews contain questions that refer to the particular demographics (age, income, education, number of years on the job, and so
1748390 – Wiley US ©
on) of the person being interviewed as relevant to the research study. For example, the age of the respondent may or may not be relevant.
Interestingly, Patton (2015) recommends against asking “why” questions because they tend to lead to speculation about causal relationships and they can lead to dead-end responses. Patton recounts an amusing interview with a child in a study of open classrooms. When a first grader responded that her “favorite time in school” was recess, Patton asked her why she liked recess. Her answer was because she could go outside and play on the swings. When he asked, “why” she went outside, the child responded, “Because that's where the swings are!” (p. 456). Although “why” questions can put an end to a line of questioning, it has been our experience that an occasional “why” question can uncover insights that might be speculative but that might also suggest a new line of questioning.
Another typology of different types of questions that we have found particularly useful in eliciting information, especially from reticent interviewees, is Strauss, Schatzman, Bucher, and Sabshin's (1981) four major categories of questions: hypothetical, devil's advocate, ideal position, and interpretive questions. Each is defined in Table 5.2 and illustrated with examples from a case study of displaced workers participating in a job training and partnership ( JTPA) program.
TABLE 5.2
Four Types of Questions with Examples from a JTPA Training Program Case Study. Type of Question Example 1. Hypothetical questions—Ask what the respondent might do, or what it might be like in a particular situation; these usually begin with “what if” or “suppose.”
Suppose it were my first day in this training program. What would it be like?
2. Devil's advocate questions—The respondent is challenged to consider an opposing view or explanation to a situation.
Some people would say that employees who lost their job did something to bring about being fired. What would you tell them?
3. Ideal position questions—Ask the respondent to describe an ideal situation.
Would you describe what you think the ideal training program would be like?
4. Interpretive questions—The researcher advances tentative explanations or interpretations of what the respondent has been saying and asks for a reaction.
Are you finding returning to school as an adult a different experience from what you expected?
Hypothetical questions ask respondents to speculate as to what something might be like or what someone might do in a particular situation. Hypothetical questions begin with “What if” or “Suppose.” Responses are usually descriptions of the person's actual experience. In the JTPA study, for example, the hypothetical question, “Suppose it were my first day in this training program—what would it be like?” elicited descriptions of what it was actually like for the participants.
1748390 – Wiley US ©
Devil's advocate questions are particularly good to use when the topic is controversial and you want respondents' opinions and feelings. This type of question also avoids embarrassing or antagonizing respondents if they happen to be sensitive about the issue. The wording begins, “Some people would say,” which in effect depersonalizes the issue. The response, however, is almost always the respondent's personal opinion or feeling about the matter. In the JTPA example, the question, “Some people would say that employees who lost their job did something to bring it about. What would you say to them?” usually revealed how the respondent came to be unemployed and thus involved in the training program.
Ideal position questions elicit both information and opinion; these can be used with virtually any phenomenon under study. They are good to use in evaluation studies because they reveal both the positives and the negatives or shortcomings of a program. Asking what the ideal training program would be like in the JTPA example revealed things participants liked and would not want changed, as well as things that could have made it a better program.
Interpretive questions provide a check on what you think you are understanding, as well as offer an opportunity for yet more information, opinions, and feelings to be revealed. In the JTPA example, the interpretive question, “Would you say that returning to school as an adult is different from what you expected?” allowed the investigator to confirm the tentative interpretation of what had been said in the interview.
Overall, good interview questions are those that are open-ended and yield descriptive data, even stories about the phenomenon. The more detailed and descriptive the data, the better. The following questions work well to yield this type of data:
Tell me about a time when…
Give me an example of…
Tell me more about that…
What was it like for you when…
Some types of questions should be avoided in an interview. Table 5.3 outlines three types of questions to avoid and illustrates each from the JTPA study. First, avoid multiple questions—either one question that is actually a multiple question or a series of single questions that does not allow the respondent to answer one by one. An example of a multiple question is, “How do you feel about the instructors, the assignments, and the schedule of classes in the JTPA training program?” A series of questions might be, “What's it like going back to school as an adult? How do instructors respond to you? What kind of assignments do you have?” In both cases the respondent is likely to ask you to repeat the question(s), ask for clarification, or give a response covering only one part of the question—and that response may be uninterpretable. If, for example, an interviewee responded to the question, “How do you feel about the instructors, the assignments, and the schedule of classes?” with “They're OK—some I like, some I don't,” you would not know whether instructors or assignments or the schedule was being referred to.
1748390 – Wiley US ©
TABLE 5.3
Questions to Avoid. Type of Question
Example
Multiple questions
How do you feel about the instructors, the assignments, and the schedule of classes?
Leading questions
What emotional problems have you had since losing your job?
Yes-or-no questions
Do you like the program? Has returning to school been difficult?
Leading questions should also be avoided. Leading questions reveal a bias or an assumption that the researcher is making, which may not be held by the participant. These set the respondent up to accept the researcher's point of view. The question, “What emotional problems have you had since losing your job?” reflects an assumption that anyone losing a job will have emotional problems.
Finally, all researchers warn against asking yes-or-no questions. Any question that can be answered with a simple yes or no may in fact be answered just that way. Yes-or-no responses give you almost no information. For the reluctant, shy, or less verbal respondent, they offer an easy way out; they can also shut down or at least slow the flow of information from the interviewee. In the JTPA example, questions phrased in a yes-or- no manner, although at their core they are seeking good information, can yield nothing. Thus asking, “Do you like the program?” may be answered yes or no; rephrasing it to, “What do you like about the program?” necessitates more of a response. The same is true of the question “Has returning to school been difficult?” Asking, “How have you found the experience of returning to school?” mandates a fuller response.
A ruthless review of your questions to weed out poor ones before you actually conduct an interview is highly recommended. Ask the questions of yourself, challenging yourself to answer as minimally as possible. Also note whether you would feel uncomfortable honestly answering any of the questions. This review, followed by a pilot interview, will go a long way to ensure that you are asking good questions.
Probes Probes are also questions or comments that follow up on something already asked. It is virtually impossible to specify these ahead of time because they are dependent on how the participant answers the lead question. This is where being the primary instrument of data collection has its advantages, especially if you are a highly sensitive instrument. You make adjustments in your interviewing as you go along. You sense that the respondent is onto something significant or that there is more to be learned. Probing can come in the form of asking for more details, for clarification, for examples. Glesne and Peshkin (1992) point out that “probes may take numerous forms; they range from silence, to sounds, to a single word, to complete sentences” (p. 85). Silence, “used judiciously…is a useful and easy probe—as is the bunched utterance, ‘uh huh, uh huh,’ sometimes combined with a nodding head. ‘Yes, yes’ is a good alternative; variety is useful” (p. 86, emphasis in
1748390 – Wiley US ©
original). As with all questions, not just probes, the interviewer should avoid pressing too hard and too fast. After all, the participant is being interviewed, not interrogated.
Probes or follow-up questions—or as Seidman (2013) prefers to call it, “exploration”—can be as simple as seeking more information or clarity about what the person has just said. These are typically who, what, when, and where questions, such as Who else was there? What did you do then? When did this happen? or Where were you when this happened? Other probes seek more details or elaboration, such as What do you mean? Tell me more about that. Give me an example of that. “Walk” me through the experience. Would you explain that? and so on.
The following is a short excerpt (Weeks, n.d.) from an interview with a man in midlife who had been retained (held back a grade) in grammar school. The investigator was interested in how being retained had affected the person's life. Note the follow-up questions or probes used to garner a better understanding of his initial reaction to being retained.
Interviewer: How did you feel about yourself the second time you were in first grade? Respondent: I really don't remember, but I think I didn't like it. It was probably
embarrassing to me. I think I may have even had a hard time explaining it to my friends. I probably got teased. I was probably defensive about it. I may even have rebelled in some childlike way. I do know I got more aggressive at this point in my life. But I don't know if being retained had anything to do with it.
Interviewer: How did you feel about your new first grade teacher? Respondent: She was nice. I was very quiet for a while, until I got to know her. Interviewer: How did you feel about yourself during this second year? Respondent: I have to look at it as a follow-up to a period when I was not successful.
Strictly speaking, I was not very successful in the first grade—the first time. Interviewer: Your voice sometimes changes when you talk about that. Respondent: Well, I guess I'm still a little angry. Interviewer: Do you feel the retention was justified? Respondent: (long pause) I don't know how to answer that. Interviewer: Do you want to think about it for a while? Respondent: Well, I did not learn anything in the first grade the first time, but the lady was
nice. She was my Mom's best friend. So she didn't teach me anything, and she made me repeat. I had to be retained, they said, because I did not learn the material, but (shaking his finger), I could have. I could have learned it well. I was smart.
The best way to increase your skill at probing is to practice. The more you interview, especially on the same topic, the more relaxed you become and the better you can pursue potentially fruitful lines of inquiry. Another good strategy is to scrutinize a verbatim transcript of one of your interviews. Look for places where you could have followed up but did not, and compare them with places where you got a lot of good data. The difference will most likely be from having maximized an opportunity to gain more information through gentle probing.
1748390 – Wiley US ©
The Interview Guide The interview guide, or schedule as it is sometimes called, is nothing more than a list of questions you intend to ask in an interview. Depending on how structured the interview will be, the guide may contain dozens of very specific questions listed in a particular order (highly structured) or a few topical areas jotted down in no particular order (unstructured) or something in between. As we noted earlier, most interviews in qualitative research are semistructured; thus the interview guide will probably contain several specific questions that you want to ask everyone, some more open-ended questions that could be followed up with probes, and perhaps a list of some areas, topics, and issues that you want to know more about but do not have enough information about at the outset of your study to form specific questions.
An investigator new to collecting data through interviews will feel more confident with a structured interview format in which most, if not all, questions are written out ahead of time in the interview guide. Working from an interview schedule allows the new researcher to gain the experience and confidence needed to conduct more open-ended questioning. Most researchers find that they are highly dependent upon the interview guide for the first few interviews but soon can unhook themselves from constant reference to the questions and go with the natural flow of the interview. At that point, an occasional check to see whether all areas or topics are being covered may be all that is needed.
New researchers are often concerned about the order of questions in an interview. No rules determine what should go first and what should come later. Much depends upon the study's objectives, the time allotted for the interview, the person being interviewed, and how sensitive some of the questions are. Factual, sociodemographic-type questions can be asked to get the interview started, but if there are a lot of these, or if some of them are sensitive (for example, if they ask about income, age, or sexual orientation), it might be better to ask them at the end of the inter-view. By then the respondent has become invested in the interview and is more likely to see it through by answering these questions.
Generally it is a good idea to ask for relatively neutral, descriptive information at the beginning of an interview. Respondents can be asked to provide basic descriptive information about the phenomenon of interest, be it a program, activity, or experience, or to chronicle their history with the phenomenon of interest. This information lays the foundation for questions that access the interviewee's perceptions, opinions, values, emotions, and so on.
Of course, it is not always possible to separate factual information from more subjective, value-laden responses. And again, the best way to tell whether the order of your questions works or not is to try it out in a pilot interview.
In summary, then, questions are at the heart of interviewing, and to collect meaningful data a researcher must ask good questions. In our years of experience doing and supervising qualitative research, the fewer, more open-ended your questions are, the better. Having fewer broader questions unhooks you from the interview guide and enables you to really listen to what your participant has to share, which in turn enables you to better follow avenues of inquiry that will yield potentially rich contributions. Exhibit 5.1 is an interview guide for a study of how older adults become self-directed in their health care (Valente, 2005). These open-ended questions, followed up by the skillful use of probes, yielded substantive information about the topic.
1748390 – Wiley US ©
Exhibit 5.1. Interview Guide.
1. I understand that you are concerned about your health. Tell me about your health.
2. What motivated you to learn about your health?
3. Tell me, in detail, about the kinds of things you have done to learn more about your health. (What did you do first?)
4. Where do you find information about your health?
5. Tell me about a time when something you learned had a positive impact on your health care.
�. What kinds of things have you changed in your life because of your learning?
7. Whom do you talk to about your health?
�. Tell me about your current interactions with your health care provider.
9. Tell me about what you do to keep track of your health.
10. What other things do you do to manage your health?
11. What kinds of challenges (barriers) do you experience when managing your health care?
12. What else would you like to share about your health-related learning? Source: Valente (2005). Reprinted with permission.
Beginning the Interview Collecting data through interviews involves, first of all, determining whom to interview. That depends on what the investigator wants to know and from whose perspective the information is desired. Selecting respondents on the basis of what they can contribute to the researcher's understanding of the phenomenon under study means engaging in purposive or theoretical sampling (discussed in Chapter Four). In a qualitative case study of a community school program, for example, a holistic picture of the program would involve the experiences and perceptions of people having different associations with the program—administrators, teachers, students, community residents. Unlike survey research, in which the number and representativeness of the sample are major considerations, in this type of research the crucial factor is not the number of respondents but the potential of each person to contribute to the development of insight and understanding of the phenomenon.
How can such people be identified? One way is through initial on-site observation of the program, activity, or phenomenon under study. On-site observations often involve informal discussions with participants to discover those who should be interviewed in depth. A second means of locating contacts is to begin with a key person who is considered knowledgeable by others and then ask that person for referrals. Initial informants can be found through the investigator's own personal contacts, community and private organizations, advertisements on bulletin boards, or on the Internet. In some studies a preliminary interview is necessary to determine whether the person meets the criteria for
1748390 – Wiley US ©
Collepals.com Plagiarism Free Papers
Are you looking for custom essay writing service or even dissertation writing services? Just request for our write my paper service, and we'll match you with the best essay writer in your subject! With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.
Get ZERO PLAGIARISM, HUMAN WRITTEN ESSAYS
Why Hire Collepals.com writers to do your paper?
Quality- We are experienced and have access to ample research materials.
We write plagiarism Free Content
Confidential- We never share or sell your personal information to third parties.
Support-Chat with us today! We are always waiting to answer all your questions.