QUALITATIVE RESEARCH DESIGN

 

(Based on Qualitative Research Design. An Interactive Approach, 2nd edition, by Joseph A. Maxwell, Sage Publications, 2005).

 

 

Qualitative research

 

·        Inductive approach

 

·        Focus on specific situations or people

 

·        Emphasis on words rather than numbers

 

 

 

Research design. the arrangement of elements governing the functioning of a study.

 

Interactive model of research design:

 

The underlying structure and interconnection of the components of the study and the implications of each component for the others.

 

 

5 components: (i) purpose; (ii) conceptual context; (iii) research questions; (iv) methods; and (v) validity.

 

PURPOSES

 

What are the ultimate goals of this study? What issues is it intended to illuminate, and what practices will it influence? Why do you want to conduct it, and why should we care about the results? Why is the study worth doing?

 

Find an unanswered, empirically answerable question to which the answer is worth knowing.

 

Two functions:

 

·        They help you guide your other design decisions to ensure that your study is worth doing.

 

·        They are crucial to justifying your study.

 

Three kinds of purposes:

 

·        Personal purposes

 

·        Practical purposes: accomplishing something.

 

·        Research purposes: understanding something, gaining some insight into what is going on and why this is happening.

 

o   They need to be empirically answerable by your study. You need to frame your research questions in ways that help your study to advance your purposes rather than smuggling these purposes into the research questions themselves.

 

 

5 types of research purposes for which qualitative research studies are specifically suited:

 

·        Understanding the meaning of events, situations, actions, and accounts of lives and experiences.

 

·        Understanding the context with which participants act, and the influence that this context has on their actions.

 

·        Identifying unanticipated phenomena and influences and generating new grounded theories.

 

·        Understanding the process by which events and actions take place.

 

·        Developing causal explanations.

 

 

CONCEPTUAL CONTEXT (THEORETICAL FRAMEWORK)

 

What do you think is going on with the phenomena you plan to study? What theories, findings, and conceptual frameworks relating to these phenomena will guide or inform your study, and what literature, preliminary research, and personal experience will you draw on?

 

The system of concepts, assumptions, beliefs, and theories that supports and informs your research. It explains the main things to be studies and the presumed relationships among them.

 

It is a formulation of what you think is going on with the phenomena you are studying –a tentative theory of what is happening and why.

 

It helps you develop and select realistic and relevant research questions and methods, and identify potential validity threats to your conclusions.

 

This component of the design contains the theory that you already have or are developing about the setting or issues that you are studying.

 

There are four main sources to construct the theoretical framework (conceptual context):

 

(i)                your own experience: experiential data, researchers’ technical knowledge, research background and personal experiences.

(ii)             existing theory and research

(iii)           the results of any pilot studies or preliminary research that you have done to test your ideas or methods and explore their implications or to inductively develop grounded theory.

(iv)           thought experiments: speculation, what if questions.

 

 

 

RESEARCH QUESTIONS

 

What, specifically, do you want to understand by doing this study? What do you not know about the phenomena you are studying that you want to learn? What questions will your research attempt to answer, and how are these questions related to one another?

 

You need to do a significant part of the research before it is clear what specific questions you should try to answer. Specific questions are generally the result of an interactive design process, rather than being the starting point for that process.

 

Functions of research questions

 

 

In a research proposal: to explain specifically what your study will attempt to learn or understand.

 

In research design, two other functions: (i) to help you focus the study (relationship to purposes and conceptual context); and (ii) to give you guidance on how to conduct it (relationship to methods and validity).

 

Hypothesis are generally formulated after the researcher has begun the study, they are grounded in the data and are developed and tested in interaction with it, rather than being prior ideas that are simply tested against data as in quantitative research.

 

Proposition. You may state your ideas about what is going on as part of the process of theorizing and data analysis.

 

You need to treat hypothesis critically, continually asking yourself what alternative ways there are of making sense of your data.

 

Generalizing question: stated in broad, generalizing terms.

Particularizing questions: stated in narrow, particularizing terms.

 

Instrumentalist: formulate questions in terms of observable or measurable data, worrying about potential validity threats.

Realist: They treat data as fallible evidence about the phenomena, to be used critically to develop or test ideas about the existence and nature of the phenomena.

 

Variance questions: they focus on difference and correlation, e.g., does, how much, to what extent, is there.

 

Process questions: they focus on how things happen, rather than whether there is a particular relationship or how much it is explained by other variables.

 

Types of understanding in qualitative research

 

·        Description: what happened in terms of observable behaviour or events.

·        Interpretation: about the meaning of these things for people involved: their thoughts, feelings, and interpretation.

·        Theory: about how these things happen and how they can be explained.

 

·        Generalization: focus on the generality or wider prevalence of the phenomena studied (not appropriate for qualitative research).

·        Evaluation: how such phenomena should be evaluated (not appropriate for qualitative research).

 

METHODS

 

What will you actually do in conducting this study? What approaches and techniques will you use to collect and analyze your data, and how do these constitute an integrated strategy?

 

It includes:

 

·        Your research relationship with the people you study.

 

·        Your site selection and sampling decisions: you can’t study everyone everywhere doing everything, even in single cases. Purposeful sampling: strategy in which particular settings, persons, or events are selected deliberately in order to provide important information that cannot be obtained as well from other choices.

 

·        Your data collection methods:

 

o   The relationship between research questions and data collection methods. There is no way to logically or mechanically convert research questions into methods. The methods are the means to answering the research questions. Your research questions formulate what you want to understand; your interview questions are what you ask people in order to gain that understanding.

o   Triangulation of data collection methods. Collecting data using a variety of sources and methods.

 

 

·        The data analysis techniques. Data analysis is part of the design. The initial step is reading the interview transcripts, observational notes, and documents : (i) memos; (ii) categorizing strategies, such as coding and thematic analysis; (iii) connecting strategies, such as narrative analysis. Coding –the most important strategy, is to fracture the data and to rearrange them into categories that facilitate comparison between things in the same category and that aid in the development of theoretical concepts or to categorize the data into broader themes and issues.

 

 

VALIDITY

 

 

How might you be wrong? What are the plausible alternative explanations and validity threats to the potential conclusions of your study, and how will you deal with these? How do the data that you have, or that you could collect, support or challenge your ideas about what is going on? Why should we believe your results?

 

It depends on the relationship of the conclusions to reality. No method can guarantee validity. What is needed is the possibility of testing the conclusions, giving the phenomenon that we are studying the possibility to be wrong.

 

Validity is a component of the research design and consists of the strategies you use to identify and try to rule out alternative explanations, i.e., validity threats. So, you need to think of specific validity threats and try to think of what strategies are best to deal with these.

 

Validity checklist

 

·       Intensive, long-term involvement.

·       Rich data: data that are detailed and vaired

·       Respondent validation: participant’s feedback

·       Intervention: informal manipulations

·       Search for discrepant evidence and negative cases

·       Triangulation: collecting information from a diverse range of individuals and settings, and using a variety of methods.

·       Quasi-statistics

·       Comparison

 

RESEARCH PROPOSALS

 

The purpose of a proposal is to explain and justify your proposed study to an audience of nonexperts on your topic.

 

A proposal is an argument for your study. It needs to explain the logic behind the proposed research, rather than simply describe or summarize the study, and to do so in a way that nonspecialists will understand.

 

 

A model for proposal structure

 

·       Abstract: an overview and roadmap of the study itself and the argument of your proposal.

·       Introduction: explain what you want to do and why. It should clearly present the goals of your study and the problems it addresses, and give an overview of your main research questions and of the kind of study you are proposing. It should also explain the structure of the proposal itself.

·       Conceptual framework: (literature review) (i) how your proposed research fits into what is already known –its relationship to existing theory and research-; (ii) explain the theoretical framework that informs your study. Don’t summarize prior theory and research. Ground your proposed study in the relevant previous work, and give the reader a clear sense of your theoretical approach to the phenomena that you propose to study. Pilot studies that you have done must be discussed in the proposal, explaining their implications for your research. It can be done either at the end of the conceptual framework, in a separate section after the conceptual framework, or after the presentation of your research questions.

·       Research questions: (i) state your questions, (ii) clarify how your questions relate to prior research and theory, to your own experience and exploratory research, and to your goals; and (iii) how these questions form a coherent whole, rather than being a random collection of queries about your topic.

·       Research methods: Include a description of the setting or social context of your study. (i) research design in the typological sense; (ii) the research relationship you establish with those you are studying; (iii) site and participant selection; (iv) data collection, i.e., how you will get the information you need to answer your research questions; and (v) data analysis. Also ethics need to be discussed here or in a separate section.

·       Validity: how you will use different methods to address a single validity thereat or how a particular validity issue will be dealt with through selection, data collection, and analysis decisions. You must allow for the examination of competing explanations and discrepant data, i.e., that your research is not a self-fulfilling prophecy.

·       Preliminary results: discuss what you have learned so far about the practicality of your methods or tentative answers to your research questions. This discussion is valuable in justifying the feasibility of your study and clarifying your methods.

·       Conclusion: You need to pull together what you have said in the previous sections, remind your readers of the goals of your study and what it will contribute, and discuss its potential relevance and implications for the broader field/s that it is situated in. This section should answer any so what question.

·       References: only the references actually cited.

·       Appendixes: (i) a timetable for the research; (ii) letters of introduction or permission; (iii) questionnaires, interview guides, or other instruments; (iv) a list of possible interviewees; (v) a schedule of observations; (vi) descriptions of analysis techniques or software; (vii) a table of relationships among questions, methods, data, and analysis strategies; and (viii) examples of observation notes or interview transcripts from pilot studies or completed parts of the study.