research.method/MehdiRahbar

American Studies

research.method/MehdiRahbar

American Studies

  • ۰
  • ۰

History of Qualitative Research

Qualitative research has a long and vibrant history in the social sciences, health sciences, and humanities. The development of qualitative research has been heavily influenced by the variety of sub disciplines. Although the work for the Chicago School in America in the 1920s and 1930s highlighted the central role of qualitative research in social research, a range of other disciplines was also responsible for the rise and continued development of qualitative approaches, including history, medicine, nursing, social work, and communications.

Although some historical accounts have taken as their starting point the development of qualitative research in the beginning of the 20th century, for example, Norman K. Denzin and Yvonna S. Lincoln’s “Seven Moments of Qualitative Research,” other accounts begin their analysis with the development of qualitative approaches in the 17th century.

The beginnings of qualitative research, according to Vidich and Lyman, are located in the work of early ethnographers during the 17th century. Qualitative research during this period involved the Western researcher observing the customs, practices, and behaviors of “primitive” societies, to understand the other. During this period, the other was often regarded as a non-White person living in a society considered less civilized than the society to which the observer belonged. Such interest in “primitive people” was exacerbated by the problems experienced by explorers during the 15th and 16th centuries when attempting to account for people they discovered in the New World.

Qualitative research during this second phase (17th to the 19th century) was regarded in terms of colonial ethnography. During this period, ethnographic descriptions and analyses, written by Western explorers, missionaries, and colonial administrators, were deposited in church archives and/or local and national archives. Many of these early writings sought to civilize the world.

During the next phase (late 19th to early 20th century), American ethnographers focused on American Indians, who were still regarded as primitive and as representing a specific other. These others were researched to shed light on prehistoric times. This period also saw a shift in the perspective of ethnographers, from ethnographies written by missionaries to those written exclusively by anthropologists, for example, those writing after the creation of the ethnology section of the Smithsonian Institution or for the Bureau of Indian Affairs (BIA).

During the early 20th century and up to the 1960s, the religious beliefs, practices, and customs of Black, Asian, and European immigrants who had arrived on American soil during the early days of industrialization were a source of worry for White American citizens who were concerned with the future development of the American Protestant society.

Initial efforts to preach a social gospel in the settlement houses were hindered by the sheer number of new urban inhabitants. In order to deal with these increasing numbers and to identify the numbers of each denomination, nationality, and race, statistical surveys were implemented. The desire to incorporate immigrant groups into existing Protestant communities resulted in the first qualitative community analysis by W. E. B. Du- Bois—The Philadelphia Negro.

The fifth moment (1990–1995) attempted to address the crises characterizing the previous period. Innovative approaches to ethnographic writing were introduced, and the perception of the distant observer was eroded. Situation-specific and localized theories replaced grand theories and narratives.

The trends occurring in the postmodern period continue in the post experimental period (1995–2000) through the use of poetry, drama, and multimedia techniques in ethnographic writings. New researchers across a number of disciplines are continuing the more reflexive and interpretive approach to qualitative research.

 

  • Mehdi Rahbar
  • ۰
  • ۰

Post colonialism

Post colonialism

Postcolonialism is a broad theoretical approach that examines the past and present impact of colonialism and racism on social, political, and economic systems. It focuses on the ways particular groups of people because of notions of race or ethnicity have been excluded, marginalized, and represented in ways that devalued or even dehumanized them.

There are a number of major postcolonial theorists who have had a huge impact on the ways key concepts developed as an intellectual discipline: Frantz Fanon, whose groundbreaking work emphasized the effects of colonialism on the psyche; Edward Said, who developed the notion of “Orientalism”; Gayatri Spivak, whose work on the “subaltern” has been enormously influential; and Homi Bhabha, who has emphasized the value of psychoanalytical concepts such as ambivalence and hybridity in the study of colonialism.

More recently, however, the field of postcolonial studies has been characterized by a commitment to unpacking the complex connections between “race,” ethnicity, gender, sexuality, and many other forms of social stratification. These works tend to move beyond an additive model of identity, instead examining the specific ways in which various forms of inequality intersect in particular discourses and in particular historical locales.

In postcolonial studies, colonialism is not conceived simply in terms of military and economic expansion. It has important social, cultural, and religious dimensions as well. For instance, the export of cricket to colonial outposts by the British is a classic example of the way sport can be an element of colonialism.

The effect of colonialism on the human psyche was the subject of a number of books by Fanon. Whether writing about his own experiences growing up in Martinique, examining the effect of racism on the choice of sexual partners by women of color, or discussing the effects of the Algerian war of independence, Fanon consistently emphasized the damaging effects of racism and colonialism on the self-image and psyche of both colonizers and colonized people. However, he did not believe that people of color were destined to experience the same dehumanization as previous generations. As Fanon comments in Black Skin, White Masks (1991, p. 230), “I am not the slave of the Slavery that dehumanized my ancestors.”

Postcolonialism (the study of the ways in which past and present societies are influenced by a history of colonialism) is a theoretical approach that is gaining in popularity as a result of the need to theorize cross cultural contact in the context of colonialism and globalization. Although early postcolonial critics, such as Fanon, tended to characterize the power relations as involving colonizers and colonized subjects, later work tended to move away from such simplistic binaries. Such later work has emphasized the importance of studying hybridity, ambivalence, and areas of intense contact such as the borderlands, in order to  develop a more sophisticated and nuanced approach to postcolonialism.

The complex ways in which postcolonialism intersects with categories of gender, sexuality, and other forms of social stratification has also been a topic of growing scholarly interest in recent years. Qualitative research in this field tends to emphasize the situatedness of the researcher and the nation state being examined. Overgeneralization beyond the realms of one particular postcolonial context is particularly frowned upon in this area of study. 

  • Mehdi Rahbar
  • ۰
  • ۰

Positivism

 

Positivism

Positivism is the codeword for a package of philosophical ideas that most likely no one has ever accepted in its entirety. These ideas include a distrust of abstraction, a preference for observation unencumbered by too much theory, a commitment to the idea of a social science that is not vastly different from natural science, and a profound respect for quantification.

Positivism is the label for a series of claims rather than any single claim. Moreover, many of these claims are analytically separable and do not entail one other so that it is entirely possible to accept some and not the rest.

Positivism was coined by Auguste Comte, but even for him it has several different connotations. It refers, in part, to a theory of history according to which every branch of knowledge passes through three stages (the theological, the metaphysical, and the positive state—when explanations by appeal to unobservable entities are finally abandoned) and which asserts that improvements in knowledge are responsible for historical progress.

However, the most iconic version of positivism is associated with the Vienna Circle and the school of logical positivism that emerged from it along with an affiliated group in Berlin. The circle’s 1929 manifesto emphasizes two fundamental commitments: to empiricism (i.e., there is knowledge only from experience) and to logical analysis, by means of which philosophical problems and paradoxes would be resolved and the structure of scientific theory made clear. It is, of course, the second of these commitments that represents logical positivism’s distinctive contribution to the empiricist tradition.

The logical analysis component of positivism has been based on developments in formal logic since the 19th century. Instead of a system of generalizations about psychological processes, logic was now seen as a formal symbolic language, empty of any empirical content that could be used to define precisely the conceptual relations between sentences. This development provided the logical positivists (or so they believed) with a means of translating theoretical sentences into sets of statements about experience and enabled them to organize the whole of scientific knowledge into an axiomatic system.

Like empiricism, then, positivism is a family of claims and concepts on which different authors have placed varying degrees of emphasis. It shares with empiricism a commitment to making experience the test of all knowledge and is skeptical about the idea of an unobservable reality that includes entities and forces not discoverable in experience, a skepticism that extends even to laws of nature. In its later forms, positivism adds to empiricism an enthusiasm for statistics—indeed, for quantification in general—and the assumption that if a statement is meaningful, then it can, by definition, be subject to scientific testing and verification (an assumption subsequently weakened or dropped).

It also attempts to translate what is known into formal languages, including mathematics, and to organize scientific theory into logical structures. However, if there is an overlap with empiricism, there is also common ground with American pragmatism, which had a similar preference for experience, verifiability, antirealism, and operationalism. This common ground largely explains why the logical positivists were accorded such a favorable reception in the United States following their flight from Nazi Europe in the 1930s.

In contemporary methodological writing, positivism is apparently dead, yet it still receives constant criticism; it is significant that the most influential examples of modern social theory, such as critical realism, constructivism, hermeneutics, and structuration theory, take a critique of positivism as their premise.

  • Mehdi Rahbar
  • ۰
  • ۰

Politics of Qualitative Research

All research is political insomuch as it comes out of a particular view of the world, makes claims about reality, and supports or refutes existing knowledge claims. Qualitative research, which generally makes no pretense of disinterested objectivity, has been more likely than quantitative research to be labeled as political. Qualitative researchers have risen to this challenge and engaged in a variety of discussions regarding the politics of their work.

Although some have characterized the qualitative– quantitative debate as a dialogue, it might be more accurate to describe it as a struggle on the part of qualitative researchers for legitimacy and place.

Qualitative researchers are concerned about what knowledge is being uncovered as a result of their work, and they are concerned with having their work taken seriously by their colleagues and by other policy actors. As a result, they have over the years sought to establish rigor in a variety of ways. Some have discussed the importance of the credibility of the results.

They have encouraged the use of strategies such as constant comparative method, using a variety of data collection methods, and returning transcripts and analyses to participants to verify conclusions. Others have talked about establishing the trustworthiness of the research by describing all aspects of the research process in sufficient detail.

Research is always and forever a political enterprise. Because of the predominant societal perspective of qualitative research, positioned as it is alongside critical and questioning movements, it will more often than not be characterized as political, whereas quantitative work within a positivist paradigm will be characterized as scientific and therefore neutral.

What distinguishes the current political debate surrounding qualitative research from previous debates is that it has involved people from outside the research community and that it is so clearly part of a larger movement against all manner of critique and dissent. Qualitative researchers are increasingly becoming activists within and outside of the academy, forming their own links and coalitions with other outsiders who continue to challenge the status quo.

 

  • Mehdi Rahbar
  • ۰
  • ۰

Peer Review

Peer Review

Peer review, also known as expert review, independent scientific review, or auditing, is a method used by administrators, funding officials, journal editors, and researchers to inform decision making and to improve the research process and outcomes by engaging independent and qualified experts to provide critical and consultative evaluation of the merits of a research project or product, proposal.

In colleges, universities, and independent research institutions, peer review is often a required internal gatekeeping process through which research proposals must successfully pass before investigators submit proposed research projects to ethics review boards or external funding sources.

These internal peer reviews are conducted by faculty members, researchers, and others with the expertise and knowledge to render decisions of quality and to offer improvements. Outcomes include approval for external submission, guidance for a revision and resubmission, or the project’s dismissal.

Peer review is often the preferred method for judging a proposal’s merits and rigor for research funding and for deciding how best to allocate scarce public or private resources. Whether conducted by an individual or by groups sometimes called panel reviews or review committees, peer review for governmental and private foundation and trust funding focuses on the proposed project’s significance and methodological integrity.

Peer review is considered the highest and most rigorous form of editorial review in determining the publication merits of papers, chapters, and books. With scholarly or academic journals, editors and their boards of reviewers (referees) serve as the major gatekeepers for judging what texts are deemed to be of the highest quality and significance and therefore worthy of publication.

To ensure the greater independence of the peer reviewing process, some editors also combine peer review with what is called blind review, in which the referees do not know the identities of the authors and the other reviewers and the authors do not know the identities of the reviewers.

In qualitative projects, researchers may call upon peers with relevant methodological and content area expertise and experience to scrutinize and critique a study’s procedures and outcomes. This type of peer review, sometimes called investigator triangulation, provides researchers with an objective source familiar with the research or the phenomenon being explored to review the study’s methodology, to analyze portions of data, and to critique findings.

  • Mehdi Rahbar
  • ۰
  • ۰

NVivo SoftWare

NVivo SoftWare

NVivo is a qualitative data analysis (QDA) computer software package produced by QSR International. It has been designed for qualitative researchers working with very rich text-based and/or multimedia information, where deep levels of analysis on small or large volumes of data are required.

NVivo is used predominantly by academic, government, health and commercial researchers across a diverse range of fields, including social sciences such as anthropology, psychology, communication, sociology, as well as fields such as forensics, tourism, criminology and marketing.

The first NVivo software product was developed by Tom Richards in 1999. Originally called NUD*IST, it contained tools for fine, detailed analysis and qualitative modeling.

NVivo is intended to help users organize and analyze non-numerical or unstructured data. The software allows users to classify, sort and arrange information; examine relationships in the data; and combine analysis with linking, shaping, searching and modeling.

The researcher or analyst can test theories, identify trends and cross-examine information in a multitude of ways using its search engine and query functions.

NVivo accommodates a wide range of research methods, including network and organizational analysis, action or evidence-based research, discourse analysis, grounded theory, conversation analysis, ethnography, literature reviews, phenomenology, mixed methods research and the Framework methodology.

NVivo supports data formats such as audio files, videos, digital photos, Word, PDF, spreadsheets, rich text, plain text and web and social media data. Users can interchange data with applications like Microsoft Excel, Microsoft Word, IBM SPSS Statistics, EndNote, Microsoft OneNote, SurveyMonkey and Evernote; and order transcripts from within NVivo projects, using TranscribeMe.

The program uses a coding system that underpins generation of relationships between elements in the data. It is an effective relational database that provides the researcher with the flexibility to

• test tentative theorizing about relationships within the data;

• discover and explore new relationships as data analysis unfolds;

• map relationships;

• track data analysis; and

• log and save search results.

 

  • Mehdi Rahbar
  • ۰
  • ۰

Narrative Analysis

Narrative Analysis

Narrative analysis refers to a family of analytic methods for interpreting texts that have in common a storied form. As in all families, there is conflict and disagreement among those holding different perspectives.

The term narrative is illusive, carrying many meanings and used in a variety of ways by different scholars, often used synonymously with story. In the familiar everyday form, a speaker connects events to a sequence that is consequential for later action and for the meanings listeners are supposed to take away from the story. Events are perceived as important, selected, organized, connected, and evaluated as meaningful for a particular listener.

Narrative analysts ask the following questions: For whom was the story constructed and for what purpose? How is it composed? What cultural resources does it draw on or take for granted? What storehouse of cultural plots does it call up? What does the story accomplish? Are there gaps and inconsistencies that might suggest preferred, alternative, or counter narratives? There are many ways to narrate an experience: How a speaker, writer, or visual artist chooses to do it is significant, suggesting lines of inquiry that would be missed without focused attention or close reading. Some investigators in the social sciences attend to language, form, and social context (including audience) more than others do.

Elliot Mishler contrasts category-centered approaches in social research, which strip individuals of agency and consciousness, with case-based approaches that can restore agency in research and theory; individuals are respected as subjects with histories and intentions. The study of cases can generate categories or, to put it differently, theoretical generalization; the histories of the physical and social sciences are full of examples where theoretical propositions were derived from close study of individual instances. Narrative analysis joins this long tradition of case-centered inquiry, interrogating stories developed in interviews and fieldwork and in archival documents and visual media.

 

 

  • Mehdi Rahbar
  • ۰
  • ۰

Mixed Method Research

Mixed Method Research

Mixed methods is defined as research in which the inquirer or investigator collects and analyzes data, integrates the findings, and draws inferences using both qualitative and quantitative approaches or methods in a single study or a program of study.

Regardless of the form of mixing, the reasons for mixing the methods in a study need to be clearly identified by researchers. One reason for using mixed methods research is that the use of both qualitative and quantitative approaches will provide a more complete understanding of the research problem than either approach alone.

Another reason for mixing is to follow up on initial exploratory finding. This reason applies when the researcher seeks to explore first qualitatively and then to test this exploration with a large quantitative sample of a population.

A final reason for using mixed methods research is to enhance a larger data set with a smaller, more focused data set. For example, an investigator might conduct an experiment and within that experiment collect qualitative data that provides information as to how the participants experienced the intervention.

Researchers are concerned about whether qualitative research has been relegated to secondary status in mixed methods experiments that include a small, embedded qualitative component. They are also concerned about integrating incompatible views of reality when researchers combine post positivist views of a single reality with constructionist views of multiple realities. Some individuals are concerned about the dominance of certain voices in the discussion about mixed methods and whether the discourse is open and accessible to all writers. Others focus on issues of confidentiality in using the same participants in both phases of a sequential two-phase project.

More discussion is needed about the adaptation and acceptance of mixed methods in various social and health science fields. Continued work needs to be done to better understand the procedures of sampling, the ways of merging quantitative and qualitative data, the suitability of current software programs to aid the mixed methods researcher, how individuals on research teams can effectively coordinate their individual expertise in quantitative and qualitative research, how to bridge the emerging division between philosophical approaches and method approaches, and the challenge to beginning researchers to understanding three approaches to inquiry— quantitative, qualitative, and mixed. Despite these challenges, the movement of mixed methods continues to advance and its growth is seen in an enhanced understanding of it as described in journals, in books, and at national and international conferences.

  • Mehdi Rahbar
  • ۰
  • ۰

Meta- Analysis

Meta- Analysis

Meta-analysis is generally defined as the analysis of analyses. The term generally is associated with quantitative methodologies, but it does have qualitative analogs. This technique is distinctly different from secondary analyses where the original data from a study are reanalyzed.

Quantitative meta-analysis reviews statistically a collection of analyses from related individual studies in order to provide a summarization or integration of the results. The core of this review is the calculation of an effect size. The effect size can be based on the difference between two groups divided by their pooled standard deviation or a correlation between two variables.

Qualitative meta-analysis also involves the synthesis of evidence from primary studies, but there are numerous forms of synthesis with different goals, though most are interpretive techniques.

Meta-ethnography is comprised of three techniques for synthesizing qualitative studies: reciprocal translation analysis, which is the identification of key metaphors or themes in studies; refutational synthesis, where key metaphors or themes are identified and contradictions between studies examined; and lines of argument synthesis, where a general interpretation is developed based on the observations in the separate studies.

Meta-theory, part of the meta-study group of techniques, along with meta-method and meta-data is a critical analysis of specific theoretical frameworks. Meta-method is an analysis of the methodologies utilized and how the methodologies affect specific research areas. Meta-data is a synthesis of data presented in articles and reports.

 

  • Mehdi Rahbar
  • ۰
  • ۰

Interviewing

Interviewing

Interviewing is a conversational practice where knowledge is produced through the interaction between an interviewer and an interviewee or a group of interviewees. In most cases, research interviewing involves a “one-way dialogue” with the researcher asking questions and the interviewee being cast in the role of respondent.

Many different forms of interviewing exist. Interviews can be formally conducted in surveys, through the internet, over the telephone, or in face to- face interaction, and they can be informally conducted; for example, as part of ethnographic fieldwork. Research interviews can be more or less structured. In survey research interviewing, standardized questions are posed and the answers are given in forms that are amenable to quantitative procedures.

The interview itself is carried out to enable the researcher to answer one or more of his or her research questions. These are formulated in advance when the researcher thematizes and designs the study. Before deciding to carry out the interview, the researcher should always consider whether interviewing is in fact the most adequate way in which to answer the questions that interest the researcher.

Usually, the interviewer has prepared an interview guide in which the research questions are given a form that renders them suitable to be posed directly as interview questions. Good questions are typically brief, simple, and open, and often the researcher will be interested in concrete descriptions of the respondent’s experiences rather than more abstract reflections.

It is the transcription rather than the original oral interview conversation that serves as the researcher’s primary data source when he or she interprets and analyzes the interview. Transcribing interviews is an interpretive process that demands prolonged practice and sensitivity to the many differences between oral speech and written texts, and the disembodied and decontextualized nature of texts should be kept in mind during the later processes of analysis.

Qualitative interviews also became part of industrial research to maximize workers’ effectiveness and since the 1950s commercial and market interviews, especially in the form of focus groups, have been a growth industry.

Important epistemological discussions concerning the objectivity, validity, reliability, and generalizability of the knowledge produced through interviewing continue in current debates. One aspect of this discussion that is integral to the practitioners of interviewing concerns the issue of whether interviews can provide a more or less direct pipeline to the participants’ life worlds provided that the interviewer engages in no directional unbiased questioning.

  • Mehdi Rahbar