Media Discourse Analysis:
Approaches to Analyzing Media Texts
Nada Mrabet
ISLT
Abstract
This paper discusses the possible approaches to analyzing media
texts. It intends the cover some of the most important and most developed methods
of media discourse analysis, starting from the early quantitative content
analysis, originally developed by sociologists, social scientists and
communication researchers. Then, critical analysts like Fairclough and van Dijk
came to prove these quantitative methods insufficient. After that, my paper
will discuss the approach of other researchers like Schroder who found gaps in the
concepts concerned with the production/consumption processes. She suggests,
along with van Dijk, an empirical, ethnographic approach to media texts to fill
in those gaps.
Keywords: CDA, ethnography, encoding, and decoding.
Introduction
In regard of the
importance of media discourse one of the four main registers of the English
language (O’Keeffee, 2006), my paper will cover some of the key approaches,
methods and tools of analysis of media discourse that analysts can adopt to
analyze either small-scale or large-scale corpora. The quantitative content
analysis has been first adopted to carry out objective observations and
interpretations. Many software tools were brought to the table to serve
quantitative and statistical needs. However, these quantitative tools were later
proved inadequate, the thing that smoothed the path for critical analysts to
introduce Critical Discourse Analysis (CDA) to the world of media text
analysis. Then, I will focus on the importance of ethnography in media discourse
analysis which is still under growth.
Quantitative Content Analysis of Media Texts
Quantitative Content Analysis and Mass Media Research
Quantitative content
analysis has first submerged in the 1950s as a major research tool of analysis
of media texts in mass communication studies and social sciences. Lasswell
(1948) describes media content analysis as ‘who says what, through which channel,
to whom, with what effect.’ The quantitative research techniques are used for
the conduct of ‘objective, systematic and quantitative’ descriptions of the
manifest content of media texts. This makes quantitative content analysis the
most scientific and unbiased method that can be used for the analysis of media
content.
Mass communication
researchers have offered a lot to the analysis of media content. Their findings
give clear definitions to the content analysis of communication events and
provide clear outlines to follow, not only for the objective interpretations,
but also for the gathering of media content samples. Neuendorf (2002) suggests
seven elements that will assure that the scientific quantitative content
analysis of media texts will not get ruined by the subjective orientations of
the researchers: objectivity-intersubjectivity, a priori design,
reliability, validity, generalizability, replicability, and hypothesis
testing. Berelson (1952) suggests five elements of content analysis that
every researcher should focus on: substance of message content, form of
message content, producers of content, audiences of content, and effects
of content on audiences.
Similar findings were
later on found in the field of applied linguistics. Even van Dijk (1985)
admitted that before the 1960s, linguistics had little to offer to those who
were interested to analyze media discourse, and that it is within social
sciences that mass media research has initially emerged.
Implication of Quantitative Content Analysis in Applied Linguistics
Quantitative content
analysis is used for large scale corpora to summarize patterns and regularities
in texts. In the 1960s, the analysis of media discourse was approached through
quantitative methods. In the field of applied linguistics, the importance of
the quantitative approach to texts was highlighted by the Gerbner et al. book
and the Holsti introduction. Then, it was further emphasized in the General
Inquirer project, where the help of computers were brought about.
Computational algorithms
can help researchers conduct all sorts of quantitative analyses, from the most
limited and automatic, to the most complicated, e.g. analyzing statistical data
and results. The quantitative methods are the best to use for large scale
projects, if the researcher’s aim is to identify widespread language patterns
that could be missed when applying a small-scale analysis. The large-scale
analysis will help researchers to highlight patterns of association so that
they unveil, for instance, the most lexical items that tend to co-occur with
keywords derived from the issues they intend to investigate. Sometimes, without
this quantitative approach, analysts cannot be aware of the existence of some
crucial lexical items, due to the fact that they cannot be observed with the
naked eye. Noteworthy, the quantitative approach was carried out by a good
number of researchers, such as Gerbner (1968), Krishnamurthy (1996), Flowerdew
(1997), Fairclough (2000), Piper (2000), Teubert (2000) and Baker et al.
(2013).
Software for Quantitative Content Analysis
Ever since the recognition
of the role of computational algorithm in conducting a scientific objective analysis,
a good number of software tools were created to fulfill the purposes of the
texts analysts. A software tool for content analysis can be divided into three
major categories: dictionary-based content analysis (word counting,
sorting, simple statistical tests), development environments (do not
analyze but automate the construction of dictionaries, grammars, and other text
analysis tools), and annotation aids (an electronic version of the set
of marginal notes researchers generate when analyzing texts by hand).
The most commonly used
software that has been acknowledged as the most reliable one by many
researchers is the WORDSMITH. It is
‘an integrated suite of programs for looking at how words behave in texts.’ It
‘controls’ the programs it contains: Concord (makes a concordance using
plain texts or web text files), KeyWords (locate and identify key words
in a given corpora), and WordList (generate word lists based shown in
alphabetical and frequency order).
Since there are plenty
of software tools to choose from, there are some choice criteria that analysts
can follow in order to determine which software will meet their research
studies’ ultimate goals. Some of the criteria are: complexity of analysis,
language constraint, licensing issues and user base, and platforms.
Downsides of Quantitative Content Analysis
Content can be divided into two categories:
Manifest content (explicit information) and latent content (implicit
information). Quantitative content analysis can only be used for the manifest
content of media texts. Berelson (1952) says that using a quantitative method
to analyze ‘what-is-said’ will force the researchers to turn a blind eye to
‘why-the-content-is-like-that’ and ‘how people react’, i.e. the latent content.
Therefore, reducing large corpora into quantitative texts, looking for
keywords, and making concordances is not enough to build a complete picture of
the meanings intended from producing the text. Drawing conclusions from mere
figures and simple statistical data is neither the only way nor enough to
determine the intentions of the producers of media texts or the impact of these
texts on the audience.
One of the other aspects
that a quantitative content analysis of media texts failed to cover is, for
instance, the syntactic analysis of sentences, e.g. agency of social actions; the
use of the passive voice instead of the active voice to withdraw the attention
from the agent of the action. For example, “The man got killed during the
revolution” is different from “Police agents killed the man during the revolution.”
Instead of looking for the most frequent words that co-occur with the verb
‘kill’ in media texts about the revolution, it seems more important to know the
agent of this violent action. The fact that some media text producers choose to
use the passive or the active voice have different interpretations.
Qualitative Content Analysis of Media Texts
No one can deny the
importance of the quantitative method as an ‘objective, replicable and
quantitative’ tool of analysis of the manifest content of media texts. Ever
since the 1960s, much focus had been put on the ‘classical’, ‘quantitative,
American, stimulus-response’ approaches to media texts. Van Dijk (1985) stated
that in order to establish an ‘adequate analysis of the relations between media
texts and contexts’, we need to go beyond the ‘surface’ level of texts to the
investigation of the ‘underlying’ meanings. In the same context, Wodak &
Busch (2004) spoke of what some observers like Jensen & Jankawski (1991)
labeled “qualitative turn” from the quantitative content analysis of the study
of media texts. By the second half of the 1970s, different suggestions of a
‘more explicit and systematic account of media discourse’ were brought to light
primarily by the Glasgow University
Media group which has published ‘Bad News’ (1976) and ‘More Bad News’ (1980),
and the Center For Contemporary Cultural Studies (1980) under the direction of
Stuart Hall. Further contributions were made by Schelesinger & Lumley,
Dowing, Husband & Chouhan, and Hartley & Montgomery.
Discourse Analysis
Richardson (2007) states
that there are two main approaches to media texts: the formalistic approach, also
called the structuralist approach, and the functionalist approach. The
formalistic approach deals with the structural level of the media texts,
including these four characteristics: cohesion, narrative, causality and
motivation. Here, discourse analysis deals with ‘language above the
sentence.’ The functionalist approach deals with ‘language in use’ rather than
‘language above the sentence’. The language use and text interpretation cannot
be fully and adequately analyzed without the social component. Both the
formalistic and functionalistic approaches can contribute to a more adequate
analysis of media texts, built upon a consideration of meaning (assigning of
sense) and context (assigning of reference).
Critical Discourse Analysis (CDA)
CDA was first derived
from the Systemic Functional Linguistics (SFL) developed by Halliday. Then, it
got enhanced thanks to contributions made by Fairclough (1995), Fowler (1991)
and Boyd-Barret (1994). Despite the similarities, the founding fathers and
mothers of CDA, van Dijk, Wodak, and Fairclough, had a lot to offer to this
qualitative approach to media texts. CDA follows the functionalist approach,
which advocates the analysis of texts as ‘language in use.’ Its aim is to ‘link
linguistic analysis to social analysis’ (Wodak & Kroger, 2000). It is
concerned with: social problems, power relations, how society and culture
are shaped by discourse, and the investigation of texts, their interpretation,
reception and social effects (Titscher et al. 2000).
Fairclough’s model of
CDA. Fairclough’s
approach draws upon SFL. Fairclough’s method of analysis is conducted according
to: (1) text, (2) discursive practice, and (3) social practice.
(1) A text consists of representations, identities and social
relations, cohesion and coherence. There are two levels of textual analysis:
the sentence, and what is above the sentence. At the level of the sentence,
analysts examine vocabulary, semantics, grammar, and even the sound system and
the writing system. At the level that is above the sentence, analysts examine
cohesion, the organization of turn-taking in interviews during talk-shows, and
the overall structure of newspaper articles.
(2) It is at this stage when analysis turns from textual analysis
to discourse analysis. Texts should be analyzed as the ‘outcome of a discourse
practice’ for a more competent assessment of the ‘news practice, news values,
and audience role’ (Cotter, 2001). Too much focus on the text will depict
analysts as ignorant of the processes of news gathering, encoding, shaping of
belief, encoding and decoding, etc. Analysts also need to know the producers’
level of credibility, and the types of relationships they have with the
audience they are writing for and the communities they are covering (Cotter,
2001). This can deeply affect analysts’ examination of the meanings of the
texts.
(3) An adequate analysis of media texts must also include the
socio-cultural practice that is part of the communicative event to be covered.
Therefore, the textual analysis and the discourse analysis of media texts must
be linked to the socio-cultural goings where the event took place.
Van Dijk’s Model of CDA.
Van Dijk’s and
Fairclough’s approach to CDA are ‘similar in conception,’ but different in
naming. However, the former has one special conception, which is the socio-cognitive
model. Van Dijk’s method of analysis is conducted according to: the structural
nature of texts, production processes, and reception processes.
His analysis takes place at two levels: microstructure, and macrostructure.
At the micro-structural level, he focuses on the semantic relations between
propositions, syntactic and lexical elements, coherence, quotations, and
direct/indirect reporting. At the macro-structural level, he focuses on the
overall level of description of media texts, from themes, topics, to news
schemata (summary, story, and consequences).
Van Dijk’s work also
gives a great deal of importance to ideology analysis which is based on social
analysis, cognitive analysis, and discourse analysis. The cognitive
analysis consists of mental models, intended to mediate between discourse practices
and the social component. It helps analysts examine the cognitive processes
involved in the encoding and decoding of texts. In order to reveal the
implicitly-stated ideological dichotomy in media texts, van Dijk (1998b)
suggests that analysts must (1) examine the context of the discourse, the
participants and their background, (2) analyze the concerned communities, their
power relations, and conflicts, (3) cover as many opinions as possible about,
what he calls, ‘US versus THEM’, (4) reveal all what is stated implicitly, and
(5) examine the formal structure of the texts.
Wodak’s Method in CDA. Discourse sociolinguistics is one of the directions of CDA
developed by Wodak. She developed an approach to analyzing media texts that she
called the discourse historical method, where all the available
background information should be included in the analysis of the audience of
written or spoken media texts. There is a similarity between her approach and
the steps that van Dijk suggested in order to unveil the ideological dichotomy,
where he says that analysts must examine the ‘historical, political, and social
backgrounds’ of the main participants in the discourse (the text producers, the
people who were involved in the event, and the audience). Through many research
studies conducted by Wodak and her colleagues, Wodak attested that the context
of the discourse has an important impact on the structure and form of the
discourse.
Ethnographic Discourse Analysis
The search for the most
adequate method of media texts analysis did not end with CDA or any other quantitative
methods or qualitative frameworks. Many analysts adopted a combination of
quantitative and qualitative methods to achieve a holistic analysis, namely
Halloran et al. (1970), Hartmann & Husband (1974), Ter Wal (2002), Backer
& McEnery (2005), Backer et al. (2013), etc. However, what some
researchers, like van Dijk, suggest to do is to put into consideration the
ethnographic observations that need to be done ‘about the production and uses
of communicative events […] ‘in’ the media and ‘by’ the media’ (van Dijk,
1985). A general definition of the term ‘ethnography’ is ‘the description of
people and their culture’ (Denzin & Lincoln, 1994). The concept in relation
to content analysis will be broadened in the following parts.
Qualitative Content Analysis and Ethnographic Discourse Analysis
Schroder (2007)
criticized what she called ‘the half-hearted holism of CDA’, because it
‘suffers from a number of self-imposed methodological limitations.’ She states
that at the surface level, CDA is holistic. It examines all of the three
dimensions of media discourse in relation to each other: text, discourse
practice (text production/consumption), and the socio-cultural practice.
However, in a statement made by Fairclough (1995) in which he says ‘[…] the
ways in which texts are produced and consumed, which is realized in the feature
of texts,’ Schroder (2007) draws our attention to the fact that in CDA,
discourse practices are not studied ‘independently or empirically.’ They are
simply observed through the text. Schroder supports her argument with a study
conducted by Swales & Rogers (1995), where they state that conducting an
ethnographic fieldwork among media text producers and consumers will increase
the validity and reliability of the analysis, and minimize the subjectivity of
the researchers’ analyses. Another argument she uses is that of Cotter (2001),
where he suggests a ‘holistic and ethnographically oriented approach’ that
examines the ‘community of coverage’ as well as the ‘community of practice’. As
an example, Schroder mentions the framework of investigation used by David
Deacon, Natalie Feuton and Alan Bryman. They argue that media
production/reception studies have made it possible for analysts to produce more
reliable interpretations, and to achieve a more objective view of the power
relations between the audiences and producers of media texts. Schroder claims
that her approach to media texts is empirical rather than merely critical, and
that critical discourse analysts should start analyzing the encoding and
decoding processes of media discourse in an empirical manner if they ever want
to add more credibility and objectivity to their findings and interpretations.
Conclusion
No researchers from the
field of linguistics have approached media discourse directly or developed
theories implicating media discourse to linguistics. Instead, methods that are
originally developed in sociology, social science, mass communication,
discourse analysis, critical discourse analysis, and ethnographic analysis have
been adapted to fit the analysis of media texts. Still, relying on the findings
of researchers working on the latter fields will definitely accelerate the
process of producing a specially-made theory of media discourse analysis by
linguists.
References
Baker, P. & McEnery, T. (2005). A Corpus-based approach to
discourses of refugees and asylum seekers in UN and newspaper texts. Journal
of Language and Politics, pp. 197-226.
Cotter, C. (2001). Discourse and
Media. In: D. Schiffrin, D. Tannen and H. E. Hamilton. (eds). The Handbook
of Discourse Analysis. London: Blackwell, 352-371.
E. Richardson, J. (2007). Analysing Newspapers: An Approach from
Critical Discourse Analysis. New York: Palgrave Macmillan.
Johnson, S. & Ensslin, A. (2006). Language in the News: Some
Reflections on Keyword Analysis Using Wordsmith Tools and the BNC. Leeds Working Papers in Linguistics and Phonetics, 11.
Lowe, W. Software for Content
Analysis- A Review. Retrieved from: www.ou.edu/cls/online/lstd5913/pdf/rev.pdf
Macnamara, J. (2005). Media
Content Analysis : Its Uses ; Benefits and Best Practice Methodology.
Asia Pacific Public Relations Journal, 6(1), 1-34.
Schroder, K. (2007). Media Discourse
Analysis: Researching Cultural Meanings from Inception to Reception. Textual
Cultures, Vol. 2, No.2, pp. 77-99.
Van Dijk, T. (1985). Introduction:
Discourse Analysis in (mass) Communication Research. In: (Ed.)
Discourse and Communication , 69-93. (C.5.)
Wodak, R. Busch, B. (2004). 'Approaches to media texts'. In The
Sage handbook of media studies. London: Sage.
No comments:
Post a Comment