Media and Culture
Edited by Matthew A. McIntosh / 03.03.2018
Pop Culture Mania
Figure 1.1: Just as fans could purchase Jerry Lind hats, Beatles fans could purchase Beatle wigs. / Paul Townsend – 1960s Beatlemania Fashion
In 1850, an epidemic swept America—but instead of leaving victims sick with fever or flu, this epidemic involved a rabid craze for the music of Swedish soprano Jenny Lind. American showman P. T. Barnum (who would later go on to found the circus now known as Ringling Bros. and Barnum & Bailey), a shrewd marketer and self-made millionaire, is credited with spreading “Lindomania” through a series of astute show-business moves. Barnum promised Lind an unprecedented $1,000-a-night fee (the equivalent of $28,300 in 2009) for her entire 93-performance tour of the United States. Ever the savvy self-promoter, Barnum turned his huge investment to his advantage by using it to create publicity—and it paid off. When the Swedish soprano’s ship docked on U.S. shores, she was greeted by 40,000 ardent fans; another 20,000 swarmed her hotel (Barnum). Congress was adjourned specifically for Lind’s visit to Washington, DC, where the National Theatre had to be enlarged to accommodate her audiences. A town in California and an island in Canada were named in her honor. Enthusiasts could purchase Jenny Lind hats, chairs, boots, opera glasses, and even pianos. Barnum’s marketing expertise made Lind a household name and created an overwhelming demand for a singer previously unknown to American audiences.
The “Jenny rage” that the savvy Barnum was able to create was not a unique phenomenon, however; a little more than a century later, a new craze transformed some American teenagers into screaming, fainting Beatlemaniacs. Though other performers like Frank Sinatra and Elvis Presley were no strangers to manic crowds, the Beatles attracted an unprecedented amount of attention when they first arrived in the United States. When the British foursome touched down at New York’s Kennedy Airport in 1964, they were met by more than 3,000 frenzied fans. Their performance on The Ed Sullivan Show was seen by 73 million people, or 40 percent of the U.S. population. The crime rate that night dropped to its lowest level in 50 years (Ehrenreich, et. al., 1992). Beatlemania was at such a fever pitch that Life magazine cautioned that “a Beatle who ventures out unguarded into the streets runs the very real peril of being dismembered or crushed to death by his fans.” The BBC publicized the trend and perhaps added to it by highlighting the paraphernalia for fans to spend their money on: “T-shirts, sweat shirts, turtle-neck sweaters, tight-legged trousers, night shirts, scarves, and jewelry inspired by the Beatles” were all available, as were Beatles-style mop-top wigs.
In the 21st century, rabid fans could turn their attention to a whole swath of pop stars in the making when the reality TV program American Idol hit the airwaves in 2002. The show was the only television program ever to have snagged the top spot in the Nielsen ratings for six seasons in a row, often averaging more than 30 million nightly viewers. Rival television network executives were alarmed, deeming the pop giant “the ultimate schoolyard bully,” “the Death Star,” or even “the most impactful show in the history of television,” according to former NBC Universal CEO Jeff Zucker (Carter, 2007). New cell phone technologies allowed viewers to have a direct role in the program’s star-making enterprise through casting votes, signing up for text alerts, or playing trivia games on their phones. In 2009, AT&T estimated that Idol-related text traffic amounted to 178 million messages (Poniewozik, 2009).
These three crazes all relied on various forms of media to create excitement. Whether through newspaper advertisements, live television broadcasts, or integrated Internet marketing, media industry tastemakers help shape what we care about. For as long as mass media has existed in the United States, it’s helped to create and fuel mass crazes, skyrocketing celebrities, and pop culture manias of all kinds. Even in our era of seemingly limitless entertainment options, mass hits like American Idol still have the ability to dominate the public’s attention.
Barnum, P. T.” Answers.com, http://www.answers.com/topic/p-t-barnum.
Carter, Bill. “For Fox’s Rivals, ‘American Idol’ Remains a ‘Schoolyard Bully,’” New York Times, February 20, 2007, Arts section.
Ehrenreich, Barbara, Elizabeth Hess, and Gloria Jacobs, “Beatlemania: Girls Just Want to Have Fun,” in The Adoring Audience: Fan Culture and Popular Media, ed. Lisa A. Lewis (New York: Routledge, 1992), 84–106.
Poniewozik, James. “American Idol’s Voting Scandal (Or Not),” Tuned In (blog), Time, May 28, 2009, http://tunedin.blogs.time.com/2009/05/28/american-idols-voting-scandal-or-not/.
Intersection of American Media and Culture
Pop culture and American media are inextricably linked. Consider that Jenny Lind, the Beatles, and American Idol were each promoted using a then-new technology (photography for Lind, television for the Beatles, and the Internet and text messaging for American Idol).
Mass Communication, Mass Media, and Culture
We will provide an in-depth look at many kinds of media, at how media trends are reshaping the United States’ cultural landscape, and at how that culture shapes media in turn. These topics will be explored through an examination of mass media and mass communication both past and present—and speculation about what the future might look like.
First, it is important to distinguish between mass communication and mass media and to attempt a working definition of culture. Mass communication refers to information transmitted to large segments of the population. The transmission of mass communication may happen using one or many different kinds of media (singular medium), which is the means of transmission, whether print, digital, or electronic. Mass media specifically refers to a means of communication that is designed to reach a wide audience. Mass media platforms are commonly considered to include radio, newspapers, magazines, books, video games, and Internet media such as blogs, podcasts, and video sharing. Another way to consider the distinction is that a mass media message may be disseminated through several forms of mass media, such as an ad campaign with television, radio, and Internet components. Culture generally refers to the shared values, attitudes, beliefs, and practices that characterize a social group, organization, or institution. Just as it is difficult to pin down an exact definition of culture, cultures themselves can be hard to draw boundaries around, as they are fluid, diverse, and often overlapping.
Figure 1.2: Advances in media technology allowed for unprecedented voter participation in the 2007 CNN/YouTube® presidential debates. / lafra – The CNN + YouTube Debates
Throughout U.S. history, evolving media technologies have changed the way we relate socially, economically, and politically. In 2007, for example, a joint venture between the 24-hour news network CNN and the video-sharing site YouTube allowed voters to pose questions directly to presidential candidates in two televised debates. Voters could record their questions and upload them to YouTube, and a selection of these videos were then chosen by the debate moderators and played directly to the presidential candidates. This new format opened up the presidential debates to a much wider array of people, allowing for greater voter participation than has been possible in the past, where questions were posed solely by journalists or a few carefully chosen audience members.
In today’s wired world of smartphones and streaming satellite feeds, our expectations of our leaders, celebrities, teachers, and even ourselves are changing in even more drastic ways. This book provides you with the context, tools, and theories to engage with the world of mass media through an examination of the history, theory, and effects of media practices and roles in America. This book also provides you with the framework to consider some of the crucial issues affecting media and culture in today’s world.
The Evolution of Media
In 2010, Americans could turn on their television and find 24-hour news channels as well as music videos, nature documentaries, and reality shows about everything from hoarders to fashion models. That’s not to mention movies available on demand from cable providers or television and video available online for streaming or downloading. Half of U.S. households receive a daily newspaper, and the average person holds 1.9 magazine subscriptions (State of the Media, 2004) (Bilton, 2007). A University of California, San Diego study claimed that U.S. households consumed a total of approximately 3.6 zettabytes of information in 2008—the digital equivalent of a 7-foot high stack of books covering the entire United States—a 350 percent increase since 1980 (Ramsey, 2009). Americans are exposed to media in taxicabs and buses, in classrooms and doctors’ offices, on highways, and in airplanes. We can begin to orient ourselves in the information cloud through parsing what roles the media fills in society, examining its history in society, and looking at the way technological innovations have helped bring us to where we are today.
What Does Media Do for Us?
Media fulfills several basic roles in our society. One obvious role is entertainment. Media can act as a springboard for our imaginations, a source of fantasy, and an outlet for escapism. In the 19th century, Victorian readers disillusioned by the grimness of the Industrial Revolution found themselves drawn into fantastic worlds of fairies and other fictitious beings. In the first decade of the 21st century, American television viewers could peek in on a conflicted Texas high school football team in Friday Night Lights; the violence-plagued drug trade in Baltimore in The Wire; a 1960s-Manhattan ad agency in Mad Men; or the last surviving band of humans in a distant, miserable future in Battlestar Galactica. Through bringing us stories of all kinds, media has the power to take us away from ourselves.
Media can also provide information and education. Information can come in many forms, and it may sometimes be difficult to separate from entertainment. Today, newspapers and news-oriented television and radio programs make available stories from across the globe, allowing readers or viewers in London to access voices and videos from Baghdad, Tokyo, or Buenos Aires. Books and magazines provide a more in-depth look at a wide range of subjects. The free online encyclopedia Wikipedia has articles on topics from presidential nicknames to child prodigies to tongue twisters in various languages. The Massachusetts Institute of Technology (MIT) has posted free lecture notes, exams, and audio and video recordings of classes on its OpenCourseWare website, allowing anyone with an Internet connection access to world-class professors.
Another useful aspect of media is its ability to act as a public forum for the discussion of important issues. In newspapers or other periodicals, letters to the editor allow readers to respond to journalists or to voice their opinions on the issues of the day. These letters were an important part of U.S. newspapers even when the nation was a British colony, and they have served as a means of public discourse ever since. The Internet is a fundamentally democratic medium that allows everyone who can get online the ability to express their opinions through, for example, blogging or podcasting—though whether anyone will hear is another question.
Similarly, media can be used to monitor government, business, and other institutions. Upton Sinclair’s 1906 novel The Jungle exposed the miserable conditions in the turn-of-the-century meatpacking industry; and in the early 1970s, Washington Post reporters Bob Woodward and Carl Bernstein uncovered evidence of the Watergate break-in and subsequent cover-up, which eventually led to the resignation of President Richard Nixon. But purveyors of mass media may be beholden to particular agendas because of political slant, advertising funds, or ideological bias, thus constraining their ability to act as a watchdog. The following are some of these agendas:
- Entertaining and providing an outlet for the imagination
- Educating and informing
- Serving as a public forum for the discussion of important issues
- Acting as a watchdog for government, business, and other institutions
It’s important to remember, though, that not all media are created equal. While some forms of mass communication are better suited to entertainment, others make more sense as a venue for spreading information. In terms of print media, books are durable and able to contain lots of information, but are relatively slow and expensive to produce; in contrast, newspapers are comparatively cheaper and quicker to create, making them a better medium for the quick turnover of daily news. Television provides vastly more visual information than radio and is more dynamic than a static printed page; it can also be used to broadcast live events to a nationwide audience, as in the annual State of the Union address given by the U.S. president. However, it is also a one-way medium—that is, it allows for very little direct person-to-person communication. In contrast, the Internet encourages public discussion of issues and allows nearly everyone who wants a voice to have one. However, the Internet is also largely unmoderated. Users may have to wade through thousands of inane comments or misinformed amateur opinions to find quality information.
The 1960s media theorist Marshall McLuhan took these ideas one step further, famously coining the phrase “the medium is the message (McLuhan, 1964).” By this, McLuhan meant that every medium delivers information in a different way and that content is fundamentally shaped by the medium of transmission. For example, although television news has the advantage of offering video and live coverage, making a story come alive more vividly, it is also a faster-paced medium. That means more stories get covered in less depth. A story told on television will probably be flashier, less in-depth, and with less context than the same story covered in a monthly magazine; therefore, people who get the majority of their news from television may have a particular view of the world shaped not by the content of what they watch but its medium. Or, as computer scientist Alan Kay put it, “Each medium has a special way of representing ideas that emphasize particular ways of thinking and de-emphasize others (Kay, 1994).” Kay was writing in 1994, when the Internet was just transitioning from an academic research network to an open public system. A decade and a half later, with the Internet firmly ensconced in our daily lives, McLuhan’s intellectual descendants are the media analysts who claim that the Internet is making us better at associative thinking, or more democratic, or shallower. But McLuhan’s claims don’t leave much space for individual autonomy or resistance. In an essay about television’s effects on contemporary fiction, writer David Foster Wallace scoffed at the “reactionaries who regard TV as some malignancy visited on an innocent populace, sapping IQs and compromising SAT scores while we all sit there on ever fatter bottoms with little mesmerized spirals revolving in our eyes…. Treating television as evil is just as reductive and silly as treating it like a toaster with pictures (Wallace, 1997).” Nonetheless, media messages and technologies affect us in countless ways, some of which probably won’t be sorted out until long in the future.
A Brief History of Mass Media and Culture
Until Johannes Gutenberg’s 15th-century invention of the movable type printing press, books were painstakingly handwritten and no two copies were exactly the same. The printing press made the mass production of print media possible. Not only was it much cheaper to produce written material, but new transportation technologies also made it easier for texts to reach a wide audience. It’s hard to overstate the importance of Gutenberg’s invention, which helped usher in massive cultural movements like the European Renaissance and the Protestant Reformation. In 1810, another German printer, Friedrich Koenig, pushed media production even further when he essentially hooked the steam engine up to a printing press, enabling the industrialization of printed media. In 1800, a hand-operated printing press could produce about 480 pages per hour; Koenig’s machine more than doubled this rate. (By the 1930s, many printing presses could publish 3,000 pages an hour.)
This increased efficiency went hand in hand with the rise of the daily newspaper. The newspaper was the perfect medium for the increasingly urbanized Americans of the 19th century, who could no longer get their local news merely through gossip and word of mouth. These Americans were living in unfamiliar territory, and newspapers and other media helped them negotiate the rapidly changing world. The Industrial Revolution meant that some people had more leisure time and more money, and media helped them figure out how to spend both. Media theorist Benedict Anderson has argued that newspapers also helped forge a sense of national identity by treating readers across the country as part of one unified community (Anderson, 1991).
In the 1830s, the major daily newspapers faced a new threat from the rise of penny papers, which were low-priced broadsheets that served as a cheaper, more sensational daily news source. They favored news of murder and adventure over the dry political news of the day. While newspapers catered to a wealthier, more educated audience, the penny press attempted to reach a wide swath of readers through cheap prices and entertaining (often scandalous) stories. The penny press can be seen as the forerunner to today’s gossip-hungry tabloids.
Figure 1.3: The penny press appealed to readers’ desires for lurid tales of murder and scandal. / Wikimedia Commons
In the early decades of the 20th century, the first major nonprint form of mass media—radio—exploded in popularity. Radios, which were less expensive than telephones and widely available by the 1920s, had the unprecedented ability of allowing huge numbers of people to listen to the same event at the same time. In 1924, Calvin Coolidge’s preelection speech reached more than 20 million people. Radio was a boon for advertisers, who now had access to a large and captive audience. An early advertising consultant claimed that the early days of radio were “a glorious opportunity for the advertising man to spread his sales propaganda” because of “a countless audience, sympathetic, pleasure seeking, enthusiastic, curious, interested, approachable in the privacy of their homes (Briggs & Burke, 2005).” The reach of radio also meant that the medium was able to downplay regional differences and encourage a unified sense of the American lifestyle—a lifestyle that was increasingly driven and defined by consumer purchases. “Americans in the 1920s were the first to wear ready-made, exact-size clothing…to play electric phonographs, to use electric vacuum cleaners, to listen to commercial radio broadcasts, and to drink fresh orange juice year round (Mintz, 2007).” This boom in consumerism put its stamp on the 1920s and also helped contribute to the Great Depression of the 1930s (Library of Congress). The consumerist impulse drove production to unprecedented levels, but when the Depression began and consumer demand dropped dramatically, the surplus of production helped further deepen the economic crisis, as more goods were being produced than could be sold.
The post–World War II era in the United States was marked by prosperity, and by the introduction of a seductive new form of mass communication: television. In 1946, about 17,000 televisions existed in the United States; within 7 years, two-thirds of American households owned at least one set. As the United States’ gross national product (GNP) doubled in the 1950s, and again in the 1960s, the American home became firmly ensconced as a consumer unit; along with a television, the typical U.S. household owned a car and a house in the suburbs, all of which contributed to the nation’s thriving consumer-based economy (Briggs & Burke, 2005). Broadcast television was the dominant form of mass media, and the three major networks controlled more than 90 percent of the news programs, live events, and sitcoms viewed by Americans. Some social critics argued that television was fostering a homogenous, conformist culture by reinforcing ideas about what “normal” American life looked like. But television also contributed to the counterculture of the 1960s. The Vietnam War was the nation’s first televised military conflict, and nightly images of war footage and war protesters helped intensify the nation’s internal conflicts.
Broadcast technology, including radio and television, had such a hold on the American imagination that newspapers and other print media found themselves having to adapt to the new media landscape. Print media was more durable and easily archived, and it allowed users more flexibility in terms of time—once a person had purchased a magazine, he or she could read it whenever and wherever. Broadcast media, in contrast, usually aired programs on a fixed schedule, which allowed it to both provide a sense of immediacy and fleetingness. Until the advent of digital video recorders in the late 1990s, it was impossible to pause and rewind a live television broadcast.
The media world faced drastic changes once again in the 1980s and 1990s with the spread of cable television. During the early decades of television, viewers had a limited number of channels to choose from—one reason for the charges of homogeneity. In 1975, the three major networks accounted for 93 percent of all television viewing. By 2004, however, this share had dropped to 28.4 percent of total viewing, thanks to the spread of cable television. Cable providers allowed viewers a wide menu of choices, including channels specifically tailored to people who wanted to watch only golf, classic films, sermons, or videos of sharks. Still, until the mid-1990s, television was dominated by the three large networks. The Telecommunications Act of 1996, an attempt to foster competition by deregulating the industry, actually resulted in many mergers and buyouts that left most of the control of the broadcast spectrum in the hands of a few large corporations. In 2003, the Federal Communications Commission (FCC) loosened regulation even further, allowing a single company to own 45 percent of a single market (up from 25 percent in 1982).
Technological Transitions Shape Media Industries
New media technologies both spring from and cause social changes. For this reason, it can be difficult to neatly sort the evolution of media into clear causes and effects. Did radio fuel the consumerist boom of the 1920s, or did the radio become wildly popular because it appealed to a society that was already exploring consumerist tendencies? Probably a little bit of both. Technological innovations such as the steam engine, electricity, wireless communication, and the Internet have all had lasting and significant effects on American culture. As media historians Asa Briggs and Peter Burke note, every crucial invention came with “a change in historical perspectives.” Electricity altered the way people thought about time because work and play were no longer dependent on the daily rhythms of sunrise and sunset; wireless communication collapsed distance; the Internet revolutionized the way we store and retrieve information.
Figure 1.4: The transatlantic telegraph cable made nearly instantaneous communication between the United States and Europe possible for the first time in 1858. / Amber Case – 1858 trans-Atlantic telegraph cable route
The contemporary media age can trace its origins back to the electrical telegraph, patented in the United States by Samuel Morse in 1837. Thanks to the telegraph, communication was no longer linked to the physical transportation of messages; it didn’t matter whether a message needed to travel 5 or 500 miles. Suddenly, information from distant places was nearly as accessible as local news, as telegraph lines began to stretch across the globe, making their own kind of World Wide Web. In this way, the telegraph acted as the precursor to much of the technology that followed, including the telephone, radio, television, and Internet. When the first transatlantic cable was laid in 1858, allowing nearly instantaneous communication from the United States to Europe, the London Times described it as “the greatest discovery since that of Columbus, a vast enlargement…given to the sphere of human activity.”
Not long afterward, wireless communication (which eventually led to the development of radio, television, and other broadcast media) emerged as an extension of telegraph technology. Although many 19th-century inventors, including Nikola Tesla, were involved in early wireless experiments, it was Italian-born Guglielmo Marconi who is recognized as the developer of the first practical wireless radio system. Many people were fascinated by this new invention. Early radio was used for military communication, but soon the technology entered the home. The burgeoning interest in radio inspired hundreds of applications for broadcasting licenses from newspapers and other news outlets, retail stores, schools, and even cities. In the 1920s, large media networks—including the National Broadcasting Company (NBC) and the Columbia Broadcasting System (CBS)—were launched, and they soon began to dominate the airwaves. In 1926, they owned 6.4 percent of U.S. broadcasting stations; by 1931, that number had risen to 30 percent.
Figure 1.5: Gone With the Wind defeated The Wizard of Oz to become the first color film ever to win the Academy Award for Best Picture in 1939. / Wikimedia Commons
In addition to the breakthroughs in audio broadcasting, inventors in the 1800s made significant advances in visual media. The 19th-century development of photographic technologies would lead to the later innovations of cinema and television. As with wireless technology, several inventors independently created a form of photography at the same time, among them the French inventors Joseph Niépce and Louis Daguerre and the British scientist William Henry Fox Talbot. In the United States, George Eastman developed the Kodak camera in 1888, anticipating that Americans would welcome an inexpensive, easy-to-use camera into their homes as they had with the radio and telephone. Moving pictures were first seen around the turn of the century, with the first U.S. projection-hall opening in Pittsburgh in 1905. By the 1920s, Hollywood had already created its first stars, most notably Charlie Chaplin; by the end of the 1930s, Americans were watching color films with full sound, including Gone With the Wind and The Wizard of Oz.
Television—which consists of an image being converted to electrical impulses, transmitted through wires or radio waves, and then reconverted into images—existed before World War II, but gained mainstream popularity in the 1950s. In 1947, there were 178,000 television sets made in the United States; 5 years later, 15 million were made. Radio, cinema, and live theater declined because the new medium allowed viewers to be entertained with sound and moving pictures in their homes. In the United States, competing commercial stations (including the radio powerhouses of CBS and NBC) meant that commercial-driven programming dominated. In Great Britain, the government managed broadcasting through the British Broadcasting Corporation (BBC). Funding was driven by licensing fees instead of advertisements. In contrast to the U.S. system, the BBC strictly regulated the length and character of commercials that could be aired. However, U.S. television (and its increasingly powerful networks) still dominated. By the beginning of 1955, there were around 36 million television sets in the United States, but only 4.8 million in all of Europe. Important national events, broadcast live for the first time, were an impetus for consumers to buy sets so they could witness the spectacle; both England and Japan saw a boom in sales before important royal weddings in the 1950s.
Figure 1.6: In the 1960s, the concept of a useful portable computer was still a dream; huge mainframes were required to run a basic operating system. / Wikimedia Commons
In 1969, management consultant Peter Drucker predicted that the next major technological innovation would be an electronic appliance that would revolutionize the way people lived just as thoroughly as Thomas Edison’s light bulb had. This appliance would sell for less than a television set and be “capable of being plugged in wherever there is electricity and giving immediate access to all the information needed for school work from first grade through college.” Although Drucker may have underestimated the cost of this hypothetical machine, he was prescient about the effect these machines—personal computers—and the Internet would have on education, social relationships, and the culture at large. The inventions of random access memory (RAM) chips and microprocessors in the 1970s were important steps to the Internet age. As Briggs and Burke note, these advances meant that “hundreds of thousands of components could be carried on a microprocessor.” The reduction of many different kinds of content to digitally stored information meant that “print, film, recording, radio and television and all forms of telecommunications [were] now being thought of increasingly as part of one complex.” This process, also known as convergence, is a force that’s affecting media today.
Anderson, Benedict Imagined Communities: Reflections on the Origin and Spread of Nationalism, (London: Verso, 1991).
Bilton, Jim. “The Loyalty Challenge: How Magazine Subscriptions Work,” In Circulation, January/February 2007.
Briggs, Asa and Peter Burke, A Social History of the Media: From Gutenberg to the Internet (Malden, MA: Polity Press, 2005).
Kay, Alan. “The Infobahn Is Not the Answer,” Wired, May 1994.
Library of Congress, “Radio: A Consumer Product and a Producer of Consumption,” Coolidge-Consumerism Collection, http://lcweb2.loc.gov:8081/ammem/amrlhtml/inradio.html.
McLuhan, Marshall. Understanding Media: The Extensions of Man, (New York: McGraw-Hill, 1964).
Mintz, Steven “The Jazz Age: The American 1920s: The Formation of Modern American Mass Culture,” Digital History, 2007, http://www.digitalhistory.uh.edu/database/article_display.cfm?hhid=454.
Ramsey, Doug. “UC San Diego Experts Calculate How Much Information Americans Consume” UC San Diego News Center, December 9, 2009, http://ucsdnews.ucsd.edu/newsrel/general/12-09Information.asp.
State of the Media, project for Excellence in Journalism, The State of the News Media 2004, http://www.stateofthemedia.org/2004/.
Wallace, David Foster “E Unibus Pluram: Television and U.S. Fiction,” in A Supposedly Fun Thing I’ll Never Do Again (New York: Little Brown, 1997).
It’s important to keep in mind that the implementation of new technologies doesn’t mean that the old ones simply vanish into dusty museums. Today’s media consumers still watch television, listen to radio, read newspapers, and become immersed in movies. The difference is that it’s now possible to do all those things through one device—be it a personal computer or a smartphone—and through the Internet. Such actions are enabled by media convergence, the process by which previously distinct technologies come to share tasks and resources. A cell phone that also takes pictures and video is an example of the convergence of digital photography, digital video, and cellular telephone technologies. An extreme, and currently nonexistent, example of technological convergence would be the so-called black box, which would combine all the functions of previously distinct technology and would be the device through which we’d receive all our news, information, entertainment, and social interaction.
Kinds of Convergence
Figure 1.7: Nigeria’s Nollywood produces more films annually than any other country besides India. / Paul Keller – nigerian VCDs at kwakoe
But convergence isn’t just limited to technology. Media theorist Henry Jenkins argues that convergence isn’t an end result (as is the hypothetical black box), but instead a process that changes how media is both consumed and produced. Jenkins breaks convergence down into five categories:
- Economic convergence occurs when a company controls several products or services within the same industry. For example, in the entertainment industry a single company may have interests across many kinds of media. For example, Rupert Murdoch’s News Corporation is involved in book publishing (HarperCollins), newspapers (New York Post, The Wall Street Journal), sports (Colorado Rockies), broadcast television (Fox), cable television (FX, National Geographic Channel), film (20th Century Fox), Internet (MySpace), and many other media.
- Organic convergence is what happens when someone is watching a television show online while exchanging text messages with a friend and also listening to music in the background—the “natural” outcome of a diverse media world.
- Cultural convergence has several aspects. Stories flowing across several kinds of media platforms is one component—for example, novels that become television series (True Blood); radio dramas that become comic strips (The Shadow); even amusement park rides that become film franchises (Pirates of the Caribbean). The character Harry Potter exists in books, films, toys, and amusement park rides. Another aspect of cultural convergence is participatory culture—that is, the way media consumers are able to annotate, comment on, remix, and otherwise influence culture in unprecedented ways. The video-sharing website YouTube is a prime example of participatory culture. YouTube gives anyone with a video camera and an Internet connection the opportunity to communicate with people around the world and create and shape cultural trends.
- Global convergence is the process of geographically distant cultures influencing one another despite the distance that physically separates them. Nigeria’s cinema industry, nicknamed Nollywood, takes its cues from India’s Bollywood, which is in turn inspired by Hollywood in the United States. Tom and Jerry cartoons are popular on Arab satellite television channels. Successful American horror movies The Ring and The Grudge are remakes of Japanese hits. The advantage of global convergence is access to a wealth of cultural influence; its downside, some critics posit, is the threat of cultural imperialism, defined by Herbert Schiller as the way developing countries are “attracted, pressured, forced, and sometimes bribed into shaping social institutions to correspond to, or even promote, the values and structures of the dominating centre of the system (White, 2001).” Cultural imperialism can be a formal policy or can happen more subtly, as with the spread of outside influence through television, movies, and other cultural projects.
- Technological convergence is the merging of technologies such as the ability to watch TV shows online on sites like Hulu or to play video games on mobile phones like the Apple iPhone. When more and more different kinds of media are transformed into digital content, as Jenkins notes, “we expand the potential relationships between them and enable them to flow across platforms (Jenkins, 2001).”
Effects of Convergence
Jenkins’s concept of organic convergence is perhaps the most telling. To many people, especially those who grew up in a world dominated by so-called old media, there is nothing organic about today’s media-dominated world. As a New York Times editorial recently opined, “Few objects on the planet are farther removed from nature—less, say, like a rock or an insect—than a glass and stainless steel smartphone (New York Times, 2010).” But modern American culture is plugged in as never before, and today’s high school students have never known a world where the Internet didn’t exist. Such a cultural sea change causes a significant generation gap between those who grew up with new media and those who didn’t.
A 2010 study by the Kaiser Family Foundation found that Americans aged 8 to 18 spend more than 7.5 hours with electronic devices each day—and, thanks to multitasking, they’re able to pack an average of 11 hours of media content into that 7.5 hours (Lewin, 2010). These statistics highlight some of the aspects of the new digital model of media consumption: participation and multitasking. Today’s teenagers aren’t passively sitting in front of screens, quietly absorbing information. Instead, they are sending text messages to friends, linking news articles on Facebook, commenting on YouTube videos, writing reviews of television episodes to post online, and generally engaging with the culture they consume. Convergence has also made multitasking much easier, as many devices allow users to surf the Internet, listen to music, watch videos, play games, and reply to e-mails on the same machine.
However, it’s still difficult to predict how media convergence and immersion are affecting culture, society, and individual brains. In his 2005 book Everything Bad Is Good for You, Steven Johnson argues that today’s television and video games are mentally stimulating, in that they pose a cognitive challenge and invite active engagement and problem solving. Poking fun at alarmists who see every new technology as making children stupider, Johnson jokingly cautions readers against the dangers of book reading: It “chronically understimulates the senses” and is “tragically isolating.” Even worse, books “follow a fixed linear path. You can’t control their narratives in any fashion—you simply sit back and have the story dictated to you…. This risks instilling a general passivity in our children, making them feel as though they’re powerless to change their circumstances. Reading is not an active, participatory process; it’s a submissive one (Johnson, 2005).”
A 2010 book by Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains is more pessimistic. Carr worries that the vast array of interlinked information available through the Internet is eroding attention spans and making contemporary minds distracted and less capable of deep, thoughtful engagement with complex ideas and arguments. “Once I was a scuba diver in a sea of words,” Carr reflects ruefully. “Now I zip along the surface like a guy on a Jet Ski (Carr, 2010).” Carr cites neuroscience studies showing that when people try to do two things at once, they give less attention to each and perform the tasks less carefully. In other words, multitasking makes us do a greater number of things poorly. Whatever the ultimate cognitive, social, or technological results, convergence is changing the way we relate to media today.
Video Killed the Radio Star: Convergence Kills Off Obsolete Technology—or Does It?
When was the last time you used a rotary phone? How about a street-side pay phone? Or a library’s card catalog? When you need brief, factual information, when was the last time you reached for a volume of Encyclopedia Britannica? Odds are it’s been a while. All of these habits, formerly common parts of daily life, have been rendered essentially obsolete through the progression of convergence.
But convergence hasn’t erased old technologies; instead, it may have just altered the way we use them. Take cassette tapes and Polaroid film, for example. Influential musician Thurston Moore of the band Sonic Youth recently claimed that he only listens to music on cassette. Polaroid Corporation, creators of the once-popular instant-film cameras, was driven out of business by digital photography in 2008, only to be revived 2 years later—with pop star Lady Gaga as the brand’s creative director. Several Apple iPhone apps allow users to apply effects to photos to make them look more like a Polaroid photo.
Cassettes, Polaroid cameras, and other seemingly obsolete technologies have been able to thrive—albeit in niche markets—both despite and because of Internet culture. Instead of being slick and digitized, cassette tapes and Polaroid photos are physical objects that are made more accessible and more human, according to enthusiasts, because of their flaws. “I think there’s a group of people—fans and artists alike—out there to whom music is more than just a file on your computer, more than just a folder of MP3s,” says Brad Rose, founder of a Tulsa, Oklahoma-based cassette label (Hogan, 2010). The distinctive Polaroid look—caused by uneven color saturation, underdevelopment or overdevelopment, or just daily atmospheric effects on the developing photograph—is emphatically analog. In an age of high resolution, portable printers, and camera phones, the Polaroid’s appeal to some has something to do with ideas of nostalgia and authenticity. Convergence has transformed who uses these media and for what purposes, but it hasn’t eliminated these media.
Carr, Nicholas The Shallows: What the Internet Is Doing to Our Brains (New York: Norton, 2010).
Hogan, Marc. “This Is Not a Mixtape,” Pitchfork, February 22, 2010, http://pitchfork.com/features/articles/7764-this-is-not-a-mixtape/2/.
Jenkins, Henry. “Convergence? I Diverge,” Technology Review, June 2001, 93.
Johnson, Steven Everything Bad Is Good for You (Riverhead, NY: Riverhead Books, 2005).
Lewin, Tamar “If Your Kids Are Awake, They’re Probably Online,” New York Times, January 20, 2010, http://www.nytimes.com/2010/01/20/education/20wired.html.
New York Times, editorial, “The Half-Life of Phones,” New York Times, June 18, 2010, http://www.nytimes.com/2010/06/20/opinion/20sun4.html1.
White, Livingston A. “Reconsidering Cultural Imperialism Theory,” TBS Journal 6 (2001), http://www.tbsjournal.com/Archives/Spring01/white.html.
The Role of Social Values in Communication
Figure 1.8: Thomas Paine is regarded by some as the “moral father of the Internet” because his independent spirit is reflected in the democratization of mass communication via the Internet. / Marion Doss – Thomas Paine, Engraving
Free Speech and Its Limitations
The value of free speech is central to American mass communication and has been since the nation’s revolutionary founding. The U.S. Constitution’s very first amendment guarantees the freedom of the press. Because of the First Amendment and subsequent statutes, the United States has some of the broadest protections on speech of any industrialized nation. However, there are limits to what kinds of speech are legally protected—limits that have changed over time, reflecting shifts in U.S. social values.
Figure 1.9: Artist Shepard Fairey, creator of the iconic Obama HOPE image, was sued by the Associated Press for copyright infringement; Fairey argued that his work was protected by the fair use exception. / Wikimedia Commons
Definitions of obscenity, which is not protected by the First Amendment, have altered with the nation’s changing social attitudes. James Joyce’s Ulysses, ranked by the Modern Library as the best English-language novel of the 20th century, was illegal to publish in the United States between 1922 and 1934 because the U.S. Customs Court declared the book obscene because of its sexual content. The 1954 Supreme Court case Roth v. the United States defined obscenity more narrowly, allowing for differences depending on community standards. The sexual revolution and social changes of the 1960s made it even more difficult to pin down just what was meant by community standards—a question that is still under debate to this day. The mainstreaming of sexually explicit content like Playboy magazine, which is available in nearly every U.S. airport, is another indication that obscenity is still open to interpretation.
Regulations related to obscene content are not the only restrictions on First Amendment rights; copyright law also puts limits on free speech. Intellectual property law was originally intended to protect just that—the proprietary rights, both economic and intellectual, of the originator of a creative work. Works under copyright can’t be reproduced without the authorization of the creator, nor can anyone else use them to make a profit. Inventions, novels, musical tunes, and even phrases are all covered by copyright law. The first copyright statute in the United States set 14 years as the maximum term for copyright protection. This number has risen exponentially in the 20th century; some works are now copyright-protected for up to 120 years. In recent years, an Internet culture that enables file sharing, musical mash-ups, and YouTube video parodies has raised questions about the fair use exception to copyright law. The exact line between what types of expressions are protected or prohibited by law are still being set by courts, and as the changing values of the U.S. public evolve, copyright law—like obscenity law—will continue to change as well.
Propaganda and Other Ulterior Motives
World War I propaganda posters were sometimes styled to resemble movie posters in an attempt to glamorize the war effort. / Wikimedia Commons
Sometimes social values enter mass media messages in a more overt way. Producers of media content may have vested interests in particular social goals, which, in turn, may cause them to promote or refute particular viewpoints. In its most heavy-handed form, this type of media influence can become propaganda, communication that intentionally attempts to persuade its audience for ideological, political, or commercial purposes. Propaganda often (but not always) distorts the truth, selectively presents facts, or uses emotional appeals. During wartime, propaganda often includes caricatures of the enemy. Even in peacetime, however, propaganda is frequent. Political campaign commercials in which one candidate openly criticizes the other are common around election time, and some negative ads deliberately twist the truth or present outright falsehoods to attack an opposing candidate.
Other types of influence are less blatant or sinister. Advertisers want viewers to buy their products; some news sources, such as Fox News or The Huffington Post, have an explicit political slant. Still, people who want to exert media influence often use the tricks and techniques of propaganda. During World War I, the U.S. government created the Creel Commission as a sort of public relations firm for the United States’ entry into the war. The Creel Commission used radio, movies, posters, and in-person speakers to present a positive slant on the U.S. war effort and to demonize the opposing Germans. Chairman George Creel acknowledged the commission’s attempt to influence the public but shied away from calling their work propaganda:
In no degree was the Committee an agency of censorship, a machinery of concealment or repression…. In all things, from first to last, without halt or change, it was a plain publicity proposition, a vast enterprise in salesmanship, the world’s greatest adventures in advertising…. We did not call it propaganda, for that word, in German hands, had come to be associated with deceit and corruption. Our effort was educational and informative throughout, for we had such confidence in our case as to feel that no other argument was needed than the simple, straightforward presentation of the facts (Creel, 1920).
Of course, the line between the selective (but “straightforward”) presentation of the truth and the manipulation of propaganda is not an obvious or distinct one. (Another of the Creel Commission’s members was later deemed the father of public relations and authored a book titled Propaganda.) In general, however, public relations is open about presenting one side of the truth, while propaganda seeks to invent a new truth.
In 1960, journalist A. J. Liebling wryly observed that “freedom of the press is guaranteed only to those who own one.” Liebling was referring to the role of gatekeepers in the media industry, another way in which social values influence mass communication. Gatekeepers are the people who help determine which stories make it to the public, including reporters who decide what sources to use and editors who decide what gets reported on and which stories make it to the front page. Media gatekeepers are part of society and thus are saddled with their own cultural biases, whether consciously or unconsciously. In deciding what counts as newsworthy, entertaining, or relevant, gatekeepers pass on their own values to the wider public. In contrast, stories deemed unimportant or uninteresting to consumers can linger forgotten in the back pages of the newspaper—or never get covered at all.
In one striking example of the power of gatekeeping, journalist Allan Thompson lays blame on the news media for its sluggishness in covering the Rwandan genocide in 1994. According to Thompson, there weren’t many outside reporters in Rwanda at the height of the genocide, so the world wasn’t forced to confront the atrocities happening there. Instead, the nightly news in the United States was preoccupied by the O. J. Simpson trial, Tonya Harding’s attack on a fellow figure skater, and the less bloody conflict in Bosnia (where more reporters were stationed). Thompson went on to argue that the lack of international media attention allowed politicians to remain complacent (Thompson, 2007). With little media coverage, there was little outrage about the Rwandan atrocities, which contributed to a lack of political will to invest time and troops in a faraway conflict. Richard Dowden, Africa editor for the British newspaper The Independentduring the Rwandan genocide, bluntly explained the news media’s larger reluctance to focus on African issues: “Africa was simply not important. It didn’t sell newspapers. Newspapers have to make profits. So it wasn’t important (Thompson, 2007).” Bias on the individual and institutional level downplayed the genocide at a time of great crisis and potentially contributed to the deaths of hundreds of thousands of people.
Gatekeepers had an especially strong influence in old media, in which space and time were limited. A news broadcast could only last for its allotted half hour, while a newspaper had a set number of pages to print. The Internet, in contrast, theoretically has room for infinite news reports. The interactive nature of the medium also minimizes the gatekeeper function of the media by allowing media consumers to have a voice as well. News aggregators like Digg allow readers to decide what makes it on to the front page. That’s not to say that the wisdom of the crowd is always wise—recent top stories on Digg have featured headlines like “Top 5 Hot Girls Playing Video Games” and “The girl who must eat every 15 minutes to stay alive.” Media expert Mark Glaser noted that the digital age hasn’t eliminated gatekeepers; it’s just shifted who they are: “the editors who pick featured artists and apps at the Apple iTunes store, who choose videos to spotlight on YouTube, and who highlight Suggested Users on Twitter,” among others (Glaser, 2009). And unlike traditional media, these new gatekeepers rarely have public bylines, making it difficult to figure out who makes such decisions and on what basis they are made.
Observing how distinct cultures and subcultures present the same story can be indicative of those cultures’ various social values. Another way to look critically at today’s media messages is to examine how the media has functioned in the world and in the United States during different cultural periods.
Creel, George.How We Advertised America (New York: Harper & Brothers, 1920).
Glaser, Marc. “New Gatekeepers Twitter, Apple, YouTube Need Transparency in Editorial Picks,” PBS Mediashift, March 26, 2009, http://www.pbs.org/mediashift/2009/03/new-gatekeepers-twitter-apple-youtube-need-transparency-in-editorial-picks085.html.
Thompson, Allan. “The Media and the Rwanda Genocide” (lecture, Crisis States Research Centre and POLIS at the London School of Economics, January 17, 2007), http://www2.lse.ac.uk/publicEvents/pdf/20070117_PolisRwanda.pdf.
Early Modern Period (late 1400s–1700s)
Began with Johannes Gutenberg’s invention of the movable type printing press; characterized by improved transportation, educational reform, and scientific inquiry.
Late Modern Period (1700s–1900s)
Sparked by the Industrial Revolution; characterized by technical innovations, increasingly secular politics, and urbanization.
Postmodern Age (1950s–present)
Marked by skepticism, self-consciousness, celebration of differences, and the digitalization of culture.
Table 1.1: Cultural Periods
After exploring the ways technology, culture, and mas
After exploring the ways technology, culture, and mass media have affected one another over the years, it may also be helpful to look at recent cultural eras more broadly. A cultural period is a time marked by a particular way of understanding the world through culture and technology. Changes in cultural periods are marked by fundamental switches in the way people perceive and understand the world. In the Middle Ages, truth was dictated by authorities like the king and the church. During the Renaissance, people turned to the scientific method as a way to reach truth through reason. And, in 2008, Wired magazine’s editor in chief proclaimed that Google was about to render the scientific method obsolete (Anderson, 2008). In each of these cases, it wasn’t that the nature of truth changed, but the way humans attempted to make sense of a world that was radically changing. For the purpose of studying culture and mass media, the post-Gutenberg modern and postmodern ages are the most relevant ones to explore.
The Modern Age
The Modern Age, or modernity, is the postmedieval era, a wide span of time marked in part by technological innovations, urbanization, scientific discoveries, and globalization. The Modern Age is generally split into two parts: the early and the late modern periods.
The early modern period began with Gutenberg’s invention of the movable type printing press in the late 15th century and ended in the late 18th century. Thanks to Gutenberg’s press, the European population of the early modern period saw rising literacy rates, which led to educational reform. As noted, Gutenberg’s machine also greatly enabled the spread of knowledge and, in turn, spurred the Renaissance and the Protestant Reformation. During the early modern period, transportation improved, politics became more secularized, capitalism spread, nation-states grew more powerful, and information became more widely accessible. Enlightenment ideals of reason, rationalism, and faith in scientific inquiry slowly began to replace the previously dominant authorities of king and church.
Huge political, social, and economic changes marked the end of the 18th century and the beginning of the late modern period. The Industrial Revolution, which began in England around 1750, combined with the American Revolution in 1776 and the French Revolution in 1789, marked the beginning of massive changes in the world.
The French and American revolutions were inspired by a rejection of monarchy in favor of national sovereignty and representative democracy. Both revolutions also heralded the rise of secular society as opposed to church-based authority systems. Democracy was well suited to the so-called Age of Reason, with its ideals of individual rights and progress.
Though less political, the Industrial Revolution had equally far-reaching consequences. It did not merely change the way goods were produced—it also fundamentally changed the economic, social, and cultural framework of its time. The Industrial Revolution doesn’t have clear start or end dates. However, during the 19th century, several crucial inventions—the internal combustion engine, steam-powered ships, and railways, among others—led to innovations in various industries. Steam power and machine tools increased production dramatically. But some of the biggest changes coming out of the Industrial Revolution were social in character. An economy based on manufacturing instead of agriculture meant that more people moved to cities, where techniques of mass production led people to value efficiency both in and out of the factory. Newly urbanized factory laborers could no longer produce their own food, clothing, or supplies, and instead turned to consumer goods. Increased production led to increases in wealth, though income inequalities between classes also started to grow.
These overwhelming changes affected (and were affected by) the media. As noted, the fusing of steam power and the printing press enabled the explosive expansion of books and newspapers. Literacy rates rose, as did support for public participation in politics. More and more people lived in the city, had an education, got their news from the newspaper, spent their wages on consumer goods, and identified as citizens of an industrialized nation. Urbanization, mass literacy, and new forms of mass media contributed to a sense of mass culture that united people across regional, social, and cultural boundaries.
Modernity and the Modern Age, it should be noted, are distinct from (but related to) the cultural movement of modernism. The Modern Era lasted from the end of the Middle Ages to the middle of the 20th century; modernism, however, refers to the artistic movement of late 19th and early 20th centuries that arose from the widespread changes that swept the world during that period. Most notably, modernism questioned the limitations of traditional forms of art and culture. Modernist art was in part a reaction against the Enlightenment’s certainty of progress and rationality. It celebrated subjectivity through abstraction, experimentalism, surrealism, and sometimes pessimism or even nihilism. Prominent examples of modernist works include James Joyce’s stream-of-consciousness novels, cubist paintings by Pablo Picasso, atonal compositions by Claude Debussy, and absurdist plays by Luigi Pirandello.
The Post Modern Age
Modernism can also be seen as a transitional phase between the modern and postmodern eras. While the exact definition and dates of the Postmodern Age are still being debated by cultural theorists and philosophers, the general consensus is that the Postmodern Age began during the second half of the 20th century and was marked by skepticism, self-consciousness, celebration of difference, and the reappraisal of modern conventions. The Modern Age took for granted scientific rationalism, the autonomous self, and the inevitability of progress; the Postmodern Age questioned or dismissed many of these assumptions. If the Modern Age valued order, reason, stability, and absolute truth, the Postmodern Age reveled in contingency, fragmentation, and instability. The effect of technology on culture, the rise of the Internet, and the Cold War are all aspects that led to the Postmodern Age.
The belief in objective truth that characterized the Modern Age is one of the major assumptions overturned in the Postmodern Age. Postmodernists instead took their cues from Erwin Schrödinger, the quantum physicist who famously devised a thought experiment in which a cat is placed inside a sealed box with a small amount of radiation that may or may not kill it. While the box remains sealed, Schrödinger proclaimed, the cat exists simultaneously in both states, dead and alive. Both potential states are equally true. Although the thought experiment was devised to explore issues in quantum physics, it appealed to postmodernists in its assertion of radical uncertainty. Rather than there being an absolute objective truth accessible by rational experimentation, the status of reality was contingent and depended on the observer.
This value of the relative over the absolute found its literary equivalent in the movement of deconstruction. While Victorian novelists took pains to make their books seem more realistic, postmodern narratives distrusted professions of reality and constantly reminded readers of the artificial nature of the story they were reading. The emphasis was not on the all-knowing author, but instead on the reader. For the postmodernists, meaning was not injected into a work by its creator, but depended on the reader’s subjective experience of the work. The poetry of Sylvia Plath and Allen Ginsberg exemplify this, as much of their work is emotionally charged and designed to create a dialogue with the reader, oftentimes forcing the reader to confront controversial issues such as mental illness or homosexuality.
Another way the Postmodern Age differed from the Modern Age was in the rejection of what philosopher Jean-François Lyotard deemed “grand narratives.” The Modern Age was marked by different large-scale theories that attempted to explain the totality of human experience, such as capitalism, Marxism, rationalism, Freudianism, Darwinism, fascism, and so on. However, increasing globalization and the rise of subcultures called into question the sorts of theories that claimed to explain everything at once. Totalitarian regimes during the 20th century, such as Adolf Hitler’s Third Reich and the USSR under Joseph Stalin, led to a mistrust of power and the systems held up by power. The Postmodern Age, Lyotard theorized, was one of micronarratives instead of grand narratives—that is, a multiplicity of small, localized understandings of the world, none of which can claim an ultimate or absolute truth. An older man in Kenya, for example, does not view the world in the same way as a young woman from New York. Even people from the same cultural backgrounds have different views of the world—when you were a teenager, did your parents understand your way of thinking? The diversity of human experience is a marked feature of the postmodern world. As Lyotard noted, “Eclecticism is the degree zero of contemporary general culture; one listens to reggae, watches a Western, eats McDonald’s food for lunch and local cuisine for dinner, wears Paris perfume in Tokyo and retro clothes in Hong Kong; knowledge is a matter for TV games (Lyotard, 1984).”
Postmodernists also mistrusted the idea of originality and freely borrowed across cultures and genres. William S. Burroughs gleefully proclaimed a sort of call to arms for his generation of writers in 1985: “Out of the closets and into the museums, libraries, architectural monuments, concert halls, bookstores, recording studios and film studios of the world. Everything belongs to the inspired and dedicated thief (Burroughs,1993).” The feminist artist Barbara Kruger, for example, creates works of art from old advertisements, and writers, such as Kathy Acker, reconstructed existing texts to form new stories. The rejection of traditional forms of art and expression embody the Postmodern Age.
From the early Modern Age through the Postmodern Age, people have experienced the world in vastly different ways. Not only has technology rapidly become more complex, but culture itself has changed with the times. When reading further, it’s important to remember that forms of media and culture are hallmarks of different eras, and the different ways in which media are presented often tell us a lot about the culture and times.
Anderson, Chris. “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete,” Wired, June 23, 2008, http://www.wired.com/science/discoveries/magazine/16-07/pb_theory.
Burroughs, William S. “Les Velours,” The Adding Machine, (New York: Arcade Publishing, 1993), 19–21.
Lyotard, Jean-François The Postmodern Condition: A Report on Knowledge, trans. Geoff Bennington and Brian Massumi (Minneapolis: University of Minnesota Press, 1984).
Mass Media and Popular Culture
Burroughs’s jubilant call to bring art “out of the closets and into the museums” spoke to postmodernism’s willingness to meld high and low culture (Leonard, 1997). And although the Postmodern Age specifically embraced popular culture, mass media and pop culture have been entwined from their very beginnings. In fact, mass media often determines what does and does not make up the pop culture scene.
Historically, mass pop culture has been fostered by an active and tastemaking mass media that introduces and encourages the adoption of certain trends. Although they are similar in some ways to the widespread media gatekeepers, tastemakers differ in that they are most influential when the mass media is relatively small and concentrated. When only a few publications or programs reach millions of people, their writers and editors are highly influential. The New York Times’s restaurant reviews used to be able to make a restaurant successful or unsuccessful through granting (or withdrawing) its rating.
Or take the example of Ed Sullivan’s variety show, which ran from 1948 to 1971, and is most famous for hosting the first U.S. appearance of the Beatles—a television event that was at the time the most-watched TV program ever. Sullivan hosted musical acts, comedians, actors, and dancers and had the reputation of being able to turn an act on the cusp of fame into full-fledged stars. Comedian Jackie Mason compared being on The Ed Sullivan Show to “an opera singer being at the Met. Or if a guy is an architect that makes the Empire State Building.…This was the biggest (Leonard, 1997).” Sullivan was a classic example of an influential tastemaker of his time. A more modern example is Oprah Winfrey, whose book club endorsements often send literature, including old classics like Leo Tolstoy’s Anna Karenina, skyrocketing to the top of The New York Times Best Sellers list.
Figure 1.11: For Elvis Presley’s third appearance on The Ed Sullivan show, he was shown only from the waist up; Sullivan considered his dancing too scandalous for family viewing. / Wikimedia Commons
Along with encouraging a mass audience to see (or skip) certain movies, television shows, video games, books, or fashion trends, people use tastemaking to create demand for new products. Companies often turn to advertising firms to help create public hunger for an object that may have not even existed 6 months before. In the 1880s, when George Eastman developed the Kodak camera for personal use, photography was most practiced by professionals. “Though the Kodak was relatively cheap and easy to use, most Americans didn’t see the need for a camera; they had no sense that there was any value in visually documenting their lives,” noted New Yorker writer James Surowiecki (Surowiecki, 2003). Kodak became a wildly successful company not because Eastman was good at selling cameras, but because he understood that what he really had to sell was photography. Apple Inc. is a modern master of this technique. By leaking just enough information about a new product to cause curiosity, the technology company ensures that people will be waiting excitedly for an official release.
Tastemakers help keep culture vital by introducing the public to new ideas, music, programs, or products, but tastemakers are not immune to outside influence. In the traditional media model, large media companies set aside large advertising budgets to promote their most promising projects; tastemakers buzz about “the next big thing,” and obscure or niche works can get lost in the shuffle.
A Changing System for the Internet Age
In retrospect, the 20th century was a tastemaker’s dream. Advertisers, critics, and other cultural influencers had access to huge audiences through a number of mass-communication platforms. However, by the end of the century, the rise of cable television and the Internet had begun to make tastemaking a more complicated enterprise. While The Ed Sullivan Show regularly reached 50 million people in the 1960s, the most popular television series of 2009—American Idol—averaged around 25.5 million viewers per night, despite the fact that the 21st-century United States could claim more people and more television sets than ever before (Wikipedia, 2012). However, the proliferation of TV channels and other competing forms of entertainment meant that no one program or channel could dominate the attention of the American public as in Sullivan’s day.
Number of Viewers
Percent of Households
The Ed Sullivan Show / The Beatles’ first appearance
The Ed Sullivan Show / Elvis Presley’s first appearance
I Love Lucy / “Lucy Goes to the Hospital”
M*A*S*H / Series finale
Seinfeld / Series finale
American Idol / Season 5 finale
Table 1.2 Viewings of Popular Television Broadcasts
Meanwhile, a low-tech home recording of a little boy acting loopy after a visit to the dentist (“David After Dentist”) garnered more than 37 million YouTube viewings in 2009 alone. The Internet appears to be eroding some of the tastemaking power of the traditional media outlets. No longer is the traditional mass media the only dominant force in creating and promoting trends. Instead, information spreads across the globe without the active involvement of traditional mass media. Websites made by nonprofessionals can reach more people daily than a major newspaper. Music review sites such as Pitchfork keep their eyes out for the next big thing, whereas review aggregators like Rotten Tomatoes allow readers to read hundreds of movie reviews by amateurs and professionals alike. Blogs make it possible for anyone with Internet access to potentially reach an audience of millions. Some popular bloggers have transitioned from the traditional media world to the digital world, but others have become well known without formal institutional support. The celebrity-gossip chronicler Perez Hilton had no formal training in journalism when he started his blog, PerezHilton.com, in 2005; within a few years, he was reaching millions of readers a month.
E-mail and text messages allow people to transmit messages almost instantly across vast geographic expanses. Although personal communications continue to dominate, e-mail and text messages are increasingly used to directly transmit information about important news events. When Barack Obama wanted to announce his selection of Joe Biden as his vice-presidential running mate in the 2008 election, he bypassed the traditional televised press conference and instead sent the news to his supporters directly via text message—2.9 million text messages, to be exact (Covey). Social networking sites, such as Facebook, and microblogging services, such as Twitter, are another source of late-breaking information. When Michael Jackson died of cardiac arrest in 2009, “RIP Michael Jackson” was a top trending topic on Twitter before the first mainstream media first reported the news.
Thanks to these and other digital-age media, the Internet has become a pop culture force, both a source of amateur talent and a source of amateur promotion. However, traditional media outlets still maintain a large amount of control and influence over U.S. pop culture. One key indicator is the fact that many singers or writers who first make their mark on the Internet quickly transition to more traditional media—YouTube star Justin Bieber was signed by a mainstream record company, and blogger Perez Hilton is regularly featured on MTV and VH1. New-media stars are quickly absorbed into the old-media landscape.
Getting Around the Gatekeepers
Not only does the Internet give untrained individuals access to a huge audience for their art or opinions, but it also allows content creators to reach fans directly. Projects that may not have succeeded through traditional mass media may get a second chance through newer medias. The profit-driven media establishment has been surprised by the success of some self-published books. For example, dozens of literary agents rejected first-time author Daniel Suarez’s novel Daemon before he decided to self-publish in 2006. Through savvy self-promotion through influential bloggers, Suarez garnered enough attention to land a contract with a major publishing house.
Figure 1.12: E-readers offer authors a way to get around the traditional publishing industry, but their thousands of options can make choosing hard on readers. / Edvvc – eReader Comparison
Suarez’s story, though certainly exceptional, reaches some of the questions facing creators and consumers of pop culture in the Internet age. Without the influence of an agent, editor, or PR company, self-published content may be able to hew closer to the creator’s intention. However, much of the detailed marketing work must be performed by the work’s creator instead of by a specialized public relations team. And with so many self-published, self-promoted works uploaded to the Internet every day, it’s easy for things—even good things—to get lost in the shuffle.
Critic Laura Miller spells out some of the ways in which writers in particular can take control of their own publishing: “Writers can upload their works to services run by Amazon, Apple and… Barnes & Noble, transforming them into e-books that are instantly available in high-profile online stores. Or they can post them on services like Urbis.com, Quillp.com, or CompletelyNovel.com and coax reviews from other hopeful users (Miller, 2010).” Miller also points out that many of these companies can produce hard copies of books as well. While such a system may be a boon for writers who haven’t had success with the traditional media establishment, Miller notes that it may not be the best option for readers, who “rarely complain that there isn’t enough of a selection on Amazon or in their local superstore; they’re more likely to ask for help in narrowing down their choices (Miller, 2010).”
The question remains: Will the Internet era be marked by a huge and diffuse pop culture, where the power of traditional mass media declines and, along with it, the power of the universalizing blockbuster hit? Or will the Internet create a new set of tastemakers—influential bloggers—or even serve as a platform for the old tastemakers to take on new forms?
In 1993, The New York Times restaurant critic Ruth Reichl wrote a review about her experiences at the upscale Manhattan restaurant Le Cirque. She detailed the poor service she received when the restaurant staff did not know her and the excellent service she received when they realized she was a professional food critic. Her article illustrated how the power to publish reviews could affect a person’s experience at a restaurant. The Internet, which turned everyone with the time and interest into a potential reviewer, allowed those ordinary people to have their voices heard. In the mid-2000s, websites such as Yelp and TripAdvisor boasted hundreds of reviews of restaurants, hotels, and salons provided by users. Amazon allows users to review any product it sells, from textbooks to bathing suits. The era of the democratized review had come, and tastemaking was now everyone’s job.
By crowdsourcing (harnessing the efforts of a number of individuals online to solve a problem) the review process, the idea was, these sites would arrive at a more accurate description of the service in choice. One powerful reviewer would no longer be able to wield disproportionate power; instead, the wisdom of the crowd would make or break restaurants, movies, and everything else. Anyone who felt treated badly or scammed now had recourse to tell the world about it. By 2008, Yelp had 4 million reviews
However, mass tastemaking isn’t as perfect as some people had promised. Certain reviewers can overly influence a product’s overall rating by contributing multiple votes. One study found that a handful of Amazon users were casting hundreds of votes, while most rarely wrote reviews at all. Online reviews also tend to skew to extremes—more reviews are written by the ecstatic and the furious, while the moderately pleased aren’t riled up enough to post online about their experiences. And while traditional critics are supposed to adhere to ethical standards, there’s no such standard for online reviews. Savvy authors or restaurant owners have been known to slyly insert positive reviews or attempt to skew ratings systems. To get an accurate picture, potential buyers may find themselves wading through 20 or 30 online reviews, most of them from nonprofessionals. And sometimes those people aren’t professionals for a reason. Consider these user reviews on Amazon of William Shakespeare’s Hamlet: “There is really no point and it’s really long,” “I really didn’t enjoy reading this book and I wish that our English teacher wouldn’t force my class to read this play,” and “don’t know what Willy Shakespeare was thinking when he wrote this one play tragedy, but I thought this sure was boring! Hamlet does too much talking and not enough stuff.” While some may argue that these are valid criticisms of the play, these comments are certainly a far cry from the thoughtful critique of a professional literary critic.
These and other issues underscore the point of having reviews in the first place—that it’s an advantage to have certain places, products, or ideas examined and critiqued by a trusted and knowledgeable source. In an article about Yelp, The New York Times noted that one of the site’s elite reviewers had racked up more than 300 reviews in 3 years, and then pointed out that “By contrast, a New York Times restaurant critic might take six years to amass 300 reviews. The critic visits a restaurant several times, strives for anonymity and tries to sample every dish on the menu (McNeil, 2008).” Whatever your vantage point, it’s clear that old-style tastemaking is still around and still valuable—but the democratic review is here to stay.
Covey, Nic. “Flying Fingers,” Nielsen, http://en-us.nielsen.com/main/insights/consumer_insight/issue_12/flying_fingers.
Leonard, John. “The Ed Sullivan Age,” American Heritage, May/June 1997.
McNeil, Donald G. “Eat and Tell,” New York Times, November 4, 2008, Dining & Wine section.
Miller, Laura. “When Anyone Can Be a Published Author,” Salon, June 22, 2010, http://www.salon.com/books/laura_miller/2010/06/22/slush.
Surowiecki, James. “The Tastemakers,” New Yorker, January 13, 2003.
Wikipedia, , s.v. “The Ed Sullivan Show,” last modified June 26, 2012, http://en.wikipedia.org/wiki/The_Ed_Sullivan_Show; Wikipedia, s.v. “American Idol,” last modified June 26, 2012, http://en.wikipedia.org/wiki/American_Idol.
In Gutenberg’s age and the subsequent modern era, literacy—the ability to read and write—was a concern not only of educators, but also of politicians, social reformers, and philosophers. A literate population, many reasoned, would be able to seek out information, stay informed about the news of the day, communicate effectively, and make informed decisions in many spheres of life. Because of this, literate people made better citizens, parents, and workers. Several centuries later, as global literacy rates continued to grow, there was a new sense that merely being able to read and write was not enough. In a media-saturated world, individuals needed to be able to sort through and analyze the information they were bombarded with every day. In the second half of the 20th century, the skill of being able to decode and process the messages and symbols transmitted via media was named media literacy. According to the nonprofit National Association for Media Literacy Education (NAMLE), a person who is media literate can access, analyze, evaluate, and communicate information. Put another way by John Culkin, a pioneering advocate for media literacy education, “The new mass media—film, radio, TV—are new languages, their grammar as yet unknown (Moody, 1993).” Media literacy seeks to give media consumers the ability to understand this new language. The following are questions asked by those that are media literate:
- Who created the message?
- What are the author’s credentials?
- Why was the message created?
- Is the message trying to get me to act or think in a certain way?
- Is someone making money for creating this message?
- Who is the intended audience?
- How do I know this information is accurate?
Why Be Media Literature?
Culkin called the pervasiveness of media “the unnoticed fact of our present,” noting that media information was as omnipresent and easy to overlook as the air we breathe (and, he noted, “some would add that it is just as polluted”) (Moody, 1993). Our exposure to media starts early—a study by the Kaiser Family Foundation found that 68 percent of children ages 2 and younger spend an average of 2 hours in front of a screen (either computer or television) each day, while children under 6 spend as much time in front of a screen as they do playing outside (Lewin). U.S. teenagers are spending an average of 7.5 hours with media daily, nearly as long as they spend in school. Media literacy isn’t merely a skill for young people, however. Today’s Americans get much of their information from various media sources—but not all that information is created equal. One crucial role of media literacy education is to enable us to skeptically examine the often-conflicting media messages we receive every day.
Many of the hours people spend with media are with commercial-sponsored content. The Federal Trade Commission (FTC) estimated that each child aged 2 to 11 saw, on average, 25,629 television commercials in 2004 alone, or more than 10,700 minutes of ads. Each adult saw, on average, 52,469 ads, or about 15.5 days’ worth of television advertising (Holt, 2007). Children (and adults) are bombarded with contradictory messages—newspaper articles about the obesity epidemic run side by side with ads touting soda, candy, and fast food. The American Academy of Pediatrics maintains that advertising directed to children under 8 is “inherently deceptive” and exploitative because young children can’t tell the difference between programs and commercials (Shifrin, 2005). Advertising often uses techniques of psychological pressure to influence decision making. Ads may appeal to vanity, insecurity, prejudice, fear, or the desire for adventure. This is not always done to sell a product—antismoking public service announcements may rely on disgusting images of blackened lungs to shock viewers. Nonetheless, media literacy involves teaching people to be guarded consumers and to evaluate claims with a critical eye.
Bias, Spin, and Misinformation
Advertisements may have the explicit goal of selling a product or idea, but they’re not the only kind of media message with an agenda. A politician may hope to persuade potential voters that he has their best interests at heart. An ostensibly objective journalist may allow her political leanings to subtly slant her articles. Magazine writers might avoid criticizing companies that advertise heavily in their pages. News reporters may sensationalize stories to boost ratings—and advertising rates.
Mass-communication messages are created by individuals, and each individual has his or her own set of values, assumptions, and priorities. Accepting media messages at face value could lead to confusion because of all the contradictory information available. For example, in 2010, a highly contested governor’s race in New Mexico led to conflicting ads from both candidates, Diane Denish and Susana Martinez, each claiming that the other agreed to policies that benefited sex offenders. According to media watchdog site FactCheck.org, the Denish team’s ad “shows a preteen girl—seemingly about 9 years old—going down a playground slide in slow-motion, while ominous music plays in the background and an announcer discusses two sex crime cases. It ends with an empty swing, as the announcer says: ‘Today we don’t know where these sex offenders are lurking, because Susana Martinez didn’t do her job.’” The opposing ad proclaims that “a department in Denish’s cabinet gave sanctuary to criminal illegals, like child molester Juan Gonzalez (Robertson & Kiely, 2010).” Both claims are highly inflammatory, play on fear, and distort the reality behind each situation. Media literacy involves educating people to look critically at these and other media messages and to sift through various messages and make sense of the conflicting information we face every day.
New Skills for a New World
In the past, one goal of education was to provide students with the information deemed necessary to successfully engage with the world. Students memorized multiplication tables, state capitals, famous poems, and notable dates. Today, however, vast amounts of information are available at the click of a mouse. Even before the advent of the Internet, noted communications scholar David Berlo foresaw the consequences of expanding information technology: “Most of what we have called formal education has been intended to imprint on the human mind all of the information that we might need for a lifetime.” Changes in technology necessitate changes in how we learn, Berlo noted, and these days “education needs to be geared toward the handling of data rather than the accumulation of data (Shaw, 2003).”
Wikipedia, a hugely popular Internet encyclopedia, is at the forefront of the debate on the proper use of online sources. In 2007, Middlebury College banned the use of Wikipedia as a source in history papers and exams. One of the school’s librarians noted that the online encyclopedia “symbolizes the best and worst of the Internet. It’s the best because everyone gets his/her say and can state their views. It’s the worst because people who use it uncritically take for truth what is only opinion (Byers, 2007).” Or, as comedian and satirist Stephen Colbert put it, “Any user can change any entry, and if enough other users agree with them, it becomes true (Colbert, 2006).” A computer registered to the U.S. Democratic Party changed the Wikipedia page for Rush Limbaugh to proclaim that he was “racist” and a “bigot,” and a person working for the electronic voting machine manufacturer Diebold was found to have erased paragraphs connecting the company to Republican campaign funds (Fildes, 2007). Media literacy teaches today’s students how to sort through the Internet’s cloud of data, locate reliable sources, and identify bias and unreliable sources.
Individual Accountability and Popular Culture
Ultimately, media literacy involves teaching that images are constructed with various aims in mind and that it falls to the individual to evaluate and interpret these media messages. Mass communication may be created and disseminated by individuals, businesses, governments, or organizations, but it is always received by an individual. Education, life experience, and a host of other factors make each person interpret constructed media in different ways; there is no correct way to interpret a media message. But on the whole, better media literacy skills help us function better in our media-rich environment, enabling us to be better democratic citizens, smarter shoppers, and more skeptical media consumers. When analyzing media messages, consider the following:
- Author: Consider who is presenting the information. Is it a news organization, a corporation, or an individual? What links do they have to the information they are providing? A news station might be owned by the company it is reporting on; likewise, an individual might have financial reasons for supporting a certain message.
- Format: Television and print media often use images to grab people’s attention. Do the visuals only present one side of the story? Is the footage overly graphic or designed to provoke a specific reaction? Which celebrities or professionals are endorsing this message?
- Audience: Imagine yourself in another’s shoes. Would someone of the opposite gender feel the same way as you do about this message? How might someone of a different race or nationality feel about it? How might an older or younger person interpret this information differently? Was this message made to appeal to a specific audience?
- Content: Even content providers that try to present information objectively can have an unconscious slant. Analyze who is presenting this message. Does he or she have any clear political affiliations? Is he or she being paid to speak or write this information? What unconscious influences might be at work?
- Purpose: Nothing is communicated by mass media without a reason. What reaction is the message trying to provoke? Are you being told to feel or act a certain way? Examine the information closely and look for possible hidden agendas.
With these considerations as a jumping-off place, we can ensure that we’re staying informed about where our information comes from and why it is being sent—important steps in any media literacy education (Center for Media Literacy).
Byers, Meredith “Controversy Over Use of Wikipedia in Academic Papers Arrives at Smith,” Smith College Sophian, News section, March 8, 2007.
Center for Media Literacy, “Five Key Questions Form Foundation for Media Inquiry,” http://www.medialit.org/reading-room/five-key-questions-form-foundation-media-inquiry.
Colbert, Stephen. “The Word: Wikiality,” The Colbert Report, July 31, 2006.
Fildes, Jonathan. “Wikipedia ‘Shows CIA Page Edits,’” BBC News, Science and Technology section, August 15, 2007.
Holt, Debra. and others, Children’s Exposure to TV Advertising in 1977 and 2004, Federal Trade Commission Bureau of Economics staff report, June 1, 2007.
Lewin. “If Your Kids Are Awake.”
Mann, Merlin. http://www.merlinmann.com/projects/.
Moody, Kate. “John Culkin, SJ: The Man Who Invented Media Literacy: 1928–1993,” Center for Media Literacy, http://www.medialit.org/reading_room/article408.html.
Robertson, Lori and Eugene Kiely, “Mudslinging in New Mexico: Gubernatorial Candidates Launch Willie Horton-Style Ads, Each Accusing the Other of Enabling Sex Offenders to Strike Again,” FactCheck.org, June 24, 2010, http://factcheck.org/2010/06/mudslinging-in-new-mexico/.
Shaw, David. “A Plea for Media Literacy in our Nation’s Schools,” Los Angeles Times, November 30, 2003.
Shifrin, Donald. “Perspectives on Marketing, Self-Regulation and Childhood Obesity” (remarks, Federal Trade Commission Workshop, Washington, DC, July 14–15, 2005).
From Understanding Media and Culture: An Introduction to Mass Communication, originally published by the University of Minnesota Libraries under a Creative Commons Attribution-NonCommercial 4.0 International license.