This content is copyright Flat World Knowledge and the author(s). This content was captured in November 2012,
from http://catalog.flatworldknowledge.com/bookhub/reader/3833.
A Creative Commons Attribution-NonCommercial-ShareAlike license was clearly declared and displayed.

Read License Information
Full Legal Code.
This content is being redistributed in accordance with the permissions of that license.
Figure 8.1
In 2009, many moviegoers were amazed by the three-dimensional (3-D) film Avatar. Avatar grossed over $1.8 billion in theaters worldwide, $1.35 billion from 3-D sales alone.Brandon Gray, “‘Avatar’ is New King of the World,” Box Office Mojo, January 26, 2010, http://boxofficemojo.com/news/?id=2657. Following in that vein, dozens of other movie studios released 3-D films, resulting in lesser box office successes such as Alice in Wonderland, Clash of the Titans, and Shrek Forever After. Many film reviewers and audiences seemed adamant—3-D movies were the wave of the future.
However, could this eye-popping technology actually ruin our moviegoing experience? Brian Moylan, a critic for Gawker.com, argues that it already has. The problem with 3-D, he says, is that “It is so mind-numbingly amazing that narrative storytelling hasn’t caught up with the technology. The corporate screenwriting borgs are so busy trying to come up with plot devices to highlight all the newfangled whoosiwhatsits—objects being hurled at the audience, flying sequences, falling leaves, glowing Venus Flytraps—that no one is really bothering to tell a tale.”Brian Moylan, “3D is Going to Ruin Movies for a Long Time to Come,” Gawker, http://gawker.com/#!5484085/3d-is-going-to-ruin-movies-for-a-long-time-to-come.
James Cameron, director of Avatar, agrees. “[Studios] think, ‘what was [sic] the takeaway lessons from Avatar? Oh you should make more money with 3-D.’ They ignore the fact that we natively authored the film in 3-D, and [they] decide that what we accomplished in several years of production could be done in an eight week (post-production 3-D) conversion [such as] with Clash of the Titans.”Edward Baig, “‘Avatar’ Director James Cameron: 3D Promising, but Caution Needed,” USA Today, March 11, 2010, http://content.usatoday.com/communities/technologylive/post/2010/03/james-cameron/1. Cameron makes the following point: While recent films such as Avatar (2009) and Beowulf (2007) were created exclusively for 3-D, many other filmmakers have converted their movies to 3-D after filming was already complete. Clash of the Titans is widely criticized because its 3-D effects were quickly added in postproduction. Edward Baig, “‘Avatar’ Director James Cameron: 3D Promising, but Caution Needed,” USA Today, March 11, 2010, http://content.usatoday.com/communities/technologylive/post/2010/03/james-cameron/1.
What effect does this have on audiences? Aside from the complaints of headaches and nausea (and the fact that some who wear glasses regularly can find it uncomfortable or even impossible to wear 3-D glasses on top of their own), many say that the new technology simply makes movies looks worse. The film critic Roger Ebert has continuously denounced the technology, noting that movies such as The Last Airbender look like they’re “filmed with a dirty sheet over the lens.”Roger Ebert, review of The Last Airbender, directed by M. Night Shyamalan, Chicago Sun Times, June 30, 2010, http://rogerebert.suntimes.com/apps/pbcs.dll/article?AID=/20100630/REVIEWS/100639999. 3-D technology can cause a movie to look fuzzier, darker, and generally less cinematically attractive. However, movie studios are finding 3-D films attractive for another reason.
Because seeing a movie in 3-D is considered a “premium” experience, consumers are expected to pay higher prices. And with the increasing popularity of IMAX 3D films, many moviegoers were amazed by the 3-D film Avatar 3-D, tickets may surpass $20 per person.Andrew Stewart and Pamela McClintock, “Big Ticket Price Increase for 3D Pics,” Variety, March 24, 2010, http://www.variety.com/article/VR1118016878.html?categoryid=13&cs=1. This gives 3-D films an advantage over 2-D ones as audiences are willing to pay more to do so.
The recent 3-D boom has often been compared to the rise of color film in the early 1950s. However, some maintain that it’s just a fad. Will 3-D technology affect the future of filmmaking? With a host of new 3-D technologies for the home theater being released in 2010, many are banking on the fact that it will. Director James Cameron, however, is unsure of the technology’s continuing popularity, arguing that “If people put bad 3-D in the marketplace they’re going to hold back or even threaten the emerging of 3-D.”Edward Baig, “‘Avatar’ Director James Cameron: 3D Promising, but Caution Needed,” USA Today, March 11, 2010, http://content.usatoday.com/communities/technologylive/post/2010/03/james-cameron/1. What is important, he maintains, is the creative aspect of moviemaking—no technology can replace good filmmaking. In the end, audiences will determine the medium’s popularity. Throughout the history of film, Technicolor dyes, enhanced sound systems, and computer-generated graphics have boasted huge box-office revenues; however, it’s ultimately the viewers who determine what a good movie is and who set the standard for future films.
The movie industry as we know it today originated in the early 19th century through a series of technological developments: the creation of photography, the discovery of the illusion of motion by combining individual still images, and the study of human and animal locomotion. The history presented here begins at the culmination of these technological developments, where the idea of the motion picture as an entertainment industry first emerged. Since then, the industry has seen extraordinary transformations, some driven by the artistic visions of individual participants, some by commercial necessity, and still others by accident. The history of the cinema is complex, and for every important innovator and movement listed here, others have been left out. Nonetheless, after reading this section you will understand the broad arc of the development of a medium that has captured the imaginations of audiences worldwide for over a century.
While the experience of watching movies on smartphones may seem like a drastic departure from the communal nature of film viewing as we think of it today, in some ways the small-format, single-viewer display is a return to film’s early roots. In 1891, the inventor Thomas Edison, together with William Dickson, a young laboratory assistant, came out with what they called the kinetoscopeThe camera used to capture images for the Edison kinetograph., a device that would become the predecessor to the motion picture projector. The kinetoscope was a cabinet with a window through which individual viewers could experience the illusion of a moving image.Europe 1789–1914: Encyclopedia of the Age of Industry and Empire, vol. 1, s.v. “Cinema,” by Alan Williams, Gale Virtual Reference Library.“The Kinetoscope,” British Movie Classics, http://www.britishmovieclassics.com/thekinetoscope.php. A perforated celluloid film stripA thin, transparent type of film that was coated with light-sensitive chemicals to record images. with a sequence of images on it was rapidly spooled between a lightbulb and a lens, creating the illusion of motion.Britannica Online, s.v. “Kinetoscope,” http://www.britannica.com/EBchecked/topic/318211/Kinetoscope/318211main/Article. The images viewers could see in the kinetoscope captured events and performances that had been staged at Edison’s film studio in East Orange, New Jersey, especially for the Edison kinetographThomas Edison’s early motion picture display that allowed a single viewer to experience the illusion of a moving image. (the camera that produced kinetoscope film sequences): circus performances, dancing women, cockfights, boxing matches, and even a tooth extraction by a dentist.David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 43–44.
Figure 8.2

The Edison kinetoscope.
As the kinetoscope gained popularity, the Edison Company began installing machines in hotel lobbies, amusement parks, and penny arcades, and soon kinetoscope parlors—where customers could pay around 25 cents for admission to a bank of machines—had opened around the country. However, when friends and collaborators suggested that Edison find a way to project his kinetoscope images for audience viewing, he apparently refused, claiming that such an invention would be a less profitable venture.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture; Robinson, From Peep Show to Palace, 45, 53.
Because Edison hadn’t secured an international patent for his invention, variations of the kinetoscope were soon being copied and distributed throughout Europe. This new form of entertainment was an instant success, and a number of mechanics and inventors, seeing an opportunity, began toying with methods of projecting the moving images onto a larger screen. However, it was the invention of two brothers, Auguste and Louis Lumière—photographic goods manufacturers in Lyon, France—that saw the most commercial success. In 1895, the brothers patented the CinématographeLightweight film projector, created by Auguste and Louis Lumière, that also functioned as a camera and printer, and allowed multiple people to view moving images at the same time. (from which we get the term cinema), a lightweight film projector that also functioned as a camera and printer. Unlike the Edison kinetograph, the Cinématographe was lightweight enough for easy outdoor filming, and over the years the brothers used the camera to take well over 1,000 short films, most of which depicted scenes from everyday life. In December 1895, in the basement lounge of the Grand Café, Rue des Capucines in Paris, the Lumières held the world’s first ever commercial film screening, a sequence of about 10 short scenes, including the brother’s first film, Workers Leaving the Lumière Factory, a segment lasting less than a minute and depicting workers leaving the family’s photographic instrument factory at the end of the day, as shown in the still frame here in .Encyclopedia of the Age of Industry and Empire, s.v. “Cinema.”
Believing that audiences would get bored watching scenes that they could just as easily observe on a casual walk around the city, Louis Lumière claimed that the cinema was “an invention without a future,”Louis Menand, “Gross Points,” New Yorker, February 7, 2005, http://www.newyorker.com/archive/2005/02/07/050207crat_atlarge. but a demand for motion pictures grew at such a rapid rate that soon representatives of the Lumière company were traveling throughout Europe and the world, showing half-hour screenings of the company’s films. While cinema initially competed with other popular forms of entertainment—circuses, vaudeville acts, theater troupes, magic shows, and many others—eventually it would supplant these various entertainments as the main commercial attraction.Louis Menand, “Gross Points,” New Yorker, February 7, 2005, http://www.newyorker.com/archive/2005/02/07/050207crat_atlarge. Within a year of the Lumières’ first commercial screening, competing film companies were offering moving-picture acts in music halls and vaudeville theaters across Great Britain. In the United States, the Edison Company, having purchased the rights to an improved projecter that they called the VitascopeLarge-screen motion projector manufactured by Thomas Edison., held their first film screening in April 1896 at Koster and Bial’s Music Hall in Herald Square, New York City.
Figure 8.3

Workers Leaving the Lumière Factory: One of the first films viewed by an audience.
Film’s profound impact on its earliest viewers is difficult to imagine today, inundated as many are by video images. However, the sheer volume of reports about the early audience’s disbelief, delight, and even fear at what they were seeing suggests that viewing a film was an overwhelming experience for many. Spectators gasped at the realistic details in films such as Robert Paul’s Rough Sea at Dover, and at times people panicked and tried to flee the theater during films in which trains or moving carriages sped toward the audience.David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 63. Even the public’s perception of film as a medium was considerably different from the contemporary understanding; the moving image was an improvement upon the photograph—a medium with which viewers were already familiar—and this is perhaps why the earliest films documented events in brief segments but didn’t tell stories. During this “novelty period” of cinema, audiences were more interested by the phenomenon of the film projector itself, so vaudeville halls advertised the kind of the projector they were using (for example, “The Vitascope—Edison’s Latest Marvel”)Andrei Ionut Balcanasu, Sergey V. Smagin, and Stephanie K. Thrift, “Edison and the Lumiere Brothers,” Cartoons and Cinema of the 20th Century, http://library.thinkquest.org/C0118600/index.phtml?menu=en%3B1%3Bci1001.html., rather than the names of the films.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture
By the close of the 19th century, as public excitement over the moving picture’s novelty gradually wore off, filmmakers were also beginning to experiment with film’s possibilities as a medium in itself (not simply, as it had been regarded up until then, as a tool for documentation, analogous to the camera or the phonograph). Technical innovations allowed filmmakers like Parisian cinema owner Georges Méliès to experiment with special effects that produced seemingly magical transformations on screen: flowers turned into women, people disappeared with puffs of smoke, a man appeared where a woman had just been standing, and other similar tricks.David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 74–75; Encyclopedia of the Age of Industry and Empire, s.v. “Cinema.”
Not only did Méliès, a former magician, invent the “trick filmFilms that contained techniques, originally used by Georges Méliès, such as stop-motion photography that made objects disappear, reappear, and transform.,” which producers in England and the United States began to imitate, but he was also the one to tranform cinema into the narrative medium it is today. Whereas before, filmmakers had only ever created single-shot films that lasted a minute or less, Méliès began joining these short films together to create stories. His 30-scene Trip to the Moon (1902), a film based on a Jules Verne novel, may have been the most widely seen production in cinema’s first decade.David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 441. However, Méliès never developed his technique beyond treating the narrative film as a staged theatrical performance; his camera, representing the vantage point of an audience facing a stage, never moved during the filming of a scene. In 1912, Méliès released his last commercially successful production, The Conquest of the Pole, and from then on, he lost audiences to filmmakers who were experimenting with more sophisticated techniques.Encyclopedia of Communication and Information (New York: MacMillan Reference USA, 2002), s.v. “Méliès, Georges,” by Ted C. Jones, Gale Virtual Reference Library.
Figure 8.4

Georges Méliès’ Trip to the Moon was one of the first films to incorporate fantasy elements and to use “trick” filming techniques, both of which heavily influenced future filmmakers.
One of these innovative filmmakers was Edwin S. Porter, a projectionist and engineer for the Edison Company. Porter’s 12-minute film, The Great Train Robbery (1903), broke with the stagelike compositions of Méliès-style films through its use of editing, camera pans, rear projections, and diagonally composed shots that produced a continuity of action. Not only did The Great Train Robbery establish the realistic narrative as a standard in cinema, it was also the first major box-office hit. Its success paved the way for the growth of the film industry, as investors, recognizing the motion picture’s great moneymaking potential, began opening the first permanent film theaters around the country.
Known as nickelodeonsThe earliest motion picture theaters, often housed in converted storefronts. because of their 5 cent admission charge, these early motion picture theaters, often housed in converted storefronts, were especially popular among the working class of the time, who couldn’t afford live theater. Between 1904 and 1908, around 9,000 nickelodeons appeared in the United States. It was the nickelodeon’s popularity that established film as a mass entertainment medium.Dictionary of American History, 3rd ed., s.v. “Nickelodeon,” by Ryan F. Holznagel, Gale Virtual Reference Library.
As the demand for motion pictures grew, production companies were created to meet it. At the peak of nickelodeon popularity in 1910,Britannica Online, s.v. “nickelodeon.” there were 20 or so major motion picture companies in the United States. However, heated disputes often broke out among these companies over patent rights and industry control, leading even the most powerful among them to fear fragmentation that would loosen their hold on the market.Raymond Fielding, A Technological History of Motion Pictures and Television (Berkeley: California Univ. Press, 1967) 21. Because of these concerns, the 10 leading companies—including Edison, Biograph, Vitagraph, and others—formed the Motion Picture Patents Company (MPPC)A monopolistic trade agreement among the earliest major motion picture studios. in 1908. The MPPC was a trade group that pooled the most significant motion picture patents and established an exclusive contract between these companies and the Eastman Kodak Company as a supplier of film stock. Also known as the Trust, the MPPC’s goal was to standardize the industry and shut out competition through monopolistic control. Under the Trust’s licensing system, only certain licensed companies could participate in the exchange, distribution, and production of film at different levels of the industry—a shut-out tactic that eventually backfired, leading the excluded, independent distributors to organize in opposition to the Trust.Raymond Fielding, A Technological History of Motion Pictures and Television (Berkeley: California Univ. Press, 1967) 21; David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 101–102.
In these early years, theaters were still running single-reel films, which came at a standard length of 1,000 feet, allowing for about 16 minutes of playing time. However, companies began to import multiple-reel films from European producers around 1907, and the format gained popular acceptance in the United States in 1912 with Louis Mercanton’s highly successful Queen Elizabeth, a three-and-a-half reel “feature,” starring the French actress Sarah Bernhardt. As exibitors began to show more features—as the multiple-reel film came to be called—they discovered a number of advantages over the single-reel short. For one thing, audiences saw these longer films as special events and were willing to pay more for admission, and because of the popularity of the feature narrativesFeature films that tell a story., features generally experienced longer runs in theaters than their single-reel predecessors.“Pre World-War I US Cinema,” Motion Pictures: The Silent Feature: 1910-27, http://www.uv.es/EBRIT/macro/macro_5004_39_4.html#0009. Additionally, the feature film gained popularity among the middle classes, who saw its length as analogous to the more “respectable” entertainment of live theater.“Pre World-War I US Cinema,” Motion Pictures: The Silent Feature: 1910-27, http://www.uv.es/EBRIT/macro/macro_5004_39_4.html#0009. Following the example of the French film d’art, U.S. feature producers often took their material from sources that would appeal to a wealthier and better educated audience, such as histories, literature, and stage productions.David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 135, 144.
As it turns out, the feature film was one factor that brought about the eventual downfall of the MPPC. The inflexible structuring of the Trust’s exhibition and distribution system made the organization resistant to change. When movie studio, and Trust member, Vitagraph began to release features like A Tale of Two Cities (1911) and Uncle Tom’s Cabin (1910), the Trust forced it to exhibit the films serially in single-reel showings to keep with industry standards. The MPPC also underestimated the appeal of the star system, a trend that began when producers chose famous stage actors like Mary Pickford and James O’Neill to play the leading roles in their productions and to grace their advertising posters.David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 140. Because of the MPPC’s inflexibility, independent companies were the only ones able to capitalize on two important trends that were to become film’s future: single-reel features and star power. Today, few people would recognize names like Vitagraph or Biograph, but the independents that outlasted them—Universal, Goldwyn (which would later merge with Metro and Mayer), Fox (later 20th Century Fox), and Paramount (the later version of the Lasky Corporation)—have become household names.
As moviegoing increased in popularity among the middle class, and as the feature films began keeping audiences in their seats for longer periods of time, exhibitors found a need to create more comfortable and richly decorated theater spaces to attract their audiences. These “dream palaces,” so called because of their often lavish embellishments of marble, brass, guilding, and cut glass, not only came to replace the nickelodeon theater, but also created the demand that would lead to the Hollywood studio system. Some producers realized that the growing demand for new work could only be met if the films were produced on a regular, year-round system. However, this was impractical with the current system that often relied on outdoor filming and was predominately based in Chicago and New York—two cities whose weather conditions prevented outdoor filming for a significant portion of the year. Different companies attempted filming in warmer locations such as Florida, Texas, and Cuba, but the place where producers eventually found the most success was a small, industrial suburb of Los Angeles called Hollywood.
Hollywood proved to be an ideal location for a number of reasons. Not only was the climate temperate and sunny year-round, but land was plentiful and cheap, and the location allowed close access to a number of diverse topographies: mountains, lakes, desert, coasts, and forests. By 1915, more than 60 percent of U.S. film production was centered in Hollywood.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture
While the development of narrative film was largely driven by commercial factors, it is also important to acknowledge the role of individual artists who turned it into a medium of personal expression. The motion picture of the silent era was generally simplistic in nature; acted in overly animated movements to engage the eye; and accompanied by live music, played by musicians in the theater, and written titles to create a mood and to narrate a story. Within the confines of this medium, one filmmaker in particular emerged to transform the silent filmFilm without recorded sound. into an art and to unlock its potential as a medium of serious expression and persuasion. D. W. Griffith, who entered the film industry as an actor in 1907, quickly moved to a directing role in which he worked closely with his camera crew to experiment with shots, angles, and editing techniques that could heighten the emotional intensity of his scenes. He found that by practicing parallel editingAn editing technique in which a film alternates between two or more scenes of action., in which a film alternates between two or more scenes of action, he could create an illusion of simultaneity. He could then heighten the tension of the film’s drama by alternating between cuts more and more rapidly until the scenes of action converged. Griffith used this technique to great effect in his controversial film The Birth of a Nation, which will be discussed in greater detail later on in this chapter. Other techniques that Griffith employed to new effect included panning shotsShots that turn the camera horizontally, vertically, or diagonally., through which he was able to establish a sense of scene and to engage his audience more fully in the experience of the film, and tracking shotsShots that travel with the movement of a scene., or shots that traveled with the movement of a scene,“Griffith,” Motion Pictures, http://www.uv.es/EBRIT/macro/macro_5004_39_6.html#0011. which allowed the audience—through the eye of the camera—to participate in the film’s action.
As film became an increasingly lucrative U.S. industry, prominent industry figures like D. W. Griffith, slapstick comedian/director Charlie Chaplin, and actors Mary Pickford and Douglas Fairbanks grew extremely wealthy and influential. Public attitudes toward stars and toward some stars’ extravagant lifestyles were divided, much as they are today: On the one hand, these celebrities were idolized and imitated in popular culture, yet at the same time, they were criticized for representing a threat, on and off screen, to traditional morals and social order. And much as it does today, the news media liked to sensationalize the lives of celebrities to sell stories. Comedian Roscoe “Fatty” Arbuckle, who worked alongside future icons Charlie Chaplin and Buster Keaton, was at the center of one of the biggest scandals of the silent era. When Arbuckle hosted a marathon party over Labor Day weekend in 1921, one of his guests, model Virginia Rapp, was rushed to the hospital, where she later died. Reports of a drunken orgy, rape, and murder surfaced. Following World War I, the United States was in the middle of significant social reforms, such as Prohibition. Many feared that movies and their stars could threaten the moral order of the country. Because of the nature of the crime and the celebrity involved, these fears became inexplicably tied to the Artbuckle case.“Post World War I US Cinema,” Motion Pictures, http://www.uv.es/EBRIT/macro/macro_5004_39_10.html#0015. Even though autopsy reports ruled that Rapp had died from causes for which Arbuckle could not be blamed, the comedian was tried (and acquitted) for manslaughter, and his career was ruined.
The Arbuckle affair and a series of other scandals only increased public fears about Hollywood’s impact. In response to this perceived threat, state and local governments increasingly tried to censor the content of films that depicted crime, violence, and sexually explicit material. Deciding that they needed to protect themselves from government censorship and to foster a more favorable public image, the major Hollywood studios organized in 1922 to form an association they called the Motion Picture Producers and Distributers of America (later renamed the Motion Picture Association of America,An association of major Hollywood studios designed to set industry standards and give filmmakers artistic freedom. or MPAA). Among other things, the MPAA instituted a code of self-censorship for the motion picture industry. Today, the MPAA operates by a voluntary rating system, which means producers can voluntarily submit a film for review, which is designed to alert viewers to the age-apropriateness of a film, while still protecting the filmmakers’ artistic freedom.Motion Picture Association of America, “History of the MPAA,” http://www.mpaa.org/about/history.
In 1925, Warner Bros. was just a small Hollywood studio looking for opportunities to expand. When representatives from Western Electric offered to sell the studio the rights to a new technology they called Vitaphone, a sound-on-disc system that had failed to capture the interest of any of the industry giants, Warner Bros. executives took a chance, predicting that the novelty of talking films might be a way to make a quick, short-term profit. Little did they anticipate that their gamble would not only establish them as a major Hollywood presence but also change the industry forever.
The pairing of sound with motion pictures was nothing new in itself. Edison, after all, had commisioned the kinetoscope to create a visual accompaniment to the phonograph, and many early theaters had orchestra pits to provide musical accompaniment to their films. Even the smaller picture houses with lower budgets almost always had an organ or piano. When Warner Bros. purchased Vitaphone technology, it planned to use it to provide prerecorded orchestral accompaniment for its films, thereby increasing their marketability to the smaller theaters that didn’t have their own orchestra pits.Phil Gochenour, “Birth of the ‘Talkies’: The Development of Synchronized Sound for Motion Pictures,” in Science and Its Times, vol. 6, 1900–1950, ed. Neil Schlager and Josh Lauer (Detroit: Gale, 2000), 577. In 1926, Warner debuted the system with the release of Don Juan, a costume drama accompanied by a recording of the New York Philharmonic Orchestra; the public responded enthusiastically.“Pre World War II Sound Era: Introduction of Sound,” Motion Pictures, http://www.uv.es/EBRIT/macro/macro_5004_39_11.html#0017. By 1927, after a $3 million campaign, Warner Bros. had wired more than 150 theaters in the United States, and it released its second sound film, The Jazz Singer, in which the actor Al Jolson improvised a few lines of synchronized dialogue and sang six songs. The film was a major breakthrough. Audiences, hearing an actor speak on screen for the first time, were enchanted.Phil Gochenour, “Birth of the ‘Talkies’: The Development of Synchronized Sound for Motion Pictures,” in Science and Its Times, vol. 6, 1900–1950, ed. Neil Schlager and Josh Lauer (Detroit: Gale, 2000), 578. While radio, a new and popular entertainment, had been drawing audiences away from the picture houses for some time, with the birth of the “talkieThe name people used for the earliest talking films.,” or talking film, audiences once again returned to the cinema in large numbers, lured by the promise of seeing and hearing their idols perform.Charles Higham. The Art of the American Film: 1900–1971. (Garden City: Doubleday & Company, 1973), 85. By 1929, three-fourths of Hollywood films had some form of sound accompaniment, and by 1930, the silent film was a thing of the past.Phil Gochenour, “Birth of the ‘Talkies’: The Development of Synchronized Sound for Motion Pictures,” in Science and Its Times, vol. 6, 1900–1950, ed. Neil Schlager and Josh Lauer (Detroit: Gale, 2000), 578.
Although the techniques of tinting and hand painting had been available methods for adding color to films for some time (Georges Méliès, for instance, employed a crew to hand-paint many of his films), neither method ever caught on. The hand-painting technique became impractical with the advent of mass-produced film, and the tinting process, which filmmakers discovered would create an interference with the transmission of sound in films, was abandoned with the rise of the talkie. However, in 1922, Herbert Kalmus’ Technicolor company introduced a dye-transfer technique that allowed it to produce a full-length film, The Toll of the Sea, in two primary colors.“Motion Pictures in Color,” in American Decades, ed. Judith S. Baughman and others, vol. 3, Gale Virtual Reference Library. However, because only two colors were used, the appearance of The Toll of the Sea (1922), The Ten Commandments (1923), and other early Technicolor films was not very lifelike. By 1932, Technicolor had designed a three-color system with more realistic results, and for the next 25 years, all color films were produced with this improved system. Disney’s Three Little Pigs (1933) and Snow White and the Seven Dwarves (1936) and films with live actors, like MGM’s The Wizard of Oz (1939) and Gone With the Wind (1939), experienced early success using Technicolor’s three-color method.
Despite the success of certain color films in the 1930s, Hollywood, like the rest of the United States, was feeling the impact of the Great Depression, and the expenses of special cameras, crews, and Technicolor lab processing made color films impractical for studios trying to cut costs. Therefore, it wasn’t until the end of the 1940s that Technicolor would largely displace the black-and-white film.“Motion Pictures in Color,” in American Decades, ed. Judith S. Baughman and others, vol. 3, Gale Virtual Reference Library.
The spike in theater attendance that followed the introduction of talking films changed the economic structure of the motion picture industry, bringing about some of the largest mergers in industry history. By 1930, eight studios produced 95 percent of all American films, and they continued to experience growth even during the Depression. The five most influential of these studios—Warner Bros., Metro-Goldwyn-Mayer, RKO, 20th Century Fox, and Paramount—were vertically integratedA form of organization in which studios controlled every aspect of production as it related to their films.; that is, they controlled every part of the system as it related to their films, from the production, to release, distribution, and even viewing. Because they owned theater chains worldwide, these studios controlled which movies exhibitors ran, and because they “owned” a stock of directors, actors, writers, and technical assistants by contract, each studio produced films of a particular character.
The late 1930s and early 1940s are sometimes known as the “Golden AgePeriod in the late 1930s and early 1940s when the movie industry found unparalled success in terms of attendance and production.” of cinema, a time of unparalleled success for the movie industry; by 1939, film was the 11th-largest industry in the United States, and during World War II, when the U.S. economy was once again flourishing, two-thirds of Americans were attending the theater at least once a week.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture Some of the most acclaimed movies in history where released during this period, including Citizen Kane and The Grapes of Wrath. However, postwar inflation, a temporary loss of key foreign markets, the advent of the television, and other factors combined to bring that rapid growth to an end. In 1948, the case of the United States v. Paramount Pictures—mandating competition and forcing the studios to relinquish control over theater chains—dealt the final devastating blow from which the studio system would never recover. Control of the major studios reverted to Wall Street, where the studios were eventually absorbed by multinational corporations, and the powerful studio heads lost the influence they had held for nearly 30 years.Michael Baers, “Studio System,” in St. James Encyclopedia of Popular Culture, ed. Sara Pendergast and Tom Pendergast (Detroit: St. James Press, 2000), vol. 4, 565.
Figure 8.5

Rise and Decline of Movie Viewing During Hollywood’s “Golden Age”
While economic factors and antitrust legislation played key roles in the decline of the studio system, perhaps the most important factor in that decline was the advent of the television. Given the opportunity to watch “movies” from the comfort of their own homes, the millions of Americans who owned a television by the early 1950s were attending the cinema far less regularly than they had only several years earlier.“The War Years and Post World War II Trends: Decline of the Hollywood Studios,” Motion Pictures, http://www.uv.es/EBRIT/macro/macro_5004_39_24.html#0030. In an attempt to win back diminishing audiences, studios did their best to exploit the greatest advantages film held over television. For one thing, television broadcasting in the 1950s was all in black and white, whereas the film industry had the advantage of color. While producing a color film was still an expensive undertaking in the late 1940s, a couple of changes occurred in the industry in the early 1950s to make color not only more affordable but also more realistic in its appearance. In 1950, as the result of antitrust legislation, Technicolor lost its monopoly on the color film industry, allowing other providers to offer more competitive pricing on filming and processing services. At the same time, Kodak came out with a multilayer film stock that made it possible to use more affordable cameras and to produce a higher quality image. Kodak’s Eastmancolor option was an integral component in converting the industry to color. In the late 1940s, only 12 percent of features were in color; however, by 1954 (after the release of Kodak Eastmancolor) more than 50 percent of movies were in color.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture
Another clear advantage on which filmmakers tried to capitalize was the sheer size of the cinema experience. With the release of the epic biblical film The Robe in 1953, 20th Century Fox introduced the method that would soon be adopted by nearly every studio in Hollywood: a technology that allowed filmmakers to squeeze a wide-angle image onto conventional 35-mm film stock, thereby increasing the aspect ratioWidth-to-height ratio of a film. (the ratio of a screen’s width to its height) of their images. This wide-screen format increased the immersive quality of the theater experience. Nonetheless, even with these advancements, movie attendance never again reached the record numbers it experienced in 1946, at the peak of the Golden Age of Hollywood.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture; David Robinson, From Peep Show to Palace: The Birth of American Film (New York: Columbia University Press, 1994), 45, 53.
The Cold War with the Soviet Union began in 1947, and with it came the widespread fear of communism, not only from the outside, but equally from within. To undermine this perceived threat, the House Un-American Activities Committee (HUAC) commenced investigations to locate communist sympathizers in America, who were suspected of conducting espionage for the Soviet Union. In the highly conservative and paranoid atmosphere of the time, Hollywood, the source of a mass-cultural medium, came under fire in response to fears that subversive, communist messages were being embedded in films. In November 1947, more than 100 people in the movie business were called to testify before the HUAC about their and their colleagues’ involvement with communist affairs. Of those investigated, 10 in particular refused to cooperate with the committee’s questions. These 10, later known as the Hollywood Ten, were fired from their jobs and sentenced to serve up to a year in prison. The studios, already slipping in influence and profit, were eager to cooperate in order to save themselves, and a number of producers signed an agreement stating that no communists would work in Hollywood.
The hearings, which recommenced in 1951 with the rise of Senator Joseph McCarthy’s influence, turned into a kind of witch hunt as witnesses were asked to testify against their associates, and a blacklist of suspected communists evolved. Over 324 individuals lost their jobs in the film industry as a result of blacklisting (the denial of work in a certain field or industry) and HUAC investigations.Dan Georgakas, “Hollywood Blacklist,” in Encyclopedia of the American Left, ed. Mari Jo Buhle, Paul Buhle, and Dan Georgakas, 2004, http://writing.upenn.edu/~afilreis/50s/blacklist.html; Michael Mills, “Blacklist: A Different Look at the 1947 HUAC Hearings,” Modern Times, 2007, http://www.moderntimes.com/blacklist/; Kathleen Dresler, Kari Lewis, Tiffany Schoser and Cathy Nordine, “The Hollywood Ten,” Dalton Trumbo, 2005, http://www.mcpld.org/trumbo/WebPages/hollywoodten.htm.
Movies of the late 1960s began attracting a younger demographic, as a growing number of young people were drawn in by films like Sam Peckinpah’s The Wild Bunch (1969), Stanley Kubrick’s 2001: A Space Odyssey (1968), Arthur Penn’s Bonnie and Clyde (1967), and Dennis Hopper’s Easy Rider (1969)—all revolutionary in their genres—that displayed a sentiment of unrest toward conventional social orders and included some of the earliest instances of realistic and brutal violence in film. These four films in particular grossed so much money at the box offices that producers began churning out low-budget copycats to draw in a new, profitable market.“Recent Trends in US Cinema,” Motion Pictures, http://www.uv.es/EBRIT/macro/macro_5004_39_37.html#0045. While this led to a rise in youth-culture films, few of them saw great success. However, the new liberal attitudes toward depictions of sex and violence in these films represented a sea of change in the movie industry that manifested in many movies of the 1970s, including Francis Ford Coppola’s The Godfather (1972), William Friedkin’s The Exorcist (1973), and Steven Spielberg’s Jaws (1975), all three of which saw great financial success.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture; John Belton, American Cinema/American Culture. (New York: McGraw-Hill, 1994), 284–290.
In the 1970s, with the rise of work by Coppola, Spielberg, George Lucas, Martin Scorsese, and others, a new breed of director emerged. These directors were young and film-school educated, and they contributed a sense of professionalism, sophistication, and technical mastery to their work, leading to a wave of blockbuster productions, including Close Encounters of the Third Kind (1977), Star Wars (1977), Raiders of the Lost Ark (1981), and E.T.: The Extra-Terrestrial (1982). The computer-generated special effects that were available at this time also contributed to the success of a number of large-budget productions. In response to these and several earlier blockbusters, movie production and marketing techniques also began to shift, with studios investing more money in fewer films in the hopes of producing more big successes. For the first time, the hefty sums producers and distributers invested didn’t go to production costs alone; distrubuters were discovering the benefits of TV and radio advertising and finding that doubling their advertising costs could increase profits as much as three or four times over. With the opening of Jaws, one of the five top-grossing films of the decade (and the highest grossing film of all time until the release of Star Wars in 1977), Hollywood embraced the wide-release method of movie distribution, abandoning the release methods of earlier decades, in which a film would debut in only a handful of select theaters in major cities before it became gradually available to mass audiences. Jaws was released in 600 theaters simultaneously, and the big-budget films that followed came out in anywhere from 800 to 2,000 theaters nationwide on their opening weekends.John Belton, American Cinema/American Culture. (New York: McGraw-Hill, 1994), 305; Steve Hanson and Sandra Garcia-Myers, “Blockbusters,” in St. James Encyclopedia of Popular Culture, ed. Sara Pendergast and Tom Pendergast (Detroit: St. James Press, 2000), vol. 1, 282.
The major Hollywood studios of the late 1970s and early 1980s, now run by international corporations, tended to favor the conservative gamble of the tried and true, and as a result, the period saw an unprecedented number of high-budget sequels—as in the Star Wars, Indiana Jones, and Godfather films—as well as immitations and adaptations of earlier successful material, such as the plethora of “slasher” films that followed the success of the 1979 thriller Halloween. Additionally, corporations sought revenue sources beyond the movie theater, looking to the video and cable releases of their films. Introduced in 1975, the VCR became nearly ubiquitous in American homes by 1998 with 88.9 million households owning the appliance.Karen Rosen and Alan Meier, “Power Measurements and National Energy Consumption of Televisions and Video Cassette Recorders in the USA,” Energy, 25, no. 3 (2000), 220. Cable television’s growth was slower, but ownership of VCRs gave people a new reason to subscribe, and cable subsequently expanded as well.Everett Rogers, “Video is Here to Stay,” Center for Media Literacy, http://www.medialit.org/reading-room/video-here-stay. And the newly introduced concept of film-based merchandise (toys, games, books, etc.) allowed companies to increase profits even more.
The 1990s saw the rise of two divergent strands of cinema: the technically spectacular blockbuster with special, computer-generated effects and the independent, low-budget film. The capabilities of special effects were enhanced when studios began manipulating film digitally. Early examples of this technology can be seen in Terminator 2: Judgment Day (1991) and Jurassic Park (1993). Films with an epic scope—Independence Day (1996), Titanic (1997), and The Matrix (1999)—also employed a range of computer-animation techniques and special effects to wow audiences and to draw more viewers to the big screen. Toy Story (1995), the first fully computer-animated film, and those that came after it, such as Antz (1998), A Bug’s Life (1998), and Toy Story 2 (1999), displayed the improved capabilities of computer-generated animation. David Sedman, “Film Industry, Technology of,” in Encyclopedia of Communication and Information, ed. Jorge Reina Schement (New York: MacMillan Reference, 2000), vol. 1, 340. At the same time, independent directors and producers, such as the Coen brothers and Spike Jonze, experienced an increased popularity, often for lower-budget films that audiences were more likely to watch on video at home.Britannica Online. s.v. “History of the Motion Picture.” http://www.britannica.com/EBchecked/topic/394161/history-of-the-motion picture A prime example of this is the 1996 Academy Awards program, when independent films dominated the Best Picture category. Only one movie from a big film studio was nominated—Jerry Maguire—and the rest were independent films. The growth of both independent movies and special-effects-laden blockbusters continues to the present day. You will read more about current issues and trends and the future of the movie industry later on in this chapter.
Identify four films that you would consider to be representative of major developments in the industry and in film as a medium that were outlined in this section. Imagine you are using these films to explain movie history to a friend. Provide a detailed explanation of why each of these films represents significant changes in attitudes, technology, or trends and situate each in the overall context of film’s development. Consider the following questions:
The relationship between movies and culture involves a complicated dynamic; while American movies certainly influence the mass culture that consumes them, they are also an integral part of that culture, a product of it, and therefore a reflection of prevailing concerns, attitudes, and beliefs. In considering the relationship between film and culture, it is important to keep in mind that, while certain ideologies may be prevalent in a given era, not only is American culture as diverse as the populations that form it, but it is also constantly changing from one period to the next. Mainstream films produced in the late 1940s and into the 1950s, for example, reflected the conservatism that dominated the sociopolitical arenas of the time. However, by the 1960s, a reactionary youth culture began to emerge in opposition to the dominant institutions, and these antiestablishment views soon found their way onto screen—a far cry from the attitudes most commonly represented only a few years earlier.
In one sense, movies could be characterized as America’s storytellers. Not only do Hollywood films reflect certain commonly held attitudes and beliefs about what it means to be American, but they also portray contemporary trends, issues, and events, serving as records of the eras in which they were produced. Consider, for example, films about the September 11, 2001, terrorist attacks: Fahrenheit 9/11, World Trade Center, United 93, and others. These films grew out of a seminal event of the time, one that preoccupied the consciousness of Americans for years after it occurred.
In 1915, director D. W. Griffith established his reputation with the highly successful film The Birth of a Nation, based on Thomas Dixon’s novel The Clansman, a prosegregation narrative about the American South during and after the Civil War. At the time, The Birth of a Nation was the longest feature film ever made, at almost 3 hours, and contained huge battle scenes that amazed and delighted audiences. Griffith’s storytelling ability helped solidify the narrative style that would go on to dominate feature films. He also experimented with editing techniques such as close-ups, jump cuts, and parallel editing that helped make the film an artistic achievement.
Griffith’s film found success largely because it captured the social and cultural tensions of the era. As American studies specialist Lary May has argued, “[Griffith’s] films dramatized every major concern of the day.”Lary May, “Apocalyptic Cinema: D. W. Griffith and the Aesthetics of Reform,” in Movies and Mass Culture, ed. John Belton (New Brunswick, NJ: Rutgers University Press, 1997), 26. In the early 20th century, fears about recent waves of immigrants had led to certain racist attitudes in mass culture, with “scientific” theories of the time purporting to link race with inborn traits like intelligence and other capabilities. Additionally, the dominant political climate, largely a reaction against populist labor movements, was one of conservative elitism, eager to attribute social inequalities to natural human differences.“Birth of a Nation,” Encyclopedia of the Social Sciences, 2nd ed., ed. William A. Darity, Jr., Gale Virtual Reference Library, 1:305–306. According to a report by the New York Evening Post after the film’s release, even some Northern audiences “clapped when the masked riders took vengeance on Negroes.”Charles Higham. The Art of the American Film: 1900–1971. (Garden City: Doubleday & Company, 1973), 13. However, the outrage many groups expressed about the film is a good reminder that American culture is not monolithic, that there are always strong contingents in opposition to dominant ideologies.
While critics praised the film for its narrative complexity and epic scope, many others were outraged and even started riots at several screenings because of its highly controversial, openly racist attitudes, which glorified the Ku Klux Klan and blamed Southern blacks for the destruction of the war.Charles Higham. The Art of the American Film: 1900–1971. (Garden City: Doubleday & Company, 1973), 10–11. Many Americans joined the National Association for the Advancement of Colored People (NAACP) in denouncing the film, and the National Board of Review eventually cut a number of the film’s racist sections.Lary May, “Apocalyptic Cinema: D. W. Griffith and the Aesthetics of Reform,” in Movies and Mass Culture, ed. John Belton (New Brunswick, NJ: Rutgers University Press, 1997), 46. However, it’s important to keep in mind the attitudes of the early 1900s. At the time the nation was divided, and Jim Crow laws and segregation were enforced. Nonetheless, The Birth of a Nation was the highest grossing movie of its era. In 1992, the film was classified by the Library of Congress among the “culturally, historically, or aesthetically significant films” in U.S. history.
Figure 8.6

The Birth of a Nation expressed racial tensions of the early 20th century.
Until the bombing of Pearl Harbor in 1941, American films after World War I generally reflected the neutral, isolationist stance that prevailed in politics and culture. However, after the United States was drawn into the war in Europe, the government enlisted Hollywood to help with the war effort, opening the federal Bureau of Motion Picture Affairs in Los Angeles. Bureau officials served in an advisory capacity on the production of war-related films, an effort with which the studios cooperated. As a result, films tended toward the patriotic and were produced to inspire feelings of pride and confidence in being American and to clearly establish that America and its allies were forced of good. For instance, critically acclaimed Casablanca paints a picture of the ill effects of fascism, illustrates the values that heroes like Victor Laszlo hold, and depicts America as a place for refugees to find democracy and freedom.Review of Casablanca, directed by Michael Curtiz, Digital History, http://www.digitalhistory.uh.edu/historyonline/bureau_casablanca.cfm.
These early World War II films were sometimes overtly propagandist, intended to influence American attitudes rather than present a genuine reflection of American sentiments toward the war. Frank Capra’s Why We Fight films, for example, the first of which was produced in 1942, were developed for the U.S. Army and were later shown to general audiences; they delivered a war message through narrative.Clayton R. Koppes and Gregory D. Black, Hollywood Goes to War: How Politics, Profits and Propaganda Shaped World War II Movies (Los Angeles: The Free Press, 1987), 122. As the war continued, however, filmmakers opted to forego patriotic themes for a more serious reflection of American sentiments, as exemplified by films like Alfred Hitchcock’s Lifeboat.
In Mike Nichols’s 1967 film The Graduate, Dustin Hoffman, as the film’s protagonist, enters into a romantic affair with the wife of his father’s business partner. However, Mrs. Robinson and the other adults in the film fail to understand the young, alienated hero, who eventually rebels against them. The Graduate, which brought in more than $44 million at the box office, reflected the attitudes of many members of a young generation growing increasingly dissatisfied with what they perceived to be the repressive social codes established by their more conservative elders.Tim Dirks, review of The Graduate, directed by Mike Nichols, Filmsite, http://www.filmsite.org/grad.html.
This baby boomer generation came of age during the Korean and Vietnam wars. Not only did the youth culture express a cynicism toward the patriotic, prowar stance of their World War II–era elders, but they displayed a fierce resistance toward institutional authority in general, an antiestablishmentism epitomized in the 1967 hit film Bonnie and Clyde. In the film, a young, outlaw couple sets out on a cross-country bank-robbing spree until they’re killed in a violent police ambush at the film’s close.John Belton, American Cinema/American Culture. (New York: McGraw-Hill, 1994), 286.
Figure 8.7

Bonnie and Clyde reflected the attitudes of a rising youth culture.
Bonnie and Clyde’s violence provides one example of the ways films at the time were testing the limits of permissible on-screen material. The youth culture’s liberal attitudes toward formally taboo subjects like sexuality and drugs began to emerge in film during the late 1960s. Like Bonnie and Clyde, Sam Peckinpah’s 1969 Western The Wild Bunch, displays an early example of aestheticized violence in film. The wildly popular Easy Rider (1969)—containing drugs, sex, and violence—may owe a good deal of its initial success to liberalized audiences. And in the same year, Midnight Cowboy, one of the first Hollywood films to receive an X rating (in this case for its sexual content), won three Academy Award awards, including Best Picture.John Belton, American Cinema/American Culture. (New York: McGraw-Hill, 1994), 288–89. As the release and subsequently successful reception of these films attest, what at the decade’s outset had been countercultural had, by the decade’s close, become mainstream.
When the MPAA (originally MPPDA) first banded together in 1922 to combat government censorship and to promote artistic freedom, the association attempted a system of self-regulation. However, by 1930—in part because of the transition to talking pictures—renewed criticism and calls for censorship from conservative groups made it clear to the MPPDA that the loose system of self-regulation was not enough protection. As a result, the MPPDA instituted the Production Code, or Hays Code (after MPPDA director William H. Hays), which remained in place until 1967. The code, which according to motion picture producers concerned itself with ensuring that movies were “directly responsible for spiritual or moral progress, for higher types of social life, and for much correct thinking,”“Complete Nudity is Never Permitted: The Motion Picture Code of 1930,” http://historymatters.gmu.edu/d/5099/. was strictly enforced starting in 1934, putting an end to most public complaints. However, many people in Hollywood resented its restrictiveness. After a series of Supreme Court cases in the 1950s regarding the code’s restrictions to freedom of speech, the Production Code grew weaker until it was finally replaced in 1967 with the MPAA rating system.“The Production Code of the Motion Picture Producers and Distributers of America, Inc.—1930–1934,” American Decades Primary Sources, ed. Cynthia Rose (Detroit: Gale, 2004), vol. 4, 12–15.
As films like Bonnie and Clyde and Who’s Afraid of Virginia Woolf? (1966) tested the limits on violence and language, it became clear that the Production Code was in need of replacement. In 1968, the MPAA adopted a ratings sytem to identify films in terms of potentially objectionable content. By providing officially designated categories for films that would not have passed Production Code standards of the past, the MPAA opened a way for films to deal openly with mature content. The ratings system originally included four categories: G (suitable for general audiences), M (equivalent to the PG rating of today), R (restricted to adults over age 16), and X (equivalent to today’s NC-17).
The MPAA rating systems, with some modifications, is still in place today. Before release in theaters, films are submitted to the MPAA board for a screening, during which advisers decide on the most apropriate rating based on the film’s content. However, studios are not required to have the MPAA screen releases ahead of time—some studios release films without the MPAA rating at all. Commercially, less restrictive ratings are generally more beneficial, particularly in the case of adult-themed films that have the potential to earn the most restrictive rating, the NC-17. Some movie theaters will not screen a movie that is rated NC-17. When filmmakers get a more restrictive rating than they were hoping for, they may resubmit the film for review after editing out objectionable scenes.Kirby Dick, interview by Terry Gross, Fresh Air, NPR, September 13, 2006, http://www.npr.org/templates/story/story.php?storyId=6068009.
Unlike the patriotic war films of the World War II era, many of the films about U.S. involvement in Vietnam reflected strong antiwar sentiment, criticizing American political policy and portraying war’s damaging effects on those who survived it. Films like Dr. Strangelove (1964), M*A*S*H (1970), The Deer Hunter (1978), and Apocalypse Now (1979) portray the military establishment in a negative light and dissolve clear-cut distinctions, such as the “us versus them” mentality, of earlier war films. These, and the dozens of Vietnam War films that were produced in the 1970s and 1980s—Oliver Stone’s Platoon (1986) and Born on the Fourth of July (1989) and Stanley Kubrick’s Full Metal Jacket (1987), for example—reflect the sense of defeat and lack of closure Americans felt after the Vietnam War and the emotional and psychological scars it left on the nation’s psyche.Tim Dirks, “1980s Film History,” Filmsite, 2010, http://www.filmsite.org; Michael Anderegg, introduction to Inventing Vietnam: The War in Film and Television, ed. Michael Anderegg (Philadelphia: Temple University Press, 1991), 6–8. A spate of military and politically themed films emerged during the 1980s as America recoverd from defeat in Vietnam, while at the same time facing anxieties about the ongoing Cold War with the Soviet Union.
Fears about the possibility of nuclear war were very real during the 1980s, and some film critics argue that these anxieties were reflected not only in overtly political films of the time but also in the popularity of horror films, like Halloween and Friday the 13th, which feature a mysterious and unkillable monster, and in the popularity of the fantastic in films like E.T.: The Extra-Terrestrial, Raiders of the Lost Ark, and Star Wars, which offer imaginative escapes.Robin Wood, Hollywood from Vietnam to Reagan (New York: Columbia University Press, 1986), 168.
Just as movies reflect the anxieties, beliefs, and values of the cultures that produce them, they also help to shape and solidify a culture’s beliefs. Sometimes the influence is trivial, as in the case of fashion trends or figures of speech. After the release of Flashdance in 1983, for instance, torn T-shirts and leg warmers became hallmarks of the fashion of the 1980s.Diana Pemberton-Sikes, “15 Movies That Inspired Fashion Trends,” The Clothing Chronicles, March 3, 2006, http://www.theclothingchronicles.com/archives/217-03032006.htm. However, sometimes the impact can be profound, leading to social or political reform, or the shaping of ideologies.
During the 1890s and up until about 1920, American culture experienced a period of rapid industrialization. As people moved from farms to centers of industrial production, urban areas began to hold larger and larger concentrations of the population. At the same time, film and other methods of mass communication (advertising and radio) developed, whose messages concerning tastes, desires, customs, speech, and behavior spread from these population centers to outlying areas across the country. The effect of early mass-communication media was to wear away regional differences and create a more homogenized, standardized culture.
Film played a key role in this development, as viewers began to imitate the speech, dress, and behavior of their common heroes on the silver screen.Steven Mintz, “The Formation of Modern American Mass Culture,” Digital History, 2007, http://www.digitalhistory.uh.edu/database/article_display.cfm?HHID=455. In 1911, the Vitagraph company began publishing The Motion Picture Magazine, America’s first fan magazine. Originally conceived as a marketing tool to keep audiences interested in Vitagraph’s pictures and major actors, The Motion Picture Magazine helped create the concept of the film star in the American imagination. Fans became obsessed with the off-screen lives of their favorite celebrities, like Pearl White, Florence Lawrence, and Mary Pickford.Jack Doyle, “A Star is Born: 1910s,” Pop History Dig, 2008, http://www.pophistorydig.com/?tag=film-stars-mass-culture.
American identity in mass society is built around certain commonly held beliefs, or myths about shared experiences, and these American myths are often disseminated through or reinforced by film. One example of a popular American myth, one that dates back to the writings of Thomas Jefferson and other founders, is an emphasis on individualism—a celebration of the common man or woman as a hero or reformer. With the rise of mass culture, the myth of the individual became increasingly appealing because it provided people with a sense of autonomy and individuality in the face of an increasingly homogenized culture. The hero myth finds embodiment in the Western, a film genre that was popular from the silent era through the 1960s, in which the lone cowboy, a seminomadic wanderer makes his way in a lawless, and often dangerous, frontier. An example is 1952’s High Noon. From 1926 until 1967, Westerns accounted for nearly a quarter of all films produced. In other films, like Frank Capra’s 1946 movie It’s a Wonderful Life, the individual triumphs by standing up to injustice, reinforcing the belief that one person can make a difference in the world.John Belton, introduction to Movies and Mass Culture, ed. John Belton, 12. And in more recent films, hero figures such as Indiana Jones, Luke Skywalker (Star Wars), and Neo (The Matrix) have continued to emphasize individualism.
As D. W. Griffith recognized nearly a century ago, film has enormous power as a medium to influence public opinion. Ever since Griffith’s The Birth of a Nation sparked strong public reactions in 1915, filmmakers have been producing movies that address social issues, sometimes subtly, and sometimes very directly. More recently, films like Hotel Rwanda (2004), about the 1994 Rwandan genocide, or The Kite Runner (2007), a story that takes place in the midst of a war-torn Afghanistan, have captured audience imaginations by telling stories that raise social awareness about world events. And a number of documentary films directed at social issues have had a strong influence on cultural attitudes and have brought about significant change.
In the 2000s, documentaries, particularly those of an activist nature, were met with greater interest than ever before. Films like Super Size Me (2004), which documents the effects of excessive fast-food consumption and criticizes the fast-food industry for promoting unhealthy eating habits for profit, and Food, Inc. (2009), which examines corporate farming practices and points to the negative impact these practices can have on human health and the environment, have brought about important changes in American food culture.Kim Severson, “Eat, Drink, Think, Change,” New York Times, June 3, 2009, http://www.nytimes.com/2009/06/07/movies/07seve.html. Just 6 weeks after the release of Super Size Me, McDonald’s took the supersize option off its menu and since 2004 has introduced a number of healthy food options in its restaurants.Suemedha Sood, “Weighing the Impact of ‘Super Size Me,’” Wiretap, June 29, 2004. http://www.wiretapmag.org/stories/19059/. Other fast-food chains have made similar changes.Suemedha Sood, “Weighing the Impact of ‘Super Size Me,’” Wiretap, June 29, 2004. http://www.wiretapmag.org/stories/19059/.
Other documentaries intended to influence cultural attitudes and inspire change include those made by director Michael Moore. Moore’s films present a liberal stance on social and political issues such as health care, globalization, and gun control. His 2002 film Bowling for Columbine, for example, addressed the Columbine High School shootings of 1999, presenting a critical examination of American gun culture. While some critics have accused Moore of producing propagandistic material under the label of documentary because of his films’ strong biases, his films have been popular with audiences, with four of his documentaries ranking among the highest grossing documentaries of all time. Fahrenheit 9/11 (2004), which criticized the second Bush administration and its involvement in the Iraq War, earned $119 million at the box office, making it the most successful documentary of all time.Tim Dirks, “Film History of the 2000s,” Filmsite; Washington Post, “The 10 Highest-Grossing Documentaries,” July 31, 2006, http://www.washingtonpost.com/wp-dyn/content/graphic/2006/07/31/GR2006073100027.html.
Filmmaking is both a commercial and artistic venture. The current economic situation in the film industry, with increased production and marketing costs and lower audience turnouts in theaters, often sets the standard for the films big studios are willing to invest in. If you wonder why theaters have released so many remakes and sequels in recent years, this section may help you to understand the motivating factors behind those decisions.
In the movie industry today, publicity and product are two sides of the same coin. Even films that get a lousy critical reception can do extremely well in ticket sales if their marketing campaigns manage to create enough hype. Similarly, two comparable films can produce very different results at the box office if they have been given different levels of publicity. This explains why the film What Women Want, starring Mel Gibson, brought in $33.6 million in its opening weekend in 2000, while a few months later, The Million Dollar Hotel, also starring Gibson, only brought in $29,483 during its opening weekend.Nash Information Services, “What Women Want,” The Numbers: Box Office Data, Movie Stars, Idle Speculation, http://www.the-numbers.com/2000/WWWNT.php; Nash Information Services, “The Million Dollar Hotel,” The Numbers: Box Office Data, Movie Stars, Idle Speculation, http://www.the-numbers.com/2001/BHOTL.php. Unlike in the days of the Hollywood studio system, no longer do the actors alone draw audiences to a movie. The owners of the nation’s major movie theater chains are keenly aware that a film’s success at the box office has everything to do with studio-generated marketing and publicity. What Women Want was produced by Paramount, one of the film industry’s six leading studios, and widely released (on 3,000 screens) after an extensive marketing effort, while The Million Dollar Hotel was produced by Lionsgate, an independent studio without the necessary marketing budget to fill enough seats for a wide release on opening weekend.Edward Jay Epstein, “Neither the Power nor the Glory: Why Hollywood Leaves Originality to the Indies,” Slate, October 17, 2005, http://www.slate.com/id/2128200.
The Hollywood “dream factory,” as Hortense Powdermaker labeled it in her 1950 book on the movie industry,Hortense Powdermaker, “Hollywood, the Dream Factory,” http://astro.temple.edu/~ruby/wava/powder/intro.html. manufactures an experience that is part art and part commercial product.Caryn James, “Critic’s Notebook: Romanticizing Hollywood’s Dream Factory,” New York Times, November 7, 1989, http://www.nytimes.com/1989/11/07/movies/critic-s-notebook-romanticizing-hollywood-s-dream-factory.html. While the studios of today are less factory-like than they were in the vertically integrated studio system era, the coordinated efforts of a film’s production team can still be likened to a machine calibrated for mass production. The films the studios churn out are the result of a capitalist enterprise that ultimately looks to the “bottom line” to guide most major decisions. Hollywood is an industry, and as in any other industry in a mass market, its success relies on control of production resources and “raw materials” and on its access to mass distribution and marketing strategies to maximize the product’s reach and minimize competition.John Belton, American Cinema/American Culture. (New York: McGraw-Hill, 1994), 61–62. In this way, Hollywood has an enormous influence on the films to which the public has access.
Ever since the rise of the studio system in the 1930s, the majority of films have originated with the leading Hollywood studios. Today, the six big studios control 95 percent of the film business.Kirby Dick, interview by Terry Gross, Fresh Air, NPR, September 13, 2006, http://www.npr.org/templates/story/story.php?storyId=6068009. In the early years, audiences were familiar with the major studios, their collections of actors and directors, and the types of films that each studio was likely to release. All of that changed with the decline of the studio system; screenwriters, directors, scripts, and cinematographers no longer worked exclusively with one studio, so these days, while moviegoers are likely to know the name of a film’s director and major actors, it’s unusual for them to identify a film with the studio that distributes it. However, studios are no less influential. The previews of coming attractions that play before a movie begins are controlled by the studios.Hillary Busis, “How Do Movie Theaters Decide Which Trailers to Show?” Slate, April 15, 2010, http://www.slate.com/id/2246166/. Online marketing, TV commercials, and advertising partnerships with other industries—the name of an upcoming film, for instance, appearing on some Coke cans—are available tools for the big-budget studios that have the resources to commit millions to prerelease advertising. Even though studios no longer own the country’s movie theater chains, the films produced by the big six studios are the ones the multiplexes invariably show. Unlike films by independents, it’s a safe bet that big studio movies are the ones that will sell tickets.
While it may seem like the major studios are making heavy profits, moviemaking today is a much riskier, less profitable enterprise than it was in the studio system era. The massive budgets required for the global marketing of a film are huge financial gambles. In fact, most movies cost the studios much more to market and produce—upward of $100 million—than their box-office returns ever generate. With such high stakes, studios have come to rely on the handful of blockbuster films that keep them afloat,New World Encyclopedia, s.v. “Film Industry,” www.newworldencyclopedia.org/entry/Hollywood. movies like Titanic, Pirates of the Caribbean, and Avatar.New World Encyclopedia, s.v. “Film Industry,” www.newworldencyclopedia.org/entry/Hollywood. The blockbuster film becomes a touchstone, not only for production values and story lines, but also for moviegoers’ expectations. Because studios know they can rely on certain predictable elements to draw audiences, they tend to invest the majority of their budgets on movies that fit the blockbuster mold. Remakes, movies with sequel setups, or films based on best-selling novels or comic books are safer bets than original screenplays or movies with experimental or edgy themes.
James Cameron’s Titanic (1997), the second highest grossing movie of all time, saw such success largely because it was based on a well-known story, contained predictable plot elements, and was designed to appeal to the widest possible range of audience demographics with romance, action, expensive special effects, and an epic scope—meeting the blockbuster standard on several levels. The film’s astronomical $200 million production cost was a gamble indeed, requiring the backing of two studios, Paramount and 20th Century Fox.Steve Hanson and Sandra Garcia-Myers, “Blockbusters,” in St. James Encyclopedia of Popular Culture, ed. Sara Pendergast and Tom Pendergast (Detroit: St. James Press, 2000), vol. 1 However, the rash of high-budget, and high-grossing, films that have appeared since—Harry Potter and the Sorceror’s Stone and its sequels (2002–2011), Avatar (2009), Alice in Wonderland (2010), The Lord of the Rings films (2001–2003), The Dark Knight (2008), and others—are an indication that, for the time being, the blockbuster standard will drive Hollywood production.
While the blockbuster still drives the industry, the formulaic nature of most Hollywood films of the 1980s, 1990s, and into the 2000s has opened a door for independent films to make their mark on the industry. Audiences have welcomed movies like Fight Club (1999), Lost in Translation (2003), and Juno (2007) as a change from standard Hollywood blockbusters. Few independent films reached the mainstream audience during the 1980s, but a number of developments in that decade paved the way for their increased popularity in the coming years. The Sundance Film Festival (originally the U.S. Film Festival) began in Park City, Utah, in 1980 as a way for independent filmmakers to showcase their work. Since then, the festival has grown to garner more public attention, and now often represents an opportunity for independents to find market backing by larger studios. In 1989, Steven Soderbergh’s sex, lies, and videotape, released by Miramax, was the first independent to break out of the art-house circuit and find its way into the multiplexes.
In the 1990s and 2000s, independent directors like the Coen brothers, Wes Anderson, Sofia Coppola, and Quentin Tarantino made significant contributions to contemporary cinema. Tarantino’s 1994 film, Pulp Fiction, garnered attention for its experimental narrative structure, witty dialogue, and nonchalant approach to violence. It was the first independent film to break $100 million at the box office, proving that there is still room in the market for movies produced outside of the big six studios.Ronald Bergan, Film (New York: Dorling Kindersley, 2006), 84.
English-born Michael Apted, former president of the Director’s Guild of America, once said, “Europeans gave me the inspiration to make movies … but it was the Americans who showed me how to do it.”Michael Apted, “Film’s New Anxiety of Influence,” Newsweek, December 28, 2007. http://www.newsweek.com/2007/12/27/film-s-new-anxiety-of-influence.html. Major Hollywood studio films have dominated the movie industry worldwide since Hollywood’s golden age, yet American films have always been in a relationship of mutual influence with films from foreign markets. From the 1940s through the 1960s, for example, American filmmakers admired and were influenced by the work of overseas auteursDirectors whose personal, creative visions were reflected in their work.—directors like Ingmar Bergman (Sweden), Federico Fellini (Italy), François Truffaut (France), and Akira Kurosawa (Japan), whose personal, creative visions were reflected in their work.Richard Pells, “Is American Culture ‘American’?” eJournal USA, February 1, 2006, http://www.america.gov/st/econ-english/2008/June/20080608102136xjyrreP0.3622858.html. The concept of the auteur was particularly important in France in the late 1950s and early 1960s when French filmmaking underwent a rebirth in the form of the New Wave movementThe French New Wave was characterized by an independent production style that showcased the personal authorship of its young directors.. The French New Wave was characterized by an independent production style that showcased the personal authorship of its young directors.Ronald Bergan, Film (New York: Dorling Kindersley, 2006), 60. The influence of the New Wave was, and continues to be, felt in the United States. The generation of young, film school-educated directors that became prominent in American cinema in the late 1960s and early 1970s owe a good deal of their stylistic techniques to the work of French New Wave directors.
Figure 8.8

The French New Wave movement of the 1950s and 1960s showed that films could be both artistically and commercially successful. Jean-Luc Godard’s Breathless is well known for its improvisatory techniques and use of jump cuts.
In the current era of globalization, the influence of foreign films remains strong. The rapid growth of the entertainment industry in Asia, for instance, has led to an exchange of style and influence with U.S. cinema. Remakes of a number of popular Japanese horror films, including The Ring (2005), Dark Water (2005), and The Grudge (2004), have fared well in the United States, as have Chinese martial arts films like Crouching Tiger, Hidden Dragon (2000), Hero (2002), and House of Flying Daggers (2004). At the same time, U.S. studios have recently tried to expand into the growing Asian market by purchasing the rights to films from South Korea, Japan, and Hong Kong for remakes with Hollywood actors.Diana Lee, “Hollywood’s Interest in Asian Films Leads to Globalization,” UniOrb.com, December 1, 2005, http://uniorb.com/ATREND/movie.htm.
With the growth of Internet technology worldwide and the expansion of markets in rapidly developing countries, American films are increasingly finding their way into movie theaters and home DVD players around the world. In the eyes of many people, the problem is not the export of a U.S. product to outside markets, but the export of American culture that comes with that product. Just as films of the 1920s helped to shape a standardized, mass culture as moviegoers learned to imitate the dress and behavior of their favorite celebrities, contemporary film is now helping to form a mass culture on the global scale, as the youth of foreign nations acquire the American speech, tastes, and attitudes reflected in film.Jessica C. E. Gienow-Hecht, “A European Considers the Influence of American Culture,” eJournal USA, February 1, 2006, http://www.america.gov/st/econenglish/2008/June/20080608094132xjyrreP0.2717859.html.
Staunch critics, feeling helpless to stop the erosion of their national cultures, accuse the United States of cultural imperialism through flashy Hollywood movies and commercialism—that is, deliberate conquest of one culture by another to spread capitalism. At the same time, others argue that the worldwide impact of Hollywood films is an inevitable part of globalization, a process that erodes national borders, opening the way for a free flow of ideas between cultures.Jessica C. E. Gienow-Hecht, “A European Considers the Influence of American Culture,” eJournal USA, February 1, 2006, http://www.america.gov/st/econenglish/2008/June/20080608094132xjyrreP0.2717859.html.
With control of over 95 percent of U.S. film production, the big six Hollywood studios—Warner Bros., Paramount, 20th Century Fox, Universal, Columbia, and Disney—are at the forefront of the American film industry, setting the standards for distribution, release, marketing, and production values. However, the high costs of moviemaking today are such that even successful studios must find moneymaking potential in crossover media—computer games, network TV rights, spin-off TV series, DVD and releases on Blu-ray Disc format, toys and other merchandise, books, and other after-market products—to help recoup their losses. The drive for aftermarket marketability in turn dictates the kinds of films studios are willing to invest in.Steve Hanson and Sandra Garcia-Myers, “Blockbusters,” in St. James Encyclopedia of Popular Culture, ed. Sara Pendergast and Tom Pendergast (Detroit: St. James Press, 2000), vol. 1, 283.
In the days of the vertically integrated studio system, filmmaking was a streamlined process, neither as risky nor as expensive as it is today. When producers, directors, screenwriters, art directors, actors, cinematographers, and other technical staff were all under contract with one studio, turnaround time for the casting and production of a film was often as little as 3 to 4 months. Beginning in the 1970s, after the decline of the studio system, the production costs for films increased dramatically, forcing the studios to invest more of their budgets in marketing efforts that could generate presalesSales of distribution rights for a film in different sectors before the film’s release.—that is, sales of distribution rights for a film in different sectors before the movie’s release.Steve Hanson and Sandra Garcia-Myers, “Blockbusters,” in St. James Encyclopedia of Popular Culture, ed. Sara Pendergast and Tom Pendergast (Detroit: St. James Press, 2000), vol. 1, 282. This is still true of filmmaking today. With contracts that must be negotiated with actors, directors, and screenwriters, and with extended production times, costs are exponentially higher than they were in the 1930s—when a film could be made for around $300,000.Eric Schaefer, “Bold! Daring! Shocking! True!”: A History of Exploitation Films, 1919–1959 (Durham, NC: Duke University Press, 1999), 50. By contrast, today’s average production budget, not including marketing expenses, is close to $65 million today.Nash Information Services, “Glossary of Movie Business Terms,” The Numbers, http://www.the-numbers.com/glossary.php
Consider James Cameron’s Avatar, released in 2009, which cost close to $340 million, making it one of the most expensive films of all time. Where does such an astronomical budget go? When weighing the total costs of producing and releasing a film, about half of the money goes to advertising. In the case of Avatar, the film cost $190 million to make and around $150 million to market.Mojgan Sherkat-Massoom, “10 Most Expensive Movies Ever Made,” Access Hollywood, 2010, http://today.msnbc.msn.com/id/34368822/ns/entertainment-access_hollywood/?pg=2#ENT_AH_MostExpensiveMovies; Rebecca Keegan, “How Much Did Avatar Really Cost?” Vanity Fair, December 2009, http://www.vanityfair.com/online/oscars/2009/12/how-much-did-avatar-really-cost.html. Of that $190 million production budget, part goes toward above-the-line costsCosts in the production of a film that are negotiated before filming begins., those that are negotiated before filming begins, and part to below-the-line costsCosts in the production of a film that are generally fixed., those that are generally fixed. Above-the-line costs include screenplay rights; salaries for the writer, producer, director, and leading actors; and salaries for directors’, actors’, and producers’ assistants. Below-the-line costs include the salaries for nonstarring cast members and technical crew, use of technical equipment, travel, locations, studio rental, and catering.Aldo-Vincenzo Tirelli, “Production Budget Breakdown: The Scoop on Film Financing,” Helium, http://www.helium.com/items/936661-production-budget-breakdown-the-scoop-on-film-financing. For Avatar, the reported $190 million doesn’t include money for research and development of 3-D filming and computer-modeling technologies required to put the film together. If these costs are factored in, the total movie budget may be closer to $500 million.Rebecca Keegan, “How Much Did Avatar Really Cost?” Vanity Fair, December 2009, http://www.vanityfair.com/online/oscars/2009/12/how-much-did-avatar-really-cost.html. Fortunately for 20th Century Fox, Avatar made a profit over these expenses in box-office sales alone, raking in $750 million domestically (to make it the highest-grossing movie of all time) in the first 6 months after its release.Box Office Mojo, “Avatar,” May 31, 2010, http://boxofficemojo.com/movies/?id=avatar.htm. However, one thing you should keep in mind is that Avatar was released in both 2-D and 3-D. Because 3-D ticket prices are more expensive than traditional 2-D theaters, the box-office returns are inflated.
However, for every expensive film that has made out well at the box office, there are a handful of others that have tanked. Back in 1980, when United Artists (UA) was a major Hollywood studio, its epic western Heaven’s Gate cost nearly six times its original budget: $44 million instead of the proposed $7.6 million. The movie, which bombed at the box office, was the largest failure in film history at the time, losing at least $40 million, and forcing the studio to be bought out by MGM.Sheldon Hall and Stephen Neale, “Super Blockbusters: 1976–1985,” in Epics, Spectacles, and Blockbusters: A Hollywood History (Detroit: Wayne State Univ. Press, 2010), 231. Since then, Heaven’s Gate has become synonymous with commercial failure in the film industry.Tim Dirks, “1980s Film History,” Filmsite, 2010, http://www.filmsite.org
More recently, the 2005 movie Sahara lost $78 million, making it one of the biggest financial flops in film history. The film’s initial production budget of $80 million eventually doubled to $160 million, due to complications with filming in Morocco and to numerous problems with the script.Glenn F. Bunting, “$78 Million of Red Ink?” Los Angeles Times, April 15, 2007, http://articles.latimes.com/2007/apr/15/business/fi-movie15/4.
Movie piracy used to be perpetrated in two ways: Either someone snuck into a theater with a video camera, turning out blurred, wobbly, off-colored copies of the original film, or somebody close to the film leaked a private copy intended for reviewers. In the digital age, however, crystal-clear bootlegs of movies on DVD and the Internet are increasingly likely to appear illegally, posing a much greater threat to a film’s profitability. Even safeguard techniques like digital watermarks are frequently sidestepped by tech-savvy pirates.Lisa Respers France, “In Digital Age, Can Movie Piracy Be Stopped?” CNN, May 2, 2009, http://www.cnn.com/2009/TECH/05/01/wolverine.movie.piracy/.
In 2009, an unfinished copy of 20th Century Fox’s X-Men Origins: Wolverine appeared online 1 month before the movie’s release date in theaters. Within a week, more than 1 million people had downloaded the pirated film. Similar situations have occurred in recent years with other major movies, including The Hulk (2003) and Star Wars Episode III: Revenge of the Sith (2005).Lisa Respers France, “In Digital Age, Can Movie Piracy Be Stopped?” CNN, May 2, 2009, http://www.cnn.com/2009/TECH/05/01/wolverine.movie.piracy/. According to a 2006 study sponsored by the MPAA, Internet piracy and other methods of illegal copying cost major Hollywood studios $6.1 billion in the previous year.Jesse Hiestand, “MPAA Study: ’05 Piracy Cost $6.1 Bil.,” Hollywood Reporter, May 3, 2006. http://business.highbeam.com/2012/article-1G1-146544812/mpaa-study-05-piracy-cost-61-bil. The findings of this report have since been called into question, with investigators claiming that there was no clear methodology for how researchers estimated those figures.Greg Sandoval, “Feds Hampered by Incomplete MPAA Piracy Data,” CNET News, April 19, 2010, http://news.cnet.com/8301-31001_3-20002837-261.html. Nonetheless, the ease of theft made possible by the digitization of film and improved file-sharing technologies like BitTorrent software, a peer-to-peer protocol for transferring large quantities of information between users, have put increased financial strain on the movie industry.
In , you learned that blockbuster films rely on certain predictable elements to attract audiences. Think about recent blockbusters like Alice in Wonderland, Avatar, and Pirates of the Caribbean and consider the following:
New technologies have a profound impact, not only on the way films are made, but also on the economic structure of the film industry. When VCR technology made on-demand home movie viewing possible for the first time, filmmakers had to adapt to a changing market. The recent switch to digital technology also represents a turning point for film. In this section, you will learn how these and other technologies have changed the face of cinema.
The first technology for home video recording, Sony’s Betamax cassettes, hit the market in 1975. The device, a combined television set and videocassette recorder (VCR), came with the high price tag of $2,495, making it a luxury still too expensive for the average American home. Two years later, RCA released the vertical helical scan (VHS) system of recording, which would eventually outsell Betamax, though neither device was yet a popular consumer product. Within several years however, the concept of home movie recording and viewing was beginning to catch on. In 1979, Columbia Pictures released 20 films for home viewing, and a year later Disney entered the market with the first authorized video rental plan for retail stores. By 1983, VCRs were still relatively uncommon, found in just 10 percent of American homes, but within 2 years the device had found a place in nearly one-third of U.S. households.Entertainment Merchant Association, “A History of Home Video and Video Game Retailing,” http://www.entmerch.org/industry_history.html.
At the same time, video rental stores began to spring up across the country. In 1985, three major video rental chains—Blockbuster, Hastings, and Movie Gallery—opened their doors. The video rental market took off between 1983 and 1986, reaching $3.37 billion in 1986. Video sales that year came to $1 billion, for total revenue of more than $4 billion, marking the first time in history that video would eclipse box-office revenues ($3.78 billion that year).Entertainment Merchant Association, “A History of Home Video and Video Game Retailing,” http://www.entmerch.org/industry_history.html.
Video sales and rentals opened a new mass market in the entertainment industry—the home movie viewer—and offered Hollywood an extended source of income from its films. On the other hand, the VCR also introduced the problem of piracy.
In an age when Hollywood was already struggling financially because of increased production costs, Sony’s release of home video recording technology became a major source of anxiety for Hollywood studios. If people could watch movies in their own homes, would they stop going to the movies altogether? In the 1976 case Sony Corp. of America v. Universal City Studios, Universal Studios and the Walt Disney Company sued Sony in the U.S. District Court for the Central District of California. The suit argued that because Sony was manufacturing a technology that could potentially be used to break copyright law, the company was therefore liable for any copyright infringement committed by VCR purchasers. The District Court struggled with the case, eventually ruling against Sony. However, Sony appealed to the Supreme Court, where the case was again highly debated. Part of the struggle was the recognition that the case had wider implications: Does a device with recording capabilities conflict with copyright law? Is an individual guilty of copyright infringement if she records a single movie in her own home for her own private use?
Eventually the Supreme Court ruled that Sony and other VCR manufacturers could not be held liable for copyright infringement. This case represented an important milestone for two reasons. It opened up a new market in the entertainment sector, enabling video rental and home movie sales. Additionally, the case set a standard for determining whether a device with copying or recording capability violated copyright law. The court ruled that because nonprofit, noncommercial home recording did not constitute copyright violation, VCR technology did have legitimate legal uses, and Sony and other companies could not be held liable for any misuse of their devices. Recently, this case has posed interpretive challenges in legal battles and in debates over file sharing through the Internet.Willie Spruill and Derek Adler, “Sony Corp. of America v. Universal City Studios,” Downloading & Piracy project for Laura N. Gasaway’s Cyberspace Law Seminar, University of North Carolina School of Law, 2009, http://www.unc.edu/courses/2009spring/law/357c/001/Piracy/cases.htm.
In 1980, around the time when consumers were just beginning to purchase VCRs for home use, Pioneer Electronics introduced another technology, the LaserDisc, an optical storage disc that produced higher quality images than did VHS tapes. Nonetheless, because of its large size (12 inches in diameter) and lack of recording capabilities, this early disc system never became popular in the U.S. market. However, the LaserDisc’s successor, the digital versatile disc (DVD) was a different story. Like LaserDisc, the DVD is an optical storage disc—that is, a device whose encoded information follows a spiral pattern on the disc’s surface and can be read when illuminated by a laser diode. However, unlike the analog-formatted LaserDisc, the DVD’s information storage is entirely digital, allowing for a smaller, lighter, more compressed medium.
The first DVDs were released in stores in 1997, impressing consumers and distributers with their numerous advantages over the VHS tape: sharper-resolution images, compactness, higher durability, interactive special features, and better copy protection. In only a few years, sales of DVD players and discs surpassed those of VCRs and videos, making the DVD the most rapidly adopted consumer electronics product of all time.Entertainment Merchant Association, “History of Home Video”; Tim Dirks, “1990s Film History,” Filmsite, 2010, http://www.filmsite.org
In 1999, the movie rental market was revolutionized by Netflix. Netflix began in 1997 as a video rental store in California. In 1999, the company began offering a subscription service online. Subscribers would select movies that they wanted to see on Netflix’s website, and the movies would arrive in their mailbox a few days later, along with a prepaid return envelope. This allowed users to select from thousands of movies and television shows in the privacy of their own home.
More recently, DVD technology has been surpassed by the Blu-ray Disc format, intended for storing and producing high-definition video. Released in 2006, the Blu-ray Disc technology has the same physical dimensions as DVDs, but because they are encoded to be read by lasers with a shorter wavelength, the discs have more than five times the storage capacity of the DVD.Blu-Ray.com, “Blu-Ray Disc,” http://www.blu-ray.com/info/. By 2009 there were 10.9 million Blu-ray Disc players in U.S. homes.Henning Molbaek, “10.7 Million Blu-Ray Players in U.S. Homes,” DVDTown.com, Jan 9, 2009, http://www.dvdtown.com/news/107-million-blu-ray-players-in-us-homes/6288. However, the technology has yet to replace the DVD in rental stores and among the majority of U.S. consumers.
DVD rentals and sales make up a major source of revenue for the movie industry, accounting for nearly half of the returns on feature films. In fact, for some time the industry has been exploiting the profitability of releasing some films directly to DVD without ever premiering them in theaters or of releasing films on DVD simultaneously with their theater releases. According to one estimate, for every movie that appears in theaters, there are three that go straight to DVD.Robert W. Court, “Straight to DVD,” New York Times, May 6, 2006, Opinion section, http://www.nytimes.com/2006/05/06/opinion/06cort.html. While direct-to-DVD has become synonymous with poor production values and ill-conceived sequels, there are a number of reasons why a studio might bypass the multiplexes. Prequels and sequels of box-office hits, shot on a lower production budget, are often released this way and can generate considerable income from the niche market of hard-core fans. The fourth American Pie film, Bring It On: In It to Win It, and Ace Ventura Pet Detective, Jr. are all examples of successful direct-to-DVD films. However, in other cases, the costs of theatrical promotion and release may simply be too high for a studio to back. This is especially true among independently produced films that lack the big-studio marketing budgets. Slumdog Millionaire (2009) was almost one of these cases. However, the film did make it to theaters, going on to win eight Academy Award awards in 2009, including Best Picture.Tom Charity. “Review: Why Some Films Go Straight to DVD,” CNN, February 27, 2009, http://www.cnn.com/2009/SHOWBIZ/Movies/02/27/review.humboldt/index.html. Finally, a film may go straight to DVD when its content is too controversial to be released in theaters. For example, almost all porn films are direct-to-DVD releases.
Between 2005 and 2008, the number of direct-to-DVD releases grew 36 percent as studios began to see the profitability of the strategy.Brooks Barnes, “Direct-to-DVD Releases Shed Their Loser Label,” New York Times, January 28, 2008, http://www.nytimes.com/2008/01/28/business/media/28dvd.html. After a movie’s success at the box office, a prequel, sequel, or related movie might earn the same profit pound-for-pound at the rental store if filmmakers slash the production budget, often replacing the original celebrity actors with less expensive talent. In 2008, direct-to-DVD brought in around $1 billion in sales.Brooks Barnes, “Direct-to-DVD Releases Shed Their Loser Label,” New York Times, January 28, 2008, http://www.nytimes.com/2008/01/28/business/media/28dvd.html.
Despite the profitability of the DVD market, the economic downturn that began in 2007, along with the concurrent release of Blu-ray Disc technology and online digital downloads, have brought about a decline in DVD sales among U.S. consumers.Dianne Garrett, “DVD Sales Down 3.6% in ’07,” January 7, 2008, www.variety.com/article/VR1117978576?refCatId=20. With the rise in digital downloads, Netflix broadened its appeal in 2007 by offering subscribers live-streaming movies and TV shows. This allowed viewers to watch programs on their computers, handheld devices, the Nintendo Wii game system, the Sony PlayStation 3 game system, and the Microsoft Xbox 360 game system without ever having the disc itself.
Additionally, by late 2007 film studios also became anxious over another trend: the Redbox rental system. Redbox, an American company that places DVD rental vending machines in pharmacies, grocery stores, and fast-food chains around the country, had placed a kiosk in approximately 22,000 locations by 2009.Brookes Barnes, “Movie Studios see a Threat in Growth of Redbox,” New York Times, September 6, 2009, http://www.nytimes.com/2009/09/07/business/media/07redbox.html. For the movie industry, the trouble isn’t the widespread availability of Redbox rentals, it’s the price. As of March 2001, customers can rent DVDs from a Redbox kiosk for only $1 per day, which has already led to a severe decline in rental revenue for the film industry.Carl DiOrio, “$1 DVD Rentals Costing Biz $1 Bil: Study,” Hollywood Reporter, December 7, 2009, http://www.hollywoodreporter.com/news/1-dvd-rentals-costing-biz-92098. According to the traditional pricing model, prices for rentals are based on a release window; newly released films cost more to rent for a specified period of time after their release. When customers can rent both older and newly released movies at the same low price, rentals don’t produce the same returns.Jesse Hiestand, “MPAA Study: ’05 Piracy Cost $6.1 Bil.,” Hollywood Reporter, May 3, 2006. http://business.highbeam.com/2012/article-1G1-146544812/mpaa-study-05-piracy-cost-61-bil.
Hollywood has also suffered major losses from online piracy. Since 2007, studios have been teaming up to turn this potential threat into a source of income. Now, instead of illegally downloading their favorite movies from file-sharing sites, fans can go to legal, commercial-supported sites like Hulu.com, where they can access a selected variety of popular movies and TV shows for the same price as accessing NBC, ABC, and CBS—free. In April 2010, Hulu announced it has already launched this service, the Hulu Plus service, in addition to its free service, for users who want access to even more programs, such as Glee.Reuters, “Hulu Launches Paid Subscription TV Service,” Fox News, June 30, 2010, http://www.foxnews.com/scitech/2010/06/30/hulu-starts-paid-subscription-tv-service/. Hulu doesn’t allow viewers to download the films to their home computers, but it does provide a home-viewing experience through online streaming of content.Hulu, “Media Info,” 2010, http://www.hulu.com/about.
In an industry where technological innovations can transform production or distribution methods over the course of a few years, it’s incredible to think that most movies are still captured on celluloid film, the same material that Thomas Edison used to capture his kinetoscope images well over a century ago. In 2002, George Lucas’s Star Wars Episode II: Attack of the Clones became the first major Hollywood movie filmed on high-definition digital video. However, the move to digitally filmed movies has been gradual; much of the movie industry—including directors, producers, studios, and major movie theater chains—has been slow to embrace this major change in filming technology. At the time that Lucas filmed Attack of the Clones, only 18 theaters in the country were equipped with digital projectors.Scott Kirsner, “Studios Shift to Digital Movies, but Not Without Resistance,” New York Times, July 4, 2006, http://www.nytimes.com/2005/05/22/technology/22iht-movies23.html?scp=15&sq=digital%20movie&st=cse.
However, digital cinematography has become an increasingly attractive, and increasingly popular, option for a number of reasons. For one thing, during production, it eliminates the need to reload film. A scene filmed in the traditional method, requiring multiple takes, can now be filmed in one continuous take because no raw material is being used in the process.Scott Kirsner, “Studios Shift to Digital Movies, but Not Without Resistance,” New York Times, July 4, 2006, http://www.nytimes.com/2005/05/22/technology/22iht-movies23.html?scp=15&sq=digital%20movie&st=cse. The digital format streamlines the editing process as well. Rather than scanning the images into a computer before adding digital special effects and color adjustments, companies with digitally filmed material can send it electronically to the editing suite. Additionally, digital film files aren’t susceptible to scratching or wear over time, and they are capable of producing crystal-clear, high-resolution images.Eric A. Taub. “More Digital Projectors, Coming to a Theater Near You,” Gadgetwise (blog), New York Times, June 18, 2009, http://gadgetwise.blogs.nytimes.com/2009/06/18/its-a-4k-world-after-all/.
Figure 8.9

Attack of the Clones: First Film to Be Made with Digital Cinematography.
For distributers and production companies, digitally recorded images eliminate the costs of purchasing, developing, and printing film. Studios spend around $800 million each year making prints of the films they distribute to theaters and additional money on top of that to ship the heavy reels.Ty Burr, “Will the ‘Star Wars’ Digital Gamble Pay Off?” Entertainment Weekly, April 19, 2002, http://archives.cnn.com/2002/SHOWBIZ/Movies/04/19/ew.hot.star.wars/. For a film like Attack of the Clones, widely released in 3,000 theaters, printing and shipping costs for 35-mm film would be around $20 million.Ty Burr, “Will the ‘Star Wars’ Digital Gamble Pay Off?” Entertainment Weekly, April 19, 2002, http://archives.cnn.com/2002/SHOWBIZ/Movies/04/19/ew.hot.star.wars/. On the other hand, with digital format, which requires no printing and can be sent to theaters on a single hard drive, or, as the system develops, over cable or satellite, these costs are virtually eliminated.Doreen Carvajal, “Nurturing Digital Cinema,” New York Times, May 23, 2005, http://www.nytimes.com/2005/05/22/technology/22iht-movies23.html; Ty Burr, “Will the ‘Star Wars’ Digital Gamble Pay Off?” Entertainment Weekly, April 19, 2002, http://archives.cnn.com/2002/SHOWBIZ/Movies/04/19/ew.hot.star.wars/.
In part, the change has been gradual because, for theaters, the costs of making the digital switch (at around $125,000 for a high-quality digital projectorReuters, “Movie Theaters Going Digital,” CNN, December 24, 2003, http://www.cnn.com/2003/TECH/ptech/12/24/digital.movietheater.reut/index.html.) is high, and the transformation offers them fewer short-term incentives than it does for distributors, who could save a significant amount of money with digital technology. Furthermore, theaters have already heavily invested in their current projection equipment for 35-mm film.Doreen Carvajal, “Nurturing Digital Cinema,” New York Times, May 23, 2005, http://www.nytimes.com/2005/05/22/technology/22iht-movies23.html In the long run, the high-definition picture capabilities of digital movies might boost profits as more moviegoers turn out at the theaters, but there are no guarantees. In the meantime, the major studios are negotiating with leading theater chains to underwrite some of the conversion expenses.Erin McCarthy, “The Tech Behind 3D’s Big Revival,” Popular Mechanics, April 1, 2009, http://www.popularmechanics.com/technology/digital/3d/4310810.
Another financial pitfall of digital film is, surprisingly, the cost of storage once the film is out of major circulation. For major studios, a significant portion of revenues—around one-third—comes from the rerelease of old films. Studios invest an annual budget of just over $1,000 per film to keep their 35-millimeter masters in archival storage.Michael Cieply, “The Afterlife is Expensive for Digital Movies,” New York Times, December 23, 2007, http://www.nytimes.com/2007/12/23/business/media/23steal.html. Keeping the film stock at controlled temperature and moisture levels prevents degradation, so masters are often stored in mines, where these conditions can be met most optimally.Michael Cieply, “The Afterlife is Expensive for Digital Movies,” New York Times, December 23, 2007, http://www.nytimes.com/2007/12/23/business/media/23steal.html.
Digital data however, for all of its sophistication, is actually less likely to last than traditional film is; DVDs can degrade rapidly, with only a 50 percent chance of lasting up to 15 years,Michael Cieply, “The Afterlife is Expensive for Digital Movies,” New York Times, December 23, 2007, http://www.nytimes.com/2007/12/23/business/media/23steal.html. while hard drives must be operated occasionally to prevent them from locking up. As a result, the storage cost for digital originals comes closer to $12,500 per film per year.Michael Cieply, “The Afterlife is Expensive for Digital Movies,” New York Times, December 23, 2007, http://www.nytimes.com/2007/12/23/business/media/23steal.html. Moreover, as one generation of digital technology gives way to another, files have to be migrated to newer formats to prevent originals from becoming unreadable.
After World War II, as movie attendance began to decline, the motion picture industry experimented with new technologies to entice audiences back into increasingly empty theaters. One such gimmick, the 3-D picture, offered the novel experience of increased audience “participation” as monsters, flying objects, and obstacles appeared to invade the theater space, threatening to collide with spectators. The effect was achieved by manipulating filming equipment to work like a pair of human eyes, mimicking the depth of field produced through binocular vision. By joining two cameras together and spacing them slightly apart with their lenses angled fractionally toward one another, filmmakers could achieve an effect similar to that created by the overlapping fields of vision of the right and left eye. In theaters, the resulting images were played simultaneously on two separate projectors. The 3-D glasses spectators wore were polarized to filter the images so that the left eye received only “left eye” projections and the right eye received only “right eye” projections.Matt Buchanan, “Giz Explains 3D Technologies,” Gizmodo (blog), November 12, 2008, http://gizmodo.com/5084121/giz-explains-3d-technologies.
3-D was an instant sensation. House of Wax, the first big-budget 3-D movie, released in 1953, brought in over $1 million during its first 3 weeks in theaters, making it one of the most successful films of the year. Best of all for investors, 3-D could be created with fairly inexpensive equipment. For this reason, a boom of 3-D development soon occurred nationwide. Forty-six 3-D movies were filmed in a span of 2 years. However, 3-D proved to be a brief success, with its popularity already beginning to wane by the end of 1953.John Hayes, “‘You See Them WITH Glasses!’ A Short History of 3D Movies,” Wide Screen Movies Magazine, 2009, http://widescreenmovies.org/wsm11/3D.htm.
Figure 8.10

Resurgence of 3-D.
3-D soon migrated from the realm of common popular entertainment to novelty attraction, appearing in IMAX cinemas, as an occasional marketing draw for kids’ movies, and in theme-park classics like Captain Eo and Honey, I Shrunk the Audience. Captain Eo, a Disneyland attraction from 1986 to 1993, featured pop sensation Michael Jackson in his heyday. Following Jackson’s death, the film was rereleased for a limited time in 2010.Heather Hust Rivera, “Captain EO Returns to Disneyland Resort.” Disney Parks Blog, December 18, 2009. http://disneyparks.disney.go.com/blog/2009/12/captain-eo-returns-to-disneyland-resort/.
Despite the marginal role 3-D has played since the midcentury fad died out, new technologies have brought about a resurgence in the trend, and the contemporary 3-D experience seems less like a gimmick and more like a serious development in the industry. DreamWorks animation CEO Jeffrey Katzenberg, for one, likened the new 3-D to the introduction of color.Erin McCarthy, “The Tech Behind 3D’s Big Revival,” Popular Mechanics, April 1, 2009, http://www.popularmechanics.com/technology/digital/3d/4310810. One of the downfalls that lead to the decline of 3-D in the 1950s was the “3-D headache” phenomenon audiences began to experience as a result of technical problems with filming.John Hayes, “‘You See Them WITH Glasses!’ A Short History of 3D Movies,” Wide Screen Movies Magazine, 2009, http://widescreenmovies.org/wsm11/3D.htm. To create the 3-D effect, filmmakers need to calculate the point where the overlapping images converge, an alignment that had to be performed by hand in those early years. And for the resulting image to come through clearly, the parallel cameras must run in perfect sync with one another—another impossibility with 35-millimeter film, which causes some distortion by the very fact of its motion through the filming camera.
Today the 3-D headache is a thing of the past, as computerized calibration makes perfect camera alignment a reality and as the digital recording format eliminates the celluloid-produced distortion. Finally, a single digital projector equipped with a photo-optical device can now perform the work of the two synchronized projectors of the past. For the theater chains, 3-D provides the first real incentive to make the conversion to digital. Not only do audiences turn out in greater numbers for an experience they can’t reproduce at home, even on their HD television sets, but theaters are also able to charge more for tickets to see 3-D films. In 2008, for example, Journey to the Center of the Earth, which grossed $102 million, earned 60 percent of that money through 3-D ticket sales, even though it played in 3-D on only 30 percent of its screens.Erin McCarthy, “The Tech Behind 3D’s Big Revival,” Popular Mechanics, April 1, 2009, http://www.popularmechanics.com/technology/digital/3d/4310810. Two of the top-grossing movies of all time, Avatar (2009) and Alice in Wonderland (2010), were both released in 3-D.
Imagine you work for a major Hollywood studio and you are negotiating a contract with a large theater chain to switch to a digital projection system. Consider the following:
Review Questions
Questions for Section 8.1 "The History of Movies"
Questions for Section 8.2 "Movies and Culture"
Questions for Section 8.3 "Issues and Trends in Film"
Questions for Section 8.4 "The Influence of New Technology"
Research the career of a Hollywood producer. In this career, identify the different types of producers involved in a production. What tasks are these producers expected to perform? Do people in this career specialize in a certain genre of film? If so, which genre would you specialize in and why?
In November 2007, more than 12,000 film, television, and radio writers working in the United States went on strike for fairer pay. Lasting for more than 3 months, the strike disrupted many hit shows in 2008 and cost the Los Angeles economy $2.5 billion.Associated Press, “Writers Strike Cost L.A. Economy $2.5 billion,” MSNBC, February 19, 2008, http://www.msnbc.msn.com/id/23244509/. While production for many television shows was on hiatus, several industry executives were thinking of new ways to reach their audience. Declaring that it was “time to change the face of Show Business as we know it,” Buffy the Vampire Slayer creator Joss Whedon started writing a three-part musical series specifically designed to be released on the Internet.Dr. Horrible’s Sing-Along Blog: Official Fan Site, http://doctorhorrible.net/about/. Starring Neil Patrick Harris as an aspiring supervillain eager to join the Evil League of Evil, Dr. Horrible’s Sing-Along Blog featured three 14-minute acts that told the story of Dr. Horrible; his nemesis, Captain Hammer (Nathan Fillion); and their mutual love interest, Penny (Felicia Day). Whedon cowrote the musical with his brothers and funded the $200,000 project, filming the series in a small studio he set up in his Los Angeles loft and at several outdoor locations in the city. None of the cast or crew initially received compensation for the project.
The first act of Dr. Horrible was released on its official website on July 15, 2008, hosted by free online video service Hulu. Act II followed on July 17 and Act III appeared 2 days later. Viewers could watch all three episodes for free online anywhere in the world. Shortly after the free viewing, the tragicomic musical was made available for purchase on iTunes, where it reached No. 1 on the video chart, totaling 2.2 million downloads per week. The soundtrack was also released via iTunes, obtaining the No. 1 spot on the first day of its release and debuting at No. 39 on the Billboard 200 chart. Capitalizing on the musical’s online success, Whedon greenlit, or authorized, the creation of a DVD, which was released exclusively through Amazon.com on December 19, 2008, and reached the No. 3 position for Amazon’s movies and television sales. Illustrating the participatory nature of the new medium, the DVD extras featured several video submissions from fans. The 3-minute video clips were winning entries from a competition announced at Comic-Con—an annual comic book and fan convention held in San Diego—in which fans explained why they should be inducted into the Evil League of Evil. The DVD also featured a singing commentary from the cast and crew, performed both in and out of character. Internet downloads, DVD sales, and soundtrack sales enabled Whedon to pay back the cast and crew of Dr. Horrible and confirmed the online musical as a viable model for future independent productions.
Although Dr. Horrible was an unlikely commercial success, it quickly became a media sensation. Named one of the best inventions of 2008 by Time magazine and awarded an Emmy for best short-format live-action entertainment program, the online supervillain musical challenged traditional notions that a big-budget studio is necessary to create a hit television series. Along with other successful web television series such as The Guild, Dorm Life, and Lonelygirl15, Dr. Horrible has helped to pave the way for smaller independent companies to create popular entertainment. An understanding of how television evolved and how it is beginning to merge with the Internet will provide insight into the future of content delivery and viewer patterns.
Since replacing radio as the most popular mass medium in the 1950s, television has played such an integral role in modern life that, for some, it is difficult to imagine being without it. Both reflecting and shaping cultural values, television has at times been criticized for its alleged negative influences on children and young people and at other times lauded for its ability to create a common experience for all its viewers. Major world events such as the John F. Kennedy and Martin Luther King assassinations and the Vietnam War in the 1960s, the Challenger shuttle explosion in 1986, the 2001 terrorist attacks on the World Trade Center, and the impact and aftermath of Hurricane Katrina in 2005 have all played out on television, uniting millions of people in shared tragedy and hope. Today, as Internet technology and satellite broadcasting change the way people watch television, the medium continues to evolve, solidifying its position as one of the most important inventions of the 20th century.
Inventors conceived the idea of television long before the technology to create it appeared. Early pioneers speculated that if audio waves could be separated from the electromagnetic spectrum to create radio, so too could television waves be separated to transmit visual images. As early as 1876, Boston civil servant George Carey envisioned complete television systems, putting forward drawings for a “selenium camera” that would enable people to “see by electricity” a year later.“Visionary Period, 1880’s Through 1920’s,” Federal Communications Commission, November 21, 2005, http://www.fcc.gov/omd/history/tv/1880-1929.html
During the late 1800s, several technological developments set the stage for television. The invention of the cathode ray tube (CRT)An electronic display device in which a beam of electrons is focused on a glass viewing screen to create an image. by German physicist Karl Ferdinand Braun in 1897 played a vital role as the forerunner of the television picture tube. Initially created as a scanning device known as the cathode ray oscilloscope, the CRT effectively combined the principles of the camera and electricity. It had a florescent screen that emitted a visible light (in the form of images) when struck by a beam of electrons. The other key invention during the 1880s was the mechanical scanner system. Created by German inventor Paul Nipkow, the scanning diskA large, flat metal disk with perforations arranged in a spiral pattern used as a rotating camera in early television models. was a large, flat metal disk with a series of small perforations arranged in a spiral pattern. As the disk rotated, light passed through the holes, separating pictures into pinpoints of light that could be transmitted as a series of electronic lines. The number of scanned lines equaled the number of perforations, and each rotation of the disk produced a television frame. Nipkow’s mechanical disk served as the foundation for experiments on the transmission of visual images for several decades.
In 1907, Russian scientist Boris Rosing used both the CRT and the mechanical scanner system in an experimental television system. With the CRT in the receiver, he used focused electron beams to display images, transmitting crude geometrical patterns onto the television screen. The mechanical disk system was used as a camera, creating a primitive television system.
Figure 9.1

Two key inventions in the 1880s paved the way for television to emerge: the cathode ray tube and the mechanical disk system.
From the early experiments with visual transmissions, two types of television systems came into existence: mechanical television and electronic television. Mechanical televisionA television system that used mechanical moving parts to capture and display images. Mechanical television was phased out during the 1930s in favor of electronic television. developed out of Nipkow’s disk system and was pioneered by British inventor John Logie Baird. In 1926, Baird gave the world’s first public demonstration of a television system at Selfridge’s department store in London. He used mechanical rotating disks to scan moving images into electrical impulses, which were transmitted by cable to a screen. Here they showed up as a low-resolution pattern of light and dark. Baird’s first television program showed the heads of two ventriloquist dummies, which he operated in front of the camera apparatus out of the audience’s sight. In 1928, Baird extended his system by transmitting a signal between London and New York. The following year, the British Broadcasting Corporation (BBC) adopted his mechanical system, and by 1932, Baird had developed the first commercially viable television system and sold 10,000 sets. Despite its initial success, mechanical television had several technical limitations. Engineers could get no more than about 240 lines of resolution, meaning images would always be slightly fuzzy (most modern televisions produce images of more than 600 lines of resolution). The use of a spinning disk also limited the number of new pictures that could be seen per second, resulting in excessive flickering. The mechanical aspect of television proved to be a disadvantage that required fixing in order for the technology to move forward.
At the same time Baird (and, separately, American inventor Charles Jenkins) was developing the mechanical model, other inventors were working on an electronic televisionAll-electronic television system that scanned images using an electronic camera and received images by cathode ray tube. Electronic television replaced mechanical television in the 1930s. system based on the CRT. While working on his father’s farm, Idaho teenager Philo Farnsworth realized that an electronic beam could scan a picture in horizontal lines, reproducing the image almost instantaneously. In 1927, Farnsworth transmitted the first all-electronic television picture by rotating a single straight line scratched onto a square piece of painted glass by 90 degrees.
Farnsworth barely profited from his invention; during World War II, the government suspended sales of television sets, and by the time the war ended, Farnsworth’s original patents were close to expiring. However, following the war, many of his key patents were modified by RCA and were widely applied in broadcasting to improve television picture quality.
Figure 9.2

The low image resolution of John Logie Baird’s mechanical television was a major disadvantage that led to the technology’s replacement by electronic television systems.
Having coexisted for several years, electronic television sets eventually began to replace mechanical systems. With better picture quality, no noise, a more compact size, and fewer visual limitations, the electronic system was far superior to its predecessor and rapidly improving. By 1939, the last mechanical television broadcasts in the United States had been replaced with electronic broadcasts.
Television broadcasting began as early as 1928, when the Federal Radio Commission authorized inventor Charles Jenkins to broadcast from W3XK, an experimental station in the Maryland suburbs of Washington, DC. Silhouette images from motion picture films were broadcast to the general public on a regular basis, at a resolution of just 48 lines. Similar experimental stations ran broadcasts throughout the early 1930s. In 1939, RCA subsidiary NBC (National Broadcasting Company) became the first network to introduce regular television broadcasts, transmitting its inaugural telecast of the opening ceremonies at the New York World’s Fair. The station’s initial broadcasts transmitted to just 400 television sets in the New York area, with an audience of 5,000 to 8,000 people.Lenox Lohr, Television Broadcasting (New York: McGraw Hill, 1940).
Television was initially available only to the privileged few, with sets ranging from $200 to $600—a hefty sum in the 1930s, when the average annual salary was $1,368.Library, Lone Star College: Kinwood, “American Cultural History 1930–1939,” http://kclibrary.lonestar.edu/decade30.html. RCA offered four types of television receivers, which were sold in high-end department stores such as Macy’s and Bloomingdale’s, and received channels 1 through 5. Early receivers were a fraction of the size of modern television sets, featuring 5-, 9-, or 12-inch screens. Television sales prior to World War II were disappointing—an uncertain economic climate, the threat of war, the high cost of a television receiver, and the limited number of programs on offer deterred numerous prospective buyers. Many unsold television sets were put into storage and sold after the war.
NBC was not the only commercial network to emerge in the 1930s. RCA radio rival CBS (Columbia Broadcasting System) also began broadcasting regular programs. So that viewers would not need a separate television set for each individual network, the Federal Communications Commission (FCC) outlined a single technical standard. In 1941, the panel recommended a 525-line system and an image rate of 30 frames per second. It also recommended that all U.S. television sets operate using analog signalsBroadcast signal made of varying radio waves. Analog signals were used to broadcast television programming for 60 years. They were replaced by digital signals in 2009. (broadcast signals made of varying radio waves). Analog signals were replaced by digital signalsSignals transmitted as binary code. Digital signals replaced analog signals as the universal method of transmitting television broadcasts in 2009. (signals transmitted as binary code) in 2009.
With the outbreak of World War II, many companies, including RCA and General Electric, turned their attention to military production. Instead of commercial television sets, they began to churn out military electronic equipment. In addition, the war halted nearly all television broadcasting; many television stations reduced their schedules to around 4 hours per week or went off the air altogether.
Although it did not become available until the 1950s or popular until the 1960s, the technology for producing color television was proposed as early as 1904, and was demonstrated by John Logie Baird in 1928. As with his black-and-white television system, Baird adopted the mechanical method, using a Nipkow scanning disk with three spirals, one for each primary color (red, green, and blue). In 1940, CBS researchers, led by Hungarian television engineer Peter Goldmark, used Baird’s 1928 designs to develop a concept of mechanical color television that could reproduce the color seen by a camera lens.
Following World War II, the National Television System Committee (NTSC) worked to develop an all-electronic color system that was compatible with black-and-white television sets, gaining FCC approval in 1953. A year later, NBC made the first national color broadcast when it telecast the Tournament of Roses Parade. Despite the television industry’s support for the new technology, it would be another 10 years before color television gained widespread popularity in the United States, and black-and-white television sets outnumbered color television sets until 1972.John Klooster, Icons of Invention: The Makers of the Modern World from Gutenberg to Gates (Santa Barbara, CA: ABC-CLIO, 2009), 442.
Figure 9.3

During the so-called “golden age” of television, the percentage of U.S. households that owned a television set rose from 9 percent in 1950 to 95.3 percent in 1970.
The 1950s proved to be the golden age of television, during which the medium experienced massive growth in popularity. Mass-production advances made during World War II substantially lowered the cost of purchasing a set, making television accessible to the masses. In 1945, there were fewer than 10,000 television sets in the United States. By 1950, this figure had soared to around 6 million, and by 1960 more than 60 million television sets had been sold.World Book Encyclopedia (2003), s.v. “Television.” Many of the early television program formats were based on network radio shows and did not take advantage of the potential offered by the new medium. For example, newscasters simply read the news as they would have during a radio broadcast, and the network relied on newsreel companies to provide footage of news events. However, during the early 1950s, television programming began to branch out from radio broadcasting, borrowing from theater to create acclaimed dramatic anthologies such as Playhouse 90 (1956) and The U.S. Steel Hour (1953) and producing quality news film to accompany coverage of daily events.
Two new types of programs—the magazine format and the television spectacular—played an important role in helping the networks gain control over the content of their broadcasts. Early television programs were developed and produced by a single sponsor, which gave the sponsor a large amount of control over the content of the show. By increasing program length from the standard 15-minute radio show to 30 minutes or longer, the networks substantially increased advertising costs for program sponsors, making it prohibitive for a single sponsor. Magazine programs such as the Today show and The Tonight Show, which premiered in the early 1950s, featured multiple segments and ran for several hours. They were also screened on a daily, rather than weekly, basis, drastically increasing advertising costs. As a result, the networks began to sell spot advertisements that ran for 30 or 60 seconds. Similarly, the television spectacular (now known as the television special) featured lengthy music-variety shows that were sponsored by multiple advertisers.
Figure 9.4

ABC’s Who Wants to Be a Millionaire brought the quiz show back to prime-time television after a 40-year absence.
In the mid-1950s, the networks brought back the radio quiz-show genre. Inexpensive and easy to produce, the trend caught on, and by the end of the 1957–1958 season, 22 quiz shows were being aired on network television, including CBS’s $64,000 Question. Shorter than some of the new types of programs, quiz shows enabled single corporate sponsors to have their names displayed on the set throughout the show. The popularity of the quiz-show genre plunged at the end of the decade, however, when it was discovered that most of the shows were rigged. Producers provided some contestants with the answers to the questions in order to pick and choose the most likable or controversial candidates. When a slew of contestants accused the show Dotto of being fixed in 1958, the networks rapidly dropped 20 quiz shows. A New York grand jury probe and a 1959 congressional investigation effectively ended prime-time quiz shows for 40 years, until ABC revived the genre with its launch of Who Wants to Be a Millionaire in 1999.William Boddy, “The Seven Dwarfs and the Money Grubbers,” in Logics of Television: Essays in Cultural Criticism, ed. Patricia Mellencamp (Bloomington, IN: Indiana University Press, 1990), 98–116.
Formerly known as Community Antenna Television, or CATV, cable televisionA system of providing television and other media services to consumers via coaxial cable. Subscribers are connected through a central community antenna, which picks up satellite signals for distribution. was originally developed in the 1940s in remote or mountainous areas, including in Arkansas, Oregon, and Pennsylvania, to enhance poor reception of regular television signals. Cable antennas were erected on mountains or other high points, and homes connected to the towers would receive broadcast signals.
In the late 1950s, cable operators began to experiment with microwave to bring signals from distant cities. Taking advantage of their ability to receive long-distance broadcast signals, operators branched out from providing a local community service and began focusing on offering consumers more extensive programming choices. Rural parts of Pennsylvania, which had only three channels (one for each network), soon had more than double the original number of channels as operators began to import programs from independent stations in New York and Philadelphia. The wider variety of channels and clearer reception the service offered soon attracted viewers from urban areas. By 1962, nearly 800 cable systems were operational, serving 850,000 subscribers.
Figure 9.5

The Evolution of Television
Cable’s exponential growth was viewed as competition by local television stations, and broadcasters campaigned for the FCC to step in. The FCC responded by placing restrictions on the ability of cable systems to import signals from distant stations, which froze the development of cable television in major markets until the early 1970s. When gradual deregulation began to loosen the restrictions, cable operator Service Electric launched the service that would change the face of the cable television industry—pay TVSubscription-based television service in which consumers pay a fee to the program provider.. The 1972 Home Box Office (HBO) venture, in which customers paid a subscription fee to access premium cable television shows and video-on-demand products, was the nation’s first successful pay cable service. HBO’s use of a satellite to distribute its programming made the network available throughout the United States. This gave it an advantage over the microwave-distributed services, and other cable providers quickly followed suit. Further deregulation provided by the 1984 Cable Act enabled the industry to expand even further, and by the end of the 1980s, nearly 53 million households subscribed to cable television (see Section 6.3 "Current Popular Trends in the Music Industry"). In the 1990s, cable operators upgraded their systems by building higher-capacity hybrid networks of fiber-optic and coaxial cable. These broadbandA high-speed network connection that can carry data, voice, television, and video at higher speeds and in greater quantities than traditional connections. networks provide a multichannel television service, along with telephone, high-speed Internet, and advanced digital video services, using a single wire.
Following the FCC standards set out during the early 1940s, television sets received programs via analog signals made of radio waves. The analog signal reached television sets through three different methods: over the airwaves, through a cable wire, or by satellite transmission. Although the system remained in place for more than 60 years, it had several disadvantages. Analog systems were prone to static and distortion, resulting in a far poorer picture quality than films shown in movie theaters. As television sets grew increasingly larger, the limited resolution made scan lines painfully obvious, reducing the clarity of the image. Companies around the world, most notably in Japan, began to develop technology that provided newer, better-quality television formats, and the broadcasting industry began to lobby the FCC to create a committee to study the desirability and impact of switching to digital televisionTelevision that uses signals that translate television images and sounds into binary code. Digital television replaced analog television in 2009.. A more efficient and flexible form of broadcast technology, digital television uses signals that translate television images and sounds into binary code, working in much the same way as a computer. This means they require much less frequency space and also provide a far higher quality picture. In 1987, the Advisory Committee on Advanced Television Services began meeting to test various television systems, both analog and digital. The committee ultimately agreed to switch from analog to digital format in 2009, allowing a transition period in which broadcasters could send their signal on both an analog and a digital channel. Once the switch took place, many older analog television sets were unusable without a cable or satellite service or a digital converter. To retain consumers’ access to free over-the-air television, the federal government offered $40 gift cards to people who needed to buy a digital converter, expecting to recoup its costs by auctioning off the old analog broadcast spectrum to wireless companies.Jacques Steinberg, “Converters Signal a New Era for TVs,” New York Times, June 7, 2007, http://www.nytimes.com/2007/06/07/technology/07digital.html. These companies were eager to gain access to the analog spectrum for mobile broadband projects because this frequency band allows signals to travel greater distances and penetrate buildings more easily.
Around the same time the U.S. government was reviewing the options for analog and digital television systems, companies in Japan were developing technology that worked in conjunction with digital signals to create crystal-clear pictures in a wide-screen format. High-definition televisionWide-screen television system with a much higher resolution than standard televisions, creating a cinematic experience for the viewer., or HDTV, attempts to create a heightened sense of realism by providing the viewer with an almost three-dimensional experience. It has a much higher resolution than standard television systems, using around five times as many pixels per frame. First available in 1998, HDTV products were initially extremely expensive, priced between $5,000 and $10,000 per set. However, as with most new technology, prices dropped considerably over the next few years, making HDTV affordable for mainstream shoppers.
Figure 9.6

HDTV uses a wide-screen format with a different aspect ratio (the ratio of the width of the image to its height) than standard-definition television. The wide-screen format of HDTV is similar to that of movies, allowing for a more authentic film-viewing experience at home.
© Thinkstock
As of 2010, nearly half of American viewers are watching television in high definition, the fastest adoption of television technology since the introduction of the VCR in the 1980s.Brian Stelter, “Crystal-Clear, Maybe Mesmerizing,” New York Times, May 23, 2010, http://www.nytimes.com/2010/05/24/business/media/24def.html. The new technology is attracting viewers to watch television for longer periods of time. According to the Nielsen Company, a company that measures television viewership, households with HDTV watch 3 percent more prime-time televisionProgramming screened between the hours of 7 and 11 p.m., when the largest audience is available.—programming screened between 7 and 11 p.m., when the largest audience is available—than their standard-definition counterparts.Brian Stelter, “Crystal-Clear, Maybe Mesmerizing,” New York Times, May 23, 2010, http://www.nytimes.com/2010/05/24/business/media/24def.html. The same report claims that the cinematic experience of HDTV is bringing families back together in the living room in front of the large wide-screen television and out of the kitchen and bedroom, where individuals tend to watch television alone on smaller screens. However, these viewing patterns may change again soon as the Internet plays an increasingly larger role in how people view television programs. The impact of new technologies on television is discussed in much greater detail in Section 9.4 "Influence of New Technologies" of this chapter.
Figure 9.7

Since 1950, the amount of time the average household spends watching television has almost doubled.
Please respond to the following writing prompts. Each response should be a minimum of one paragraph.
Since its inception as an integral part of American life in the 1950s, television has both reflected and nurtured cultural mores and values. From the escapist dramas of the 1960s, which consciously avoided controversial issues and glossed over life’s harsher realities in favor of an idealized portrayal, to the copious reality television shows in recent years, on which participants discuss even the most personal and taboo issues, television has held up a mirror to society. But the relationship between social attitudes and television is reciprocal; broadcasters have often demonstrated their power to influence viewers, either consciously through slanted political commentary, or subtly, by portraying controversial relationships (such as single parenthood, same-sex marriages, or interracial couplings) as socially acceptable. The symbiotic nature of television and culture is exemplified in every broadcast, from family sitcoms to serious news reports.
In the 1950s, most television entertainment programs ignored current events and political issues. Instead, the three major networks (ABC, NBC, and CBS) developed prime-time shows that would appeal to a general family audience. Chief among these types of shows was the domestic comedyGeneric family comedy popular in the 1950s that was identified by its character-based humor and was usually set within the home.—a generic family comedy that was identified by its character-based humor and usually set within the home. Seminal examples included popular 1950s shows such as Leave It to Beaver, The Donna Reed Show, and The Adventures of Ozzie and Harriet. Presenting a standardized version of the white middle-class suburban family, domestic comedies portrayed the conservative values of an idealized American life. Studiously avoiding prevalent social issues such as racial discrimination and civil rights, the shows focused on mostly white middle-class families with traditional nuclear roles (mother in the home, father in the office) and implied that most domestic problems could be solved within a 30-minute time slot, always ending with a strong moral lesson.
Although these shows depicted an idealized version of American family life, many families in the 1950s were traditional nuclear families. Following the widespread poverty, political uncertainty, and physical separation of the war years, many Americans wanted to settle down, have children, and enjoy the peace and security that family life appeared to offer. During the booming postwar era, a period of optimism and prosperity, the traditional nuclear family flourished. However, the families and lifestyles presented in domestic comedies did not encompass the overall American experience by any stretch of the imagination. As historian Stephanie Coontz points out, “the June Cleaver or Donna Stone homemaker role was not available to the more than 40 percent of black women with small children who worked outside the home.”Stephanie Coontz, “‘Leave It to Beaver’ and ‘Ozzie and Harriet’: American Families in the 1950s,” in The Way We Never Were: American Families and the Nostalgia Trip (New York: BasicBooks, 1992), 28. Although nearly 60 percent of the U.S. population was labeled middle class by the mid-1950s, 25 percent of all families and more than 50 percent of two-parent black families were poor. Migrant workers suffered horrific deprivations, and racial tensions were rife. None of this was reflected in the world of domestic comedies, where even the Hispanic gardener in Father Knows Best was named Frank Smith.Stephanie Coontz, “‘Leave It to Beaver’ and ‘Ozzie and Harriet’: American Families in the 1950s,” in The Way We Never Were: American Families and the Nostalgia Trip (New York: BasicBooks, 1992), 28.
Figure 9.8

Most domestic comedies in the 1950s portrayed an idealized version of family life and ignored social and political events.
Not all programs in the 1950s were afraid to tackle controversial social or political issues. In March 1954, journalist Edward R. Murrow broadcast an unflattering portrait of U.S. Senator Joseph McCarthy on his show See It Now. McCarthy, a member of the Senate Investigation Committee, had launched inquiries regarding potential Communist infiltration in U.S. institutions. Murrow thought that McCarthy’s aggressive tactics were a potential threat to civil liberties. His portrait cast the senator from Wisconsin in an unflattering light by pointing out contradictions in his speeches. This led to such an uproar that McCarthy was formally reprimanded by the U.S. Senate.Michael J. Friedman, “‘See It Now’: Murrow vs. McCarthy,” in Edward R. Murrow: Journalism at Its Best, publication of U.S. Department of State, June 1, 2008, http://www.america.gov/st/democracyhr-english/2008/June/20080601110244eaifas8.602542e-02.html.
Entertainment programs also tackled controversial issues. The long-running television western Gunsmoke, which aired on CBS from 1955 to 1975, flourished in a Cold War society, where U.S. Marshal Matt Dillon (James Arness) stood up to lawlessness in defense of civilization. The characters and community in Gunsmoke faced relevant social issues, including the treatment of minority groups, the meaning of family, the legitimacy of violence, and the strength of religious belief. During the 1960s, the show adapted to the desires of its viewing audience, becoming increasingly aware of and sympathetic to ethnic minorities, in tune with the national mood during the civil rights era. This adaptability helped the show to become the longest-running western in television history.
During the 1960s, television news broadcasts brought the realities of real-world events into people’s living rooms in vivid detail. The CBS Evening News with Walter Cronkite, which debuted in 1962, quickly became the country’s most popular newscast, and by the end of the decade, journalist Walter Cronkite was known as the most trusted man in America. Following John F. Kennedy’s election to the presidency at the beginning of the decade, the 1960s took an ominous turn. Shocked viewers tuned into Cronkite’s broadcast on November 22, 1963, to learn about the assassination of their president. During the next few days, viewers followed every aspect of the tragedy on television, from the tremor in Cronkite’s voice as he removed his glasses and announced the news of Kennedy’s death, to the frantic scenes from Dallas police headquarters where the assassin, Lee Harvery Oswald, was gunned down by nightclub owner Jack Ruby, to the thousands of mourners lining up next to the president’s flag-draped coffin.
Figure 9.9

Television began to play a major role in U.S. politics during the presidency of John. F. Kennedy.
Around the same time as Kennedy’s assassination, horrific images from Vietnam were streaming into people’s living rooms during the nation’s first televised war. With five camera crews on duty in the Saigon bureau, news crews captured vivid details of the war in progress. Although graphic images were rarely shown on network television, several instances of violence reached the screen, including a CBS report in 1965 that showed Marines lighting the thatched roofs of the village of Cam Ne with Zippo lighters and an NBC news report in 1968 that aired a shot of South Vietnamese General Nyuyen Ngoc Loan executing a captive on a Saigon street. Further images, of children being burned and scarred by napalm and prisoners being tortured, fueled the antiwar sentiments of many Americans. In addition to the devastation caused by the president’s death and the Vietnam War, Americans were also feeling the pressure of the Cold War—the clash between the United States and the Soviet Union in the years following World War II. This pressure was especially great during periods of tension throughout the 1950s and 1960s, such as the 1962 Cuban Missile Crisis, a confrontation that caused many people to fear nuclear war.
As a result of the intense stress faced by many Americans during the 1960s, broadcasters and viewers turned to escapist programs such as I Dream of Jeannie, a fantasy show about a 2,000-year-old genie who marries an astronaut, and Bewitched, a supernatural-themed show about a witch who tries to live as a surburban housewife. Both shows typified the situation comedyComedy genre, also known as a sitcom, that features a recurring cast of characters who resolve zany situations based on their everyday lives., or sitcom, a comedy genre featuring a recurring cast of characters who resolve zany situations based on their everyday lives. Other popular sitcoms in the 1960s included The Beverly Hillbillies, a show about a poor backwoods family who move to Beverly Hills, California, after finding oil on their land, and Gilligan’s Island, the ultimate escapist comedy about seven characters shipwrecked on an uncharted island. None of the 1960s sitcoms mentioned any of the political unease that was taking place in the outside world, providing audiences with a welcome diversion from real life. Other than an occasional documentary, television programming in the 1960s consisted of a sharp dichotomy between prime-time escapist comedy and hard news.
Figure 9.10

Escapist sitcoms like I Dream of Jeannie provided Americans with a much-needed diversion from the stressful events of the 1960s.
During the 1970s, broadcasters began to diversify families on their shows to reflect changing social attitudes toward formerly controversial issues such as single parenthood and divorce. Feminist groups including the National Organization for Women (NOW), the National Women’s Political Caucus, and the Coalition of Labor Union Women pushed for equality on issues such as pay and encouraged women to enter the workforce. In 1972, the U.S. Supreme Court sanctioned women’s right to abortion, giving them control over their reproductive rights. Divorce rates skyrocketed during the 1970s, as states adopted no-fault divorce laws, and the change in family dynamics was reflected on television. Between 1972 and 1978, CBS aired the socially controversial sitcom Maude. Featuring a middle-aged feminist living with her fourth husband and divorced daughter, the show exploded the dominant values of the white middle-class domestic sitcom and its traditional gender roles. Throughout its 7-year run, Maude tackled social and political issues such as abortion, menopause, birth control, alcoholism, and depression. During its first four seasons, the show was in the top 10 in Nielsen ratings, illustrating the changing tastes of the viewing audience, who had come of age during the era of civil rights and Vietnam protests and developed a taste for socially conscious television. Other 1970s sitcoms took the same approach, including Maude’s CBS predecessor, All in the Family, which covered issues ranging from racism and homophobia to rape and miscarriage, and The Mary Tyler Moore Show, which reflected changing attitudes toward women’s rights by featuring television’s first never-married independent career woman as the central character. Even wholesome family favorite The Brady Bunch, which ran from 1969 to 1974, featured a non-nuclear family, reflecting the rising rates of blended families in American society.
Figure 9.11

The popularity of controversial shows like Maude reflected the changing cultural and social values of the 1970s.
In addition to changing family dynamics on sitcoms and other prime-time shows, variety and comedy sketch shows developed a political awareness in the 1970s that reflected audiences’ growing appetite for social and political commentary. Sketch comedyA series of short comedy scenes or vignettes that are often featured on variety shows, talk shows, or comedy shows. During the 1970s, comedy sketches that parodied American popular culture and politics grew in popularity. show Saturday Night Live (SNL) premiered on NBC in 1975 and has remained on air ever since. Featuring a different celebrity guest host every week and relatively unknown comedy regulars, the show parodies contemporary popular culture and politics, lambasting presidential candidates and pop stars alike. Earlier NBC sketch comedy show Laugh-In, which ran from 1968 to 1973, also featured politically charged material, though it lacked the satirical bite of later series such as SNL. By the end of the decade, television broadcasting reflected a far more politically conscious and socially aware viewing audience.
Until the mid-1980s, the top three networks (ABC, NBC, and CBS) dominated television broadcasting in the United States. However, as cable services gained popularity following the deregulation of the industry in 1984, viewers found themselves with a multitude of options. Services such as Cable News Network (CNN), Entertainment and Sports Programming Network (ESPN), and Music Television (MTV) profoundly altered the television landscape in the world of news, sports, and music. New markets opened up for these innovative program types, as well as for older genres such as the sitcom. During the 1980s, a revival of family sitcoms took place with two enormous hits: The Cosby Show and Family Ties. Both featured a new take on modern family life, with the mothers working outside of the home and the fathers pitching in with housework and parental duties. Despite their success on network television, sitcoms faced stiff competition from cable’s variety of choices. Between 1983 and 1994, weekly broadcast audience shares (a measure of the number of televisions in use that are tuned to a particular show) for network television dropped from 69 to 52, while cable networks’ shares rose from 9 to 26.Horace Newcomb, ed., Encyclopedia of Television (New York: Fitzroy Dearborn, 2004), 389.
With a growing number of households subscribing to cable television, concern began to grow about the levels of violence to which children were becoming exposed. In addition to regularly broadcast network programs, cable offered viewers the chance to watch films and adult-themed shows during all hours, many of which had far more violent content than normal network programming. One study found that by the time an average child leaves elementary school, he or she has witnessed 8,000 murders and more than 100,000 other acts of violence on television.Rea Blakey, “Study Links TV Viewing Among Kids to Later Violence,” CNN Health, March 28, 2002, http://archives.cnn.com/2002/HEALTH/parenting/03/28/kids.tv.violence/index.html. Although no conclusive links have been drawn between witnessing violence on television and carrying out violence in real life, the loosening boundaries regarding sexual and violent content on television is a persistent cause for concern for many parents. For more information on the social effects of violence in the media, please refer to .
Although television viewership is growing, the vast number of cable channels and other, newer content delivery platforms means that audiences are thinly stretched. In recent years, broadcasters have been narrowing the focus of their programming to meet the needs and interests of an increasingly fragmented audience. Entire cable channels devoted to cooking, music, news, African American interests (see sidebar below), weather, and courtroom drama enable viewers to choose exactly what type of show they want to watch, and many news channels are further specialized according to viewers’ political opinions. This trend toward specialization reflects a more general shift within society, as companies cater increasingly to smaller, more targeted consumer bases. Business magazine editor Chris Anderson explains, “We’re leaving the watercooler era, when most of us listened, watched and read from the same relatively small pool of mostly hit content. And we’re entering the microculture era, when we are all into different things.”Marc Gunther, “The Extinction of Mass Culture, CNN Money, July 12, 2006, http://money.cnn.com/2006/07/11/news/economy/pluggedin_gunther.fortune/index.htm. Just as cable broadcasters are catering to niche markets, Internet-based companies such as Amazon.com and Netflix are taking advantage of this concept by selling large numbers of books, DVDs, and music albums with narrow appeal. and of this chapter will cover the recent trends and issues of this era in television.
Launched in 1980, Black Entertainment Television (BET) was the first television network in the United States dedicated to the interests of African American viewers. The basic-cable franchise was created in Washington, DC, by media entrepreneur Robert Johnson, who initially invested $15,000 in the venture. Within a decade, he had turned the company into a multimillion-dollar enterprise, and in 1991 it became the first black-controlled company on the New York Stock Exchange. The company was sold to Viacom in 2003 for $3 billion.
Predating MTV by a year, BET initially focused on black-oriented music videos but soon diversified into original urban-oriented programs and public affairs shows. Although BET compensated somewhat for the underrepresentation of blacks on television (African Americans made up 8 percent of the prime-time characters on television in 1980 but made up 12 percent of the population), viewers complained about the portrayal of stereotypical images and inappropriate violent or sexual behavior in many of the rap videos shown by the network. In a 2004 interview with BET vice president of communications Michael Lewellen, former BET talk show host Bev Smith said, “We had videos on BET in those days that were graphic but didn’t proliferate as they seem to be doing now. That’s all you do seem to see are scantily dressed women who a lot of African American women are upset about in those videos.”The O’Reilly Factor, “Is Black Entertainment Television Taking a Disturbing Turn?” Fox News, May 26, 2004, http://www.foxnews.com/story/0,2933,120993,00.html. Despite the criticisms, BET remained the No. 1 cable network among blacks 18 to 34 in 2010 and retained an average audience of 524,000 total viewers during the first quarter of the year.Forbes, “BET Networks Unveils New African American Consumer Market Research and New Programming at 2010 Upfront Presentation,” April 14, 2010, http://www.forbes.com/feeds/prnewswire/2010/04/14/prnewswire201004141601PR_NEWS_USPR_____NE86679.html.
Despite entering a microculture era with a variety of niche markets, television remains the most important unifying cultural presence in the United States. During times of national crises, television news broadcasts have galvanized the country by providing real-time coverage of major events. When terrorists crashed planes into the World Trade Center towers in 2001, 24-hour television news crews provided stunned viewers around the world with continuous updates about the attack and its aftermath. Meanwhile, network blockbusters such as Lost and 24 have united viewers in shared anticipation, launching numerous blogs, fan sites, and speculative workplace discussions about characters’ fates.
Televised coverage of the news has had several cultural effects since the 1950s. Providing viewers with footage of the most intense human experiences, televised news has been able to reach people in a way that radio and newspapers cannot. The images themselves have played an important role in influencing viewer opinion. During the coverage of the civil rights movement, for example, footage of a 1963 attack on civil rights protesters in Birmingham, Alabama, showed police blasting African American demonstrators—many of them children—with fire hoses. Coupled with images of angry white segregationist mobs squaring off against black students, the news footage did much to sway public opinion in favor of liberal legislation such as the 1964 Voting Rights Act. Conversely, when volatile pictures of the race riots in Detroit and other cities in the late 1960s hit the airwaves, horrified viewers saw the need for a return to law and order. The footage helped create an anti-civil-rights backlash that encouraged many viewers to vote for conservative Republican Richard Nixon during the 1968 presidential election.
During the past few decades, mass-media news coverage has gone beyond swaying public opinion through mere imagery. Trusted centrist voices such as that of Walter Cronkite, who was known for his impartial reporting of some of the biggest news stories in the 1960s, have been replaced by highly politicized news coverage on cable channels such as conservative Fox News and liberal MSNBC. As broadcasters narrow their focus to cater to more specialized audiences, viewers choose to watch the networks that suit their political bias. Middle-of-the-road network CNN, which aims for nonpartisanship, frequently loses out in the ratings wars against Fox and MSNBC, both of which have fierce groups of supporters. As one reporter put it, “A small partisan base is enough for big ratings; the mildly interested middle might rather watch Grey’s Anatomy.”James Poniewozik, “CNN: Can a Mainstream News Outlet Survive?” Time, May 3, 2010, http://www.time.com/time/magazine/article/0,9171,1983901,00.html. Critics argue that partisan news networksNews networks that cater to niche political audiences by offering a right-wing or left-wing viewpoint rather than attempting to remain impartial. cause viewers to have less understanding of opposing political opinions, making them more polarized.
Table 9.1 Partisan Profile of Television News Audiences 2008
News Channel |
Republican (%) |
Democratic (%) |
Independent (%) |
|---|---|---|---|
Fox News |
39 |
33 |
22 |
Nightly Network |
22 |
45 |
26 |
MSNBC |
18 |
45 |
27 |
CNN |
18 |
51 |
23 |
NewsHour |
21 |
46 |
23 |
Source: “Partisanship and Cable News Audiences,” Oct. 30, 2009, Pew Research Center for the People & the Press, a project of the Pew Research Center.
The issue of whether television producers have a responsibility to promote particular social values continues to generate heated discussion. When the unmarried title character in the CBS series Murphy Brown—a comedy show about a divorced anchorwoman—got pregnant and chose to have the baby without any involvement from the father, then–Vice President Dan Quayle referenced the show as an example of degenerating family values. Linking the 1992 Los Angeles riots to a breakdown of family structure and social order, Quayle lambasted producers’ poor judgment, saying, “It doesn’t help matters when prime-time TV has Murphy Brown, a character who supposedly epitomizes today’s intelligent, highly paid professional woman, mocking the importance of fathers by bearing a child alone, and calling it just another ‘lifestyle choice.’”Time, “Dan Quayle vs. Murphy Brown,” June 1, 1992, http://www.time.com/time/magazine/article/0,9171,975627,00.html. Quayle’s outburst sparked lively debate between supporters and opponents of his viewpoint, with some praising his outspoken social commentary and others dismissing him as out of touch with America and its growing number of single mothers.
Similar controversy arose with the portrayal of openly gay characters on prime-time television shows. When the lead character on the ABC sitcom Ellen came out in 1997 (2 weeks after Ellen DeGeneres, the actress who played the role, announced that she was gay), she became the first leading gay character on both broadcast and cable networks. The show proved to be a test case for the nation’s tolerance of openly gay characters on prime-time television and became the subject of much debate. Embraced by liberal supporters and lambasted by conservative objectors (evangelical Baptist minister Jerry Falwell infamously dubbed her “Ellen DeGenerate”), both the actress and the show furthered the quest to make homosexuality acceptable to mainstream audiences. Although Ellen was canceled the following year (amid disagreements with producers about whether it should contain a parental advisory warning), DeGeneres successfully returned to television in 2003 with her own talk show. Subsequent shows with prominent gay characters were quick to follow in Ellen’s footsteps. According to the Gay & Lesbian Alliance Against Defamation (GLAAD), 18 lesbian, gay, bisexual, or transgender characters accounted for 3 percent of scripted series regulars in the 2009–2010 broadcast television schedule, up from 1.3 percent in 2006.Wendy Mitchell, “GLAAD Report: Gay Characters on Network TV Still on the Rise,” Entertainment Weekly, September 30, 2009, http://hollywoodinsider.ew.com/2009/09/30/glaad-report-gay-characters-on-rise/.
Emerging out of the 1948 television series Candid Camera, in which people were secretly filmed responding to elaborate practical jokes, reality televisionTelevision that attempts to capture unscripted, real-life situations. Many reality television shows are contrived or deliberately manufactured by producers. aimed to capture real, unscripted life on camera. The genre developed in several different directions, from home-video clip shows (America’s Funniest Home Videos, America’s Funniest People) to true-crime reenactment shows (America’s Most Wanted, Unsolved Mysteries) to thematic shows based on professions of interest (Project Runway, Police Women of Broward County, Top Chef). Near the turn of the millennium, the genre began to lean toward more voyeuristic shows, such as MTV’s The Real World, an unscripted “documentary” that followed the lives of seven strangers selected to live together in a large house or apartment in a major city. The show drew criticisms for glamorizing bad behavior and encouraging excessive drinking and casual sex, although its ratings soared with each successive controversy (a trend that critics claim encouraged producers to actively stage rating-grabbing scenarios). During the late 1990s and 2000s, a wave of copycat reality television shows emerged, including the voyeuristic series Big Brother, which filmed a group of strangers living together in an isolated house full of cameras in an attempt to win large amounts of cash, and Survivor, a game show in which participants competed against each other by performing endurance challenges on an uninhabited island. Survivor’s success as the most popular show on television in the summer of 2000 ensured the continued growth of the reality television genre, and producers turned their attention to reality dating shows such as The Bachelor, Temptation Island, and Dating in the Dark. Cheap to produce, with a seemingly never-ending supply of willing contestants and eager advertising sponsors, reality television shows continue to bring in big ratings. As of 2010, singing talent competition American Idol is television’s biggest revenue generator, pulling in $8.1 million in advertising sales every 30 minutes it is on the air.Paul Bond, “‘Idol’ Listed as TV’s Biggest Revenue Generator,” Hollywood Reporter, May 5, 2010, http://www.hollywoodreporter.com/hr/content_display/news/e3i8f1f42046a622bda2d602430b16d3ed9.
Figure 9.12

The stress of appearing on reality television shows has proved detrimental to some contestants’ health. Britain’s Got Talent star Susan Boyle suffered a nervous breakdown in 2009.
Reality television has created the cultural phenomenon of the instant celebrity. Famous for simply being on the air, reality show contestants are extending their 15 minutes in the spotlight. Kate Gosselin, star of Jon & Kate Plus 8, a cable television show about a couple who have eight children, has since appeared in numerous magazine articles, and in 2010 she starred on celebrity reality dance show Dancing with the Stars. Survivor contestant Elisabeth Hasselbeck became a co-host on television talk show The View, and several American Idol contestants (including Kelly Clarkson and Carrie Underwood) have become household names. The genre has drawn criticism for creating a generation that expects to achieve instant wealth without having to try very hard and also for preying on vulnerable people whom critics call “disposable.” When Britain’s Got Talent star Susan Boyle suffered a public meltdown in 2009 after the stress of transitioning from obscurity to stardom in an extremely short time period, the media began to point out the dangers of reality television. In 2009, TheWrap.com investigated the current lives of former stars of reality shows such as The Contender, Paradise Hotel, Wife Swap, and Extreme Makeover and found that at least 11 participants had committed suicide as an apparent result of their appearances on screen.Guy Adams, “Lessons From America on the Dangers of Reality Television,” Independent (London), June 6, 2009, http://www.independent.co.uk/news/world/americas/lessons-from-america-on-the-dangers-of-reality-television-1698165.html; Frank Feldlinger, “TheWrap Investigates: 11 Players Have Committed Suicide,” TheWrap, http://www.thewrap.com/television/article/thewrap-investigates-11-players-have-committed-suicide-3409.
Please respond to the following short-answer writing prompts. Each response should be a minimum of one paragraph.
When television was in its infancy, producers modeled the new medium on radio. Popular radio shows such as police drama Dragnet and western cowboy series Gunsmoke were adapted for television, and new television shows were sponsored by single advertisers, just as radio shows had been. Television was dominated by three major networks—NBC, ABC, and CBS—and these networks accounted for more than 95 percent of all prime-time viewing until the late 1970s. Today, the television industry is far more complex. Programs are sponsored by multiple advertisers; programming is controlled by major media conglomerates; and the three major networks no longer dominate the airwaves but instead share their viewers with numerous cable channels. Several factors account for these trends within the industry, including technological developments, government regulations, and the creation of new networks.
Early television programs were often developed, produced, and supported by a single sponsor, which sometimes reaped the benefits of having its name inserted into the program’s title—Colgate Comedy Hour, Camel Newsreel, Goodyear TV Playhouse. However, as production costs soared during the 1950s (a single one-hour television show cost a sponsor about $35,000 in 1952 compared with $90,000 at the end of the decade), sponsors became increasingly unable to bear the financial burden of promoting a show single-handedly. This suited the broadcast networks, which disliked the influence sponsors exerted over program content. Television executives, in particular NBC’s Sylvester L. “Pat” Weaver, advocated the magazine concept, in which advertisers purchased one- or two-minute blocks rather than the entire program, just as magazines contained multiple advertisements from different sponsors. The presence of multiple sponsors meant that no one advertiser controlled the entire program.
Figure 9.13

Many sponsors believed that if viewers identified their favorite shows, such as the Colgate Comedy Hour, with a sponsor, they would be more likely to purchase the product being advertised.
Although advertising agencies relinquished control of production to the networks, they retained some influence over the content of the programs they sponsored. As one executive commented, “If my client sells peanut butter and the script calls for a guy to be poisoned eating a peanut butter sandwich, you can bet we’re going to switch that poison to a martini.”Horace Newcomb, ed., Encyclopedia of Television (New York: Fitzroy Dearborn, 2004), 2170. Sponsors continue to influence program content indirectly by financially supporting shows they support and pulling funding from those they do not. For example, in 1995, pharmaceutical giant Procter & Gamble, the largest television advertiser, announced it would no longer sponsor salacious daytime talk shows. The company provided producers with details about its guidelines, pulling out of shows it deemed offensive and supporting shows that dealt with controversial subject matter responsibly. Communications heavyweight AT&T took a similar path, reviewing shows after they were taped but before they aired in order to make decisions about corporate sponsorship on an individual basis.Advertising Age, “Speak Up About Talk Shows,” November 27, 1995, http://adage.com/article?article_id=84233. In 2009, advertisers used their financial might to take a stand against Fox News host Glenn Beck, who offended viewers and sponsors alike with his incendiary comments that President Obama was a “racist” and had a “deep-seated hatred for white people.” Sponsors of the Glenn Beck television talk show began to remove advertising spots from the program in protest of Beck’s comments. A spokeswoman for Progressive car insurance said, “We place advertising on a variety of programming with the goal of reaching a broad range of insurance consumers who might be interested in our products. We also seek to avoid advertising on programming that our customers or potential customers may find extremely offensive.”William Spain, “Advertisers Deserting Fox News’ Glenn Beck,” MarketWatch, August 14, 2009, http://www.marketwatch.com/story/advertisers-deserting-fox-news-glenn-beck-2009-08-14. Other shows whose advertisers have pulled ads include NBC’s long-running sketch comedy show Saturday Night Live, BET’s Hot Ghetto Mess, and ABC’s Ellen sitcom.
Corporate sponsorship does not just affect network television. Even public television has become subject to the influence of advertising. Established in 1969, Public Broadcasting ServicePublic television network established in 1969. The service was intended to enable universal access to television for viewers in rural areas and viewers who could not afford to pay for private television services. (PBS) developed out of a report by the Carnegie Commission on Educational Television, which examined the role of educational, noncommercial television on society. The report recommended that the government finance public television in order to provide diversity of programming during the network era—a service created “not to sell products” but to “enhance citizenship and public service.”Michael P. McCauley, Public Broadcasting and the Public Interest (Armonk, NY: M.E. Sharpe, 2003), 239. Public television was also intended to provide universal access to television for viewers in rural areas or viewers who could not afford to pay for private television services. PBS focused on educational program content, targeting viewers who were less appealing to the commercial networks and advertisers, such as the over-50 age demographic and children under 12.
The original Carnegie Commission report recommended that Congress create a federal trust fund based on a manufacturer’s excise tax on the sale of television sets to finance public television. Following intense lobbying by the National Association of Broadcasters, the proposal was removed from the legislation that established the service. As a result, public television subsists on viewer contributions and federal funding and the latter has been drastically reduced in recent years. Although a 2007 proposal by President George W. Bush to eliminate more than half of the federal allocation to public broadcasting ($420 million out of $820 million) was overturned, PBS has become increasingly dependent on corporate sponsorship to stay afloat. By 2006, corporate sponsors funded more than 25 percent of all public television. Sponsorship has saved many programs that would otherwise have been lost, but critics have bemoaned the creeping commercialism of public television. When PBS began selling banner advertisements on its website in 2006, Gary Ruskin, executive director of consumer group Commercial Alert, commented, “It’s just one more intrusion of the commercial ethos into an organization that was supposed to be firmly noncommercial. The line between them and the commercial networks is getting fuzzier and fuzzier.”Matea Gold, “Marketing Tie-ins Finding Their Way to PBS Sponsors,” Baltimore Sun, October 23, 2006, http://articles.baltimoresun.com/2006-10-23/features/0610230151_1_pbs-corporate-underwriters-public-television. Despite such criticisms, the drop in federal funding has forced public television executives to seek more creative ways of obtaining financial backing—for example, through online banner ads. In 2009, PBS shortened the length of time companies were required to sponsor some programs in an effort to encourage advertisers.Brian Stelter, “PBS to Shorten Time Commitments for Sponsorships,” New York Times, May 7, 2009, http://www.nytimes.com/2009/05/08/business/media/08adco.html. As of 2010, the future of PBS remained uncertain. With better-funded cable channels offering niche-interest shows that were traditionally public television’s domain (BBC nature series Planet Earth was shown on the Discovery Channel, while historical dramas John Adams and The Tudors are shown on premium cable channels HBO and Showtime), PBS is left to rely on shows that have been around for decades, such as Nova and Nature, to attract audiences.Charles McGrath, “Is PBS Still Necessary?” New York Times, February 17, 2008, http://www.nytimes.com/2008/02/17/arts/television/17mcgr.html. Only time will tell how PBS fares in the face of competition.
The period between 1950 and 1970 is historically recognized as the network eraThe period between 1950 and 1970, during which network television dominated the airwaves and accounted for more than 95 percent of prime-time viewing.. Aside from a small portion of airtime controlled by public television, the three major networks (known as the Big Three) dominated the television industry, collectively accounting for more than 95 percent of prime-time viewing. In 1986, Rupert Murdoch, the head of multinational company News Corp, launched the Fox network, challenging the dominance of the Big Three. In its infancy, Fox was at best a minor irritation to the other networks. With fewer than 100 affiliated stationsA local television or radio station associated with a particular broadcast network that carries some or all of that network’s programming. (the other networks all had more than 200 affiliates each), reaching just 80 percent of the nation’s households (compared with the Big Three’s 97 percent coverage rate), and broadcasting just one show (The Late Show Starring Joan Rivers), Fox was barely a consideration in the ratings war. During the early 1990s, these dynamics began to change. Targeting young viewers and black audiences with shows such as Beverly Hills 90210, Melrose Place, In Living Color, and The Simpsons, Fox began to establish itself as an edgy, youth-oriented network. Luring affiliates away from other networks to increase its viewership, Fox also extended its programming schedule beyond the initial 2-night-a-week broadcasts. By the time the fledgling network acquired the rights to National Football League (NFL) games with its $1.58 billion NFL deal in 1994, entitling it to 4 years of NFL games, Fox was a worthy rival to the other three broadcast networks. Its success turned the Big Three into the Big Four. In the 1994–1995 television season, 43 percent of U.S. households were watching the Big Four at any given moment during prime time.James Poniewozik, “Here’s to the Death of Broadcast,” Time, March 26, 2009, http://www.time.com/time/magazine/article/0,9171,1887840,00.html.
Fox’s success prompted the launch of several smaller networks in the mid-1990s. UPN (owned by Paramount, recently acquired by Viacom) and WB (owned by media giant Time Warner) both debuted in January 1995. Using strategies similar to Fox, the networks initially began broadcasting programs 2 nights a week, expanding to a 6-day schedule by 2000. Targeting young and minority audiences with shows such as Buffy the Vampire Slayer, Moesha, Dawson’s Creek, and The Wayans Bros., the new networks hoped to draw stations away from their old network affiliations. However, rather than repeating the success of Fox, UPN and WB struggled to make an impact. Unable to attract many affiliate stations, the two fledgling networks reached fewer households than their larger rivals because they were unobtainable in some smaller cities. High start-up costs, relatively low audience ratings, and increasing production expenses spelled the end of the “netletsA term coined by Variety magazine for minor-league networks that lack a full week’s worth of programming.,” a term coined by Variety magazine for minor-league networks that lacked a full week’s worth of programming. After losing $1 billion each, parent companies CBS (having split from Viacom) and Time Warner agreed to merge UPN and WB, resulting in the creation of the CW network in 2006. Targeting the desirable 18–34 age group, the network retained the most popular shows from before the merger—America’s Next Top Model and Veronica Mars from UPN and Beauty and the Geek and Smallville from WB—as well as launching new shows such as Gossip Girl and The Vampire Diaries. Despite its cofounders’ claims that the CW would be the “fifth great broadcast network,” the collaboration got off to a shaky start. Frequently outperformed by Spanish-language television network Univision in 2008 and with declining ratings among its target audience, critics began to question the future of the CW network.Melissa Grego, “How The CW Stays Undead,” Broadcasting and Cable, February 1, 2010, http://www.broadcastingcable.com/article/446733-How_The_CW_Stays_Undead.php. However, the relative success of shows such as Gossip Girl and 90210 in 2009 gave the network a foothold on its intended demographic, quashing rumors that co-owners CBS Corporation and Warner Bros. might disband the network. Warner Bros. Television Group President Bruce Rosenblum said, “I think the built-in assumption and the expectation is that the CW is here to stay.”Scott Collins, “With Ratings Comeback, has CW Finally Turned the Corner?” Los Angeles Times, April 7, 2009, http://latimesblogs.latimes.com/showtracker/2009/04/last-week-the-cw-scored-its-best-ratings-in-nearly-five-months-ordinarily-this-might- not-sound-like-huge-news-but-cw-is-a.html.
Figure 9.14

Despite launching several new shows geared toward its target demographic, the CW remains fifth in the network rankings.
A far greater challenge to network television than the emergence of smaller competitors was the increasing dominance of cable television. Between 1994 and 2009, the percentage of U.S. households watching the Big Four networks during prime time plummeted from 43 percent to 27 percent.James Poniewozik, “Here’s to the Death of Broadcast,” Time, March 26, 2009, http://www.time.com/time/magazine/article/0,9171,1887840,00.html. Two key factors influenced the rapid growth of cable television networks: industry deregulationThe removal of government regulations from an industry. Government deregulation of the cable industry in the 1980s enabled its extensive growth throughout the next two decades. and the use of satellites to distribute local television stations around the country.
During the 1970s, the growth of cable television was restricted by FCC regulations, which protected broadcasters by establishing franchising standards and enforcing anti-siphoning rules that prevented cable from taking sports and movie programming away from the networks. However, during the late 1970s, a court ruled that the FCC had exceeded its authority, and the anti-siphoning rules were repealed. This decision paved the way for the development of cable movie channels, contributing to the exponential growth of cable in the 1980s and 1990s. Further deregulation of cable in the 1984 Cable Communications Policy Act removed restrictions on cable rates, enabling operators to charge what they wanted for cable services as long as there was effective competition to the service (a standard that over 90 percent of all cable markets could meet). Other deregulatory policies during the 1980s included the eradication of public-service requirements and the elimination of regulated amounts of advertising in children’s programming, expanding the scope of cable channel stations. Deregulation was intended to encourage competition within the industry but instead enabled local cable companies to establish monopolies all over the country. In 1989, U.S. Senator Al Gore of Tennessee commented, “Precipitous rate hikes of 100 percent or more in one year have not been unusual since cable was given total freedom to charge whatever the market will bear…. Since cable was deregulated, we have also witnessed an extraordinary concentration of control and integration by cable operators and program services, manifesting itself in blatantly anticompetitive behavior toward those who would compete with existing cable operators for the right to distribute services.”Adam M. Zaretsky, “The Cable TV Industry and Regulation,” Regional Economist, July 1995, http://research.stlouisfed.org/publications/regional/95/07/CableTV.pdf. The FCC reintroduced regulations for basic cable rates in 1992, by which time more than 56 million households (over 60 percent of the households with televisions) subscribed to a cable service.
The growth of cable television was also assisted by a national satellite distribution system. Pioneered by Time Inc., which founded cable network company HBO, the corporation used satellite transmission in 1975 to beam the “Thrilla from Manila”—the historic heavyweight boxing match between Muhammad Ali and Joe Frazier—into people’s homes. Shortly afterward, entrepreneur Ted Turner, owner of independent Atlanta-based station WTBS, uplinked his station’s signal onto the same satellite as HBO, enabling cable operators to downlink the station on one of their channels. Initially provided free to subscribers to encourage interest, the station offered television reruns, wrestling, and live sports from Atlanta. Having created the first “superstation,” Turner expanded his realm by founding 24-hour news network CNN in 1980. At the end of the year, 28 national programming services were available, and the cable revolution had begun. Over the next decade, the industry underwent a period of rapid growth and popularity, and by 1994 viewers could choose from 94 basic and 20 premium cable services.
Figure 9.15

The 1975 “Thrilla from Manila” was one of the first offerings by HBO.
Because the proliferation of cable channels provided viewers with so many choices, broadcasters began to move away from mass-oriented programming in favor of more targeted shows. Whereas the broadcast networks sought to obtain the widest audience possible by avoiding programs that might only appeal to a small minority of viewers, cable channels sought out niche audiences within specific demographic groups—a process known as narrowcastingThe process of seeking out a niche audience within a particular demographic group as opposed to seeking the widest possible audience.. In much the same way that specialist magazines target readers interested in a particular sport or hobby, cable channels emphasize one topic, or group of related topics, that appeal to specific viewers (often those who have been neglected by broadcast television). People interested in current affairs can tune into CNN, MSNBC, Fox News, or any number of other news channels, while those interested in sports can switch on ESPN or TSN (The Sports Network). Other channels focus on music, shopping, comedy, science fiction, or programs aimed at specific cultural or gender groups. Narrowcasting has proved beneficial for advertisers and marketers, who no longer need to time their communications based on the groups of people who are most likely to watch television at certain times of the day. Instead, they concentrate their approach on subscription channels that appeal directly to their target consumers.
The popularity of cable television has forced the Big Four networks to rethink their approach to programming over the past three decades. Because of the narrowcasting mode of distribution and exhibition, cable television has offered more explicit sexual and violent content than broadcast television does. To compete for cable channels’ viewing audience, broadcast networks have loosened restrictions on graphic material and now frequently feature partial nudity, violence, and coarse language. This has increased viewership of mildly controversial shows such as CSI, NCIS, Grey’s Anatomy, and Private Practice, while opening the networks to attacks from conservative advocacy groups that object to extreme content.
The broadcast networks are increasingly adapting narrowcasting as a programming strategy. Newer networks, such as the CW, deliberately target the 18–34 age group (women in particular). Since its inception, the CW has replaced urban comedies such as Everybody Hates Chris with female-oriented series such as Gossip Girl and The Vampire Diaries. Older networks group similar programs that appeal to specific groups in adjacent time slots to retain viewers for as long as possible. For example, ABC sitcoms Modern Family and Cougar Town run back to back, while Fox follows reality police series Cops with crime-fighting show America’s Most Wanted.
Despite responding to challenges from cable, the broadcast networks’ share of the total audience has declined each year. Between 2000 and 2009, the networks saw their numbers drop by around 8 million viewers.Robert Bianco, “The Decade in Television: Cable, the Internet Become Players,” USA Today, December 29, 2009, http://www.usatoday.com/life/television/news/2009-12-28-decadeTV28_CV_N.htm.
Figure 9.16

Increased competition from cable channels has caused a steady decline in the networks’ audience ratings.
Please respond to the following short-answer writing prompts. Each response should be a minimum of one paragraph.
Choose one of the Big Four networks and print out its weekly programming schedule. Watch the network’s prime-time programs over the course of a week, noting the target demographic for each show. Observe the advertising sponsors that support each show and compare how the products and services fit with the intended audience.
The experience of watching television is rapidly changing with the progression of technology. No longer restricted to a limited number of channels on network television, or even to a television schedule, viewers are now able to watch exactly what they want to watch, when they want to watch it. Non-television delivery systems such as the Internet, which enables viewers to download traditional television shows onto a computer, laptop, iPod, or smartphone, are changing the way people watch television. Meanwhile, cable and satellite providers are enabling viewers to purchase television shows to watch at their convenience through the use of video-on-demand services, changing the concept of prime-time viewing. Digital video recording (DVR) systems such as TiVo, which enable users to record particular shows onto the system’s computer memory, are having a similar effect.
Although television audiences are becoming increasingly fragmented, they are also growing because of the convenience and availability of new technology. In 2009, Nielsen’s Three Screen Report, which encompassed television, cell phone, and computer usage, reported that the average viewer watched more than 151 hours of television per month, up 3.6 percent from the previous year.Alana Semuels, “Television Viewing at All-Time High,” Los Angeles Times, February 24, 2009, http://articles.latimes.com/2009/feb/24/business/fi-tvwatching24. Viewers might not all be sitting together in the family room watching prime-time shows on network television between 7 and 11 p.m., but they are watching.
The origins of satellite televisionTelevision system delivered by means of a communications satellite and received by a satellite dish and set-top box. can be traced to the space race of the 1950s, when the United States and the Soviet Union were competing to put the first satellite into space. Soviet scientists accomplished the goal first with the launch of Sputnik in 1957, galvanizing Americans (who were fearful of falling behind in space technology during the Cold War era) into intensifying their efforts and resulting in the creation of the National Aeronautics and Space Administration (NASA) in 1958. AT&T launched Telstar, the first active communications satellite, on July 10, 1962, and the first transatlantic television signal—a black-and-white image of a U.S. flag waving in front of the Andover Earth Station in western Maine—transmitted that same day. However, the television industry did not utilize satellites for broadcasting purposes until the late 1970s when PBS introduced Public Television Satellite Service. Satellite communication technology caught on and was used by broadcasters as a distribution method between 1978 and 1984 by pioneering cable channels such as HBO, TBS (Turner Broadcasting System), and CBN (Christian Broadcasting Network, later the Family Channel).
The trouble with early satellite television systems was that once people purchased a satellite system, they had free access to every basic and premium cable service that was broadcasting via satellite signals. The FCC had an “open skies” policy, under which users had as much right to receive signals as broadcasters had the right to transmit them. Initially, the satellite receiver systems were prohibitively expensive for most families, costing more than $10,000. However, as the price of a satellite dish dropped toward the $3,000 mark in the mid-1980s, consumers began to view satellite television as a cheaper, higher-quality alternative to cable. Following the initial purchase of a dish system, the actual programming—consisting of more than 100 cable channels—was free. Cable broadcasters lobbied the government for legal assistance and, under the 1984 Cable Act, were allowed to encrypt their satellite feeds so that only people who purchased a decoder from a satellite provider could receive the channel.
Following the passing of the Cable Act, the satellite industry took a dramatic hit. Sales of the popular direct-to-home (DTH)A television system in which subscribers receive television signals directly from geostationary satellites. DTH was a precursor to direct broadcast satellite (DBS). systems (precursors to the smaller, more powerful direct broadcast satellite systems introduced in the 1990s) that had offered free cable programming slumped from 735,000 units in 1985 to 225,000 units a year later, and around 60 percent of satellite retailers went out of business. The satellite industry’s sudden drop in popularity was exacerbated by large-scale anti-dish advertising campaigns by cable operators, depicting satellite dishes as unsightly. Although sales picked up in the late 1980s with the introduction of integrated receiving and decoding units and the arrival of program packages, which saved consumers the time and effort of signing up for individual programming services, the growth of the satellite industry was stunted by piracy—the theft of satellite signals. Of the 1.9 million units manufactured between 1986 and 1990, fewer than 500,000 were receiving signals legally.Harry W. Thibedeau, “DTH Satellite TV: Timelines to the Future,” Satellite Broadcasting & Communications Association, 2000, http://satelliteretailers.com/dish_installation.html. The problem was ultimately solved by the actions of the Satellite Broadcasting and Communications Association (SBCA)Trade association created in 1986 that established an antipiracy task force in an effort to prevent the theft of satellite signals., an association created in 1986 by the merger of two trade organizations—the Society of Private and Commercial Earth Stations (SPACE) and the Direct Broadcast Satellite Association (DBSA). SPACE was composed of manufacturers, distributors, and retailers of direct-to-home systems, and DBSA represented companies interested in direct broadcast satellite systems. The SBCA set up an antipiracy task force, aggressively pursuing illegal hackers with the FBI’s help.
Once the piracy problem was under control, the satellite industry could move forward. In 1994, four major cable companies launched a first-generation direct broadcast satellite (DBS)A small-dish satellite-delivered program service specifically intended for home reception. system called PrimeStar. The system, a small-dish satellite-delivered program service specifically intended for home reception, was the first successful attempt to enter the market in the United States. Within a year, PrimeStar was beaming 67 channels into 70,000 homes for a monthly fee of $25 to $35 (in addition to a hardware installation fee of $100 to $200). By 1996, competing companies DirecTV and the EchoStar Dish Network had entered the industry, and Dish Network’s cheaper prices were forcing its competitors to drop their fees. DirecTV acquired PrimeStar’s assets in 1999 for around $1.82 billion, absorbing its rival’s 2.3 million subscribers.Sandeep Junnarker, “DirecTV to Buy Rival PrimeStar’s Assets,” CNET, January 22, 1999, http://news.cnet.com/DirecTV-to-buy-rival-Primestars-assets/2100-1033_3-220509.html.
Figure 9.17

Subscribers of DBS receive signals from geostationary satellites that are broadcast in digital format at microwave frequency and intercepted by a satellite dish. A converter next to the television produces output that can be viewed on the television receiver.
As of 2010, the two biggest players in the satellite television industry are DirecTV and Dish Network. Assisted by the passing of the Satellite Television Home Viewers Act in 1999, which enabled satellite providers to carry local television stations (putting them on equal footing with cable television), both companies have grown rapidly over the past decade. In the first quarter of 2010, DirecTV boasted 18.6 million subscribers, placing it ahead of its rival, Dish Network, which reported a total of 14.3 million subscribers.Franklin Paul, “Dish Network Subscriber Gain, Profit Beat Street,” Reuters, May 10, 2010, http://www.reuters.com/article/idUSTRE6492MW20100510. Dish courts customers who have been hit by the economic downturn, aggressively cutting its prices and emphasizing its low rates. Conversely, DirecTV targets affluent consumers, emphasizing quality and choice in its advertising campaigns and investing in advanced services and products such as multiroom viewing (enabling a subscriber to watch a show in one room, pause it, and continue watching the same show in another room) to differentiate itself from rival satellite and cable companies.
Since the 1999 legislation put satellite television in direct competition with cable, the major satellite companies have increasingly pitted themselves against cable broadcasters, offering consumers numerous incentives to switch providers. One of these incentives is the addition of premium networks for satellite subscribers in the same vein as premium cable channel HBO. In 2005, DirecTV expanded its 101 Network channel to include original shows, becoming the first satellite station to air first episodes of a broadcast television series with NBC daytime soap opera Passions in 2007. The station aired first-run episodes of football drama series Friday Night Lights in 2008 and set its sights on the male over-35 demographic by obtaining syndication rights to popular HBO series Oz and Deadwood a year later. Commenting on the satellite company’s programming plans, executive vice president for entertainment for DirecTV Eric Shanks said, “We’d like to become a pre-cable window for these premium channels.”Bill Carter, “DirecTV Raises Its Sights for a Channel,” New York Times, January 23, 2009, http://www.nytimes.com/2009/01/24/business/media/24direct.html. In other words, the company hopes to purchase HBO shows such as Sex and the City before HBO sells the series to basic-cable channels like TBS.
In another overt bid to lure cable customers over to satellite television, both DirecTV and Dish Network offer a number of comprehensive movies and sports packages, benefiting from their additional channel capacity (satellite television providers typically offer around 350 channels, compared with 180 channels on cable) and their ability to receive international channels often unavailable on cable. In the mid-2000s, the satellite companies also began encroaching on cable television’s domination of bundled packages, by offering all-in-one phone, Internet, and television services. Despite being ideally suited to offering such packages with their single telecommunications pipe into the house, cable companies such as Comcast, Cox, and Time Warner had developed a reputation for offering poor service at extortionate prices. In the first three quarters of 2004, the eight largest cable providers (with the exception of bankrupt Adelphia) lost 552,000 basic-cable subscribers. Between 2000 and 2004, cable’s share of the television market fell from 66 percent to 62 percent, while the number of U.S. households with satellite television increased from 12 percent to 19 percent.Ken Belson, “Cable’s Rivals Lure Customers With Packages,” New York Times, November 22, 2004, http://www.nytimes.com/2004/11/22/technology/22satellite.html. Despite reports that cash-strapped consumers are switching off pay-TV services to save money during strained economic times, satellite industry revenues have risen steadily over the past decade.
Over the past two decades, the viewing public has become increasingly fragmented as a result of growing competition between cable and satellite channels and traditional network television stations. Now, television audiences are being presented with even more options. Digital video recorders (DVRs) like TiVo allow viewers to select and record shows they can watch at a later time. For example, viewers can set their DVRs to record all new (or old) episodes of the show Deadliest Catch and then watch the recorded episodes whenever they have free time.
DVRs can be used by advertisers to track which shows are being viewed. DVRs are even capable of targeting viewers with specific ads when they decide to watch their recorded program. In 2008, consumer groups battled with cable companies and lawmakers to protect the privacy of viewers who did not wish to be tracked this way, causing Nielsen to make tracking optional.
Non-television delivery systems such as the Internet allow viewers to download their favorite shows at any time, on several different media. The Internet has typically been bad news for traditional forms of media; newspapers, magazines, the music industry, video rental companies, and bookstores have all suffered from the introduction of the Internet. However, unlike other media, television has so far survived the Internet’s effects. Television remains the dominant source of entertainment for most Americans, who are using new media in conjunction with traditional television viewing, watching vast quantities of television in addition to streaming numerous YouTube videos and catching up on missed episodes via the networks’ web pages. In the third quarter of 2008, the average American watched 142 hours of television per month, an increase of five hours per month from the same quarter the previous year. Internet use averaged 27 hours per month, an increase of an hour and a half between 2007 and 2008.Randall Stross, “Why Television Still Shines in a World of Screens,” New York Times, February 7, 2009, http://www.nytimes.com/2009/02/08/business/media/08digi.html.
Of the many recent Internet phenomenons, few have made as big an impact as video-sharing website YouTubeVideo-sharing website founded in 2005 that enables users to upload personal videos, television clips, music videos, and snippets of movies that can be watched by other users worldwide.. Created by three PayPal engineers in 2005, the site enables users to upload personal videos, television clips, music videos, and snippets of movies that can be watched by other users worldwide. Although it initially drew unfavorable comparisons with the original music-sharing site Napster (see ), which was buried under an avalanche of copyright infringement lawsuits, YouTube managed to survive the controversy by forming agreements with media corporations, such as NBC Universal Television, to legally broadcast video clips from shows such as The Office. In 2006, the company, which showed more than 100 million video clips per day, was purchased by Google for $1.65 billion.Associated Press, “Google Buys YouTube for $1.65 Billion,” MSNBC, October 10, 2006, http://www.msnbc.msn.com/id/15196982/ns/business-us_business/. Correctly predicting that the site was the “next step in the evolution of the Internet,” Google CEO Eric Schmidt has watched YouTube’s popularity explode since the takeover. As of 2010, YouTube shows more than 2 billion clips per day and allows people to upload 24 hours of video every single minute.YouTube, “YouTube Fact Sheet,” http://www.youtube.com/t/fact_sheet. To secure its place as the go-to entertainment website, YouTube is expanding its boundaries by developing a movie rental service and showing live music concerts and sporting events in real time. In January 2010, Google signed a deal with the Indian Premier League, making 60 league cricket matches available on YouTube’s IPL channel and attracting 50 million viewers worldwide.Heather Timmons, “Google Sees a New Role for YouTube: An Outlet for Live Sports,” New York Times, May 2, 2010, http://www.nytimes.com/2010/05/03/business/media/03cricket.html.
Figure 9.18

Agreements between YouTube and media corporations allow viewers to watch clips of their favorite shows on YouTube for free.
While YouTube remains focused on user-generated material, viewers looking for commercial videos of movies and television shows are increasingly turning to HuluWebsite founded in 2007 that gives users access to an entire library of video clips without charge and syndicates its material to partner distribution sites such as MySpace and Facebook.. Established in 2007 following a deal between NBC Universal, News Corporation, and a number of leading Internet companies (including Yahoo!, AOL, MSN, and MySpace), the site gives users access to an entire library of video clips without charge and syndicates its material to partner distribution sites. The videos include full episodes of current hit shows such as House, Saturday Night Live, and The Simpsons, as well as older hits from the studios’ television libraries. Supported through advertising, the venture, which is only available to viewers in the United States, became the premier video broadcast site on the web within 2 years. In July 2009, the site received more than 38 million viewers and delivered more videos than any site except YouTube.Chuck Salter, “Can Hulu Save Traditional TV?” Fast Company, November 1, 2009, http://www.fastcompany.com/magazine/140/the-unlikely-mogul.html. Throughout the entire year, Hulu generated an estimated $120 million in revenue and increased its advertiser base to 250 sponsors.Chuck Salter, “Can Hulu Save Traditional TV?” Fast Company, November 1, 2009, http://www.fastcompany.com/magazine/140/the-unlikely-mogul.html. Its advertising model appeals to viewers, who need only to watch two minutes of promotion in 22 minutes of programming, compared with 8 minutes on television. Limiting sponsorship to one advertiser per show has helped make recall rates twice as high as those for the same advertisements on television, benefiting the sponsors as well as the viewers.
Some critics and television executives claim that the Hulu model has been too successful for its own good, threatening the financial underpinnings of cable television by reducing DVD sales and avoiding carriage feesFees paid to cable operators by broadcasters for transmitting their channels on the operators’ systems.—in 2009, Fox pulled most of the episodes of It’s Always Sunny in Philadelphia from Hulu’s site. Per the networks’ request, Hulu also shut off access to its programming from Boxee, a fledgling service that enabled viewers to stream online video to their television sets. “We have to find ways to advance the business rather than cannibalize it,” stated the distribution chief at TNT, a network that refused to stream episodes of shows such as The Closer on Hulu’s site.Frank Rose, “Hulu, a Victim of Its Own Success?” Wired, May 12, 2009, http://www.wired.com/epicenter/2009/05/hulu-victim-success/. However, many television executives realize that if they do not cannibalize their own material, others will. When a viral video of Saturday Night Live short “Lazy Sunday” hit the web in 2005, generating millions of hits on YouTube, NBC did not earn a dime. Broadcast networks—the Big Four and the CW—have also begun streaming shows for free in an effort to stop viewers from watching episodes on other websites.
Hulu executives are considering introducing paid content on the site in an effort to subsidize advertising revenue, a blow to consumers that would likely be softened by perks such as early access to content, ad-free shows, and more comprehensive archives.
Originally introduced in the early 1990s, the concept of video on demand (VOD)A pay-per-view system that enables viewers to order or download a film via television or Internet and watch it at their convenience.—a pay-per-view system that allows viewers to order or download a film via television or the Internet and watch it at their convenience—was not immediately successful because of the prohibitive cost of ordering a movie compared to buying or renting it from a store. Another early complaint about the service was that studios withheld movies until long after they were available on DVD, by which time most people who wanted to view the film had already seen it. Both of these disadvantages have since been remedied, with movies now released at the same time on VOD as they are on DVD at competitive rental prices. Currently, most cable and satellite television providers offer some form of on-demand service, either VOD, which provides movies 24 hours a day and enables viewers all the functionality of a DVD player (such as the ability to pause, rewind, or fast forward films), or NVOD (near video on demand)A system that broadcasts multiple copies of a film or program over short time intervals but does not allow viewers to control the video by pausing or rewinding it., which broadcasts multiple copies of a film or program over short time intervals but does not allow viewers to control the video.
As an alternative to cable or satellite VOD, viewers can also readily obtain movies and television shows over the Internet, via free services such as YouTube and Hulu or through paid subscriptions to sites that stream movies to a computer. Online DVD rental service Netflix started giving subscribers instant access to its catalog of older television programs and films in 2007, while Internet giant Amazon.com set up a rival service resembling the pay-per-view model in 2008. Viewers can also stream free episodes of their favorite shows via cable and broadcast networks’ websites. With the increasing popularity of smartphonesCell phones that provide services similar to those of a personal computer. Most contain built-in applications and Internet access.—cell phones that contain built-in applications and Internet access—viewers are using VOD as a way of watching television while they are out of the house. Having discovered that consumers are willing to watch entire television episodes or even films on their smartphones, industry executives are looking for ways to capitalize on smartphone technology. In 2010, News Corporation’s Fox Mobile Group was planning to launch Bitbop, a service that will stream television episodes to smartphones for $9.99 a month. Discussing the project, Bitbop architect Joe Bilman said that “the marriage of on-demand content and mobility has the power to light a fire in the smartphone space.”Brian Stelter, “Audiences, and Hollywood, Flock to Smartphones,” New York Times, May 2, 2010, http://www.nytimes.com/2010/05/03/business/media/03mobile.html. The shift from traditional television viewing to online viewing is making a small but noticeable dent in the $84 billion cable and satellite industry. Between the beginning of 2008 and the end of 2009, an estimated 800,000 U.S. households cut the cable cord in favor of web viewing.Erick Schonfeld, “Estimate: 800,000 U.S. Households Abandoned Their TVs for the Web,” TechCrunch, April 13, 2010, http://techcrunch.com/2010/04/13/800000-households-abandoned-tvs-web/.
Moving a step beyond VOD, cable and satellite television providers are combining aspects of traditional television viewing with online content to create an entirely new way of watching shows—interactive television (iTV)Television programming that enables viewers to engage with and control aspects of their viewing experience—for example, by playing a quiz using a remote control.. Using an additional set-top box and their remote control, viewers can utilize several different features that go beyond simply watching a television show. For example, interactive television enables users to take part in quiz shows, vote for a favorite contestant on a game show, view highlights or look up statistics during sports matches, create a music playlist or photo slideshow, and view local information such as weather and traffic through a tickerAn electronic banner that scrolls across the bottom of the television screen. Tickers usually provide up-to-date information about news or weather. under a current television program. Software such as Microsoft’s UltimateTV, released in 2001, even brought interactivity to individual television shows. For example, a viewer watching CBS crime series CSI can click on the interactive icon in the corner of the screen and obtain instant information about forensic analysis techniques, along with an episode guide, character biographies, and a map of the show’s Las Vegas setting.
Interactive television is beginning to take on the social format of the web, linking viewers with online communities who use communication tools such as Twitter and Skype IM to discuss what they just saw on television in real time. When popular musical comedy show Glee hit the screens in 2009, marketing experts at Fox pushed for a strong online presence, airing the pilot episode well in advance of the actual season debut and generating buzz on social networking sites such as Twitter and FacebookA social networking service that allows users to instant message and to publicly post thoughts, photographs, videos, links, and more.. Once the show gained widespread popularity, Fox launched an interactive hypertrailer on its website, allowing viewers to click on and “like” the show’s cast members on Facebook. The Glee cast also participates in weekly “tweet-peats,” which feature live Twitter feeds that scroll across the bottom of the screen during reruns of the show, providing behind-the-scenes details and answering fan questions. The CW network uses a similar technique with its “TV to Talk About” campaign, a tagline that changes from ad to ad to include iterations such as “TV to text about,” “blog about,” or “tweet about.” Its website offers forums where viewers can discuss episodes and interact with video extras, photos, and background clips about various shows. Online television forum Television Without Pity provides viewers with an alternative place for discussion that is not affiliated with any one network.
Figure 9.19

A Nielsen report found that during the fourth quarter of 2009, 60 percent of Americans spent up to 3.5 hours every month going online and watching television simultaneously.
Despite the shift toward interactive television, one barrier that manufacturers seem to be unwilling to cross is the addition of Internet to people’s television sets. Although Internet-enabled televisions began trickling into the market in 2008 and 2009, many industry executives remained skeptical of their potential. In February 2009, Sony spokesman Greg Belloni said, “Sony’s stance is that consumers don’t want an Internet-like experience with their TVs, and we’re really not focused on bringing anything other than Internet video or widgets to our sets right now.”Matt Richtel, “What Convergence? TV’s Hesitant March to the Net,” New York Times, February 15, 2009, http://www.nytimes.com/2009/02/16/technology/internet/16chip.html. Although some analysts predict that up to 20 percent of televisions will be Internet-enabled by 2012, consulting firm Deloitte anticipates the continued concurrent use of television sets with laptops, MP3 players, and other browser-enabled devices.Deloitte, “Deloitte Analyses Top Trends for the Media Industry for 2010,” news release, http://www.deloitte.com/view/en_GB/uk/industries/tmt/press-release/37df818581646210VgnVCM100000ba42f00aRCRD.htm.
Please respond to the following short-answer writing prompts. Each response should be a minimum of one paragraph.
Review Questions
Whether online viewing outlets continue to grow in popularity or viewers return to more traditional methods of watching television, broadcasters are likely to remain dependent on advertising sponsors to fund their programming. Advertising sales executives work for a specific network and sell television time to agencies and companies, working within budgets to ensure that clients make effective use of their advertising time.
Read through the U.S. Bureau of Labor Statistics overview of a career in advertising sales. You can find it at: http://www.bls.gov/oco/ocos297.htm.
Then, read BNET’s analysis of the television advertising industry at http://industry.bnet.com/media/10008136/truth-in-network-tv-advertising-and-what-to-do-about-it/. Once you have looked at both sites, use the information to answer these questions:
Figure 10.1
Video games have come a long way from using a simple joystick to guide Pac-Man on his mission to find food and avoid ghosts. This is illustrated by a 2007 Southwest Airlines commercial in which two friends are playing a baseball video game on a Nintendo Wii–like console. The batting friend tells the other to throw his pitch—and he does, excitedly firing his controller into the middle of the plasma television, which then falls off the wall. Both friends stare in shock at the shattered flat screen as the narrator asks, “Want to get away?”
Such a scene is unlikely to have taken place in the early days of video games when Atari reigned supreme and the action of playing solely centered on hand-eye coordination. The learning curve was relatively nonexistent; players maneuvered a joystick to shoot lines of aliens in the sky, or they turned the wheel on a paddle to play a virtual game of table tennis.
But as video games became increasingly popular, they also became increasingly complex. Consoles upgraded and evolved on a regular basis, and the games kept up. Players called each other with loopholes and tips on how to get Mario and Luigi onto the next level, and now they exchange their tricks on gaming blogs. Games like The Legend of Zelda and Final Fantasy created alternate worlds and intricate story lines, providing multiple-hour epic adventures.
Long criticized for taking kids out of the backyard and into a sedentary spot in front of the television, many video games have circled back to their simpler origins and, in doing so, have made players more active. Casual gamers who could quickly figure out how to put together puzzle pieces in Tetris can now just as easily figure out how to “swing” a tennis racket with a Wiimote. Video games are no longer a convenient scapegoat for America’s obesity problems; Wii Fit offers everything from yoga to boxing, and Dance Dance Revolution estimates calories burned while players dance.
The logistics of video games continue to change, and as they do, gaming has begun to intersect with every other part of culture. Players can learn how to “perform” their favorite songs with Guitar Hero and Rock Band. Product placement akin to what is seen in movies and on television is equally prevalent in video games such as the popular Forza Motorsport or FIFA series. As the Internet allows for players across the world to participate simultaneously, video games have the potential to one day look like competitive reality shows.Michael Dolan, “The Video Game Revolution: The Future of Video Gaming,” PBS, http://www.pbs.org/kcts/videogamerevolution/impact/future.html. Arguably, video games even hold a place in the art world, with the increasing complexity of animation and story lines.Jona Tres Kap, “The Video Game Revolution: But is it Art?” PBS, http://www.pbs.org/kcts/videogamerevolution/impact/art.html.
And now, with endless possibilities for the future, video games are attracting new and different demographics. Avid players who grew up with video games may be the first ones to purchase 3-D televisions for the 3-D games of the future.M. H. Williams, “Study Shows Casual and Core Gamers Are Ready for 3-D Gaming,” Industry Gamers, June 15, 2010, http://www.industrygamers.com/news/study-shows-casual-and-core-gamers-are-ready-for-3d-gaming/. But casual players, perhaps of an older demographic, will be drawn to the simplicity of a game like Wii Bowling. Video games have become more accessible than ever. Social media websites like Facebook offer free video game applications, and smartphone users can download apps for as little as a dollar, literally putting video games in one’s back pocket. Who needs a cumbersome Scrabble board when it’s available on a touch screen anytime, anywhere?
Video games have become ubiquitous in modern culture. Understanding them as a medium allows a fuller understanding of their implications in the realms of entertainment, information, and communication. Studying their history reveals new perspectives on the ways video games have affected mainstream culture.
Pong, the electronic table-tennis simulation game, was the first video game for many people who grew up in the 1970s and is now a famous symbol of early video games. However, the precursors to modern video games were created as early as the 1950s. In 1952 a computer simulation of tic-tac-toe was developed for EDSAC, one of the first stored-information computers, and in 1958 a game called Tennis for Two was developed at Brookhaven National Laboratory as a way to entertain people coming through the laboratory on tours.Simon Egenfeldt-Nielsen, Understanding Video Games: The Essential Introduction (New York: Taylor & Francis, 2008), 50.
Figure 10.2

Tennis for Two was a rudimentary game designed to entertain visitors to the Brookhaven National Laboratory.
These games would generate little interest among the modern game-playing public, but at the time they enthralled their users and introduced the basic elements of the cultural video game experience. In a time before personal computers, these games allowed the general public to access technology that had been restricted to the realm of abstract science. Tennis for Two created an interface where anyone with basic motor skills could use a complex machine. The first video games functioned early on as a form of media by essentially disseminating the experience of computer technology to those who did not have access to it.
As video games evolved, their role as a form of media grew as well. Video games have grown from simple tools that made computing technology understandable to forms of media that can communicate cultural values and human relationships.
The 1970s saw the rise of video games as a cultural phenomenon. A 1972 article in Rolling Stone describes the early days of computer gaming:
Reliably, at any nighttime moment (i.e. non-business hours) in North America hundreds of computer technicians are effectively out of their bodies, locked in life-or-death space combat computer-projected onto cathode ray tube display screens, for hours at a time, ruining their eyes, numbing their fingers in frenzied mashing of control buttons, joyously slaying their friend and wasting their employers’ valuable computer time. Something basic is going on.Stewart Brand, “Space War,” Rolling Stone, December 7, 1972.
This scene was describing Spacewar!, a game developed in the 1960s at the Massachusetts Institute of Technology (MIT) that spread to other college campuses and computing centers. In the early 1970s, very few people owned computers. Most computer users worked or studied at university, business, or government facilities. Those with access to computers were quick to utilize them for gaming purposes.
The first coin-operated arcade gameA coin-operated video game placed in public establishments. was modeled on Spacewar! It was called Computer Space, and it fared poorly among the general public because of its difficult controls. In 1972, Pong, the table-tennis simulator that has come to symbolize early computer games, was created by the fledgling company AtariThe video game company that was responsible for the arcade game Pong and that led the home console market in the 1970s and 1980s., and it was immediately successful. Pong was initially placed in bars with pinball machines and other games of chance, but as video games grew in popularity, they were placed in any establishment that would take them. By the end of the 1970s, so many video arcades were being built that some towns passed zoning laws limiting them.Steven Kent, “Super Mario Nation,” American Heritage, September 1997, http://www.americanheritage.com/articles/magazine/ah/1997/5/1997_5_65.shtml.
The end of the 1970s ushered in a new era—what some call the golden age of video games—with the game Space Invaders, an international phenomenon that exceded all expectations. In Japan, the game was so popular that it caused a national coin shortage. Games like Space Invaders illustrate both the effect of arcade games and their influence on international culture. In two different countries on opposite sides of the globe, Japanese and American teenagers, although they could not speak to one another, were having the same experiences thanks to a video game.
The first video game consoleA video game system designed to be attached to a television to simulate arcade video games. for the home began selling in 1972. It was the Magnavox Odyssey, and it was based on prototypes built by Ralph Behr in the late 1960s. This system included a Pong-type game, and when the arcade version of Pong became popular, the Odyssey began to sell well. Atari, which was making arcade games at the time, decided to produce a home version of Pong and released it in 1974. Although this system could only play one game, its graphics and controls were superior to the Odyssey, and it was sold through a major department store, Sears. Because of these advantages, the Atari home version of Pong sold well, and a host of other companies began producing and selling their own versions of Pong.Leonard Herman, “Early Home Video Game Systems,” in The Video Game Explosion: From Pong to PlayStation and Beyond, ed. Mark Wolf (Westport, CT: Greenwood Press, 2008), 54.
A major step forward in the evolution of video games was the development of game cartridgesInterchangeable cartridges containing video games for use in home consoles. that stored the games and could be interchanged in the console. With this technology, users were no longer limited to a set number of games, leading many video game console makers to switch their emphasis to producing games. Several groups, such as Magnavox, Coleco, and Fairchild, released versions of cartridge-type consoles, but Atari’s 2600 console had the upper hand because of the company’s work on arcade games. Atari capitalized off of its arcade successes by releasing games that were well known to a public that was frequenting arcades. The popularity of games such as Space Invaders and Pac-Man made the Atari 2600 a successful system. The late 1970s also saw the birth of companies such as Activision, which developed third-party games for the Atari 2600.Mark J. P. Wolf, “Arcade Games of the 1970s,” in The Video Game Explosion (see note 7), 41.
The birth of the home computer market in the 1970s paralleled the emergence of video game consoles. The first computer designed and sold for the home consumer was the Altair. It was first sold in 1975, several years after video game consoles had been selling, and it sold mainly to a hobbyist market. During this period, people such as Steve Jobs, the founder of Apple, were building computers by hand and selling them to get their start-up businesses going. In 1977, three important computers—Radio Shack’s TRS-80, the Commodore PET, and the Apple II—were produced and began selling to the home market.Jeremy Reimer, “Total share: 30 years of personal computer market share figures,” Ars Technica (blog), December 14, 2005, http://arstechnica.com/old/content/2005/12/total-share.ars/2.
The rise of personal computers allowed for the development of more complex games. Designers of games such as Mystery House, developed in 1979 for the Apple II, and Rogue, developed in 1980 and played on IBM PCs, used the processing power of early home computers to develop video games that had extended plots and story lines. In these games, players moved through landscapes composed of basic graphics, solving problems and working through an involved narrative. The development of video games for the personal computer platform expanded the ability of video games to act as media by allowing complex stories to be told and new forms of interaction to take place between players.
Atari’s success in the home console market was due in large part to its ownership of already-popular arcade games and the large number of game cartridges available for the system. These strengths, however, eventually proved detrimental to the company and led to what is now known as the video game crash of 1983The economic failure of the video game industry in 1983, caused by an oversupply of games and waning demand.. Atari bet heavily on its past successes with popular arcade games by releasing Pac-Man for the Atari 2600. Pac-Man was a successful arcade game that did not translate well to the home console, leading to disappointed consumers and lower-than-expected sales. Additionally, Atari produced 10 million of the lackluster Pac-Man games on its first run, despite the fact that active consoles were only estimated at 10 million. Similar mistakes were made with a game based on the movie E.T.: The Extra-Terrestrial, which has gained notoriety as one of the worst games in Atari’s history. It was not received well by consumers despite the success of the movie, and Atari had again bet heavily on its success. Piles of unsold E.T. game cartridges were reportedly buried in the New Mexico desert under a veil of secrecy.Nick Montfort and Ian Bogost, Racing the Beam: The Atari Video Computer System (Cambridge, MA: MIT Press, 2009), 127.
As retail outlets became increasingly wary of home console failures, they began stocking fewer games on shelves. This action, combined with an increasing number of companies producing games, led to overproduction and a resulting fallout in the video game market in 1983. Many smaller game developers did not have the capacity to withstand this downturn and went out of business. Although Coleco and Atari were able to make it through the crash, neither company regained its former share of the video game market. It was 1985 when the video game market picked up again.Steven Kent, “Super Mario Nation,” American Heritage, September 1997, http://www.americanheritage.com/articles/magazine/ah/1997/5/1997_5_65.shtml.
Nintendo, a Japanese card and novelty producer that had begun to produce electronic games in the 1970s, was responsible for arcade games such as Donkey Kong in the early 1980s. Its first home console, developed in 1984 for sale in Japan, tried to succeed where Atari had failed. The Nintendo system used newer, better microchips, bought in large quantities, to ensure high-quality graphics at a price consumers could afford. Keeping console prices low meant Nintendo had to rely on games for most of its profits and maintain control of game production. This was something Atari had failed to do, and it led to a glut of low-priced games that caused the crash of 1983. Nintendo got around this problem with proprietary circuits that would not allow unlicensed games to be played on the console. This allowed Nintendo to dominate the home video game market through the end of the decade, when one-third of homes in the United States had a Nintendo system.Gary Cross and Gregory Smits, “Japan, the U.S. and the Globalization of Children’s Consumer Culture,” Journal of Social History 38, no. 4 (2005).
Nintendo introduced its Nintendo Entertainment System (NES) in the United States in 1985. The game Super Mario Brothers, released with the system, was also a landmark in video game development. The game employed a narrative in the same manner as more complicated computer games, but its controls were accessible and its objectives simple. The game appealed to a younger demographic, generally boys in the 8–14 range, than the one targeted by Atari.Stephen Kline, Nick Dyer-Witheford, and Greig De Peuter, Digital Play: The Interaction of Technology, Culture, and Marketing (Montreal: McGill-Queen’s University Press, 2003), 119. Its designer, Shigeru Miyamoto, tried to mimic the experiences of childhood adventures, creating a fantasy world not based on previous models of science fiction or other literary genres.Rus McLaughlin, “IGN Presents the History of Super Mario Bros.,” IGN Retro, November 8, 2007, http://games.ign.com/articles/833/833615p1.html. Super Mario Brothers also gave Nintendo an iconic character who has been used in numerous other games, television shows, and even a movie. The development of this type of character and fantasy world became the norm for video game makers. Games such as The Legend of Zelda became franchises with film and television possibilities rather than simply one-off games.
As video games developed as a form of media, the public struggled to come to grips with the kind of messages this medium was passing on to children. These were no longer simple games of reflex that could be compared to similar non-video games or sports; these were forms of media that included stories and messages that concerned parents and children’s advocates. Arguments about the larger meaning of the games became common, with some seeing the games as driven by ideas of conquest and gender stereotypes, whereas others saw basic stories about traveling and exploration.Mary Fuller and Henry Jenkins, “Nintendo and New World Travel Writing: A Dialogue,” Cybersociety: Computer-Mediated Communication and Community, ed. Steven G. Jones (Thousand Oaks, CA: Sage Publications, 1995), 57–72.
Other software companies were still interested in the home console market in the mid-1980s. Atari released the 2600jr and the 7800 in 1986 after Nintendo’s success, but the consoles could not compete with Nintendo. The Sega Corporation, which had been involved with arcade video game production, released its Sega Master System in 1986. Although the system had more graphics possibilities than the NES, Sega failed to make a dent in Nintendo’s market share until the early 1990s, with the release of Sega Genesis.Aphra Kerr, “Spilling Hot Coffee? Grand Theft Auto as Contested Cultural Product,” in The Meaning and Culture of Grand Theft Auto: Critical Essays, ed. Nate Garrelts (Jefferson, NC: McFarland, 2005), 17.
The enormous number of games available for Atari consoles in the early 1980s took its toll on video arcades. In 1983, arcade revenues had fallen to a 3-year low, leading game makers to turn to newer technologies that could not be replicated by home consoles. This included arcade games powered by laser discs, such as Dragon’s Lair and Space Ace, but their novelty soon wore off, and laser-disc games became museum pieces.Aljean Harmetz, “Video Arcades Turn to Laser Technology as Queues Dwindle,” Morning Herald (Sydney), February 2, 1984. In 1989, museums were already putting on exhibitions of early arcade games that included ones from the early 1980s. Although newer games continued to come out on arcade platforms, they could not compete with the home console market and never achieved their previous successes from the early 1980s. Increasingly, arcade gamers chose to stay at home to play games on computers and consoles. Today, dedicated arcades are a dying breed. Most that remain, like the Dave & Buster’s and Chuck E. Cheese’s chains, offer full-service restaurants and other entertainment attractions to draw in business.
Home games fared better than arcades because they could ride the wave of personal computer purchases that occurred in the 1980s. Some important developments in video games occurred in the mid-1980s with the development of online games. Multiuser dungeons, or MUDs, were role-playing games played online by multiple users at once. The games were generally text-based, describing the world of the MUD through text rather than illustrating it through graphics. The games allowed users to create a character and move through different worlds, accomplishing goals that awarded them with new skills. If characters attained a certain level of proficiency, they could then design their own area of the world. Habitat, a game developed in 1986 for the Commodore 64, was a graphic version of this type of game. Users dialed up on modems to a central host server and then controlled characters on screen, interacting with other users. Jeremy Reimer, “The Evolution of Gaming: Computers, Consoles, and Arcade,” Ars Technica (blog), October 10, 2005, http://arstechnica.com/old/content/2005/10/gaming-evolution.ars/4.
During the mid-1980s, a demographic shift occurred. Between 1985 and 1987, games designed to run on business computers rose from 15 percent to 40 percent of games sold.Philip Elmer-Dewitt and others, “Computers: Games that Grownups Play,” Time, July 27, 1987, http://www.time.com/time/magazine/article/0,9171,965090,00.html. This trend meant that game makers could use the increased processing power of business computers to create more complex games. It also meant adults were interested in computer games and could become a profitable market.
Video games evolved at a rapid rate throughout the 1990s, moving from the first 16-bit systems (named for the amount of data they could process and store) in the early ’90s to the first Internet-enabled home console in 1999. As companies focused on new marketing strategies, wider audiences were targeted, and video games’ influence on culture began to be felt.
Nintendo’s dominance of the home console market throughout the late 1980s allowed it to build a large library of games for use on the NES. This also proved to be a weakness, however, because Nintendo was reluctant to improve or change its system for fear of making its game library obsolete. Technology had changed in the years since the introduction of the NES, and companies such as NEC and Sega were ready to challenge Nintendo with 16-bit systems.Andy Slaven, Video Game Bible, 1985–2002, (Victoria, BC: Trafford), 70–71.
Figure 10.3

Sega’s commercials suggested that it was a more violent version of Nintendo.
The Sega Master System had failed to challenge the NES, but with the release of its 16-bit system, Sega Genesis, the company pursued a new marketing strategy. Whereas Nintendo targeted 8- to 14-year-olds, Sega’s marketing plan targeted 15- to 17-year olds, making games that were more mature and advertising during programs such as the MTV Video Music Awards. The campaign successfully branded Sega as a cooler version of Nintendo and moved mainstream video games into a more mature arena. Nintendo responded to the Sega Genesis with its own 16-bit system, the Super NES, and began creating more mature games as well. Games such as Sega’s Mortal Kombat and Nintendo’s Street Fighter competed to raise the level of violence possible in a video game. Sega’s advertisements even suggested that its game was better because of its more violent possibilities.Gamespot, “When Two Tribes Go to War: A History of Video Game Controversy,” http://www.gamespot.com/features/6090892/p-5.html.
By 1994, companies such as 3DO, with its 32-bit system, and Atari, with its allegedly 64-bit Jaguar, attempted to get in on the home console market but failed to use effective marketing strategies to back up their products. Both systems fell out of production before the end of the decade. Sega, fearing that its system would become obsolete, released the 32-bit Saturn system in 1995. The system was rushed into production and did not have enough games available to ensure its success.CyberiaPC.com, “Sega Saturn (History, Specs, Pictures),” http://www.cyberiapc.com/vgg/sega_saturn.htm. Sony stepped in with its PlayStation console at a time when Sega’s Saturn was floundering and before Nintendo’s 64-bit system had been released. This system targeted an even older demographic of 14- to 24-year-olds and made a large effect on the market; by March of 2007, Sony had sold 102 million PlayStations.Edge staff, “The Making Of: Playstation,” Edge, April 24, 2009, http://www.next-gen.biz/features/the-making-of-playstation.
Computer games had avid players, but they were still a niche market in the early 1990s. An important step in the mainstream acceptance of personal computer games was the development of the first-person shooterA video game genre developed in the 1990s that puts the player in the perspective of a character who primarily uses guns to defeat enemies. genre. First popularized by the 1992 game Wolfenstein 3D, these games put the player in the character’s perspective, making it seem as if the player were firing weapons and being attacked. Doom, released in 1993, and Quake, released in 1996, used the increased processing power of personal computers to create vivid three-dimensional worlds that were impossible to fully replicate on video game consoles of the era. These games pushed realism to new heights and began attracting public attention for their graphic violence.
Figure 10.4

Myst challenged the notion that only violent games could be successful.
Another trend was reaching out to audiences outside of the video-game-playing community. Myst, an adventure game where the player walked around an island solving a mystery, drove sales of CD-ROM drives for computers. Myst, its sequel Riven, and other nonviolent games such as SimCity actually outsold Doom and Quake in the 1990s.Stephen C. Miller, “News Watch; Most-Violent Video Games Are Not Biggest Sellers,” New York Times, July 29, 1999, http://www.nytimes.com/1999/07/29/technology/news-watch-most-violent-video-games-are-not-biggest-sellers.html. These nonviolent games appealed to people who did not generally play video games, increasing the form’s audience and expanding the types of information that video games put across.
A major advance in game technology came with the increase in Internet use by the general public in the 1990s. A major feature of Doom was the ability to use multiplayer gaming through the Internet. Strategy games such as Command and Conquer and Total Annihilation also included options where players could play each other over the Internet. Other fantasy-inspired role-playing games, such as Ultima Online, used the Internet to initiate the massively multiplayer online role-playing game (MMORPG)A genre of video games that allows a large number of players to simultaneously engage in a role-playing game. genre.Jeremy Reimer, “The Evolution of Gaming: Computers, Consoles, and Arcade,” Ars Technica (blog), October 10, 2005, http://arstechnica.com/old/content/2005/10/gaming-evolution.ars/4. These games used the Internet as their platform, much like the text-based MUDs, creating a space where individuals could play the game while socially interacting with one another.
The development of portable game systems was another important aspect of video games during the 1990s. Handheld games had been in use since the 1970s, and a system with interchangeable cartridges had even been sold in the early 1980s. Nintendo released the Game Boy in 1989, using the same principles that made the NES dominate the handheld market throughout the 1990s. The Game Boy was released with the game Tetris, using the game’s popularity to drive purchases of the unit. The unit’s simple design meant users could get 20 hours of playing time on a set of batteries, and this basic design was left essentially unaltered for most of the decade. More advanced handheld systems, such as the Atari Lynx and Sega Game Gear, could not compete with the Game Boy despite their superior graphics and color displays. Joe Hutsko, “88 Million and Counting; Nintendo Remains King of the Handheld Game Players,” New York Times, March 25, 2000, http://www.nytimes.com/2000/03/25/business/88-million-and-counting-nintendo-remains-king-of-the-handheld-game-players.html.
The decade-long success of the Game Boy belies the conventional wisdom of the console wars that more advanced technology makes for a more popular system. The Game Boy’s static, simple design was readily accessible, and its stability allowed for a large library of games to be developed for it. Despite using technology almost a decade old, the Game Boy accounted for 30 percent of Nintendo of America’s overall revenues at the end of the 1990s. Joe Hutsko, “88 Million and Counting; Nintendo Remains King of the Handheld Game Players,” New York Times, March 25, 2000, http://www.nytimes.com/2000/03/25/business/88-million-and-counting-nintendo-remains-king-of-the-handheld-game-players.html.
1999 saw Sega’s last effort in the console wars with its Sega Dreamcast. This console could connect to the Internet, emulating the sophisticated computer games of the 1990s. The new features of the Sega Dreamcast were not enough to save the brand, however, and Sega discontinued production in 2001, leaving the console market entirely.“Sega Dreamcast,” slide in “A Brief History of Game Console Warfare,” Business Week, http://images.businessweek.com/ss/06/10/game_consoles/.
A major problem for Sega’s Dreamcast was Sony’s release of the PlayStation 2 (PS2) in 2000. The PS2 could function as a DVD player, expanding the role of the console into an entertainment device that did more than play video games. This console was incredibly successful, enjoying a long production run, with more than 106 million units sold worldwide by the end of the decade.“PlayStation 2,” slide in “A Brief History of Game Console Warfare.” Business Week, http://images.businessweek.com/ss/06/10/game_consoles/
In 2001, two major consoles were released to compete with the PS2: the Xbox and the Nintendo GameCube. The Xbox was an attempt by Microsoft to enter the market with a console that expanded on the functions of other game consoles. The unit had features similar to a PC, including a hard drive and an ethernet port for online play through its service, Xbox Live. The popularity of the first-person shooter game Halo, an Xbox exclusive release, boosted sales as well. Nintendo’s GameCube did not offer DVD playback capabilities, choosing instead to focus on gaming functions. Both of these consoles sold millions of units but did not come close to the sales of the PS2.
As consoles developed to rival the capabilities of personal computers, game developers began to focus more on games for consoles. From 2000 to the end of the decade, the popularity of personal computer games has gradually declined. The computer gaming community, while still significant, is focused on game players who are willing to pay a lot of money on personal computers that are designed specifically for gaming, often including multiple monitors and user modifications that allow personal computers to play newer games. This type of market, though profitable, is not large enough to compete with the audience for the much cheaper game consoles. Kristin Kalning, “Is PC Gaming Dying? Or Thriving?” MSNBC, March 26, 2008, http://www.msnbc.msn.com/id/23800152/wid/11915773/.
Nintendo continued its control of the handheld game market into the 2000s with the 2001 release of the Game Boy Advance, a redesigned Game Boy that offered 32-bit processing and compatibility with older Game Boy games. In 2004, anticipating Sony’s upcoming handheld console, Nintendo released the Nintendo DS, a handheld console that featured two screens and Wi-Fi capabilities for online gaming. Sony’s PlayStation Portable (PSP) was released the following year and featured Wi-Fi capabilities as well as a flexible platform that could be used to play other media such as MP3s.Penelope Patsuris, “Sony PSP vs. Nintendo DS,” Forbes, June 7, 2004, http://www.forbes.com/2004/06/07/cx_pp_0607mondaymatchup.html. These two consoles, along with their newer versions, continue to dominate the handheld market.
One interesting innovation in mobile gaming occurred in 2003 with the release of the Nokia N-Gage. The N-Gage was a combination of a game console and mobile phone that, according to consumers, did not fill either role very well. The product line was discontinued in 2005, but the idea of playing games on phones persisted and has been developed on other platforms.Brad Stone, “Play It Again, Nokia. For the Third Time,” New York Times, August 27, 2007, http://www.nytimes.com/2007/08/27/technology/27nokia.html. Apple currently dominates the industry of mobile phone games; in 2008 and 2009 alone, iPhone games generated $615 million in revenue.Peter Farago, “Apple iPhone and iPod Touch Capture U.S. Video Game Market Share,” Flurry (blog), March 22, 2010, http://blog.flurry.com/bid/31566/Apple-iPhone-and-iPod-touch-Capture-U-S-Video-Game-Market-Share. As mobile phone gaming grows in popularity and as the supporting technology becomes increasingly more advanced, traditional portable gaming platforms like the DS and the PSP will need to evolve to compete. Nintendo is already planning a successor to the DS that features 3-D graphics without the use of 3-D glasses that it hopes will help the company retain and grow its share of the portable gaming market.
The trends of the late 2000s have shown a steadily increasing market for video games. Newer control systems and family-oriented games have made it common for many families to engage in video game play as a group. Online games have continued to develop, gaining unprecedented numbers of players. The overall effect of these innovations has been the increasing acceptance of video game culture by the mainstream.
The current state of the home console market still involves the three major companies of the past 10 years: Nintendo, Sony, and Microsoft. The release of Microsoft’s Xbox 360 led this generation of consoles in 2005. The Xbox 360 featured expanded media capabilities and integrated access to Xbox Live, an online gaming service. Sony’s PlayStation 3 (PS3) was released in 2006. It also featured enhanced online access as well as expanded multimedia functions, with the additional capacity to play Blu-ray discs. Nintendo released the Wii at the same time. This console featured a motion-sensitive controller that departed from previous controllers and focused on accessible, often family-oriented games. This combination successfully brought in large numbers of new game players, including many older adults. By June 2010, in the United States, the Wii had sold 71.9 million units, the Xbox 360 had sold 40.3 million, and the PS3 trailed at 35.4 million.VGChartz, “Weekly Hardware Chart: 19th June 2010,” http://www.vgchartz.com. In the wake of the Wii’s success, Microsoft and Sony have introduced their own motion-sensitive systems.J. P. Mangalindan, “Is Casual Gaming Destroying the Traditional Gaming Market?” Fortune, March 18, 2010, http://tech.fortune.cnn.com/2010/03/18/is-casual-gaming-destroying-the-traditional-gaming-market/.
Video game marketing has changed to bring in more and more people to the video game audience. Think about the influence video games have had on you or people you know. If you have never played video games, then think about the ways your conceptions of video games have changed. Sketch out a timeline indicating the different occurrences that marked your experiences related to video games. Now compare this timeline to the history of video games from this section. Consider the following questions:
With such a short history, the place of video games in culture is constantly changing and being redefined. Are video games entertainment or art? Should they focus on fostering real-life skills or developing virtual realities? Certain games have come to prominence in recent years for their innovations and genre-expanding attributes. These games are notable for not only great economic success and popularity but also for having a visible influence on culture.
The musical series Guitar Hero, based on a Japanese arcade game of the late 1990s, was first launched in North America in 2005. In the game, the player uses a guitar-shaped controller to match the rhythms and notes of famous rock songs. The closer the player approximates the song, the better the score. This game introduced a new genre of games in which players simulate playing musical instruments. Rock Band, released in 2007, uses a similar format, including a microphone for singing, a drum set, and rhythm and bass guitars. These games are based on a similar premise as earlier rhythm-based games such as Dance Dance Revolution, in which players keep the rhythm on a dance pad. Dance Dance Revolution, which was introduced to North American audiences in 1999, was successful but not to the extent that the later band-oriented games were. In 2008, music-based games brought in an estimated $1.9 billion.
Figure 10.5

Rock Band includes a microphone and a drum set along with a guitar.
Guitar Hero and Rock Band brought new means of marketing and a kind of cross-media stimulus with them. The songs featured in the games experienced increased downloads and sales—as much as an 840 percent increase in some cases.Matt Peckham, Music Sales Rejuvenated by Rock Band, Guitar Hero,” PC World, December 22, 2008, http://www.washingtonpost.com/wp-dyn/content/article/2008/12/22/AR2008122200798.html. The potential of this type of game did not escape its developers or the music industry. Games dedicated solely to one band were developed, such as Guitar Hero: Aerosmith and The Beatles: Rock Band. These games were a mix of music documentary, greatest hits album, and game. They included footage from early concerts, interviews with band members, and, of course, songs that allowed users to play along. When Guitar Hero: Aerosmith was released, the band’s catalog experienced a 40 percent increase in sales.Denise Quan, “Is ‘Guitar Hero’ Saving Rock ’n’ Roll?” CNN, August 28, 2008, http://www.cnn.com/2008/SHOWBIZ/Music/08/20/videol.games.music/.
The rock band Metallica made its album Death Magnetic available for Guitar Hero III on the same day it was released as an album.Denise Quan, “Is ‘Guitar Hero’ Saving Rock ’n’ Roll?” CNN, August 28, 2008, http://www.cnn.com/2008/SHOWBIZ/Music/08/20/videol.games.music/. Other innovations include Rock Band Network, a means for bands and individuals to create versions of their own songs for Rock Band that can be downloaded for a fee. The sporadic history of the video game industry makes it unclear if this type of game will maintain market share or even maintain its popularity, but it has clearly opened new avenues of expression as a form of media.
The first game in the Grand Theft Auto (GTA) series was released in 1997 for the PC and Sony PlayStation. The game had players stealing cars—not surprising given its title—and committing a variety of crimes to achieve specific goals. The game’s extreme violence made it popular with players of the late 1990s, but its true draw was the variety of options that players could employ in the game. Specific narratives and goals could be pursued, but if players wanted to drive around and explore the city, they could do that as well. A large variety of cars, from sports cars to tractor trailers, were available depending on the player’s goals. The violence could likewise be taken to any extreme the player wished, including stealing cars, killing pedestrians, and engaging the police in a shoot-out. This type of game is known as a sandbox gameA video game in which the player chooses among a number of different objectives., or open world, and it is defined by the ability of users to freely pursue their own objectives.Ryan Donald, review of Grand Theft Auto (PlayStation), CNET, 28 April 2000, http://reviews.cnet.com/legacy-game-platforms/grand-theft-auto-playstation/4505-9882_7-30971409-2.html.
The GTA series has evolved over the past decade by increasing the realism, options, and explicit content of the first game. GTA III and GTA IV, as well as a number of spin-off games, such as the recent addition The Ballad of Gay Tony, have made the franchise more profitable and more controversial. These newer games have expanded on the idea of an open video game world, allowing players to have their characters buy and manage businesses, play unrelated mini-games (such as bowling and darts), and listen to a wide variety of in-game music, talk shows, and even television programs. However, increasing freedom also results in increasing controversy, as players can choose to solicit prostitutes, visit strip clubs, perform murder sprees, and assault law enforcement agents. Lawsuits have attempted to tie the games to real-life instances of violence, and GTA games are routinely the target of political investigations into video game violence. Tatiana Morales, “Grand Theft Auto Under Fire,” CBS News, July 14, 2005, http://www.cbsnews.com/stories/2005/07/13/earlyshow/living/parenting/main708794.shtml.
World of Warcraft (WoW), released in 2004, is a massively multiplayer online role-playing game (MMORPG) loosely based on the Warcraft strategy franchise of the 1990s. The game is conducted entirely online, though it is accessed through purchased software, and players purchase playing time. Each player chooses an avatar, or character, that belongs to one of several races, such as orcs, elves, and humans. These characters can spend their time on the game by completing quests, learning trades, or simply interacting with other characters. As characters gain experience, they obtain skills and earn virtual money. Players also choose whether they can attack other players without prior agreement by choosing a PvP (player versus player) serverA computer in a network that provides a service to other computers linked to it.. The normal server allows players to fight each other, but it can only be done if both players consent. A third server is reserved for those players who want to role-play, or act in character.
Figure 10.6

World of Warcraft allows players to team up with their avatars to go on quests or just socialize.
Image used by permission. © 2011 Blizzard Entertainment, Inc.
Various organizations have sprung up within the WoW universe. Guilds are groups that ascribe to specific codes of conduct and work together to complete tasks that cannot be accomplished by a lone individual. The guilds are organized by the players; they are not maintained by WoW developers. Each has its own unique identity and social rules, much like a college fraternity or social club. Voice communication technology allows players to speak to each other as they complete missions and increases the social bonding that holds such organizations together.Carson Barker, “Team Players: Guilds Take the Lonesome Gamer Out of Seclusion … Kind Of,” Austin Chronicle, July 28, 2006, http://www.austinchronicle.com/gyrobase/Issue/story?oid=oid%3A390551.
WoW has taken the medium of video games to unprecedented levels. Although series such as Grand Theft Auto allow players a great deal of freedom, everything done in the games was accounted for at some point. WoW, which depends on the actions of millions of players to drive the game, allows people to literally live their lives through a game. In the game, players can earn virtual gold by mining it, killing enemies, and killing other players. It takes a great deal of time to accumulate gold in this manner, so many wealthy players choose to buy this gold with actual dollars. This is technically against the rules of the game, but these rules are unenforceable. Entire real-world industries have developed from this trade in gold. Chinese companies employ workers, or “gold farmersPlayers in World of Warcraft who work to get virtual gold so that they can sell it for actual money.,” who work 10-hour shifts finding gold in WoW so that the company can sell it to clients. Other players make money by finding deals on virtual goods and then selling them for a profit. One WoW player even “traveled” to Asian servers to take advantage of cheap prices, conducting a virtual import–export business.Rowenna Davis, “Welcome to the New Gold Mines,” Guardian (London), March 5, 2009, http://www.guardian.co.uk/technology/2009/mar/05/virtual-world-china?intcmp=239.
The unlimited possibilities in such a game expand the idea of what a game is. It is obvious that an individual who buys a video game, takes it home, and plays it during his or her leisure is, in fact, playing a game. But if that person is a “gold farmer” doing repetitious tasks in a virtual world to make a real-world living, the situation is not as clear. WoW challenges conventional notions of what a game is by allowing the players to create their own goals. To some players, the goal may be to gain a high level for their character; others may be interested in role-playing, whereas others are focused on making a profit. This kind of flexibility leads to the development of scenarios never before encountered in game-play, such as the development of economic classes.
The Call of Duty series of first-person shooter games is notable for its record-breaking success in the video game market, generating more than $3 billion in retail sales through late 2009.Tom Ivan, “Call of Duty Series Tops 55 Million Sales,” Edge, November 27, 2009, http://www.edge-online.com/news/call-of-duty-series-tops-55-million-sales. Call of Duty: Modern Warfare 2 was released in 2009 to critical acclaim and a great deal of controversy. The game included a 5-minute sequence in which the player, as a CIA agent infiltrating a terrorist cell, takes part in a massacre of innocent civilians. The player was not required to shoot civilians and could skip the sequence if desired, but these options did not stop international attention and calls to ban the game.Games Radar, “The Decade in Gaming: The 10 Most Shocking Moments of the Decade,” December 29, 2009, http://www.gamesradar.com/f/the-10-most-shocking-game-moments-of-the-decade/a-20091221122845427051/p-2. Proponents of the series argue that Call of Duty has a Mature rating and is not meant to be played by minors. They also point out that the games are less violent than many modern movies. However, the debate has continued, escalating as far as the United Kingdom’s House of Commons.Games Radar, “The Decade in Gaming: The 10 Most Shocking Moments of the Decade,” December 29, 2009, http://www.gamesradar.com/f/the-10-most-shocking-game-moments-of-the-decade/a-20091221122845427051/p-2.
The Nintendo Wii, with its dedicated motion-sensitive controller, was sold starting in 2006. The company had attempted to implement similar controllers in the past, including the Power Glove in 1989, but it had never based an entire console around such a device. The Wii’s simple design was combined with basic games such as Wii Sports to appeal to previously untapped audiences. Wii Sports was included with purchase of the Wii console and served as a means to demonstrate the new technology. It included five games: baseball, bowling, boxing, tennis, and golf. Wii Sports created a way for group play without the need for familiarity with video games. It was closer to outdoor social games such as horseshoes or croquet than it was to Doom. There was also nothing objectionable about it: no violence, no in-your-face intensity—just a game that even older people could access and enjoy. Wii Bowling tournaments were sometimes organized by retirement communities, and many people found the game to be a new way to socialize with their friends and families.Dave Wischnowsky, “Wii Bowling Knocks Over Retirement Home,” Chicago Tribune, February 16, 2007, http://www.chicagotribune.com/news/local/chi-070216nintendo,0,2755896.story.
Wii Fit combined the previously incompatible terms “fitness” and “video games.” Using a touch-sensitive platform, players could do aerobics, strength training, and yoga. The game kept track of players’ weights, acting as a kind of virtual trainer.Matt Vella, “Wii Fit Puts the Fun in Fitness,” Business Week, May 21, 2008, http://www.businessweek.com/innovate/content/may2008/id20080520_180427.htm. Wii Fit used the potential of video games to create an interactive version of an exercise machine, replacing workout videos and other forms of fitness that had never before considered Nintendo a competitor. This kind of design used the inherent strengths of video games to create a new kind of experience.
Nintendo found most of its past success marketing to younger demographics with games that were less controversial than the 1990s first-person shooters. Wii Sports and Wii Fit saw Nintendo playing to its strengths and expanding on them with family-friendly games that encouraged multiple generations to use video games as a social platform. This campaign was so successful that it is being imitated by rival companies Sony and Microsoft, which have released the Sony PlayStation Move and the Microsoft Kinect.
Think about the ways in which the games from Section 10.2 "Influential Contemporary Games" were innovative and groundbreaking. Consider the following questions:
An NPD poll conducted in 2007 found that 72 percent of the U.S. population had played a video game that year.Chris Faylor, “NPD: 72% of U.S. Population Played Games in 2007; PC Named “Driving Force in Online Gaming,” Shack News, April 2, 2008, http://www.shacknews.com/onearticle.x/52025. The increasing number of people playing video games means that video games are having an undeniable effect on culture. This effect is clearly visible in the increasing mainstream acceptance of aspects of gaming culture. Video games have also changed the way that many other forms of media, from music to film, are produced and consumed. Education has also been changed by video games through the use of new technologies that help teachers and students communicate in new ways through educational games such as Brain Age. As video games have an increasing influence on our culture, many have voiced their opinions on whether this form of media should be considered an art.
To fully understand the effects of video games on mainstream culture, it is important to understand the development of gaming cultureThe unique set of aesthetics and principles that characterize video games., or the culture surrounding video games. Video games, like books or movies, have avid users who have made this form of media central to their lives. In the early 1970s, programmers got together in groups to play Spacewar!, spending a great deal of time competing in a game that was rudimentary compared to modern games.Stewart Brand, “Space War,” Rolling Stone, December 7, 1972. As video arcades and home video game consoles gained in popularity, youth culture quickly adapted to this type of media, engaging in competitions to gain high scores and spending hours at the arcade or with the home console.
In the 1980s, an increasing number of kids were spending time on consoles playing games and, more importantly, increasingly identifying with the characters and products associated with the games. Saturday morning cartoons were made out of the Pac-Man and Super Mario Bros. games, and an array of nongame merchandise was sold with video game logos and characters. The public recognition of some of these characters has made them into cultural icons. A poll taken in 2007 found that more Canadians surveyed could identify a photo of Mario, from Super Mario Bros., than a photo of the current Canadian prime minister.Cohn & Wolfe Toronto, “Italian Plumber More Memorable Than Harper, Dion,” news release, November 13, 2007, http://www.newswire.ca/en/releases/mmnr/Super_Mario_Galaxy/index.html.
As the kids who first played Super Mario Bros. began to outgrow video games, companies such as Sega, and later Sony and Microsoft, began making games to appeal to older demographics. This has increased the average age of video game players, which was 35 in 2009.Entertainment Software Association, Essential Facts About the Computer and Video Game Industry: 2009 Sales, Demographic, and Usage Data, 2009, http://www.theesa.com/facts/pdfs/ESA_EF_2009.pdf. The Nintendo Wii has even found a new demographic in retirement communites, where Wii Bowling has become a popular form of entertainment for the residents.Dave Wischnowsky, “Wii Bowling Knocks Over Retirement Home,” Chicago Tribune, February 16, 2007, http://www.chicagotribune.com/news/local/chi-070216nintendo,0,2755896.story. The gradual increase in gaming age has led to an acceptance of video games as an acceptable form of mainstream entertainment.
The acceptance of video games in mainstream culture has consequently changed the way that the culture views certain people. “Geek” was the name given to people who were adept at technology but lacking in the skills that tended to make one popular, like fashion sense or athletic ability. Many of these people, because they often did not fare well in society, favored imaginary worlds such as those found in the fantasy and science fiction genres. Video games were appealing because they were both a fantasy world and a means to excel at something. Jim Rossignol, in his 2008 book This Gaming Life: Travels in Three Cities, explained part of the lure of playing Quake III online:
Cold mornings, adolescent disinterest, and a nagging hip injury had meant that I was banished from the sports field for many years. I wasn’t going to be able to indulge in the camaraderie that sports teams felt or in the extended buzz of victory through dedication and cooperation. That entire swathe of experience had been cut off from me by cruel circumstance and a good dose of self-defeating apathy. Now, however, there was a possibility for some kind of redemption: a sport for the quick-fingered and the computer-bound; a space of possibility in which I could mold friends and strangers into a proficient gaming team.Jim Rossignol, This Gaming Life: Travels in Three Cities (Ann Arbor, MI: University of Michigan Press, 2008), 17.
Video games gave a group of excluded people a way to gain proficiency in the social realm. As video games became more of a mainstream phenomenon and video game skills began to be desired by a large number of people, the popular idea of geeks changed. It is now common to see the term “geek” used to mean a person who understands computers and technology. This former slur is also prominent in the media, with headlines in 2010 such as “Geeks in Vogue: Top Ten Cinematic Nerds.”Craig Sharp, “Geeks in Vogue: Top Ten Cinematic Nerds,” Film Shaft, April 26, 2010, http://www.filmshaft.com/geeks-in-vogue-top-ten-cinematic-nerds/.
Many media stories focusing on geeks examine the ways in which this subculture has been accepted by the mainstream. Geeks may have become “cooler,” but mainstream culture has also become “geekier.” The acceptance of geek culture has led to acceptance of geek aesthetics. The mainstreaming of video games has led to acceptance of fantasy or virtual worlds. This is evident in the popularity of film/book series such as The Lord of the Rings and Harry Potter. Comic book characters, emblems of geek culture, have become the vehicles for blockbuster movies such as Spider-Man and The Dark Knight. The idea of a fantasy or virtual world has come to appeal to greater numbers of people. Virtual worlds such as those represented in the Grand Theft Auto and Halo series and online games such as World of Warcraft have expanded the idea of virtual worlds so that they are not mere means of escape but new ways to interact.Lars Konzack, “Geek Culture: The 3rd Counter-Culture,” (paper, FNG2006, Preston, England, June 26–28, 2006), http://www.scribd.com/doc/270364/Geek-Culture-The-3rd-CounterCulture.
Video games during the 1970s and 1980s were often derivativesMedia that uses the narratives and aesthetics of other forms of media. of other forms of media. E.T., Star Wars, and a number of other games took their cues from movies, television shows, and books. This began to change in the 1980s with the development of cartoons based on video games, and in the 1990s and 2000s with live-action feature films based on video games.
Television programs based on video games were an early phenomenon. Pac-Man, Pole Position, and Q*bert were among the animated programs that aired in the early 1980s. In the later 1980s, shows such as The Super Mario Bros. Super Show! and The Legend of Zelda promoted Nintendo games. In the 1990s, Pokémon, originally a game developed for the Nintendo Game Boy, was turned into a television series, a card game, several movies, and even a musical.Internet Move Database, “Pokémon,” http://www.imdb.com/. Recently, several programs have been developed that revolve entirely around video games—the web series The Guild, for instance, tells the story of a group of friends who interact through an unspecified MMORPG.
Nielsen, the company that tabulates television ratings, has begun rating video games in a similar fashion. In 2010, this information showed that video games, as a whole, could be considered a kind of fifth networkTerm used to describe the potential for video games to rival the top four television networks., along with the television networks NBC, ABC, CBS, and Fox.Mike Shields, “Nielsen: Video Games Approach 5th Network Status,” Adweek, March 25, 2009, http://www.adweek.com/aw/content_display/news/agency/e3i4f087b1aeac6f008d0ecadfeffe4a191. Advertisers use Nielsen ratings to decide which programs to support. The use of this system is changing public perceptions to include video game playing as a habit similar to television watching.
Video games have also influenced the way that television is produced. The Rocket Racing League, scheduled to be launched in 2011, will feature a “virtual racetrack.” Racing jets will travel along a virtual track that can only be seen by pilots and spectators with enabled equipment. Applications for mobile devices are being developed that will allow spectators to race virtual jets alongside the ones flying in real time.Adam Hadhazy, “’NASCAR of the Skies’ to Feature Video Game-Like Interactivity,” TechNewsDaily, April 26, 2010, http://www.technewsdaily.com/nascar-of-the-skies-to-feature-video-game-like-interactivity–0475/. This type of innovation is only possible with a public that has come to demand and rely on the kind of interactivity that video games provide.
The rise in film adaptations of video games accompanies the increased age of video game users. In 1995, Mortal Kombat, a live-action movie based on the video game, grossed over $70 million at the box office, placing it 22nd in the rankings for that year.Box Office Mojo, “Mortal Kombat,” http://boxofficemojo.com/movies/?id=mortalkombat.htm. Lara Croft: Tomb Raider, released in 2001, starred well-known actress Angelina Jolie and ranked No. 1 at the box office when it was released, and 15th overall for the year.Box Office Mojo, “Lara Croft: Tomb Raider,” http://www.boxofficemojo.com/movies/?id=tombraider.htm. Films based on video games are an increasingly common sight at the box office, such as producer Jerry Bruckheimer’s Prince of Persia, or the recent sequel to Tron, based on the idea of a virtual gaming arena.
Another aspect of video games’ influence on films is how video game releases are marketed and perceived. The release date for anticipated game Grand Theft Auto IV was announced and marketed to compete with the release of the film Iron Man. Grand Theft Auto IV supposedly beat Iron Man by $300,000,000 million in sales. This kind of comparison is, in some ways, misleading. Video games cost much more than a ticket to a movie, so higher sales does not mean that more people bought the game than the movie. Also, the distribution apparatus for the two media is totally different. Movies can only be released in theaters, whereas video games can be sold at any retail outlet.Associated Press, “‘Grand Theft Auto IV’ Beats ‘Iron Man’ by $300 Million,” Fox News, May 9, 2008, http://www.foxnews.com/story/0,2933,354711,00.html. What this kind of news story proves, however, is that the general public considers video games as something akin to a film. It is also important to realize that the scale of production and profit for video games is similar to that of films. Video games include music scores, actors, and directors in addition to the game designers, and the budgets for major games reflect this. Grand Theft Auto IV cost an estimated $100 million to produce.Gillian Bowditch, “Grand Theft Auto Producer is Godfather of Gaming,” Times (London), April 27, 2008, http://www.timesonline.co.uk/tol/news/uk/scotland/article3821838.ece.
Video games have been accompanied by music ever since the days of the arcade. Video game music was originally limited to computer beeps turned into theme songs. The design of the Nintendo 64, Sega Saturn, and Sony PlayStation made it possible to use sampled audio on new games, meaning songs played on physical instruments could be recorded and used on video games. Beginning with the music of the Final Fantasy series, scored by famed composer Nobuo Uematsu, video game music took on film score quality, complete with full orchestral and vocal tracks. This innovation proved beneficial to the music industry. Well-known musicians such as Trent Reznor, Thomas Dolby, Steve Vai, and Joe Satriani were able to create the soundtracks for popular games, giving these artists exposure to new generations of potential fans.“Video Games Music Big Hit,” Wilmington (NC) Morning Star, February 1, 1997, 36. Composing music for video games has turned into a profitable means of employment for many musicians. Schools such as Berklee College of Music, Yale, and New York University have programs that focus on composing music for video games. The students are taught many of the same principles that are involved in film scoring.Joseph P. Khan, “Berklee is Teaching Its Students to Compose Scores for Video Games,” Boston Globe, January 19, 2010, http://www.boston.com/news/education/higher/articles/2010/01/19/berklee_is_teaching_students_to_compose_scores_for_video_games/.
Many rock bands have allowed their previously recorded songs to be used in video games, similar to a hit song being used on a movie soundtrack. The bands are paid for the rights to use the song, and their music is exposed to an audience that otherwise might not hear it. As mentioned earlier, games like Rock Band and Guitar Hero have been used to promote bands. The release of The Beatles: Rock Band was timed to coincide with the release of digitally remastered reissues of the Beatles’ albums.
Another phenomenon relating to music and video games involves musicians covering video game music. A number of bands perform only video game covers in a variety of styles, such as the popular Japanese group the Black Mages, which performs rock versions of Final Fantasy music. Playing video game themes is not limited to rock bands, however. An orchestra and chorus called Video Games Live started a tour in 2005 dedicated to playing well-known video game music. Their performances are often accompanied by graphics projected onto a screen showing relevant sequences from the video games.Jason Michael Paul Productions, “About,” Play! A Video Game Symphony, http://www.play-symphony.com/about.php.
Recently, the connection between video games and other media has increased with the popularity of machinimaAnimated films created by recording character actions inside of video games., animated films and series created by recording character actions inside video games. Beginning with the short film “Diary of a Camper,” filmed inside the game Quake in 1996, fans of video games have adopted the technique of machinima to tell their own stories. Although these early movies were released only online and targeted a select niche of gamers, professional filmmakers have since adopted the process, using machinima to storyboard scenes and to add a sense of individuality to computer-generated shots. This new form of media is increasingly becoming mainstream, as television shows such as South Park and channels such as MTV2 have introduced machinima to a larger audience.Jonathan Strickland, “How Machinima Works,” HowStuffWorks.com, http://entertainment.howstuffworks.com/machinima3.htm.
Figure 10.7

Educational video games have proven to be useful tools for educators.
One sign of the mainstreaming of video games is the increase of educational institutions that embrace them. As early as the 1980s, games such as Number Munchers and Word Munchers were designed to help children develop basic math and grammar skills. In 2006, the Federation of American Scientists completed a study that approved of video game use in education. The study cited the fact that video game systems were present in most households, kids favored learning through video games, and games could be used to facilitate analytical skills.Ben Feller, “Group: Video Games Can Reshape Education,” MSNBC, October 18, 2006, http://www.msnbc.msn.com/id/15309615/from/ET/. Another study, published in the science journal Nature in 2002, found that regular video game players had better developed visual-processing skills than people who did not play video games. Participants in the test were asked to play a first-person shooter game for one hour a day for 10 days, and were then tested for specific visual attention skills. The playing improved these skills in all participants, but the regular video game players had a greater skill level than the non–game players. According to the study, “Although video-game playing may seem to be rather mindless, it is capable of radically altering visual attention processing.”C. Shawn Green and Daphne Bavelier, “Action Video Game Modifies Visual Selective Attention,” Nature 423, no. 6939 (2003): 534–537.
Other educational institutions have begun to embrace video games as well. The Boy Scouts of America have created a “belt loop,” something akin to a merit badge, for tasks including learning to play a parent-approved game and developing a schedule to balance video game time with homework.David Murphy, “Boy Scouts Develop ‘Vide Game’ Merit Badge,” PC Magazine, May 2, 2010, http://www.pcmag.com/article2/0,2817,2363331,00.asp. The federal government has also seen the educational potential of video games. A commission on balancing the federal budget suggested a video game that would educate Americans about the necessary costs of balancing the federal budget.Richard Wolf, “Nation’s Soaring Deficit Calls for Painful Choices,” USA Today, April 14, 2010, http://www.usatoday.com/news/washington/2010-04-12-deficit_N.htm. The military has similarly embraced video games as training simulators for new soldiers. These simulators, working off of newer game technologies, present several different realistic options that soldiers could face on the field. The games have also been used as recruiting tools by the U.S. Army and the Army National Guard.Associated Press, “Military Training Is Just a Game,” Wired, October 3, 2003, http://www.wired.com/gaming/gamingreviews/news/2003/10/60688.
The ultimate effect of video game use for education, whether in schools or in the public arena, means that video games have been validated by established cultural authorities. Many individuals still resist the idea that video games can be beneficial or have a positive cultural influence, but their embrace by educational institutions has given video games validation.
While universally accepted as a form of media, a debate has recently arisen over whether video games can be considered a form of art. Roger Ebert, the well-known film critic, has historically argued that “video games can never be art,” citing the fact that video games are meant to be won, whereas art is meant to be experienced.Roger Ebert, “Video Games Can Never Be Art,” Chicago Sun-Times, April 16, 2010, http://blogs.suntimes.com/ebert/2010/04/video_games_can_never_be_art.html.
His remarks have generated an outcry from both video gamers and developers. Many point to games such as 2009’s Flower, in which players control the flow of flower petals in the wind, as examples of video games developing into art. Flower avoids specific plot and characters to allow the player to focus on interaction with the landscape and the emotion of the game-play.That Game Company, “Flower,” http://thatgamecompany.com/games/flower/. Likewise, more mainstream games such as the popular Katamari series, released in 2004, are built around the idea of creation, requiring players to pull together a massive clump of objects in order to create a star.
Video games, once viewed as a mindless source of entertainment, are now being featured in publications such as The New Yorker magazine and The New York Times.Max Fisher, “Are Video Games Art?” Atlantic Wire, April 19, 2010, http://www.theatlanticwire.com/features/view/feature/Are-Video-Games-Art-1085/. With the development of increasingly complex musical scores and the advent of machinima, the boundaries between video games and other forms of media are slowly blurring. While they may not be considered art by everyone, video games have contributed significantly to modern artistic culture.
Think about the ways in which video games have influenced and affected other forms of media. Then consider the following questions:
The increasing realism and expanded possibilities of video games has inspired a great deal of controversy. However, even early games, though rudimentary and seemingly laughable nowadays, raised controversy over their depiction of adult themes. Although increased realism and graphics capabilities of contemporary video games have increased the shock value of in-game violence, international culture has been struggling to come to terms with video game violence since the dawn of video games.
Violence in video games has been controversial from their earliest days. Death RaceA 1978 arcade game that drew protest over its depiction of violence., an arcade game released in 1976, encouraged drivers to run over stick figures, which then turned into Xs. Although the programmers claimed that the stick figures were not human, the game was controversial, making national news on the television talk show Donahue and the television news magazine 60 Minutes. Video games, regardless of their realism or lack thereof, had added a new potential to the world of games and entertainment: the ability to simulate murder.
The enhanced realism of video games in the 1990s accompanied a rise in violent games as companies expanded the market to target older demographics. A great deal of controversy exists over the influence of this kind of violence on children, and also over the rating system that is applied to video games. There are many stories of real-life violent acts involving video games. The 1999 Columbine High School massacreThe 1999 high school shooting that many people connected with the first-person shooter genre of video games. was quickly linked to the teenage perpetrators’ enthusiasm for video games. The families of Columbine victims brought a lawsuit against 25 video game companies, claiming that if the games had not existed, the massacre would not have happened.Mark Ward, “Columbine Families Sue Computer Game Makers,” BBC News, May 1, 2001, http://news.bbc.co.uk/2/hi/science/nature/1295920.stm. In 2008, a 17-year-old boy shot his parents after they took away his video game system, killing his mother.Mike Harvey, “Teenager Daniel Petric Shot Parents Who Took Away Xbox,” Times (London), January 13, 2009, http://www.timesonline.co.uk/tol/news/world/us_and_americas/article5512446.ece. Also in 2008, when 6 teens were arrested for attempted carjacking and robbery, they stated that they were reenacting scenes from Grand Theft Auto.Lee Cochran, “Teens Say: Video Game Made Them Do It,” ABC News, June 27, 2008, http://abcnews.go.com/TheLaw/story?id=5262689.
There is no shortage of news stories that involve young men commiting crimes relating to an obsession with video games. The controversy has not been resolved regarding the influences behind these crimes. Many studies have linked aggression to video games; however, critics take issue with using the results of these studies to claim that the video games caused the aggression. They point out that people who enact video-game–related crimes already have psychopathic tendencies, and that the results of such research studies are correlational rather than causational—a naturally violent person is drawn to play violent video games.Jill U. Adams, “A Closer Look: Effects of Violent Video Games,” Los Angeles Times, May 3, 2010, http://www.latimes.com/news/health/la-he-closer-20100503,0,5586471.story. Other critics point out that violent games are designed for adults, just as violent movies are, and that parents should enforce stricter standards for their children.
The problem of children’s access to violent games is a large and complex one. Video games present difficult issues for those who create the ratings. One problem is the inconsistency that seems to exist in rating video games and movies. Movies with violence or sexual themes are rated either R or NC-17. Filmmakers prefer the R rating over the NC-17 rating because NC-17 ratings hurt box office sales, and they will often heavily edit films to remove overly graphic content. The Entertainment Software Rating Board (ESRB)The organization that creates ratings for video games., rates video games. The two most restrictive ratings the ESRB has put forth are “M” (for Mature; 17 and older; “may contain mature sexual themes, more intense violence, and/or strong language”) and “AO” (for Adults Only; 18 and up; “may include graphic depictions of sex and/or violence”). If this rating system were applied to movies, a great deal of movies now rated R would be labeled AO. An AO label can have a devastating effect on game sales; in fact, many retail outlets will not sell games with an AO rating.Paul Hyman, “Video Game Rating Board Don’t Get No Respect,” Hollywood Reporter, April 8, 2005, http://www.hollywoodreporter.com/hr/search/article_display.jsp?vnu_content_id=1000874859. This creates a situation where a video game with a sexual or violent scene as graphic as the ones seen in R-rated movies is difficult to purchase, whereas a pornographic magazine can be bought at many convenience stores. This issue reveals a unique aspect of video games. Although many of them are designed for adults, the distribution system and culture surrounding video games is still largely youth-oriented.
Another controversial issue is the problem of video game addiction. As of the print date, the American Medical Association (AMA) has not created an official diagnosis of video game addiction, citing the lack of long-term research. However, the AMA uses the term “video game overuseThe term used in lieu of “video game addiction” by the American Medical Association while further research is conducted to determine the correct classification.” to describe video game use that begins to affect other aspects of an individual’s life, such as relationships and health. Studies have found that socially marginalized people have more of a tendency to overuse games, especially online role-playing games like World of Warcraft. Other studies have found that patterns of time usage and social dysfunction in players who overuse games are similar to those of other addictive disorders.Mohamed Khan, Emotional and Behavioral Effects of Video Games and Internet Overuse, American Medical Association, Council on Science and Public Health, 2007, http://www.ama-assn.org/ama1/pub/upload/mm/443/csaph12a07-fulltext.pdf.
Figure 10.8

Video game use can become an obsession with some people.
Groups such as Online Gamers Anonymous have developed a 12-step program similar to that of Alcoholics Anonymous to help gamers deal with problems relating to game overuse. This group is run by former online gamers and family members of those affected by heavy game use.On-line Gamers Anonymous, “About OLGA & OLG-Anon,” http://www.olganon.org. This problem is not new, but it has become more prevalent. In the early 1990s, many stories surfaced of individuals dropping out of college or getting divorced because of addiction to MUDs.R. W. Greene, “Is Internet Addiction for Worrywarts or a Genuine Problem?” CNN, September 23, 1998, http://www.cnn.com/TECH/computing/9809/23/netaddict.idg/index.html. In addition, heavy video gaming, much like heavy computer use in an office setting, can result in painful repetitive stress injuries. Even worse are the rare, but serious, cases of death resulting from video game overuse. In the early 1980s, two deaths were linked to the video game Berzerk. The players, both in their late teens, suffered fatal heart attacks while struggling to achieve top scores.Arcade History, “Berzerk, the Video Game,” http://www.arcade-history.com/?n=berzerk&page=detail&id=236. The issue of video game addiction has become a larger one because of the ubiquity of video games and Internet technology. In countries that have a heavily wired infrastructure, such as South Korea, the problem is even bigger. In 2010, excessive game use was problematic enough that the South Korean government imposed an online gaming curfew for people under the age of 18 that would block certain sites after midnight. This decision followed the death of a 3-month-old baby from starvation while her parents played an online game at an Internet café.Geoffrey Cain, “South Korea Cracks Down on Gaming Addiction,” Time, April 20, 2010, http://www.time.com/time/world/article/0,8599,1983234,00.html.
Another side of video game addiction is told well by Jim Rossignol in his book This Gaming Life: Travels in Three Cities. The book describes Rossignol’s job as a journalist for a financial company and his increasing involvement with Quake III. Rossignol trained a team of players to compete in virtual online tournaments, scheduling practices and spending the hours afterward analyzing strategies with his teammates. His intense involvement in the game led to poor performance at his job, and he was eventually fired. After being fired, he spent even more time on the game, not caring about his lack of a job or shrinking savings. The story up to this point sounds like a testimonial about the dangers of game addiction. However, because of his expertise in the game, he was hired by a games magazine and enjoyed full-time employment writing about what he loved doing. Rossignol does not gloss over the fact that games can have a negative influence, but his book speaks to the ways in which gaming—often what would be described as obsessive gaming—can cause positive change in people’s lives.Jim Rossignol, This Gaming Life: Travels in Three Cities (Ann Arbor, MI: University of Michigan Press, 2008), 4–11.
Figure 10.9

Games such as Tomb Raider and Dead or Alive Xtreme have been criticized for their demeaning depiction of women.
It is no secret that young adult men make up the majority of video gamers. A study in 2009 found that 60 percent of gamers were male, and the average age of players was 35.Entertainment Software Association, Essential Facts About the Computer and Video Game Industry: 2009. While the gender gap has certainly narrowed in the past 30 years, video gaming is still in many ways a male-dominated medium.
Male influence can be seen throughout the industry. Women make up less than 12 percent of game designers and programmers, and those who do enter the field often find themselves facing subtle—and not so subtle—sexism.Media Awareness Network, “Gender Stereotyping,” http://www.media-awareness.ca/english/parents/video_games/concerns/gender_videogames.cfm. When Game Developer magazine released its list of top 50 people in the video game industry for 2010, bloggers were quick to note that no female developers appeared in the list.Cory Doctorow, “Gamasutra’s Most Important Gamers List Is a Boy’s Club,” BoingBoing (blog), April 14, 2010, http://boingboing.net/2010/04/14/gamasutras-most-impo.html. In 2007, scandal erupted over a pornographic comic featuring Jade Raymond, current managing director of the French video game publisher Ubisoft, that surfaced on an online forum. The motivation behind the comic was to allege that Raymond did not deserve her position at Ubisoft because she earned it based on her looks, rather than on her abilities and experience.
Sexism in video games has existed since the early days of the medium. The plot of the infamous Custer’s Revenge, released for the Atari 2600 in 1982, centered on the rape of a Native American woman. Popular NES games such as Super Mario Bros. and The Legend of Zelda featured a male figure rescuing a damsel in distress. Both the protagonist and antagonist in the original Tomb Raider game had hourglass figures with prominent busts and nonexistent waists, a trend that continues in the franchise today. In 2003, the fighting series Dead or Alive released a spin-off game titled Dead or Alive Xtreme Beach Volleyball that existed to showcase the well-endowed female characters in swim attire.Michael Strauss, “A Look at Female Characters in Video Games,” Associated Content, July 16, 2010, http://www.associatedcontent.com/article/5487226/a_look_at_female_characters_in_video_pg2.html?cat=19. The spin-off was so popular that two more similar games were released.
Some note that video games are not unique in their demeaning portrayal of women. Like movies, television, and other media forms, video games often fall back on gender stereotyping in order to engage consumers. Defenders point out that many male video game characters are also depicted lewdly. Games such as God of War and Mortal Kombat feature hypersexualized men with bulging muscles and aggressive personalities who rely on their brawn rather than their brains. How are men affected by these stereotypes? Laboratory studies have shown that violence and aggression in video games affect men more than women, leading to higher levels of male aggression.Bruce D. Bartholow and Craig A. Anderson, “Effects of Violent Video Games on Aggressive Behavior: Potential Sex Differences,” Journal of Experimental Social Psychology 38, no. 3 (2002): 283–290. While sexism is certainly present in video games, it seems sexual stereotyping affects both genders negatively.
In recent years, game designers have sought to move away from these clichéd representations of gender. The popular game Portal, released in 2007, features a female protagonist clad in a simple orange jumpsuit who relies on her wits to solve logic puzzles. Series such as Half-Life and Phoenix Wright: Ace Attorney star male heroes who are intellectuals instead of warriors. Other games, like the Mass Effect series and Halo: Reach, allow gamers to choose the gender of the main character without altering elements of the plot or game-play. However, despite recent strides forward, there is no doubt that as the video game industry continues to evolve, so too will the issue of gender.
Choose video game violence or video game addiction and search for it on the Internet. Examine the research that is associated with the issue you chose. Then consider the following questions:
Early video games were easily identifiable as games. They had basic goals and set rules of play. As games have developed in complexity, they have allowed for newer options and an expanded idea of what constitutes a game. At the same time, the aesthetics and design of game culture have been applied to tools and entertainment phenomena that are not technically games. The result is a blurring of the lines between games and other forms of communication, and the creation of new means of mixing information, entertainment, and communication.
Figure 10.10

Team video games can provide many of the benefits of physical sports.
Image used by permission. © 2011 Blizzard Entertainment, Inc.
Video games have developed in complexity and flexibility to the point that they are providing new kinds of social interactions. This is perhaps clearest when games are compared to physical sports. Most of the abstract reasons for involvement in sports—things like teamwork, problem solving, and leadership—are also reasons to play video games that allow online team interaction. Sports are often portrayed as a social platform where people can learn new skills. Amateur sports teams provide an important means of socialization and communication for many people, allowing individuals from different gender, ethnic, and class backgrounds to find a common forum for communication. Video games have similar capacities; players who engage in online team battles must communicate with one another effectively and learn to collectively solve problems. In fact, video games could arguably provide a more inclusive platform for socializing because they do not exclude most socially or physically challenged individuals. The social platform of online games is certainly no utopia, as competitive endeavors of any sort inevitably create opportunities for negative communication, but it allows people of different genders, races, and nationalities to play on equal footing, a feat that is difficult, if not impossible, in the physical realm.
Massively multiplayer online role-playing games (MMORPGs) such as World of Warcraft, EverQuest, and EVE Online allow for an unprecedented variety of goals to be pursued. According to one study on online communities,
[t]hat means games are no longer meant to be a solitary activity played by a single individual. Instead, the player is expected to join a virtual community that is parallel with the physical world, in which societal, cultural, and economical systems arise.Panayiotis Zaphiris, Chee Siang Ang, and Andrew Laghos, “Online Communities,” in The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, ed. Andrew Sears and Julie Jacko (New York: Taylor & Francis Group, 2008), 607.
MMORPGs can function as a kind of hybrid of social media and video games in this respect. In some instances, these games could function entirely as social media, and not as games. Consider a player who has no desire to achieve any of the prescribed goals of a game, such as going on quests or accomplishing tasks. This individual could simply walk around in social areas and talk to new people. Is the player still playing a game at this point, or has the game turned into a type of social media?
Virtual worldsA digitized space in which an online community can interact. such as Second LifeA virtual world in which users create avatars and participate in an online community. are good examples of the thin line between games and social media. In this virtual world, users create avatars to represent themselves. They can then explore the world and take part in the culture that characterizes it. The Second Life website FAQ (frequently asked questions) tries to illustrate the differences between a virtual world and a video game:
While the Second Life interface and display are similar to most popular massively multiplayer online role playing games (or MMORPGs), there are two key, unique differences:
Creativity: The Second Life virtual world provides almost unlimited freedom to its Residents. This world really is whatever you make it. If you want to hang out with your friends in a garden or nightclub, you can. If you want to go shopping or fight dragons, you can. If you want to start a business, create a game or build a skyscraper you can. It’s up to you.
Ownership: Instead of paying a monthly subscription fee, Residents can start a Basic account for FREE. Additional Basic accounts cost a one-time flat fee of just $9.95. If you choose to get land to live, work and build on, you pay a monthly lease fee based on the amount of land you have. You also own anything you create—Residents retain intellectual property rights over their in-world creations.Second Life, “Frequently Asked Questions: Is Second Life a game?” http://secondlife.com/whatis/faq.php#02.
Virtual worlds may differ from traditional video games in key ways, but these differences will likely be further blurred as games develop. Regardless of the way these worlds are classified, they are based heavily on the aesthetics and design of video games.
Virtual worlds are societies in which communication can be expanded to relay information in ways that cannot be done with other forms of media. Universities have set up virtual classrooms in Second Life that employ techniques for teaching that are otherwise impossible. Curriculum can be set up to take advantage of the virtual world’s creative capacity.Stacy Kluge and Liz Riley, “Teaching in Virtual Worlds: Issues and Challenges,” Issues in Informing Science and Information Technology 5 (2008): 128. For example, a class on architecture could lead students through a three-dimensional recreation of a Gothic cathedral, a history class could rely on an interactive battlefield simulation to explain the mechanics of a famous battle, or a physics class could use simulations to demonstrate important principles.
Figure 10.11

Virtual classrooms have been used in Second Life as new teaching tools.
Virtual worlds have been used to relay new kinds of information as well. In 2007, specialists developed virtual-reality simulations to help patients with Asperger’s syndrome navigate through social situations. Individuals with Asperger’s syndrome have difficulty recognizing nonverbal cues and adapting to change. The simulations allowed these patients to create avatars that then played out social situations such as job interviews. The simulations could be adjusted to allow for more complicated interactions, and the users could replay their own performances to better understand the simulation.“Avatars Help Asperger Syndrome Patients Learn to Play the Game of Life,” news release, University of Texas at Dallas News Center, November 18, 2007, http://www.utdallas.edu/news/2007/11/18-003.html. The creation of avatars and the ability to replay scenes until the user improves performance is a direct influence of video games.
Figure 10.12

FarmVille is integrated with Facebook and lets players grow and harvest crops to expand their farms.
Social media outlets such as Facebook have also used video games to expand their communication platforms. FarmVilleA game that interacts with the social platform Facebook, in which players grow, harvest, and profit from farming., a game in which players plant, harvest, and sell crops and expand their farm plots with the profits, is integrated with Facebook so that all users can connect with their real-life friends in the game. The game adds another aspect of communication—one centered on competition, strategy, and the aesthetics of game-play—to the Facebook platform. In 2010, FarmVille had 80 million users, an unprecedented number of people playing a single game. Other games for Facebook include Lexulous (formerly Scrabulous), a game that mimicked the board game Scrabble closely enough that it had to be changed, and Pet Society, a game that allows users to raise virtual animals as pets. These games are unique in their demographic scope; they seek to engage literally anyone. By coupling games with social media sites, game designers have pushed the influence of video games to unprecedented levels.Tim Walker, “Welcome to FarmVille: Population 80 million,” Independent (London), February 22, 2010, http://www.independent.co.uk/life-style/gadgets-and-tech/features/welcome-to-farmville-population-80-million-1906260.html.
Due to the increasing popularity of video games, many social networking websites have emerged in recent years that target hard-core gamers and casual gamers alike. The website GamingPassions.com describes itself as a “free online dating and social networking site for the Video Gaming community” that “allows you to meet other video game lovers who ‘get it.’”Gaming Passions, “Welcome to Gaming Passions,” http://www.gamingpassions.com/. Others, such as DateCraft, target players who share a love of specific games, such as World of Warcraft. These websites allow players to socialize outside of the confines of a video game and provide a way for fans of any game to connect.Guy Fawkes, “Online Dating for Video Gamers,” Associated Content, March 19, 2010, http://www.associatedcontent.com/article/2806686/online_dating_for_video_game_players.html?cat=41.
The maker of FarmVille, a company called Zynga, has released a number of social networking games that can be played via Internet-enabled mobile phones. Mafia Wars, a simulated version of organized crime that originated as an online game, lets users manage their money and complete jobs from their mobile phones. This innovation is only a change of platform from the computer to the phone, but it promises to make gaming a common means of social interaction.
Phones are used to communicate directly with another person, whether through text messages, Facebook posts, or phone calls. Adding games to phones creates a new language that people can use to communicate with each other. One can imagine the ways in which a spat between friends or a couple hooking up could be communicated through this medium. During the week of Valentine’s Day in 2010, the creators of FarmVille reported that users sent more than 500 million copies of virtual gifts to each other.Mathew Ingram, “Farmville Users Send 500M Valentines in 48 Hours,” GigaOM, February 10, 2010, http://gigaom.com/2010/02/10/farmville-users-send-500m-valentines-in-48-hours/. Competition, aggression, hostility, as well as generosity and general friendliness are given new means of expression with the addition of video games to the mobile platform.
An important genre of games is often overlooked when focusing on the overall history and effect of video games. These games are created to get a specific point across to the player, and are generally low-budget efforts by individuals or small groups. The group Molleindustria puts out a host of these games, including Oligarchy, a game in which the player is an oil-drilling business executive who pursues corruption to gain profits, or Faith Fighter, in which religious icons fight each other.Molle Industria, http://www.molleindustria.org/en/home. Downing Street Fighter is a similar game, designed as a satirical representation of the United Kingdom’s 2010 elections.Zain Verjee, “Will There Be a Knockout Blow?” CNN, May 3, 2010, http://ukelection.blogs.cnn.com/2010/05/03/will-there-be-a-knock-out-blow/.
Other games in this genre include a great deal of content that informs players about current events. One example is Cutthroat Capitalism, where users play the part of Somali pirates who are carrying out kidnapping and piracy missions. The player must weigh the risks and rewards of their exploits given a variety of economic factors.Scott Carney, “An Economic Analysis of the Somali Pirate Business Model,” Wired, July 13, 2009, http://www.wired.com/politics/security/magazine/17-07/ff_somali_pirates. This game allows players to understand the forces at work behind international phenomena such as piracy. Other games include the Redistricting Game, a game aimed at encouraging congressional redistricting reform, and Planet Green Game, in which players find ways to conserve energy.William McGeveran, “Video Games With a Message,” Info/Law (blog), June 18, 2007, http://blogs.law.harvard.edu/infolaw/2007/06/18/video-games-with-a-message/. These games’ publishers have used their medium to transmit information in new ways. Political satire in the medium of the video game allows the player to, in effect, play a satirical joke to its logical conclusion, exploring the entire scope of the joke. Advocacy games engage players with an interactive political message, rather than sending out traditional media ads. Mainstream groups such as PETA have also turned to video games to convey messages. During the 2008 Thanksgiving season, the animal-rights group released a parody of the popular Cooking Mama games to demonstrate the cruelty of eating meat.Alexander Sliwinski, “PETA Parody Grills Cooking Mama,” Joystiq, November 17 2008, http://www.joystiq.com/2008/11/17/peta-parody-grills-cooking-mama/.
Figure 10.13

The unauthorized PETA edition of Cooking Mama tackles the issue of animal cruelty.
Although games of this type are not in a league with game series like Halo—either in design or in popularity—they are pushing the boundaries of the video game as a form of media. Political pamphlets were a major source of information and political discourse during the American Revolution. Video games may well come to serve a similar informational function.
Several examples of the ways in which virtual worlds could be used for education were mentioned in this section.
Review Questions
Questions for Section 10.1 "The Evolution of Electronic Games"
Questions for Section 10.2 "Influential Contemporary Games"
Questions for Section 10.3 "The Impact of Video Games on Culture"
Questions for Section 10.4 "Controversial Issues"
Questions for Section 10.5 "Blurring the Boundaries Between Video Games, Information, Entertainment, and Communication"
Video games are a growing industry, and the budgets to create them are increasing every year. Video games require large production teams. The following jobs are important aspects of video game production:
Choose one of the jobs listed here or find a different job associated with the games industry and research the requirements for it online. When you have researched your job, answer the following questions:
Figure 11.1
It used to be that applying for a job was fairly simple: send in a résumé, write a cover letter, and call a few references to make sure they will say positive things. The hiring manager understands that this is a biased view, designed to make the applicant look good, but that is all forgivable. After all, everyone applying for a particular job is going through this same process, and barring great disasters, the chances of something particularly negative reaching the desk of a hiring manager are not that great.
However, there is a new step that is now an integral part of this application process—hiding (or at least cleaning up) the applicants’ virtual selves. This could entail “Googling”—shorthand for searching on Google—their own name to see the search results. If the first thing that comes up is a Flickr album (an online photo album from the photo-sharing site Flickr) from last month’s Olympian-themed cocktail party, it may be a good idea to make that album private to ensure that only friends can view the album.
The ubiquity of Web 2.0 social media like Facebook and Twitter allows anyone to easily start developing an online persona from as early as birth (depending on the openness of one’s parents)—and although this online persona may not accurately reflect the individual, it may be one of the first things a stranger sees. Those online photos may not look bad to friends and family, but one’s online persona may be a hiring manager’s first impression of a prospective employee. Someone in charge of hiring could search the Internet for information on potential new hires even before calling references.
First impressions are an important thing to keep in mind when making an online persona professionally acceptable. Your presence online can be the equivalent of your first words to a brand-new acquaintance. Instead of showing a complete stranger your pictures from a recent party, it might be a better idea to hide those pictures and replace them with a well-written blog—or a professional-looking website.
The content on social networking sites like Facebook, where people use the Internet to meet new people and maintain old friendships, is nearly indestructible and may not actually belong to the user. In 2008, as Facebook was quickly gaining momentum, The New York Times ran an article, “How Sticky Is Membership on Facebook? Just Try Breaking Free”—a title that seems at once like a warning and a big-brother taunt. The website does allow the option of deactivating one’s account, but “Facebook servers keep copies of the information in those accounts indefinitely.”Maria Aspan, “How Sticky Is Membership on Facebook? Just Try Breaking Free,” New York Times, February 11, 2008, http://www.nytimes.com/2008/02/11/technology/11facebook.html. It is a double-edged sword: On one hand, users who become disillusioned and quit Facebook can come back at any time and resume their activity; on the other, one’s information is never fully deleted. If a job application might be compromised by the presence of a Facebook profile, clearing the slate is possible, albeit with some hard labor. The user must delete, item by item, every individual wall post, every group membership, every photo, and everything else.
Not all social networks are like this—MySpace and Friendster still require users who want to delete their accounts to confirm this several times, but they offer a clear-cut “delete” option—but the sticky nature of Facebook information is nothing new.Maria Aspan, “How Sticky Is Membership on Facebook? Just Try Breaking Free,” New York Times, February 11, 2008, http://www.nytimes.com/2008/02/11/technology/11facebook.html. Google even keeps a cache of deleted web pages, and the Internet Archive keeps decades-old historical records. This transition from ephemeral media—television and radio, practically over as quickly as they are broadcast—to the enduring permanence of the Internet may seem strange, but in some ways it is built into the very structure of the system. Understanding how the Internet was conceived may help elucidate the ways in which the Internet functions today—from the difficulties of deleting an online persona to the speedy and near-universal access to the world’s information.
From its early days as a military-only network to its current status as one of the developed world’s primary sources of information and communication, the InternetA web of interconnected computers and all of the information publicly available on these computers. has come a long way in a short period of time. Yet there are a few elements that have stayed constant and that provide a coherent thread for examining the origins of the now-pervasive medium. The first is the persistence of the Internet—its Cold War beginnings necessarily influencing its design as a decentralized, indestructible communication network.
The second element is the development of rules of communication for computers that enable the machines to turn raw data into useful information. These rules, or protocolsAn agreed-upon set of rules for two or more entities to communicate over a network., have been developed through consensus by computer scientists to facilitate and control online communication and have shaped the way the Internet works. Facebook is a simple example of a protocol: Users can easily communicate with one another, but only through acceptance of protocols that include wall posts, comments, and messages. Facebook’s protocols make communication possible and control that communication.
These two elements connect the Internet’s origins to its present-day incarnation. Keeping them in mind as you read will help you comprehend the history of the Internet, from the Cold War to the Facebook era.
The near indestructibility of information on the Internet derives from a military principle used in secure voice transmission: decentralizationThe principle that there should be no central hub that controls information flow. Instead, information is transferred via protocols that allow any computer to communicate directly with any other computer.. In the early 1970s, the RAND Corporation developed a technology (later called “packet switching”) that allowed users to send secure voice messages. In contrast to a system known as the hub-and-spoke model, where the telephone operator (the “hub”) would patch two people (the “spokes”) through directly, this new system allowed for a voice message to be sent through an entire network, or web, of carrier lines, without the need to travel through a central hub, allowing for many different possible paths to the destination.
During the Cold War, the U.S. military was concerned about a nuclear attack destroying the hub in its hub-and-spoke model; with this new web-like model, a secure voice transmission would be more likely to endure a large-scale attack. A web of data pathways would still be able to transmit secure voice “packets,” even if a few of the nodes—places where the web of connections intersected—were destroyed. Only through the destruction of all the nodes in the web could the data traveling along it be completely wiped out—an unlikely event in the case of a highly decentralized network.
This decentralized network could only function through common communication protocols. Just as we use certain protocols when communicating over a telephone—“hello,” “goodbye,” and “hold on for a minute” are three examples—any sort of machine-to-machine communication must also use protocols. These protocols constitute a shared language enabling computers to understand each other clearly and easily.
In 1973, the U.S. Defense Advanced Research Projects Agency (DARPA) began research on protocols to allow computers to communicate over a distributed networkA web of computers connected to one another, allowing intercomputer communication.. This work paralleled work done by the RAND Corporation, particularly in the realm of a web-based network model of communication. Instead of using electronic signals to send an unending stream of ones and zeros over a line (the equivalent of a direct voice connection), DARPA used this new packet-switching technology to send small bundles of data. This way, a message that would have been an unbroken stream of binary data—extremely vulnerable to errors and corruption—could be packaged as only a few hundred numbers.
Figure 11.2

Centralized versus distributed communication networks
Imagine a telephone conversation in which any static in the signal would make the message incomprehensible. Whereas humans can infer meaning from “Meet me [static] the restaurant at 8:30” (we replace the static with the word at), computers do not necessarily have that logical linguistic capability. To a computer, this constant stream of data is incomplete—or “corrupted,” in technological terminology—and confusing. Considering the susceptibility of electronic communication to noise or other forms of disruption, it would seem like computer-to-computer transmission would be nearly impossible.
However, the packets in this packet-switching technology have something that allows the receiving computer to make sure the packet has arrived uncorrupted. Because of this new technology and the shared protocols that made computer-to-computer transmission possible, a single large message could be broken into many pieces and sent through an entire web of connections, speeding up transmission and making that transmission more secure.
One of the necessary parts of a network is a host. A host is a physical node that is directly connected to the Internet and “directs traffic” by routing packets of data to and from other computers connected to it. In a normal network, a specific computer is usually not directly connected to the Internet; it is connected through a host. A host in this case is identified by an Internet Protocol, or IP, address (a concept that is explained in greater detail later). Each unique IP address refers to a single location on the global Internet, but that IP address can serve as a gateway for many different computers. For example, a college campus may have one global IP address for all of its students’ computers, and each student’s computer might then have its own local IP address on the school’s network. This nested structure allows billions of different global hosts, each with any number of computers connected within their internal networks. Think of a campus postal system: All students share the same global address (1000 College Drive, Anywhere, VT 08759, for example), but they each have an internal mailbox within that system.
The early Internet was called ARPANET, after the U.S. Advanced Research Projects Agency (which added “Defense” to its name and became DARPA in 1973), and consisted of just four hosts: UCLA, Stanford, UC Santa Barbara, and the University of Utah. Now there are over half a million hosts, and each of those hosts likely serves thousands of people.Central Intelligence Agency, “Country Comparison: Internet Hosts,” World Factbook, https://www.cia.gov/library/publications/the-world-factbook/rankorder/2184rank.html. Each host uses protocols to connect to an ever-growing network of computers. Because of this, the Internet does not exist in any one place in particular; rather, it is the name we give to the huge network of interconnected computers that collectively form the entity that we think of as the Internet. The Internet is not a physical structure; it is the protocols that make this communication possible.
Figure 11.3

A TCP gateway is like a post office because of the way that it directs information to the correct location.
One of the other core components of the Internet is the Transmission Control Protocol (TCP) gateway. Proposed in a 1974 paper, the TCP gateway acts “like a postal service.”Vinton Cerf, Yogen Dalal, and Carl Sunshine, “Specification of Internet Transmission Control Program,” December 1974, http://tools.ietf.org/html/rfc675. Without knowing a specific physical address, any computer on the network can ask for the owner of any IP address, and the TCP gateway will consult its directory of IP address listings to determine exactly which computer the requester is trying to contact. The development of this technology was an essential building block in the interlinking of networks, as computers could now communicate with each other without knowing the specific address of a recipient; the TCP gateway would figure it all out. In addition, the TCP gateway checks for errors and ensures that data reaches its destination uncorrupted. Today, this combination of TCP gateways and IP addresses is called TCP/IP and is essentially a worldwide phone book for every host on the Internet.
Email has, in one sense or another, been around for quite a while. Originally, electronic messages were recorded within a single mainframe computer system. Each person working on the computer would have a personal folder, so sending that person a message required nothing more than creating a new document in that person’s folder. It was just like leaving a note on someone’s desk,Ian Peter, “The History of Email,” The Internet History Project, 2004, http://www.nethistory.info/History%20of%20the%20Internet/email.html. so that the person would see it when he or she logged onto the computer.
However, once networks began to develop, things became slightly more complicated. Computer programmer Ray Tomlinson is credited with inventing the naming system we have today, using the @ symbol to denote the server (or host, from the previous section). In other words, name@gmail.com tells the host “gmail.com” (Google’s email server) to drop the message into the folder belonging to “name.” Tomlinson is credited with writing the first network email using his program SNDMSG in 1971. This invention of a simple standard for email is often cited as one of the most important factors in the rapid spread of the Internet, and is still one of the most widely used Internet services.
The use of email grew in large part because of later commercial developments, especially America Online, that made connecting to email much easier than it had been at its inception. Internet service providers (ISPs) packaged email accounts with Internet access, and almost all web browsers (such as Netscape, discussed later in the section) included a form of email service. In addition to the ISPs, email services like Hotmail and Yahoo! Mail provided free email addresses paid for by small text ads at the bottom of every email message sent. These free “webmail” services soon expanded to comprise a large part of the email services that are available today. Far from the original maximum inbox sizes of a few megabytes, today’s email services, like Google’s Gmail service, generally provide gigabytes of free storage space.
Email has revolutionized written communication. The speed and relatively inexpensive nature of email makes it a prime competitor of postal services—including FedEx and UPS—that pride themselves on speed. Communicating via email with someone on the other end of the world is just as quick and inexpensive as communicating with a next-door neighbor. However, the growth of Internet shopping and online companies such as Amazon.com has in many ways made the postal service and shipping companies more prominent—not necessarily for communication, but for delivery and remote business operations.
In 1989, Tim Berners-Lee, a graduate of Oxford University and software engineer at CERN (the European particle physics laboratory), had the idea of using a new kind of protocol to share documents and information throughout the local CERN network. Instead of transferring regular text-based documents, he created a new language called hypertext markup language (HTML). Hypertext was a new word for text that goes beyond the boundaries of a single document. Hypertext can include links to other documents (hyperlinks), text-style formatting, images, and a wide variety of other components. The basic idea is that documents can be constructed out of a variety of links and can be viewed just as if they are on the user’s computer.
This new language required a new communication protocol so that computers could interpret it, and Berners-Lee decided on the name hypertext transfer protocol (HTTP). Through HTTP, hypertext documents can be sent from computer to computer and can then be interpreted by a browser, which turns the HTML files into readable web pages. The browser that Berners-Lee created, called World Wide Web, was a combination browser-editor, allowing users to view other HTML documents and create their own.Tim Berners-Lee, “The WorldWideWeb Browser,” 2009, http://www.w3.org/People/Berners-Lee/WorldWideWeb.
Figure 11.4

Tim Berners-Lee’s first web browser was also a web page editor.
Modern browsers, like Microsoft Internet Explorer and Mozilla Firefox, only allow for the viewing of web pages; other increasingly complicated tools are now marketed for creating web pages, although even the most complicated page can be written entirely from a program like Windows Notepad. The reason web pages can be created with the simplest tools is the adoption of certain protocols by the most common browsers. Because Internet Explorer, Firefox, Apple Safari, Google Chrome, and other browsers all interpret the same code in more or less the same way, creating web pages is as simple as learning how to speak the language of these browsers.
In 1991, the same year that Berners-Lee created his web browser, the Internet connection service Q-Link was renamed America Online, or AOL for short. This service would eventually grow to employ over 20,000 people, on the basis of making Internet access available (and, critically, simple) for anyone with a telephone line. Although the web in 1991 was not what it is today, AOL’s software allowed its users to create communities based on just about any subject, and it only required a dial-up modem—a device that connects any computer to the Internet via a telephone line—and the telephone line itself.
In addition, AOL incorporated two technologies—chat rooms and Instant Messenger—into a single program (along with a web browser). Chat rooms allowed many users to type live messages to a “room” full of people, while Instant Messenger allowed two users to communicate privately via text-based messages. The most important aspect of AOL was its encapsulation of all these once-disparate programs into a single user-friendly bundle. Although AOL was later disparaged for customer service issues like its users’ inability to deactivate their service, its role in bringing the Internet to mainstream users was instrumental.Tom Zeller, Jr., “Canceling AOL? Just Offer Your Firstborn,” New York Times, August 29, 2005, allhttp://www.nytimes.com/2005/08/29/technology/29link.html.
In contrast to AOL’s proprietary services, the World Wide Web had to be viewed through a standalone web browser. The first of these browsers to make its mark was the program Mosaic, released by the National Center for Supercomputing Applications at the University of Illinois. Mosaic was offered for free and grew very quickly in popularity due to features that now seem integral to the web. Things like bookmarks, which allow users to save the location of particular pages without having to remember them, and images, now an integral part of the web, were all inventions that made the web more usable for many people.National Center for Supercomputing Appliances, “About NCSA Mosaic,” 2010, http://www.ncsa.illinois.edu/Projects/mosaic.html.
Although the web browser Mosaic has not been updated since 1997, developers who worked on it went on to create Netscape Navigator, an extremely popular browser during the 1990s. AOL later bought the Netscape company, and the Navigator browser was discontinued in 2008, largely because Netscape Navigator had lost the market to Microsoft’s Internet Explorer web browser, which came preloaded on Microsoft’s ubiquitous Windows operating system. However, Netscape had long been converting its Navigator software into an open-source program called Mozilla Firefox, which is now the second-most-used web browser on the Internet (detailed in ).NetMarketShare, “Browser Market Share,” http://marketshare.hitslink.com/browser-market-share.aspx?qprid=0&qpcal=1&qptimeframe=M&qpsp=132. Firefox represents about a quarter of the market—not bad, considering its lack of advertising and Microsoft’s natural advantage of packaging Internet Explorer with the majority of personal computers.
Table 11.1 Browser Market Share (as of February 2010)
Browser |
Total Market Share |
|---|---|
Microsoft Internet Explorer |
62.12% |
Firefox |
24.43% |
Chrome |
5.22% |
Safari |
4.53% |
Opera |
2.38% |
As web browsers became more available as a less-moderated alternative to AOL’s proprietary service, the web became something like a free-for-all of startup companies. The web of this period, often referred to as Web 1.0, featured many specialty sites that used the Internet’s ability for global, instantaneous communication to create a new type of business. Another name for this free-for-all of the 1990s is the “dot-com boom.” During the boom, it seemed as if almost anyone could build a website and sell it for millions of dollars. However, the “dot-com crash” that occurred later that decade seemed to say otherwise. Quite a few of these Internet startup companies went bankrupt, taking their shareholders down with them. Alan Greenspan, then the chairman of the U.S. Federal Reserve, called this phenomenon “irrational exuberance,”Alan Greenspan, “The Challenge of Central Banking in a Democratic Society, ” (lecture, American Enterprise Institute for Public Policy Research, Washington, DC, December 5, 1996), http://www.federalreserve.gov/boarddocs/speeches/1996/19961205.htm. in large part because investors did not necessarily know how to analyze these particular business plans, and companies that had never turned a profit could be sold for millions. The new business models of the Internet may have done well in the stock market, but they were not necessarily sustainable. In many ways, investors collectively failed to analyze the business prospects of these companies, and once they realized their mistakes (and the companies went bankrupt), much of the recent market growth evaporated. The invention of new technologies can bring with it the belief that old business tenets no longer apply, but this dangerous belief—the “irrational exuberance” Greenspan spoke of—is not necessarily conducive to long-term growth.
Some lucky dot-com businesses formed during the boom survived the crash and are still around today. For example, eBay, with its online auctions, turned what seemed like a dangerous practice (sending money to a stranger you met over the Internet) into a daily occurrence. A less-fortunate company, eToys.com, got off to a promising start—its stock quadrupled on the day it went public in 1999—but then filed for bankruptcy in 2001.Cecily Barnes, “eToys files for Chapter 11,” CNET, March 7, 2001, http://news.cnet.com/2100-1017-253706.html.
One of these startups, theGlobe.com, provided one of the earliest social networking services that exploded in popularity. When theGlobe.com went public, its stock shot from a target price of $9 to a close of $63.50 a share.Dawn Kawamoto, “TheGlobe.com’s IPO one for the books,” CNET, November 13, 1998, http://news.cnet.com/2100-1023-217913.html. The site itself was started in 1995, building its business on advertising. As skepticism about the dot-com boom grew and advertisers became increasingly skittish about the value of online ads, theGlobe.com ceased to be profitable and shut its doors as a social networking site.theglobe.com, “About Us,” 2009, http://www.theglobe.com/. Although advertising is pervasive on the Internet today, the current model—largely based on the highly targeted Google AdSense service—did not come around until much later. In the earlier dot-com years, the same ad might be shown on thousands of different web pages, whereas now advertising is often specifically targeted to the content of an individual page.
However, that did not spell the end of social networking on the Internet. Social networking had been going on since at least the invention of Usenet in 1979 (detailed later in the chapter), but the recurring problem was always the same: profitability. This model of free access to user-generated content departed from almost anything previously seen in media, and revenue streams would have to be just as radical.
The shared, generalized protocols of the Internet have allowed it to be easily adapted and extended into many different facets of our lives. The Internet shapes everything, from our day-to-day routine—the ability to read newspapers from around the world, for example—to the way research and collaboration are conducted. There are three important aspects of communication that the Internet has changed, and these have instigated profound changes in the way we connect with one another socially: the speed of information, the volume of information, and the “democratization” of publishing, or the ability of anyone to publish ideas on the web.
One of the Internet’s largest and most revolutionary changes has come about through social networking. Because of Twitter, we can now see what all our friends are doing in real time; because of blogs, we can consider the opinions of complete strangers who may never write in traditional print; and because of Facebook, we can find people we haven’t talked to for decades, all without making a single awkward telephone call.
Recent years have seen an explosion of new content and services; although the phrase “social media” now seems to be synonymous with websites like Facebook and Twitter, it is worthwhile to consider all the ways a social media platform affects the Internet experience.
Almost as soon as TCP stitched the various networks together, a former DARPA scientist named Larry Roberts founded the company Telnet, the first commercial packet-switching company. Two years later, in 1977, the invention of the dial-up modem (in combination with the wider availability of personal computers like the Apple II) made it possible for anyone around the world to access the Internet. With availability extended beyond purely academic and military circles, the Internet quickly became a staple for computer hobbyists.
One of the consequences of the spread of the Internet to hobbyists was the founding of Usenet. In 1979, University of North Carolina graduate students Tom Truscott and Jim Ellis connected three computers in a small network and used a series of programming scripts to post and receive messages. In a very short span of time, this system spread all over the burgeoning Internet. Much like an electronic version of community bulletin boards, anyone with a computer could post a topic or reply on Usenet.
The group was fundamentally and explicitly anarchic, as outlined by the posting “What is Usenet?” This document says, “Usenet is not a democracy … there is no person or group in charge of Usenet …Usenet cannot be a democracy, autocracy, or any other kind of ‘-acy.’”Mark Moraes, Chip Salzenberg, and Gene Spafford, “What is Usenet?” December 28, 1999, http://www.faqs.org/faqs/usenet/what-is/part1/. Usenet was not used only for socializing, however, but also for collaboration. In some ways, the service allowed a new kind of collaboration that seemed like the start of a revolution: “I was able to join rec.kites and collectively people in Australia and New Zealand helped me solve a problem and get a circular two-line kite to fly,” one user told the United Kingdom’s Guardian.Simon Jeffery and others, “A People’s History of the Internet: From Arpanet in 1969 to Today,” Guardian (London), October 23, 2009, http://www.guardian.co.uk/technology/interactive/2009/oct/23/internet-arpanet.
Fast-forward to 1995: The president and founder of Beverly Hills Internet, David Bohnett, announces that the name of his company is now “GeoCities.” GeoCities built its business by allowing users (“homesteaders”) to create web pages in “communities” for free, with the stipulation that the company placed a small advertising banner at the top of each page. Anyone could register a GeoCities site and subsequently build a web page about a topic. Almost all of the community names, like Broadway (live theater) and Athens (philosophy and education), were centered on specific topics.While GeoCities is no longer in business, the Internet Archive maintains the site at http://www.archive.org/web/geocities.php. Information taken from December 21, 1996.
This idea of centering communities on specific topics may have come from Usenet. In Usenet, the domain alt.rec.kites refers to a specific topic (kites) within a category (recreation) within a larger community (alternative topics). This hierarchical model allowed users to organize themselves across the vastness of the Internet, even on a large site like GeoCities. The difference with GeoCities was that it allowed users to do much more than post only text (the limitation of Usenet), while constraining them to a relatively small pool of resources. Although each GeoCities user had only a few megabytes of web space, standardized pictures—like mailbox icons and back buttons—were hosted on GeoCities’s main server. GeoCities was such a large part of the Internet, and these standard icons were so ubiquitous, that they have now become a veritable part of the Internet’s cultural history. The Web Elements category of the site Internet Archaeology is a good example of how pervasive GeoCities graphics became.Internet Archaeology, 2010, http://www.internetarchaeology.org/swebelements.htm.
GeoCities built its business on a freemium model, where basic services are free but subscribers pay extra for things like commercial pages or shopping carts. Other Internet businesses, like Skype and Flickr, use the same model to keep a vast user base while still profiting from frequent users. Since loss of online advertising revenue was seen as one of the main causes of the dot-com crash, many current web startups are turning toward this freemium model to diversify their income streams.Claire Cain Miller, “Ad Revenue on the Web? No Sure Bet,” New York Times, May 24, 2009, http://www.nytimes.com/2009/05/25/technology/start-ups/25startup.html.
GeoCities’s model was so successful that the company Yahoo! bought it for $3.6 billion at its peak in 1999. At the time, GeoCities was the third-most-visited site on the web (behind Yahoo! and AOL), so it seemed like a sure bet. A decade later, on October 26, 2009, Yahoo! closed GeoCities for good in every country except Japan.
Diversification of revenue has become one of the most crucial elements of Internet businesses; from the Wall Street Journal online to YouTube, almost every website is now looking for multiple income streams to support its services.
Websites have many different ways of paying for themselves, and this can say a lot about both the site and its audience. The business models of today’s websites may also directly reflect the lessons learned during the early days of the Internet. Start this exercise by reviewing a list of common ways that websites pay for themselves, how they arrived at these methods, and what it might say about them:
Choose a website that you visit often, and list which of these revenue streams the site might have. How might this affect the content on the site? Is there a visible effect, or does the site try to hide it? Consider how events during the early history of the Internet may have affected the way the site operates now. Write down a revenue stream that the site does not currently have and how the site designers might implement such a revenue stream.
Although GeoCities lost market share, and although theGlobe.com never really made it to the 21st century, social networkingA website that provides a way for people to interact with each other. Often, this involves “friends” and some method of communication, whether through photos, videos, or text. has persisted. There are many different types of social mediaAny social networking service. This is a blanket term for person-to-person connections on the Internet, unlike television, radio, or newspaper. available today, from social networking sites like Facebook to blogging services like Blogger and WordPress.com. All these sites bring something different to the table, and a few of them even try to bring just about everything to the table at once.
Social networking services—like Facebook, Twitter, LinkedInA social networking service that caters to business professionals looking for networking opportunities., Google BuzzGoogle’s social networking service that is built into its Gmail service., and MySpace—provide a limited but public platform for users to create a “profile.” This can range anywhere from the 140-character (that’s letters and spaces, not words) “tweets” on Twitter, to the highly customizable MySpace, which allows users to blog, customize color schemes, add background images, and play music. Each of these services has its key demographic—MySpace, for example, is particularly geared toward younger users. Its huge array of features made it attractive to this demographic at first, but eventually it was overrun with corporate marketing and solicitations for pornographic websites, leading many users to abandon the service. In addition, competing social networking sites like Facebook offer superior interfaces that have lured away many of MySpace’s users. MySpace has attempted to catch up by upgrading its own interface, but it now faces the almost insurmountable obstacle of already-satisfied users of competing social networking services. As Internet technology evolves rapidly, most users have few qualms about moving to whichever site offers the better experience; most users have profiles and accounts on many services at once. But as relational networks become more and more established and concentrated on a few social media sites, it becomes increasingly difficult for newcomers and lagging challengers to offer the same rich networking experience. For a Facebook user with hundreds of friends in his or her social network, switching to MySpace and bringing along his or her entire network of friends is a daunting and infeasible prospect. Google has attempted to circumvent the problem of luring users to create new social networks by building its Buzz service into its popular Gmail, ensuring that Buzz has a built-in user base and lowering the social costs of joining a new social network by leveraging users’ Gmail contact lists. It remains to be seen if Google will be truly successful in establishing a vital new social networking service, but its tactic of integrating Buzz into Gmail underscores how difficult it has become to compete with established social networks like Twitter and Facebook.
Whereas MySpace initially catered to a younger demographic, LinkedIn caters to business professionals looking for networking opportunities. LinkedIn is free to join and allows users to post resumes and job qualifications (rather than astrological signs and favorite television shows). Its tagline, “Relationships matter,” emphasizes the role of an increasingly networked world in business; just as a musician might use MySpace to promote a new band, a LinkedIn user can use the site to promote professional services. While these two sites have basically the same structure, they fulfill different purposes for different social groups; the character of social networking is highly dependent on the type of social circle.
Twitter offers a different approach to social networking, allowing users to “tweet” 140-character messages to their “followers,” making it something of a hybrid of instant messaging and blogging. Twitter is openly searchable, meaning that anyone can visit the site and quickly find out what other Twitter users are saying about any subject. Twitter has proved useful for journalists reporting on breaking news, as well as highlighting the “best of” the Internet. Twitter has also been useful for marketers looking for a free public forum to disseminate marketing messages. It became profitable in December 2009 through a $25 million deal allowing Google and Microsoft to display its users’ 140-character messages in their search results.Eliot Van Buskirk, “Twitter Earns First Profit Selling Search to Google, Microsoft,” Wired, December 21, 2009, http://www.wired.com/epicenter/2009/12/twitter-earns-first-profit-selling-search-to-google-microsoft. Facebook, originally deployed exclusively to Ivy League schools, has since opened its doors to anyone over 13 with an email account. With the explosion of the service and its huge growth among older demographics, “My parents joined Facebook” has become a common complaint.See the blog at http://myparentsjoinedfacebook.com/ for examples on the subject.
Another category of social media, blogs began as an online, public version of a diary or journal. Short for “web logs,” these personal sites give anyone a platform to write about anything they want to. Posting tweets on the Twitter service is considered micro-blogging (because of the extremely short length of the posts). Some services, like LiveJournal, highlight their ability to provide up-to-date reports on personal feelings, even going so far as to add a “mood” shorthand at the end of every post. The Blogger service (now owned by Google) allows users with Google accounts to follow friends’ blogs and post comments. WordPress.com, the company that created the open-source blogging platform WordPress.org, and LiveJournal both follow the freemium model by allowing a basic selection of settings for free, with the option to pay for things like custom styles and photo hosting space. What these all have in common, however, is their bundling of social networking (such as the ability to easily link to and comment on friends’ blogs) with an expanded platform for self-expression. At this point, most traditional media companies have incorporated blogs, Twitter, and other social media as a way to allow their reporters to update instantly and often. This form of media convergence, discussed in detail in of this chapter, is now a necessary part of doing business.
There are many other types of social media out there, many of which can be called to mind with a single name: YouTube (video sharing), Wikipedia (open-source encyclopedia composed of “wikis” editable by any user), Flickr (photo sharing), and Digg (content sharing). Traditional media outlets have begun referring to these social media services and others like them as “Web 2.0.” Web 2.0 is not a new version of the web; rather, the term is a reference to the increased focus on user-generated content and social interaction on the web, as well as the evolution of online tools to facilitate that focus. Instead of relying on professional reporters to get information about a protest in Iran, a person could just search for “Iran” on Twitter and likely end up with hundreds of tweets linking to everything from blogs to CNN.com to YouTube videos from Iranian citizens themselves. In addition, many of these tweets may actually be instant updates from people using Twitter in Iran. This allows people to receive information straight from the source, without being filtered through news organizations or censored by governments.
In 2009, Susan Boyle, an unemployed middle-aged Scottish woman, appeared on Britain’s Got Talent and sang “I Dreamed a Dream” from the musical Les Misérables, becoming an international star almost overnight. It was not her performance itself that catapulted her to fame and sent her subsequently released album to the top of the UK Billboard charts and kept it there for 6 weeks. What did it was a YouTube video of her performance, viewed by 87,000,000 people and counting.BritainsSoTalented, “Susan Boyle - Singer - Britains Got Talent 2009,” 2009, http://www.youtube.com/watch?v=9lp0IWv8QZY.
Figure 11.5

Susan Boyle turned from a successful television contestant into an international celebrity when the YouTube video of her performance went viral.
Media that is spread from person to person when, for example, a friend sends you a link saying “You’ve got to see this!” is said to have “gone viralWhen a video, news story, or photo is emailed or sent from person to person, without any direction from a mainstream source..” Marketing and advertising agencies have deemed advertising that makes use of this phenomenon as “viral marketingAn attempt by marketers to produce things that “go viral” in order to build hype for a product..” Yet many YouTube sensations have not come from large marketing firms. For instance, the four-piece pop-punk band OK Go filmed a music video on a tiny budget for their song “Here It Goes Again” and released it exclusively on YouTube in 2006. Featuring a choreographed dance done on eight separate treadmills, the video quickly became a viral sensation and, as of May 2011, has over 7,265,825 views. The video helped OK Go attract millions of new fans and earned them a Grammy award in 2007, making it one of the most notable successes of viral Internet marketing. Viral marketing is, however, notoriously unpredictable and is liable to spawn remixes, spin-offs, and spoofs that can dilute or damage the messages that marketers intend to spread. Yet, when it is successful, viral marketing can reach millions of people for very little money and can even make it into the mainstream news.
Recent successes and failures in viral marketing demonstrate how difficult it is for marketers to control their message as it is unleashed virally. In 2007, the band Radiohead released their album In Rainbows online, allowing fans to download it for any amount of money they chose—including for free. Despite practically giving the album away, the digital release of In Rainbows still pulled in more money than Radiohead’s previous album, Hail to the Thief, while the band simultaneously sold a huge number of $80 collector editions and still sold physical CDs months after the digital release became available.New Musical Express, “Radiohead Reveal How Successful ‘In Rainbows’ Download Really Was,” October 15, 2008, http://www.nme.com/news/radiohead/40444. In contrast, the food giant Healthy Choice enlisted Classymommy.com blogger Colleen Padilla to write a sponsored review of its product, leading to a featured New York Times article on the blogger (not the product), which gave the product only a passing mention.Pradnya Joshi, “Approval by a Blogger May Please a Sponsor,” New York Times, July 12, 2009, http://www.nytimes.com/2009/07/13/technology/internet/13blog.html. Often, a successfully marketed product will reach some people through the Internet and then break through into the mainstream media. Yet as the article about Padilla shows, sometimes the person writing about the product overshadows the product itself.
Not all viral media is marketing, however. In 2007, someone posted a link to a new trailer for Grand Theft Auto IV on the video games message board of the web forum 4chan.org. When users followed the link, they were greeted not with a video game trailer but with Rick Astley singing his 1987 hit “Never Gonna Give You Up.” This technique—redirecting someone to that particular music video—became known as Rickrolling and quickly became one of the most well-known Internet memesAny catchphrase, video, or piece of media that becomes a cultural symbol on the Internet. of all time.Fox News, “The Biggest Little Internet Hoax on Wheels Hits Mainstream,” April 22, 2008, http://www.foxnews.com/story/0,2933,352010,00.html. An Internet meme is a concept that quickly replicates itself throughout the Internet, and it is often nonsensical and absurd. Another meme, “Lolcats,” consists of misspelled captions—“I can has cheezburger?” is a classic example—over pictures of cats. Often, these memes take on a metatextual quality, such as the meme “Milhouse is not a meme,” in which the character Milhouse (from the television show The Simpsons) is told that he is not a meme. Chronicling memes is notoriously difficult, because they typically spring into existence seemingly overnight, propagate rapidly, and disappear before ever making it onto the radar of mainstream media—or even the mainstream Internet user.
Social media allows an unprecedented volume of personal, informal communication in real time from anywhere in the world. It allows users to keep in touch with friends on other continents, yet keeps the conversation as casual as a Facebook wall post. In addition, blogs allow us to gauge a wide variety of opinions and have given “breaking news” a whole new meaning. Now, news can be distributed through many major outlets almost instantaneously, and different perspectives on any one event can be aired concurrently. In addition, news organizations can harness bloggers as sources of real-time news, in effect outsourcing some of their news-gathering efforts to bystanders on the scene. This practice of harnessing the efforts of several individuals online to solve a problem is known as crowdsourcing.
The downside of the seemingly infinite breadth of online information is that there is often not much depth to the coverage of any given topic. The superficiality of information on the Internet is a common gripe among many journalists who are now rushed to file news reports several times a day in an effort to complete with the “blogosphere,” or the crowd of bloggers who post both original news stories and aggregate previously published news from other sources. Whereas traditional print organizations at least had the “luxury” of the daily print deadline, now journalists are expected to blog or tweet every story and file reports with little or no analysis, often without adequate time to confirm the reliability of their sources.Ken Auletta, “Non-Stop News,” Annals of Communications, New Yorker, January 25, 2010, http://www.newyorker.com/reporting/2010/01/25/100125fa_fact_auletta.
Additionally, news aggregatorsServices like Google News that aggregate stories from major professional news sources and present them in a streamlined format. like Google News profit from linking to journalists’ stories at major newspapers and selling advertising, but these profits are not shared with the news organizations and journalists who created the stories. It is often difficult for journalists to keep up with the immediacy of the nonstop news cycle, and with revenues for their efforts being diverted to news aggregators, journalists and news organizations increasingly lack the resources to keep up this fast pace. Twitter presents a similar problem: Instead of getting news from a specific newspaper, many people simply read the articles that are linked from a Twitter feed. As a result, the news cycle leaves journalists no time for analysis or cross-examination. Increasingly, they will simply report, for example, on what a politician or public relations representative says without following up on these comments or fact-checking them. This further shortens the news cycle and makes it much easier for journalists to be exploited as the mouthpieces of propaganda.
Consequently, the very presence of blogs and their seeming importance even among mainstream media has made some critics wary. Internet entrepreneur Andrew Keen is one of these people, and his book The Cult of the Amateur follows up on the famous thought experiment suggesting that infinite monkeys, given infinite typewriters, will one day randomly produce a great work of literature:Proposed by T. H. Huxley (the father of Aldous Huxley), this thought experiment suggests that infinite monkeys given infinite typewriters would, given infinite time, eventually write Hamlet. “In our Web 2.0 world, the typewriters aren’t quite typewriters, but rather networked personal computers, and the monkeys aren’t quite monkeys, but rather Internet users.”Andrew Keen, The Cult of the Amateur: How Today’s Internet Is Killing Our Culture (New York: Doubleday, 2007). Keen also suggests that the Internet is really just a case of my-word-against-yours, where bloggers are not required to back up their arguments with credible sourcesGenerally, any source with authorial or editorial backing. Anonymous sources, or people who do not give their real names, are often not credible, unless they are vouched for by a known credible source.. “These days, kids can’t tell the difference between credible news by objective professional journalists and what they read on [a random website].”Andrew Keen, The Cult of the Amateur: How Today’s Internet Is Killing Our Culture (New York: Doubleday, 2007). Follow Keen on Twitter: http://twitter.com/ajkeen. Commentators like Keen worry that this trend will lead to young people’s inability to distinguish credible information from a mass of sources, eventually leading to a sharp decrease in credible sources of information.
For defenders of the Internet, this argument seems a bit overwrought: “A legitimate interest in the possible effects of significant technological change in our daily lives can inadvertently dovetail seamlessly into a ‘kids these days’ curmudgeonly sense of generational degeneration, which is hardly new.”Greg Downey, “Is Facebook Rotting Our Children’s Brains?” Neuroanthropology.net, March 2, 2009, http://neuroanthropology.net/2009/03/02/is-facebook-rotting-our-childrens-brains/. Greg Downey, who runs the collaborative blog Neuroanthropology, says that fear of kids on the Internet—and on social media in particular—can slip into “a ‘one-paranoia-fits-all’ approach to technological change.” For the argument that online experiences are “devoid of cohesive narrative and long-term significance,” Downey offers that, on the contrary, “far from evacuating narrative, some social networking sites might be said to cause users to ‘narrativize’ their experience, engaging with everyday life already with an eye toward how they will represent it on their personal pages.”
Another argument in favor of social media defies the warning that time spent on social networking sites is destroying the social skills of young people. “The debasement of the word ‘friend’ by [Facebook’s] use of it should not make us assume that users can’t tell the difference between friends and Facebook ‘friends,’” writes Downey. On the contrary, social networks (like the Usenet of the past) can even provide a place for people with more obscure interests to meet one another and share commonalities. In addition, marketing through social media is completely free—making it a valuable tool for small businesses with tight marketing budgets. A community theater can invite all of its “fans” to a new play for less money than putting an ad in the newspaper, and this direct invitation is far more personal and specific. Many people see services like Twitter, with its “followers,” as more semantically appropriate than the “friends” found on Facebook and MySpace, and because of this Twitter has, in many ways, changed yet again the way social media is conceived. Rather than connecting with “friends,” Twitter allows social media to be purely a source of information, thereby making it far more appealing to adults. In addition, while 140 characters may seem like a constraint to some, it can be remarkably useful to the time-strapped user looking to catch up on recent news.
Social media’s detractors also point to the sheer banality of much of the conversation on the Internet. Again, Downey keeps this in perspective: “The banality of most conversation is also pretty frustrating,” he says. Downey suggests that many of the young people using social networking tools see them as just another aspect of communication. However, Downey warns that online bullying has the potential to pervade larger social networks while shielding perpetrators through anonymity.
Another downside of many of the Internet’s segmented communities is that users tend to be exposed only to information they are interested in and opinions they agree with. This lack of exposure to novel ideas and contrary opinions can create or reinforce a lack of understanding among people with different beliefs, and make political and social compromise more difficult to come by.
While the situation may not be as dire as Keen suggests in his book, there are clearly some important arguments to consider regarding the effects of the web and social media in particular. The main concerns come down to two things: the possibility that the volume of amateur, user-generated content online is overshadowing better-researched sources, and the questionable ability of users to tell the difference between the two.
Although Facebook began at Harvard University and quickly became popular among the Ivy League colleges, the social network has since been lambasted as a distraction for students. Instead of studying, the argument claims, students will sit in the library and browse Facebook, messaging their friends and getting nothing done. Two doctoral candidates, Aryn Karpinski (Ohio State University) and Adam Duberstein (Ohio Dominican University), studied the effects of Facebook use on college students and found that students who use Facebook generally receive a full grade lower—a half point on the GPA scale—than students who do not.Anita Hamilton, “What Facebook Users Share: Lower Grades,” Time, April 14, 2009, http://www.time.com/time/business/article/0,8599,1891111,00.html. Correlation does not imply causation, though, as Karpinski said that Facebook users may just be “prone to distraction.”
On the other hand, students’ access to technology and the Internet may allow them to pursue their education to a greater degree than they could otherwise. At a school in Arizona, students are issued laptops instead of textbooks, and some of their school buses have Wi-Fi Internet access. As a result, bus rides, including the long trips that are often a requirement of high school sports, are spent studying. Of course, the students had laptops long before their bus rides were connected to the Internet, but the Wi-Fi technology has “transformed what was often a boisterous bus ride into a rolling study hall.”Sam Dillon, “Wi-Fi Turns Rowdy Bus Into Rolling Study Hall,” New York Times, February 11, 2010, http://www.nytimes.com/2010/02/12/education/12bus.html. Even though not all students studied all the time, enabling students to work on bus rides fulfilled the school’s goal of extending the educational hours beyond the usual 8 to 3.
Social networking provides unprecedented ways to keep in touch with friends, but that ability can sometimes be a double-edged sword. Users can update friends with every latest achievement—“[your name here] just won three straight games of solitaire!”—but may also unwittingly be updating bosses and others from whom particular bits of information should be hidden.
The shrinking of privacy online has been rapidly exacerbated by social networks, and for a surprising reason: conscious decisions made by participants. Putting personal information online—even if it is set to be viewed by only select friends—has become fairly standard. Dr. Kieron O’Hara studies privacy in social media and calls this era “intimacy 2.0,”Zoe Kleinman, “How Online Life Distorts Privacy Rights for All,” BBC News, January 8, 2010, http://news.bbc.co.uk/2/hi/technology/8446649.stm. a riff on the buzzword “Web 2.0.” One of O’Hara’s arguments is that legal issues of privacy are based on what is called a “reasonable standard.” According to O’Hara, the excessive sharing of personal information on the Internet by some constitutes an offense to the privacy of all, because it lowers the “reasonable standard” that can be legally enforced. In other words, as cultural tendencies toward privacy degrade on the Internet, it affects not only the privacy of those who choose to share their information, but also the privacy of those who do not.
With over 500 million users, it is no surprise that Facebook is one of the upcoming battlegrounds for privacy on the Internet. When Facebook updated its privacy settings in 2009 for these people, “privacy groups including the American Civil Liberties Union … [called] the developments ‘flawed’ and ‘worrisome,’” reported The Guardian in late 2009.Bobbie Johnson, “Facebook Privacy Change Angers Campaigners,” Guardian (London), December 10, 2009, http://www.guardian.co.uk/technology/2009/dec/10/facebook-privacy.
Mark Zuckerberg, the founder of Facebook, discusses privacy issues on a regular basis in forums ranging from his official Facebook blog to conferences. At the Crunchies Awards in San Francisco in early 2010, Zuckerberg claimed that privacy was no longer a “social norm.”Bobbie Johnson, “Privacy No Longer a Social Norm, Says Facebook Founder,” Guardian (London), January 11, 2010, http://www.guardian.co.uk/technology/2010/jan/11/facebook-privacy. This statement follows from his company’s late-2009 decision to make public information sharing the default setting on Facebook. Whereas users were previously able to restrict public access to basic profile information like their names and friends, the new settings make this information publicly available with no option to make it private. Although Facebook publicly announced the changes, many outraged users first learned of the updates to the default privacy settings when they discovered—too late—that they had inadvertently broadcast private information. Facebook argues that the added complexity of the privacy settings gives users more control over their information. However, opponents counter that adding more complex privacy controls while simultaneously making public sharing the default setting for those controls is a blatant ploy to push casual users into sharing more of their information publicly—information that Facebook will then use to offer more targeted advertising.Kevin Bankston, “Facebook’s New Privacy Changes: The Good, the Bad, and the Ugly,” Deeplinks Blog, Electronic Frontier Foundation, December 9, 2009, http://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly.
In response to the privacy policy, many users have formed their own grassroots protest groups within Facebook. In response to critiques, Facebook changed its privacy policy again in May 2010 with three primary changes. First, privacy controls are simpler. Instead of various controls on multiple pages, there is now one main control users can use to determine who can see their information. Second, Facebook made less information publicly available. Public information is now limited to basic information, such as a user’s name and profile picture. Finally, it is now easier to block applications and third-party websites from accessing user information.Maggie Lake, “Facebook’s privacy changes,” CNN, June 2, 2010, http://www.cnn.com/video/#/video/tech/2010/05/27/lake.facebook.pr.
Similar to the Facebook controversy, Google’s social networking Gmail add-on called Buzz automatically signed up Gmail users to “follow” the most emailed Gmail users in their address book. Because all of these lists were public by default, users’ most emailed contacts were made available for anyone to see. This was especially alarming for people like journalists who potentially had confidential sources exposed to a public audience. However, even though this mistake—which Google quickly corrected—created a lot of controversy around Buzz, it did not stop users from creating over 9 million posts in the first 2 days of the service.Todd Jackson, “Millions of Buzz users, and improvements based on your feedback,” Official Gmail Blog, February 11, 2010, http://gmailblog.blogspot.com/2010/02/millions-of-buzz-users-and-improvements.html. Google’s integration of Buzz into its Gmail service may have been upsetting to users not accustomed to the pitfalls of social networking, but Google’s misstep has not discouraged millions of others from trying the service, perhaps due to their experience dealing with Facebook’s ongoing issues with privacy infringement.
For example, Facebook’s old privacy settings integrated a collection of applications (written by third-party developers) that included everything from “Which American Idol Contestant Are You?” to an “Honesty Box” that allows friends to send anonymous criticism. “Allowing Honesty Box access will let it pull your profile information, photos, your friends’ info, and other content that it requires to work,” reads the disclaimer on the application installation page. The ACLU drew particular attention to the “app gap” that allowed “any quiz or application run by you to access information about you and your friends.”Nicole Ozer, “Facebook Privacy in Transition - But Where Is It Heading?” ACLU of Northern California, December 9, 2009, http://www.aclunc.org/issues/technology/blog/facebook_privacy_in_transition_-_but_where_is_it_heading.shtml. In other words, merely using someone else’s Honesty Box gave the program information about your “religion, sexual orientation, political affiliation, pictures, and groups.”Nicole Ozer, “Facebook Privacy in Transition - But Where Is It Heading?” ACLU of Northern California, December 9, 2009, http://www.aclunc.org/issues/technology/blog/facebook_privacy_in_transition_-_but_where_is_it_heading.shtml. There are many reasons that unrelated applications may want to collect this information, but one of the most prominent is, by now, a very old story: selling products. The more information a marketer has, the better he or she can target a message, and the more likely it is that the recipient will buy something.
Figure 11.6

Zynga, one of the top social game developers on Facebook, created the game FarmVille. Because FarmVille is ad-supported and gives users the option to purchase Farmville virtual currency with actual money, the game is free and accessible for everyone to play.
Social media on the Internet has been around for a while, and it has always been of some interest to marketers. The ability to target advertising based on demographic information given willingly to the service—age, political preference, gender, and location—allows marketers to target advertising extremely efficiently. However, by the time Facebook’s population passed the 350-million mark, marketers were scrambling to harness social media. The increasingly difficult-to-reach younger demographic has been rejecting radios for Apple’s iPod mobile digital devices and television for YouTube. Increasingly, marketers are turning to social networks as a way to reach these consumers. Culturally, these developments indicate a mistrust among consumers of traditional marketing techniques; marketers must now use new and more personalized ways of reaching consumers if they are going to sell their products.
The attempts of marketers to harness the viral spread of media on the Internet have already been discussed earlier in the chapter. Marketers try to determine the trend of things “going viral,” with the goal of getting millions of YouTube views; becoming a hot topic on Google Trends, a website that measures the most frequently searched topics on the web; or even just being the subject of a post on a well-known blog. For example, Procter & Gamble sent free samples of its Swiffer dust mop to stay-at-home-mom bloggers with a large online audience. And in 2008, the movie College (or College: The Movie) used its tagline “Best.Weekend.Ever.” as the prompt for a YouTube video contest. Contestants were invited to submit videos of their best college weekend ever, and the winner received a monetary prize.Jon Hickey, “Best Weekend Ever,” 2008, http://www.youtube.com/watch?v=pldG8MdEIOA.
What these two instances of marketing have in common is that they approach people who are already doing something they enjoy doing—blogging or making movies—and give them a relatively small amount of compensation for providing advertising. This differs from methods of traditional advertising because marketers seek to bridge a credibility gap with consumers. Marketers have been doing this for ages— long before breakfast cereal slogans like “Kid Tested, Mother Approved” or “Mikey likes it” ever hit the airwaves. The difference is that now the people pushing the products can be friends or family members, all via social networks.
For instance, in 2007, a program called Beacon was launched as part of Facebook. With Beacon, a Facebook user is confronted with the option to “share” an online purchase from partnering sites. For example, a user might buy a book from Amazon.com and check the corresponding “share” box in the checkout process, and all of his or her friends will receive a message notifying them that this person purchased and recommends this particular product. Explaining the reason for this shift in a New York Times article, Mark Zuckerberg said, “Nothing influences a person more than a trusted friend.”Louise Story, “Facebook Is Marketing Your Brand Preferences (With Your Permission),” New York Times, November 7, 2007, http://www.nytimes.com/2007/11/07/technology/07adco.html. However, many Facebook users did not want their purchasing information shared with other Facebookers, and the service was shut down in 2009 and subsequently became the subject of a class action lawsuit. Facebook’s troubles with Beacon illustrate the thin line between taking advantage of the tremendous marketing potential of social media and violating the privacy of users.
Facebook’s questionable alliance with marketers through Beacon was driven by a need to create reliable revenue streams. One of the most crucial aspects of social media is the profitability factor. In the 1990s, theGlobe.com was one of the promising new startups, but almost as quickly, it went under due to lack of funds. The lesson of theGlobe.com has not gone unheeded by today’s social media services. For example, Twitter has sold access to its content to Google and Microsoft to make users’ tweets searchable for $25 million.
Google’s Buzz is one of the most interesting services in this respect, because Google’s main business is advertising—and it is a highly successful business. Google’s search algorithms allow it to target advertising to a user’s specific tastes. As Google enters the social media world, its advertising capabilities will only be compounded as users reveal more information about themselves via Buzz. Although it does not seem that users choose their social media services based on how the services generate their revenue streams, the issue of privacy in social media is in large part an issue of how much information users are willing to share with advertisers. For example, using Google’s search engine, Buzz, Gmail, and Blogger give that single company an immense amount of information and a historically unsurpassed ability to market to specific groups. At this relatively early stage of the fledgling online social media business—both Twitter and Facebook only very recently turned a profit, so commerce has only recently come into play—it is impossible to say whether the commerce side of things will transform the way people use the services. If the uproar over Facebook’s Beacon is any lesson, however, the relationship between social media and advertising is ripe for controversy.
The use of Facebook and Twitter in the recent political uprisings in the Middle East has brought to the fore the question whether social media can be an effective tool for social change.
On January 14, 2011, after month-long protests against fraud, economic crisis, and lack of political freedom, the Tunisian public ousted President Zine El Abidine Ben Ali. Soon after the Tunisian rebellion, the Egyptian public expelled President Hosni Mubarak, who had ruled the country for 30 years. Nearly immediately, other Middle Eastern countries such as Algeria, Libya, Yemen, and Bahrain also erupted against their oppressive governments in the hopes of obtaining political freedom.Grace Gamba, “Facebook Topples Governments in Middle East,” Brimstone Online, March 18, 2011, http://www.gshsbrimstone.com/news/2011/03/18/facebook-topples-governments-in-middle-east.
What is common among all these uprisings is the role played by social media. In nearly all of these countries, restrictions were imposed on the media and government resistance was brutally discouraged.Peter Beaumont, “Can Social Networking Overthrow a Government?” Morning Herald (Sydney), February 25, 2011, http://www.smh.com.au/technology/technology-news/can-social-networking-overthrow-a-government-20110225-1b7u6.html. This seems to have inspired the entire Middle East to organize online to rebel against tyrannical rule.Chris Taylor, “Why Not Call It a Facebook Revolution?” CNN, February 24, 2011, http://edition.cnn.com/2011/TECH/social.media/02/24/facebook.revolution. Protesters used social media not only to organize against their governments but also to share their struggles with the rest of the world.Grace Gamba, “Facebook Topples Governments in Middle East,” Brimstone Online, March 18, 2011, http://www.gshsbrimstone.com/news/2011/03/18/facebook-topples-governments-in-middle-east.
In Tunisia, protesters filled the streets by sharing information on Twitter.Chris Taylor, “Why Not Call It a Facebook Revolution?” CNN, February 24, 2011, http://edition.cnn.com/2011/TECH/social.media/02/24/facebook.revolution. Egypt’s protests were organized on Facebook pages. Details of the demonstrations were circulated by both Facebook and Twitter. email was used to distribute the activists’ guide to challenging the regime.Peter Beaumont, “Can Social Networking Overthrow a Government?” Morning Herald (Sydney), February 25, 2011, http://www.smh.com.au/technology/technology-news/can-social-networking-overthrow-a-government-20110225-1b7u6.html. Libyan dissenters too spread the word about their demonstrations similarly.Chris Taylor, “Why Not Call It a Facebook Revolution?” CNN, February 24, 2011, http://edition.cnn.com/2011/TECH/social.media/02/24/facebook.revolution.
Owing to the role played by Twitter and Facebook in helping protesters organize and communicate with each other, many have termed these rebellions as “Twitter Revolutions”Evgeny Morozov, “How Much Did Social Media Contribute to Revolution in the Middle East?” Bookforum, April/May 2011, http://www.bookforum.com/inprint/018_01/7222 or “Facebook Revolutions”Eric Davis, “Social Media: A Force for Political Change in Egypt,” April 13, 2011, http://new-middle-east.blogspot.com/2011/04/social-media-force-for-political-change.html. and have credited social media for helping to bring down these regimes.Eleanor Beardsley, “Social Media Gets Credit for Tunisian Overthrow,” NPR, January 16, 2011, http://www.npr.org/2011/01/16/132975274/Social-Media-Gets-Credit-For-Tunisian-Overthrow.
During the unrest, social media outlets such as Facebook and Twitter helped protesters share information by communicating ideas continuously and instantaneously. Users took advantage of these unrestricted vehicles to share the most graphic details and images of the attacks on protesters, and to rally demonstrators.Peter Beaumont, “Can Social Networking Overthrow a Government?” Morning Herald (Sydney), February 25, 2011, http://www.smh.com.au/technology/technology-news/can-social-networking-overthrow-a-government-20110225-1b7u6.html. In other words, use of social media was about the ability to communicate across borders and barriers. It gave common people a voice and an opportunity to express their opinions.
Critics of social media, however, say that those calling the Middle East movements Facebook or Twitter revolutions are not giving credit where it is due.Alex Villarreal, “Social Media A Critical Tool for Middle East Protesters,” Voice of America, March 1, 2011, http://www.voanews.com/english/news/middle-east/Social-Media-a-Critical-Tool-for-Middle-East-Protesters-117202583.html It is true that social media provided vital assistance during the unrest in the Middle East. But technology alone could not have brought about the revolutions. The resolve of the people to bring about change was most important, and this fact should be recognized, say the critics.Chris Taylor, “Why Not Call It a Facebook Revolution?” CNN, February 24, 2011, http://edition.cnn.com/2011/TECH/social.media/02/24/facebook.revolution.
It’s in the name: World Wide Web. The Internet has broken down communication barriers between cultures in a way that could only be dreamed of in earlier generations. Now, almost any news service across the globe can be accessed on the Internet and, with the various translation services available (like Babelfish and Google Translate), be relatively understandable. In addition to the spread of American culture throughout the world, smaller countries are now able to cheaply export culture, news, entertainment, and even propaganda.
The Internet has been a key factor in driving globalizationThe lowering of economic and cultural impediments to communication and commerce between countries. in recent years. Many jobs can now be outsourced entirely via the Internet. Teams of software programmers in India can have a website up and running in very little time, for far less money than it would take to hire American counterparts. Communicating with these teams is now as simple as sending emails and instant messages back and forth, and often the most difficult aspect of setting up an international video conference online is figuring out the time difference. Especially for electronic services such as software, outsourcing over the Internet has greatly reduced the cost to develop a professionally coded site.
The increase of globalization has been an economic force throughout the last century, but economic interdependency is not its only by-product. At its core, globalization is the lowering of economic and cultural impediments to communication between countries all over the globe. Globalization in the sphere of culture and communication can take the form of access to foreign newspapers (without the difficulty of procuring a printed copy) or, conversely, the ability of people living in previously closed countries to communicate experiences to the outside world relatively cheaply.
Television, especially satellite television, has been one of the primary ways for American entertainment to reach foreign shores. This trend has been going on for some time now, for example, with the launch of MTV Arabia.Tim Arango, “World Falls for American Media, Even as It Sours on America,” New York Times, November 30, 2008, http://www.nytimes.com/2008/12/01/business/media/01soft.html. American popular culture is, and has been, a crucial export.
At the Eisenhower Fellowship Conference in Singapore in 2005, U.S. ambassador Frank Lavin gave a defense of American culture that differed somewhat from previous arguments. It would not be all Starbucks, MTV, or Baywatch, he said, because American culture is more diverse than that. Instead, he said that “America is a nation of immigrants,” and asked, “When Mel Gibson or Jackie Chan come to the United States to produce a movie, whose culture is being exported?”Frank Lavin, “‘Globalization and Culture’: Remarks by Ambassador Frank Lavin at the Eisenhower Fellowship Conference in Singapore,” U.S. Embassy in Singapore, June 28, 2005, http://singapore.usembassy.gov/062805.html. This idea of a truly globalized culture—one in which content can be distributed as easily as it can be received—now has the potential to be realized through the Internet. While some political and social barriers still remain, from a technological standpoint there is nothing to stop the two-way flow of information and culture across the globe.
The scarcity of artistic resources, the time lag of transmission to a foreign country, and censorship by the host government are a few of the possible impediments to transmission of entertainment and culture. China provides a valuable example of the ways the Internet has helped to overcome (or highlight) all three of these hurdles.
China, as the world’s most populous country and one of its leading economic powers, has considerable clout when it comes to the Internet. In addition, the country is ruled by a single political party that uses censorship extensively in an effort to maintain control. Because the Internet is an open resource by nature, and because China is an extremely well-connected country—with 22.5 percent (roughly 300 million people, or the population of the entire United States) of the country online as of 2008Google, “Internet users as percentage of population: China,” February 19, 2010, http://www.google.com/publicdata?ds=wb-wdi&met=it_net_user_p2&idim=country:CHN&dl=en&hl=en&q= china+internet+users.—China has been a case study in how the Internet makes resistance to globalization increasingly difficult.
Figure 11.7

China has more Internet users than any other country.
On January 21, 2010, Hillary Clinton gave a speech in front of the Newseum in Washington, DC, where she said, “We stand for a single Internet where all of humanity has equal access to knowledge and ideas.”Johnny Ryan and Stefan Halper, “Google vs China: Capitalist Model, Virtual Wall,” OpenDemocracy, January 22, 2010, http://www.opendemocracy.net/johnny-ryan-stefan-halper/google-vs-china-capitalist-model-virtual-wall. That same month, Google decided it would stop censoring search results on Google.cn, its Chinese-language search engine, as a result of a serious cyber-attack on the company originating in China. In addition, Google stated that if an agreement with the Chinese government could not be reached over the censorship of search results, Google would pull out of China completely. Because Google has complied (albeit uneasily) with the Chinese government in the past, this change in policy was a major reversal.
Withdrawing from one of the largest expanding markets in the world is shocking coming from a company that has been aggressively expanding into foreign markets. This move highlights the fundamental tension between China’s censorship policy and Google’s core values. Google’s company motto, “Don’t be evil,” had long been at odds with its decision to censor search results in China. Google’s compliance with the Chinese government did not help it make inroads into the Chinese Internet search market—although Google held about a quarter of the market in China, most of the search traffic went to the tightly controlled Chinese search engine Baidu. However, Google’s departure from China would be a blow to anti-government forces in the country. Since Baidu has a closer relationship with the Chinese government, political dissidents tend to use Google’s Gmail, which uses encryptedA process in which information is encoded to protect it from being stolen. Online “shopping carts,” for example, are on encrypted web pages resistant to hackers. servers based in the United States. Google’s threat to withdraw from China raises the possibility that globalization could indeed hit roadblocks due to the ways that foreign governments may choose to censor the Internet.
One only needs to go to CNN’s offical Twitter feed and begin to click random faces in the “Following” column to see the effect of media convergence through the Internet. Hundreds of different options abound, many of them individual journalists’ Twitter feeds, and many of those following other journalists. Considering CNN’s motto, “The most trusted name in network news,” its presence on Twitter might seem at odds with providing in-depth, reliable coverage. After all, how in-depth can 140 characters get?
The truth is that many of these traditional mediaTelevision, radio, newspapers, magazines, and books. outlets use Twitter not as a communication tool in itself but as a way to allow viewers to aggregate a large amount of information they may have missed. Instead of visiting multiple home pages to see the day’s top stories from multiple viewpoints, Twitter users only have to check their own Twitter pages to get updates from all the organizations they “follow.” Media conglomeratesThe name for all the media outlets owned by a single company. then use Twitter as part of an overall integration of media outlets; the Twitter feed is there to support the news content, not to report the content itself.
The threshold was crossed in 2008: The Internet overtook print media as a primary source of information for national and international news in the United States. Television is still far in the lead, but especially among younger demographics, the Internet is quickly catching up as a way to learn about the day’s news. With 40 percent of the public receiving their news from the Internet (see ),Pew Research Center for the People & the Press, “Internet Overtakes Newspapers as News Outlet,” December 23, 2008, http://people-press.org/report/479/internet-overtakes-newspapers-as-news-source. media outlets have been scrambling to set up large presences on the web. Yet one of the most remarkable shifts has been in the establishment of online-only news sources.
Figure 11.8

Americans now receive more national and international news from the Internet than they do from newspapers.
The conventional argument claims that the anonymity and the echo chamber of the Internet undermine worthwhile news reporting, especially for topics that are expensive to report on. The ability of large news organizations to put reporters in the field is one of their most important contributions and (because of its cost) is often one of the first things to be cut back during times of budget problems. However, as the Internet has become a primary news source for more and more people, new media outlets—publications existing entirely online—have begun to appear.
In 2006, two reporters for the Washington Post, John F. Harris and Jim VandeHei, left the newspaper to start a politically centered website called Politico. Rather than simply repeating the day’s news in a blog, they were determined to start a journalistically viable news organization on the web. Four years later, the site has over 6,000,000 unique monthly visitors and about a hundred staff members, and there is now a Politico reporter on almost every White House trip.Michael Wolff, “Politico’s Washington Coup,” Vanity Fair, August 2009, http://www.vanityfair.com/politics/features/2009/08/wolff200908.
Far from being a collection of amateurs trying to make it big on the Internet, Politico’s senior White House correspondent is Mike Allen, who previously wrote for The New York Times, Washington Post, and Time. His daily Playbook column appears at around 7 a.m. each morning and is read by much of the politically centered media. The different ways that Politico reaches out to its supporters—blogs, Twitter feeds, regular news articles, and now even a print edition—show how media convergence has even occurred within the Internet itself. The interactive nature of its services and the active comment boards on the site also show how the media have become a two-way street: more of a public forum than a straight news service.
Top-notch political content is not the only medium moving to the Internet, however. Saturday Night Live (SNL) has built an entire entertainment model around its broadcast time slot. Every weekend, around 11:40 p.m. on Saturday, someone interrupts a skit, turns toward the camera, shouts “Live from New York, it’s Saturday Night!” and the band starts playing. Yet the show’s sketch comedy style also seems to lend itself to the watch-anytime convenience of the Internet. In fact, the online television service Hulu carries a full eight episodes of SNL at any given time, with regular 3.5-minute commercial breaks replaced by Hulu-specific minute-long advertisements. The time listed for an SNL episode on Hulu is just over an hour—a full half-hour less than the time it takes to watch it live on Saturday night.
Hulu calls its product “online premium video,” primarily because of its desire to attract not the YouTube amateur but rather a partnership of large media organizations. Although many networks, like NBC and Comedy Central, stream video on their websites, Hulu builds its business by offering a legal way to see all these shows on the same site; a user can switch from South Park to SNL with a single click, rather than having to move to a different website.
Hulu’s success points to a high demand among Internet users for a wide variety of content collected and packaged in one easy-to-use interface. Hulu was rated the Website of the Year by the Associated PressJake Coyle, “On the Net: Hulu Is Web Site of the Year,” Seattle Times, December 19, 2008, http://seattletimes.nwsource.com/html/entertainment/2008539776_aponthenetsiteoftheyear.html. and even received an Emmy nomination for a commercial featuring Alec Baldwin and Tina Fey, the stars of the NBC comedy 30 Rock.Dan Neil, “‘30 Rock’ Gets a Wink and a Nod From Two Emmy-Nominated Spots,” Los Angeles Times, July 21, 2009, http://articles.latimes.com/2009/jul/21/business/fi-ct-neil21. Hulu’s success has not been the product of the usual dot-com underdog startup, however. Its two parent companies, News Corporation and NBC Universal, are two of the world’s media giants. In many ways, this was a logical step for these companies to take after fighting online video for so long. In December 2005, the video “Lazy Sunday,” an SNL digital short featuring Andy Samberg and Chris Parnell, went viral with over 5,000,000 views on YouTube before February 2006, when NBC demanded that YouTube take down the video.John Biggs, “A Video Clip Goes Viral, and a TV Network Wants to Control It,” New York Times, February 20, 2006, http://www.nytimes.com/2006/02/20/business/media/20youtube.html. NBC later posted the video on Hulu, where it could sell advertising for it.
Hulu allows users to break out of programming models controlled by broadcast and cable television providers and choose freely what shows to watch and when to watch them. This seems to work especially well for cult programs that are no longer available on television. In 2008, the show Arrested Development, which was canceled in 2006 after repeated time slot shifts, was Hulu’s second-most-popular program.
Hulu certainly seems to have leveled the playing field for some shows that have had difficulty finding an audience through traditional means. 30 Rock, much like Arrested Development, suffered from a lack of viewers in its early years. In 2008, New York Magazine described the show as a “fragile suckling that critics coddle but that America never quite warms up to.”Adam Sternbergh, “‘The Office’ vs. ‘30 Rock’: Comedy Goes Back to Work,” New York Magazine, April 10, 2008, http://nymag.com/daily/entertainment/2008/04/the_office_vs_30_rock_comedy_g.html. However, even as 30 Rock shifted time slots mid-season, its viewer base continued to grow through the NBC partner of Hulu. The nontraditional media approach of NBC’s programming culminated in October 2008, when NBC decided to launch the new season of 30 Rock on Hulu a full week before it was broadcast over the airwaves.Jenna Wortham, “Hulu Airs Season Premiere of 30 Rock a Week Early,” Wired, October 23, 2008, http://www.wired.com/underwire/2008/10/hulu-airs-seaso/. Hulu’s strategy of providing premium online content seems to have paid off: As of March 2011, Hulu provided 143,673,000 viewing sessions to more than 27 million unique visitors, according to Nielsen.“ComScore release March 2011 US Online Video Rankings,” April 12, 2011, http://www.comscore.com/Press_Events/Press_Releases/2011/4/comScore_Releases_March_2011_U.S._Online_Video_Rankings.
Unlike other “premium” services, Hulu does not charge for its content; rather, the word premium in its slogan seems to imply that it could charge for content if it wanted to. Other platforms, like Sony’s PlayStation 3, block Hulu for this very reason—Sony’s online store sells the products that Hulu gives away for free. However, Hulu has been considering moving to a paid subscription model that would allow users to access its entire back catalog of shows. Like many other fledgling web enterprises, Hulu seeks to create reliable revenue streams to avoid the fate of many of the companies that folded during the dot-com crash.Greg Sandoval, “More Signs Hulu Subscription Service Is Coming,” CNET, October 22, 2009, http://news.cnet.com/8301-31001_3-10381622-261.html.
Like Politico, Hulu has packaged professionally produced content into an on-demand web service that can be used without the normal constraints of traditional media. Just as users can comment on Politico articles (and now, on most newspapers’ articles), they can rate Hulu videos, and Hulu will take this into account. Even when users do not produce the content themselves, they still want this same “two-way street” service.
Table 11.2 Top 10 U.S. Online Video Brands, Home and Work
Rank |
Parent |
Total Streams (in millions) |
Unique Viewers (in millions) |
|---|---|---|---|
1 |
YouTube |
6,622,374 |
112,642 |
2 |
Hulu |
635,546 |
15,256 |
3 |
Yahoo! |
221,355 |
26,081 |
4 |
MSN |
179,741 |
15,645 |
5 |
Turner |
137,311 |
5,343 |
6 |
MTV Networks |
131,077 |
5,949 |
7 |
ABC TV |
128,510 |
5,049 |
8 |
Fox Interactive |
124,513 |
11,450 |
9 |
Nickelodeon |
117,057 |
5,004 |
10 |
Megavideo |
115,089 |
3,654 |
Source: The Nielsen Company
In the early years, the Internet was stigmatized as a tool for introverts to avoid “real” social interactions, thereby increasing their alienation from society. Yet the Internet was also seen as the potentially great connecting force between cultures all over the world. The idea that something that allowed communication across the globe could breed social alienation seemed counterintuitive. The American Psychological Association (APA) coined this concept the “Internet paradoxThe contradictory proposition of the American Psychological Association that says that the supposed social service of the Internet is actually making children antisocial. It has been more or less disproved, but still exists as a cultural stigma..”
Studies like the APA’s “Internet paradox: A social technology that reduces social involvement and psychological well-being?”Robert Kraut and others, “Internet Paradox: A Social Technology That Reduces Social Involvement and Psychological Well-Being?” American Psychologist, September 1998, http://psycnet.apa.org/index.cfm?fa=buy.optionToBuy&id=1998-10886-001. which came out in 1998, suggested that teens who spent lots of time on the Internet showed much greater rates of self-reported loneliness and other signs of psychological distress. Even though the Internet had been around for a while by 1998, the increasing concern among parents was that teenagers were spending all their time in chat rooms and online. The fact was that teenagers spent much more time on the Internet than adults, due to their increased free time, curiosity, and familiarity with technology.
However, this did not necessarily mean that “kids these days” were antisocial or that the Internet caused depression and loneliness. In his critical analysis “Deconstructing the Internet Paradox,” computer scientist, writer, and PhD recipient from Carnegie Mellon University Joseph M. Newcomer points out that the APA study did not include a control group to adjust for what may be normal “lonely” feelings in teenagers. Again, he suggests that “involvement in any new, self-absorbing activity which has opportunity for failure can increase depression,” seeing Internet use as just another time-consuming hobby, much like learning a musical instrument or playing chess.Joseph M. Newcomer, “Deconstructing the Internet Paradox,” Ubiquity, Association for Computing Machinery, April 2000, http://ubiquity.acm.org/article.cfm?id=334533. (Originally published as an op-ed in the Pittsburgh Post-Gazette, September 27, 1998.)
The general concept that teenagers were spending all their time in chat rooms and online forums instead of hanging out with flesh-and-blood friends was not especially new; the same thing had generally been thought of the computer hobbyists who pioneered the esoteric Usenet. However, the concerns were amplified when a wider range of young people began using the Internet, and the trend was especially strong in the younger demographics.
As they developed, it became quickly apparent that the Internet generation did not suffer from perpetual loneliness as a rule. After all, the generation that was raised on instant messaging invented Facebook and still makes up most of Facebook’s audience. As detailed earlier in the chapter, Facebook began as a service limited to college students—a requirement that practically excluded older participants. As a social tool and as a reflection of the way younger people now connect with each other over the Internet, Facebook has provided a comprehensive model for the Internet’s effect on social skills and especially on education.
A study by the Michigan State University Department of Telecommunication, Information Studies, and Media has shown that college-age Facebook users connect with offline friends twice as often as they connect with purely online “friends.”Nicole B. Ellison, Charles Steinfield, and Cliff Lampe, “The Benefits of Facebook ‘Friends’: Social Capital and College Students’ Use of Online Social Network Sites,” Journal of Computer-Mediated Communication 14, no. 4 (2007). In fact, 90 percent of the participants in the study reported that high school friends, classmates, and other friends were the top three groups that their Facebook profiles were directed toward.
In 2007, when this study took place, one of Facebook’s most remarkable tools for studying the ways that young people connect was its “networks” feature. Originally, a Facebook user’s network consisted of all the people at his or her college email domain: the “mycollege” portion of “me@mycollege.edu.” The MSU study, performed in April 2006, just 6 months after Facebook opened its doors to high school students, found that first-year students met new people on Facebook 36 percent more often than seniors did. These freshmen, in April 2006, were not as active on Facebook as high schoolers (Facebook began allowing high schoolers on its site during these students’ first semester in school).Ellen Rosen, “THE INTERNET; Facebook.com Goes to High School,” New York Times, October 16, 2005, http://query.nytimes.com/gst/fullpage.html?res=9C05EEDA173FF935A25753C1A9639C8B63&scp=5&sq=facebook &st=nyt. The study concluded that they could “definitively state that there is a positive relationship between certain kinds of Facebook use and the maintenance and creation of social capital.”Nicole B. Ellison, Charles Steinfield, and Cliff Lampe, “The Benefits of Facebook ‘Friends’: Social Capital and College Students’ Use of Online Social Network Sites,” Journal of Computer-Mediated Communication 14, no. 4 (2007). In other words, even though the study cannot show whether Facebook use causes or results from social connections, it can say that Facebook plays both an important and a nondestructive role in the forming of social bonds.
Although this study provides a complete and balanced picture of the role that Facebook played for college students in early 2006, there have been many changes in Facebook’s design and in its popularity. In 2006, many of a user’s “friends” were from the same college, and the whole college network might be mapped as a “friend-of-a-friend” web. If users allowed all people within a single network access to their profiles, it would create a voluntary school-wide directory of students. Since a university email address was required for signup, there was a certain level of trust. The results of this Facebook study, still relatively current in terms of showing the Internet’s effects on social capital, show that not only do social networking tools not lead to more isolation, but that they actually have become integral to some types of networking.
However, as Facebook began to grow and as high school and regional networks (such as “New York City” or “Ireland”) were incorporated, users’ networks of friends grew exponentially, and the networking feature became increasingly unwieldy for privacy purposes. In 2009, Facebook discontinued regional networks over concerns that networks consisting of millions of people were “no longer the best way for you to control your privacy.”Mark Zuckerberg, “An Open Letter from Facebook Founder Mark Zuckerberg,” Facebook, December 1, 2009, http://blog.facebook.com/blog.php?post=190423927130. Where privacy controls once consisted of allowing everyone at one’s college access to specific information, Facebook now allows only three levels: friends, friends of friends, and everyone.
Of course, not everyone on teenagers’ online friends lists are actually their friends outside of the virtual world. In the parlance of the early days of the Internet, meeting up “IRL” (shorthand for “in real life”) was one of the main reasons that many people got online. This practice was often looked at with suspicion by those not familiar with it, especially because of the anonymity of the Internet. The fear among many was that children would go into chat rooms and agree to meet up in person with a total stranger, and that stranger would turn out to have less-than-friendly motives. This fear led to law enforcement officers posing as underage girls in chat rooms, agreeing to meet for sex with older men (after the men brought up the topic—the other way around could be considered entrapment), and then arresting the men at the agreed-upon meeting spot.
In recent years, however, the Internet has become a hub of activity for all sorts of people. In 2002, Scott Heiferman started Meetup.com based on the “simple idea of using the Internet to get people off the Internet.”Scott Heiferman, “The Pursuit of Community,” New York Times, September 5, 2009, csehttp://www.nytimes.com/2009/09/06/jobs/06boss.html. The entire purpose of Meetup.com is not to foster global interaction and collaboration (as is the purpose of something like Usenet) but rather to allow people to organize locally. There are Meetups for politics (popular during Barack Obama’s presidential campaign), for New Yorkers who own Boston terriers,Amanda M. Fairbanks, “Funny Thing Happened at the Dog Run,” New York Times, August 23, 2008, csehttp://www.nytimes.com/2008/08/24/nyregion/24meetup.html. for vegan cooking, for board games, and for practically everything else. Essentially, the service (which charges a small fee to Meetup organizers) separates itself from other social networking sites by encouraging real-life interaction. Whereas a member of a Facebook group may never see or interact with fellow members, Meetup.com actually keeps track of the (self-reported) real-life activity of its groups—ideally, groups with more activity are more desirable to join. However much time these groups spend together on or off the Internet, one group of people undoubtedly has the upper hand when it comes to online interaction: World of Warcraft players.
A writer for Time states the reasons for the massive popularity of online role-playing games quite well: “[My generation’s] assumptions were based on the idea that video games would never grow up. But no genre has worked harder to disprove that maxim than MMORPGs—Massively Multiplayer Online Games.”Ta-Nehisi Paul Coates, “Confessions of a 30-Year-Old Gamer,” Time, January 12, 2007, http://www.time.com/time/arts/article/0,8599,1577502,00.html. World of Warcraft (WoW, for short) is the most popular MMORPG of all time, with over 12 million subscriptions and counting. The game is inherently social; players must complete “quests” in order to advance in the game, and many of the quests are significantly easier with multiple people. Players often form small, four-to five-person groups in the beginning of the game, but by the end of the game these larger groups (called “raiding parties”) can reach up to 40 players.
In addition, WoW provides a highly developed social networking feature called “guilds.” Players create or join a guild, which they can then use to band with other guilds in order to complete some of the toughest quests. “But once you’ve got a posse, the social dynamic just makes the game more addictive and time-consuming,” writes Clive Thompson for Slate.Clive Thompson, “An Elf’s Progress: Finally, Online Role-Playing Games That Won’t Destroy Your Life,” Slate, March 7, 2005, http://www.slate.com/id/2114354. Although these guilds do occasionally meet up in real life, most of their time together is spent online for hours per day (which amounts to quite a bit of time together), and some of the guild leaders profess to seeing real-life improvements. Joi Ito, an Internet business and investment guru, joined WoW long after he had worked with some of the most successful Internet companies; he says he “definitely”Jane Pinckard, “Is World of Warcraft the New Golf?” 1UP.com, February 8, 2006, http://www.1up.com/news/world-warcraft-golf. learned new lessons about leadership from playing the game. Writer Jane Pinckard, for video game blog 1UP, lists some of Ito’s favorite activities as “looking after newbs [lower-level players] and pleasing the veterans,” which he calls a “delicate balancing act,”Jane Pinckard, “Is World of Warcraft the New Golf?” 1UP.com, February 8, 2006, http://www.1up.com/news/world-warcraft-golf. even for an ex-CEO.
Figure 11.9

Guilds often go on “raiding parties”—just one of the many semisocial activities in World of Warcraft.
Image used by permission. © 2011 Blizzard Entertainment, Inc.
With over 12 million subscribers, WoW necessarily breaks the boundaries of previous MMORPGs. The social nature of the game has attracted unprecedented numbers of female players (although men still make up the vast majority of players), and its players cannot easily be pegged as antisocial video game addicts. On the contrary, they may even be called social video game players, judging from the general responses given by players as to why they enjoy the game. This type of play certainly points to a new way of online interaction that may continue to grow in coming years.
In 2006, the journal Developmental Psychology published a study looking at the educational benefits of the Internet for teenagers in low-income households. It found that “children who used the Internet more had higher grade point averages (GPA) after one year and higher scores after standardized tests of reading achievement after six months than did children who used it less” and that continuing to use the Internet more as the study went on led to an even greater increase in GPA and standardized test scores in reading (there was no change in mathematics test scores).Linda A. Jackson and others, “Does Home Internet Use Influence the Academic Performance of Low-Income Children?” Developmental Psychology 42, no. 3 (2006): 433–434.
One of the most interesting aspects of the study’s results is the suggestion that the academic benefits may exclude low-performing children in low-income households. The reason for this, the study suggests, is that children in low-income households likely have a social circle consisting of other children from low-income households who are also unlikely to be connected to the Internet. As a result, after 16 months of Internet usage, only 16 percent of the participants were using email and only 25 percent were using instant messaging services. Another reason researchers suggested was that because “African-American culture is historically an ‘oral culture,’” and 83 percent of the participants were African American, the “impersonal nature of the Internet’s typical communication tools” may have led participants to continue to prefer face-to-face contact. In other words, social interaction on the Internet can only happen if your friends are also on the Internet.
On February 15, 2010, the firm Compete, which analyzes Internet traffic, reported that Facebook surpassed Google as the No. 1 site to drive traffic toward news and entertainment media on both Yahoo! and MSN.Mathew Ingram, “Facebook Driving More Traffic Than Google,” New York Times, February 15, 2010, http://www.nytimes.com/external/gigaom/2010/02/15/15gigaom-facebook-driving-more-traffic-than-google-42970.html. This statistic is a strong indicator that social networks are quickly becoming one of the most effective ways for people to sift through the ever-increasing amount of information on the Internet. It also suggests that people are content to get their news the way they did before the Internet or most other forms of mass media were invented—by word of mouth.
Many companies now use the Internet to leverage word-of-mouth social networking. The expansion of corporations into Facebook has given the service a big publicity boost, which has no doubt contributed to the growth of its user base, which in turn helps the corporations that put marketing efforts into the service. Putting a corporation on Facebook is not without risk; any corporation posting on Facebook runs the risk of being commented on by over 500 million users, and of course there is no way to ensure that those users will say positive things about the corporation. Good or bad, communicating with corporations is now a two-way street.
By 1994, the promise of the “information superhighwayThe idea that the Internet will make the transfer of information very fast. Also, related to how Eisenhower’s national highway system led to more people buying cars, the information superhighway would lead to more people buying computers.” had become so potent that it was given its own summit on the University of California Los Angeles campus. The country was quickly realizing that the spread of the web could be harnessed for educational purposes; more than just the diversion of computer hobbyists, this new vision of the web would be a constant learning resource that anyone could use.
The American video artist pioneer Nam June Paik takes credit for the term information superhighway, which he used during a study for the Rockefeller Foundation in 1974, long before the existence of Usenet. In 2001, he said, “If you create a highway, then people are going to invent cars. That’s dialectics. If you create electronic highways, something has to happen.”“Video and the Information Superhighway: An Artist’s Perspective,” The Biz Media, May 3, 2010, http://blog.thebizmedia.com/video-and-the-information-superhighway/. Paik’s prediction proved to be startlingly prescient.
Al Gore’s use of the term in the House of Representatives (and later as vice president) had a slightly different meaning and context. To Gore, the promise of the Interstate Highway System during the Eisenhower era was that the government would work to allow communication across natural barriers, and that citizens could then utilize these channels to conduct business and communicate with one another. Gore saw the government as playing an essential role in maintaining the pathways of electronic communication. Allowing business interests to get involved would compromise what he saw as a necessarily neutral purpose; a freeway doesn’t judge or demand tolls—it is a public service—and neither should the Internet. During his 2000 presidential campaign, Gore was wrongly ridiculed for supposedly saying that he “invented the Internet,” but in reality his work in the House of Representatives played a crucial part in developing the infrastructure required for Internet access.
Figure 11.10

Although Al Gore did not invent the Internet, he did popularize the term information superhighway in an effort to build support for Internet infrastructure and neutrality.
However, a certain amount of money was necessary to get connected to the web. In this respect, AOL was like the Model T of the Internet—it put access to the information superhighway within reach of the average person. But despite the affordability of AOL and the services that succeeded it, certain demographics continued to go without access to the Internet, a problem known as the “digital divide,” which you will learn more about in this section.
From speed of transportation, to credibility of information (don’t trust the stranger at the roadside diner), to security of information (keep the car doors locked), to net neutrality (toll-free roads), to the possibility of piracy, the metaphor of the information superhighway has proved to be remarkably apt. All of these issues have played out in different ways, both positive and negative, and they continue to develop to this day.
In December 2002, a survey by the Pew Internet & American Life Project found that 84 percent of Americans believed that they could find information on health care, government, news, or shopping on the Internet.Anick Jesdanun, “High Expectations for the Internet,” December 30, 2002, http://www.crn.com/it-channel/18822182;jsessionid=3Z2ILJNFKM1FZQE1GHPCKH4ATMY32JVN. This belief in a decade-old system of interconnected web pages would in itself be remarkable, but taking into account that 37 percent of respondents were not even connected to the Internet, it becomes even more fantastic. In other words, of the percentage of Americans without Internet connections, 64 percent still believed that it could be a source of information about these crucial topics. In addition, of those who expect to find such information, at least 70 percent of them succeed; news and shopping were the most successful topics, government was the least. This survey shows that most Americans believed that the Internet was indeed an effective source of information. Again, the role of the Internet in education was heralded as a new future, and technology was seen to level the playing field for all students.
Nowhere was this more apparent than in the Bush administration’s 2004 report, “Toward a New Golden Age in Education: How the Internet, the Law, and Today’s Students Are Revolutionizing Expectations.” By this time, the term digital divide was already widely used and the goal of “bridging” it took everything from putting computers in classrooms to giving personal computers to some high-need students to use at home.
The report stated that an “explosive growth” in sectors such as e-learning and virtual schools allowed each student “individual online instruction.”U.S. Department of Education, Toward a New Golden Age in American Education: How the Internet, the Law and Today’s Students Are Revolutionizing Expectations, National Education Technology Plan, 2004, http://www2.ed.gov/about/offices/list/os/technology/plan/2004/site/theplan/edlite-intro.html. More than just being able to find information online, people expected the Internet to provide virtually unlimited access to educational opportunities. To make this expectation a reality, one of the main investments that the paper called for was increased broadband Internet accessHigher-speed connections to the Internet that make things like live video, audio, and file sharing possible.. As Nam June Paik predicted, stringing fiber optics around the world would allow for seamless video communication, a development that the Department of Education saw as integral to its vision of educating through technology. The report called for broadband access “24 hours a day, seven days a week, 365 days a year,” saying that it could “help teachers and students realize the full potential of this technology.”U.S. Department of Education, Toward a New Golden Age in American Education: How the Internet, the Law and Today’s Students Are Revolutionizing Expectations, National Education Technology Plan, 2004, http://www2.ed.gov/about/offices/list/os/technology/plan/2004/site/theplan/edlite-intro.html.
One of the founding principles of many public library systems is to allow for free and open access to information. Historically, one of the major roadblocks to achieving this goal has been a simple one: location. Those living in rural areas or those with limited access to transportation simply could not get to a library. But with the spread of the Internet, the hope was that a global library would be created—an essential prospect for rural areas.
One of the most remarkable educational success stories in the Department of Education’s study is that of the Chugach School District in Alaska. In 1994, this district was the lowest performing in the state: over 50 percent staff turnover, the lowest standardized test scores, and only one student in 26 years graduating from college.U.S. Department of Education, Toward a New Golden Age in American Education: How the Internet, the Law and Today’s Students Are Revolutionizing Expectations, National Education Technology Plan, 2004, http://www2.ed.gov/about/offices/list/os/technology/plan/2004/site/theplan/edlite-intro.html. The school board instituted drastic measures, amounting to a complete overhaul of the system. They abolished grade levels, focusing instead on achievement, and by 2001 had increased Internet usage from 5 percent to 93 percent.
The Department of Education study emphasizes these numbers, and with good reason: The standardized test percentile scores rose from the 1920s to the 1970s in a period of 4 years, in both math and language arts. Yet these advances were not exclusive to low-performing rural students. In Florida, the Florida Virtual School system allowed rural school districts to offer advanced-placement coursework. Students excelling in rural areas could now study topics that were previously limited to districts that could fill (and fund) an entire classroom. Just as the Interstate Highway System commercially connected the most remote rural communities to large cities, the Internet has brought rural areas even further into the global world, especially in regard to the sharing of information and knowledge.
As technology has improved, it has become possible to provide software to users as a service that resides entirely online, rather than on a person’s personal computer. Since people can now be connected to the Internet constantly, they can use online programs to do all of their computing. It is no longer absolutely necessary to have, for example, a program like Microsoft Word to compose documents; this can be done through an online service like Google Docs or Zoho Writer.
“Cloud computingRunning a program on a remote server via a web browser. In cloud computing, the computer viewing the program is not actually doing the processing, but is rather just communicating with a remote server to send and receive information; the local computer is essentially only a display.” is the process of outsourcing common computing tasks to a remote server. The actual work is not done by the computer attached to the user’s monitor, but by other (maybe many other) computers in the “cloud.” As a result, the computer itself does not actually need that much processing power; instead of calculating “1 + 1 = 2,” the user’s computer asks the cloud, “What does 1 + 1 equal?” and receives the answer. Meanwhile, the system resources that a computer would normally devote to completing these tasks are freed up to be used for other things. An additional advantage of cloud computing is that data can be stored in the cloud and retrieved from any computer, making a user’s files more conveniently portable and less vulnerable to hardware failures like a hard drive crash. Of course, it can require quite a bit of bandwidthEquivalent to the number of lanes on a highway: the maximum amount of data that can be sent per second. to send these messages back and forth to a remote server in the cloud, and in the absence of a reliable, always-on Internet connection, the usefulness of these services can be somewhat limited.
Figure 11.11

Cloud computing allows a computer to contain very little actual information. Many of the programs used by the now-popular “netbooks” are stored online.
The concept of the cloud takes into account all the applications that are hosted on external machines and viewed on a user’s computer. Google Docs, which provides word processors, spreadsheets, and other tools, and Microsoft’s Hotmail, which provides email access, both constitute aspects of the “cloud.” These services are becoming even more popular with the onset of mobile applications and netbooks, which are small laptops with relatively little processing power and storage space that rely on cloud computing. A netbook does not need the processing power required to run Microsoft Word; as long as it has a web browser, it can run the Google Docs word processor and leave (almost) all of the processing to the cloud. Because of this evolution of the Internet, computers can be built less like stand-alone machines and more like interfaces for interacting with the larger system in the cloud.
One result of cloud computing has been the rise in web applications for mobile devices, such as the iPhone, BlackBerry, and devices that use Google’s Android operating system. 3G networksThird-generation cell phone networks capable of high-speed transfer of both voice and data., which are cell phone networks capable of high-speed data transfer, can augment the computing power of phones just by giving the phones the ability to send data somewhere else to be processed. For example, a Google Maps application does not actually calculate the shortest route between two places (taking into account how highways are quicker than side roads, and numerous other computational difficulties) but rather just asks Google to do the calculation and send over the result. 3G networks have made this possible in large part because the speed of data transfer has now surpassed the speed of cell phones’ calculation abilities. As cellular transmission technology continues to improve with the rollout of the next-generation 4G networksFourth-generation cell phone networks that improve on the connectivity speeds of 3G networks and provide more comprehensive access to multimedia. (the successors to 3G networks), connectivity speeds will further increase and allow for a focus on ever-more-comprehensive provisions for multimedia.
The Internet has undoubtedly been a boon for researchers and writers everywhere. Online services range from up-to-date news and media to vast archives of past writing and scholarship. However, since the Internet is open to any user, anyone with a few dollars can set up a credible-sounding website and begin to disseminate false information.
This is not necessarily a problem with the Internet specifically; any traditional medium can—knowingly or unknowingly—publish unreliable or outright false information. But the explosion of available sources on the Internet has caused a bit of a dilemma for information seekers. The difference is that much of the information on the Internet is not the work of professional authors, but of amateurs who have questionable expertise. On the Internet, anyone can self-publish, so the vetting that usually occurs in a traditional medium—for example, by a magazine’s editorial department—rarely happens online.
That said, if an author who is recognizable from elsewhere writes something online, it may point to more reliable information.Elizabeth E. Kirk, “Evaluating Information Found on the Internet,” Sheridan Libraries, Johns Hopkins University, 1996, http://www.library.jhu.edu/researchhelp/general/evaluating/. In addition, looking for a trusted name on the website could lead to more assurance of reliability. For example, the site krugmanonline.com, the official site of Princeton economist Paul Krugman, does not have any authorial data. Even statements like “Nobel Prize Winner and Op-Ed Columnist for The New York Times” do not actually say anything about the author of the website. Much of the content is aggregated from the web as well. However, the bottom-left corner of the page has the mark “© 2009 W. W. Norton & Company, Inc.” (Krugman’s publisher). Therefore, a visitor might decide to pick and choose which information to trust. The author is clearly concerned with selling Krugman’s books, so the glowing reviews may need to be verified elsewhere; on the other hand, the author biography is probably fairly accurate, since the publishing company has direct access to Krugman, and Krugman himself probably looked it over to make sure it was valid. Taking the authorship of a site into account is a necessary step when judging information; more than just hunting down untrue statements, it can give insight into subtle bias that may arise and point to further research that needs to be done.
One noticeable thing on Paul Krugman’s site is that all of his book reviews are positive. Although these are probably real reviews, they may not be representative of his critical reception at large. Mainstream journalistic sources usually attempt to achieve some sort of balance in their reporting; given reasonable access, they will interview opposing viewpoints and reserve judgment for the editorial page. Corporate sources, like on Krugman’s site, will instead tilt the information toward their product.
Often, the web is viewed as a source of entertainment, even in its informational capacity. Because of this, sites that rely on advertising may choose to publish something more inflammatory that will be linked to and forwarded more for its entertainment value than for its informational qualities.
On the other hand, a website might attempt to present itself as a credible source of information about a particular product or topic, with the end goal of selling something. A website that gives advice on how to protect against bedbugs that includes a direct link to its product may not be the best source of information on the topic. While so much on the web is free, it is worthwhile looking into how websites actually maintain their services. If a website is giving something away for free, the information might be biased, because it must be getting its money from somewhere. The online archive of Consumer Reports requires a subscription to access it. Ostensibly, this subscription revenue allows the service to exist as an impartial judge, serving the users rather than the advertisers.
Occasionally, corporations may set up “credible” fronts to disseminate information. Because sources may look reliable, it is always important to investigate further. Global warming is a contentious topic, and websites about the issue often represent the bias of their owners. For example, the Cato Institute publishes anti-global-warming theory columns in many newspapers, including well-respected ones such as the Washington Times. Patrick Basham, an adjunct scholar at the Cato Institute, published the article “Live Earth’s Inconvenient Truths” in the Washington Times on July 11, 2007. Basham writes, “Using normal scientific standards, there is no proof we are causing the Earth to warm, let alone that such warming will cause an environmental catastrophe.”Patrick Basham, “Live Earth’s Inconvenient Truths,” Cato Institute, July 11, 2007, http://www.cato.org/pub_display.php?pub_id=8497.
However, the website ExxposeExxon.com states that the Cato Institute received $125,000 from the oil giant ExxonMobil, possibly tainting its data with bias.Exxpose Exxon, “Global Warming Deniers and ExxonMobil,” 2006, http://www.exxposeexxon.com/facts/gwdeniers.html. In addition, ExxposeExxon.com is run as a side project of the international environmental nonprofit Greenpeace, which may have its own reasons for producing this particular report. The document available on Greenpeace’s site (a scanned version of Exxon’s printout) states that in 2006, the corporation gave $20,000 to the Cato InstituteGreenpeace, ExxonMobil 2006 Contributions and Community Investments, October 5, 2007, http://research.greenpeaceusa.org/?a=view&d=4381. (the other $105,000 was given over the previous decade).
This back-and-forth highlights the difficulty of finding credible information online, especially when money is at stake. In addition, it shows how conflicting sources may go to great lengths—sorting through a company’s corporate financial reports—in order to expose what they see as falsehoods. What is the upside to all of this required fact-checking and cross-examination? Before the Internet, this probably would have required multiple telephone calls and plenty of time waiting on hold. While the Internet has made false information more widely available, it has also made checking that information incredibly easy.
Nowhere has this cross-examination and cross-listing of sources been more widespread than with Wikipedia. Information free and available to all? That sounds like a dream come true—a dream that Wikipedia founder Jimmy Wales was ready to pursue. Since the site began in 2001, the Wikimedia Foundation (which hosts all of the Wikipedia pages) has become the sixth-most-visited site on the web, barely behind eBay in terms of its unique page views.
Table 11.3 Top 10 Global Web Parent Companies, Home and Work
Rank |
Parent |
Unique Audience (millions) |
Active Reach % |
Time |
|---|---|---|---|---|
1 |
362,006 |
84.29 |
2:27:15 |
|
2 |
Microsoft |
322,352 |
75.06 |
2:53:48 |
3 |
Yahoo! |
238,035 |
55.43 |
1:57:26 |
4 |
218,861 |
50.96 |
6:22:24 |
|
5 |
eBay |
163,325 |
38.03 |
1:42:46 |
6 |
Wikimedia |
154,905 |
36.07 |
0:15:14 |
7 |
AOL LLC |
128,147 |
29.84 |
2:08:32 |
8 |
Amazon |
128,071 |
29.82 |
0:23:24 |
9 |
News Corp. |
125,898 |
29.31 |
0:53:53 |
10 |
InterActiveCorp |
122,029 |
28.41 |
0:10:52 |
Source: The Nielsen Company
Organizations had long been trying to develop factual content for the web but Wikipedia went for something else: verifiabilityThe key to a source that is considered usable on Wikipedia. Verifiable information is credited to a specific author or reliable news organization, and has been previously vetted in some way—“truth” does not enter into the picture.. The guidelines for editing Wikipedia state, “What counts is whether readers can verify that material added to Wikipedia has already been published by a reliable source, not whether editors think it is true.”Wikipedia, s.v. “Wikipedia:Neutral point of view,” http://en.wikipedia.org/wiki/Wikipedia:Neutral_point_of_view. The benchmark for inclusion on Wikipedia includes outside citations for any content “likely to be challenged” and for “all quotations.”
While this may seem like it’s a step ahead of many other sources on the Internet, there is a catch: Anyone can edit Wikipedia. This has a positive and negative side—though anyone can vandalize the site, anyone can also fix it. In addition, calling a particularly contentious page to attention can result in one of the site’s administrators placing a warning at the top of the page stating that the information is not necessarily verified. Other warnings include notices on articles about living persons, which are given special attention, and articles that may violate Wikipedia’s neutrality policy. This neutrality policy is a way to mitigate the extreme views that may be posted on a page with open access, allowing the community to decide what constitutes a “significant” view that should be represented.Wikipedia, s.v. “Wikipedia:Verifiability,” http://en.wikipedia.org/wiki/Wikipedia:Verifiability.
As long as users do not take the facts on Wikipedia at face value and make sure to follow up on the relevant sources linked in the articles they read, the site is an extremely useful reference tool that gives users quick access to a wide range of subjects. However, articles on esoteric subjects can be especially prone to vandalism or poorly researched information. Since every reader is a potential editor, a lack of readers can lead to a poorly edited page because errors, whether deliberate or not, go uncorrected. In short, the lack of authorial credit can lead to problems with judging bias and relevance of information, so the same precautions must be taken with Wikipedia as with any other online source, primarily in checking references. The advantage of Wikipedia is its openness and freedom—if you find a problem, you can either fix it (with your own verifiable sources) or flag it on the message boards. Culturally, there has been a shift from valuing a few reliable sources to valuing a multiplicity of competing sources. However, weighing these sources against one another has become easier than ever before.
As the Internet has grown in scope and the amount of personal information online has proliferated, securing this information has become a major issue. The Internet now houses everything from online banking systems to highly personal email messages, and even though security is constantly improving, this information is not invulnerable.
An example of this vulnerability is the Climategate scandal in late 2009. A collection of private email messages were hacked from a server at the University of East Anglia, where much of the Intergovernmental Panel on Climate Change research takes place. These emails show internal debates among the scientists regarding which pieces of data should be released and which are not relevant (or helpful) to their case.Andrew C. Revkin, “Hacked email Is New Fodder for Climate Dispute,” New York Times, November 20, 2009, http://www.nytimes.com/2009/11/21/science/earth/21climate.html. In these emails, the scientists sometimes talk about colleagues—especially those skeptical of climate change—in a derisive way. Of course, these emails were never meant to become public.
This scandal demonstrates how easy it can be to lose control of private information on the Internet. In previous decades, hard copies of these letters would have to be found, and the theft could probably be traced back to a specific culprit. With the Internet, it is much more difficult to tell who is doing the snooping, especially if it is done on a public network. The same protocols that allow for open access and communication also allow for possible exploitation. Like the Interstate Highway System, the Internet is impartial to its users. In other words: If you’re going to ride, lock your doors.
Another explosive scandal involving email account hacking also occurred in late 2009, when Google’s Gmail service was hacked by IP addresses originating in China. Gmail was one of the primary services used by human rights activists due to its location in the United States and its extra encryption. To understand the magnitude of this, it is important to understand the history of email hacking and the importance of physical server location and local laws.
In 2000, a computer virusA program that spreads itself from computer to computer, often with adverse effects. was unleashed by a student in the Philippines that simply sent a message with the subject line “I Love You.” The email had a file attached, called LOVE-LETTER-FOR-YOU.TXT.vbs. The suffix “.txt” is generally used for text files and was meant, in this case, as a distraction; the file’s real suffix was “.vbs,” which means that the file is a script. When run, this script ran and emailed itself across the user’s entire address book, before sending any available passwords to an email address in the Philippines. One of the key aspects of this case, however, was a matter of simple jurisdiction: the student was not prosecuted, due to the lack of computer crime laws in the Philippines.Kim Zetter, “Nov. 10, 1983: Computer ‘Virus’ Is Born,” Wired, November 10, 2009, http://www.wired.com/thisdayintech/2009/11/1110fred-cohen-first-computer-virus/.
The encryption that Gmail uses resulted in only two of the accounts being successfully hacked, and hackers were only able to see email subject lines and timestamps—no message content was available.Kim Zetter, “Google to Stop Censoring Search Results in China After Hack Attack,” Wired, January 12, 2010, http://www.wired.com/threatlevel/2010/01/google-censorship-china/. Since the chaos that ensued after the “I Love You” virus, email users and service providers have become more vigilant in their defensive techniques. However, the increased reliance on email for daily communication makes it an attractive target for hackers. The development of cloud computing will likely lead to entirely new problems with Internet security; just as a highway brings two communities together, it can also cause these communities to share problems.
Although many people increasingly rely on the Internet for communication and access to information, this reliance has come with a hefty price. Most critically, a simple exploit can cause massive roadblocks to Internet traffic, leading to disruptions in commerce, communication, and, as the military continues to rely on the Internet, national security.
Distributed denial-of-service (DDoS) attacks work like cloud computing, but in reverse. Instead of a single computer going out to retrieve data from many different sources, DDoS is a coordinated effort by many different computers to bring down (or overwhelm) a specific website. Essentially, any web server can only handle a certain amount of information at once. While the largest and most stable web servers can talk to a huge number of computers simultaneously, even these can be overwhelmed.
During a DDoS attack on government servers belonging to both the United States and South Korea in July 2009, many U.S. government sites were rendered unavailable to users in Asia for a short time.Siobhan Gorman and Evan Ramstad, “Cyber Blitz Hits U.S., Korea,” Wall Street Journal, July 9, 2009, http://online.wsj.com/article/SB124701806176209691.html. Although this did not have a major effect on U.S. cyber-security, the ease with which these servers could be exploited was troubling. In this case, the DDoS attacks were perpetuated by an email virus known as MyDoom, which essentially turned users’ computers into server-attacking “zombies.” This exploit—coupling an email scam with a larger attack—is difficult to trace, partly because the culprit is likely not one of the original attackers, but rather the victim of a virus used to turn vulnerable computers into an automated hacker army. Since the attack, President Barack Obama has committed to creating a new post for a head of cyber-security in the government.
Most Internet users in the United States connect through a commercial Internet service provider (ISP). The major players—Comcast, Verizon, Time Warner Cable, AT&T, and others—are portals to the larger Internet, serving as a way for anyone with a cable line or phone line to receive broadband Internet access through a dedicated data line.
Ideally, ISPs treat all content impartially; any two websites will load at the same speed if they have adequate server capabilities. Service providers are not entirely happy with this arrangement. ISPs have proposed a new service model that would allow corporations to pay for a “higher tier” service. For example, this would allow AOL Time Warner to deliver its Hulu service (which Time Warner co-owns with NBC) faster than all other video services, leading to partnerships between Internet content providers and Internet service providers. The service providers also often foot the bill for expanding high-speed Internet access, and they see this new two-tiered service as a way to cash in on some of that investment (and, presumably, to reinvest the funds received).
The main fear—and the reason the FCC introduced net neutralityThe concept that service providers should treat all online content equally, rather than developing a tiered system that allows companies to pay for faster delivery of information. rules—is that such a service would hamper the ability of an Internet startup to grow its business. Defenders of net neutrality contend that small businesses (those without the ability to forge partnerships with the service providers) would be forced onto a “second-tier” Internet service, and their content would naturally suffer, decreasing inventiveness and competition among Internet content providers.
One of the key roadblocks to Internet legislation is the difficulty of describing the Internet and the Internet’s place among communication bills of the past. First of all, it is important to realize that legislation relating to the impartiality of service providers is not unheard-of. Before the 1960s, AT&T was allowed to restrict its customers to using only its own telephones on its networks. In the 1960s, the FCC launched a series of “Computer Inquiries,” stating, in effect, that any customer could use any device on the network, as long as it did not actually harm the network. This led to inventions such as the fax machine, which would not have been possible under AT&T’s previous agreement.
A key point today is that these proto–net neutrality rules protected innovation even when they “threatened to be a substitute for regulated services.”Robert Cannon, “The Legacy of the Federal Communications Commission’s Computer Inquiries,” Federal Communication Law Journal 55, no. 2 (2003): 170. This is directly relevant to a controversy involving Apple’s iPhone that culminated in October 2009 when AT&T agreed to allow VoIP (voice over Internet Protocol) on its 3G data networks. VoIP services, like the program Skype, allow a user to place a telephone call from an Internet data line to a traditional telephone line. In the case of the iPhone, AT&T did not actually block the transmission of data—it just had Apple block the app from its App Store. Since AT&T runs the phone service as well as the data lines, and since many users have plans with unlimited data connections, AT&T could see its phone profits cut drastically if all its users suddenly switched to using Skype to place all their telephone calls.
Senator Ted Stevens, the former head of the committee in charge of regulating the Internet, said on the floor of the Senate that the Internet is “not a big truck … it’s a series of tubes.”Alex Curtis, “Senator Stevens Speaks on Net Neutrality,” Public Knowledge, June 28, 2006, http://www.publicknowledge.org/node/497. According to this metaphor, an email can get “stuck in the tubes” for days behind someone else’s material, leading to poorer service for the customer. In reality, service providers sell data-usage plans that only set a cap on the amount of data that someone can send over the Internet (measured in bits per secondThe most common measure of Internet bandwidth. Bit is a computing term meaning a one or a zero., where a bit is the smallest measurement of data). If a service is rated at 1.5 million bits per second (megabits per second, or 1.5 Mbps), it may only reach this once in a while—no one can “clog the tubes” without paying massive amounts of money for the service. Theoretically, the company will then invest this service fee in building more robust “tubes.”
Net neutrality is difficult to legislate in part because it can be confusing: It relies on understanding how the Internet works and how communications are regulated. Stevens’s metaphor is misleading because it assumes that Internet capacity is not already regulated in some natural way. To use the superhighway analogy, Stevens is suggesting that the highways are congested, and his solution is to allow companies to dedicate express lanes for high-paying customers (it should be noted that the revenue would go to the service providers, even though the government has chipped in quite a bit for information superhighway construction). The danger of this is that it would be very difficult for a small business or personal site to afford express-lane access. Worse yet, the pro–net neutrality organization Save the Internet says that a lack of legislation would allow companies to “discriminate in favor of their own search engines” and “leave the rest of us on a winding dirt road.”Save the Internet, “FAQs,” 2010, http://www.savetheinternet.com/faq. For areas that only have access to one Internet service, this would amount to a lack of access to all the available content.
Content on the Internet competes with content from other media outlets. Unlimited and cheap digital duplication of content removes the concept of scarcity from the economic model of media; it is no longer necessary to buy a physical CD fabricated by a company in order to play music, and digital words on a screen convey the news just as well as words printed on physical newspaper. Media companies have been forced to reinvent themselves as listeners, readers, and watchers have divided into smaller and smaller subcategories.
Traditional media companies have had to evolve to adapt to the changes wrought by the Internet revolution, but these media are far from obsolete in an online world. For example, social media can provide a very inexpensive and reliable model for maintaining a band’s following. A record company (or the band itself) can start a Facebook page, through which it can notify all its fans about new albums and tour dates—or even just remind fans that it still exists. MySpace has been (and still is, to an extent) one of the main musical outlets on the Internet. This free service comes with a small web-based music player that allows people interested in the band to listen to samples of its music. Coupling free samples with social networking allows anyone to discover a band from anywhere in the world, leading to the possibility of varying and eclectic tastes not bound by geography.
Review Questions
Questions for Section 11.1 "The Evolution of the Internet"
Questions for Section 11.2 "Social Media and Web 2.0"
Questions for Section 11.3 "The Effects of the Internet and Globalization on Popular Culture and Interpersonal Communication"
Questions for Section 11.4 "Issues and Trends"
There is a constantly growing market for people who know how to use social media effectively. Often, companies will hire someone specifically to manage their Facebook and Twitter feeds as another aspect of public relations and traditional marketing.
Read the article “5 True Things Social Media Experts Do Online,” written by social media writer Glen Allsopp. You can find it at http://www.techipedia.com/2010/social-media-expert-skills/.
Then, explore the site of Jonathan Fields, located at http://www.jonathanfields.com/blog/. After exploring for a bit, read the “About” section (the link is at the top). These two sites will help you answer the following questions:
Typing www.subservient-chicken.com into a web browser will lead the user to a site featuring video footage of a person dressed in a chicken suit. The user can then type commands, causing the chicken to perform a variety of actions. Although the chicken does not actively promote a particular product, it does appear on the Burger King website—the URL connects to a Burger King–hosted site. In fact, the Subservient Chicken was created for Burger King in 2005 to advertise a chicken sandwich. In only a year, more than 13 million unique visitors came to the site just to see what the chicken would do.Nat Ives, “Interactive Viral Campaigns Ask Users to Spread the Word,” New York Times, February 18, 2005, http://query.nytimes.com/gst/fullpage.html?res=9807EEDF113AF93BA25751C0A9639C8B63.
Given advertising’s somewhat scandalous history, the phenomenon of viral advertising is nothing short of incredible. During the 1800s, public opinion ranked advertising as an immoral, lowbrow profession, full of liars and outlandish publicists.William M. O’Barr, “A Brief History of Advertising in America,” Advertising & Society Review, 2005, http://muse.jhu.edu/journals/asr/v006/6.3unit02.html#O%27Barr. Early magazines such as Harper’s Weekly initially refused to carry ads. Even after most forms of media had begun accepting advertisments, the spots annoyed hosts as well as readers, listeners, and viewers. In both its radio and television formats, the popular Jack Benny Show regularly mocked the sponsor mentions for Jell-O that announcers had to make during the show. During the 1960s, advertisements actually capitalized on the American public’s fatigue with traditional advertising techniques. Nearly 40 years later, the arrival of digital video recorder (DVR) technology was heralded as a means for television watchers to eliminate obnoxious advertisements altogether. Then, in 2005, more than 13 million people sought out a commercial of their own volition—even repeatedly going back to it—to see what it would do next.
The Subservient Chicken campaign was truly a mix of advertising and public relations. The website was covered widely in the press, received commentary from bloggers, and was shared among millions of users who had discovered new commands for the chicken. Advertisements for the associated chicken sandwich on television and in print ads reinforced all this free publicity. This campaign demonstrates the ways that advertising and public relations are related and even complement each other. Given the popularity of such campaigns, it is not difficult to see the cultural effects of advertising and public relations.
AdvertisingPromoting a product or service through paid announcements. is defined as promoting a product or service through the use of paid announcements.Dictionary.com, s.v. “Advertising,” http://dictionary.reference.com/browse/advertising. These announcements have had an enormous effect on modern culture, and thus deserve a great deal of attention in any treatment of the media’s influence on culture.
Figure 12.1

History of Advertising
Advertising dates back to ancient Rome’s public markets and forums and continues into the modern era in most homes around the world. Contemporary consumers relate to and identify with brands and products. Advertising has inspired an independent press and conspired to encourage carcinogenic addictions. An exceedingly human invention, advertising is an unavoidable aspect of the shared modern experience.
In 79 CE, the eruption of Italy’s Mount Vesuvius destroyed and, ultimately, preserved the ancient city of Pompeii. Historians have used the city’s archaeological evidence to piece together many aspects of ancient life. Pompeii’s ruins reveal a world in which the fundamental tenets of commerce and advertising were already in place. Merchants offered different brands of fish sauces identified by various names such as “Scaurus’ tunny jelly.” Wines were branded as well, and their manufacturers sought to position them by making claims about their prestige and quality. Toys and other merchandise found in the city bear the names of famous athletes, providing, perhaps, the first example of endorsement techniques.John Hood, Selling the Dream: Why Advertising Is Good Business (Westport, CT: Praeger, 2005), 12–13.
The invention of the printing press in 1440 made it possible to print advertisements that could be put up on walls and handed out to individuals. By the 1600s, newspapers had begun to include advertisements on their pages. Advertising revenue allowed newspapers to print independently of secular or clerical authority, eventually achieving daily circulation. By the end of the 16th century, most newspapers contained at least some advertisements.William M. O’Barr, “A Brief History of Advertising in America,” Advertising & Society Review 6, no. 3 (2005), http://muse.jhu.edu/journals/asr/v006/6.3unit02.html.
European colonization of the Americas during the 1600s brought about one of the first large-scale advertising campaigns. When European trading companies realized that the Americas held economic potential as a source of natural resources such as timber, fur, and tobacco, they attempted to convince others to cross the Atlantic Ocean and work to harvest this bounty. The advertisements for this venture described a paradise without beggars and with plenty of land for those who made the trip. The advertisements convinced many poor Europeans to become indentured servants to pay for the voyage.Christina B. Mierau, Accept No Substitutes: The History of American Advertising (Minneapolis, MN: Lerner, 2000), 7–8.
Figure 12.2

Early penny press papers such as the New York Sun took advantage of advertisements, which allowed them to sell their issues for a penny or two.
The rise of the penny press during the 1800s had a profound effect on advertising. The New York Sun embraced a novel advertising model in 1833 that allowed it to sell issues of the paper for a trifling amount of money, ensuring a higher circulation and a wider audience. This larger audience in turn justified greater prices for advertisements, allowing the paper to make a profit from its ads rather than from direct sales.Jennifer Vance, “Extra, Extra, Read All About It!” Penny Press, http://iml.jou.efl.edu/projects/Spring04/vance/pennypress.html.
The career of P. T. Barnum, cofounder of the famed Barnum & Bailey circus, gives a sense of the uncontrolled nature of advertising during the 1800s. He began his career in the 1840s writing ads for a theater, and soon after, he began promoting his own shows. He advertised these shows any way he could, using not only interesting newspaper ads but also bands of musicians, paintings on the outside of his buildings, and street-spanning banners.
Barnum also learned the effectiveness of using the media to gain attention. In an early publicity stunt, Barnum hired a man to wordlessly stack bricks at various corners near his museum during the hours preceding a show. When this activity drew a crowd, the man went to the museum and bought a ticket for the show. This stunt drew such large crowds over the next two days, that the police made Barnum put a halt to it, gaining it even wider media attention. Barnum was sued for fraud over a bearded woman featured in one of his shows; the plaintiffs claimed that she was, in fact, a man. Rather than trying to keep the trial quiet, Barnum drew attention to it by parading a crowd of witnesses attesting to the bearded woman’s gender, drawing more media attention—and more customers.
Figure 12.3

P. T. Barnum used the press to spark interest in his shows.
Barnum aimed to make his audience think about what they had seen for an extended time. His Feejee mermaid—actually a mummified monkey and fish sewn together—was not necessarily interesting because viewers thought the creation was really a mermaid, but because they weren’t sure if it was or not. Such marketing tactics brought Barnum’s shows out of his establishments and into social conversations and newspapers.Edd Applegate, Personalities and Products: A Historical Perspective on Advertising in America (Westport, CT: Greenwood Press, 1998), 57–64. Although most companies today would eschew Barnum’s outrageous style, many have used the media and a similar sense of mystery to promote their products. Apple, for example, famously keeps its products such as the iPhone and iPad under wraps, building media anticipation and coverage.
In 1843, a salesman named Volney Palmer founded the first U.S. advertising agency in Philadelphia. The agency made money by linking potential advertisers with newspapers. By 1867, other agencies had formed, and advertisements were being marketed at the national level. During this time, George Rowell, who made a living buying bulk advertising space in newspapers to subdivide and sell to advertisers, began conducting market research in its modern recognizable form. He used surveys and circulation counts to estimate numbers of readers and anticipate effective advertising techniques. His agency gained an advantage over other agencies by offering advertising space most suited for a particular product. This trend quickly caught on with other agencies. In 1888, Rowell started the first advertising trade magazine, Printers’ Ink.Ellen Gartrell, “More About Early Advertising Publications,” Digital Collections, Duke University Libraries, http://library.duke.edu/digitalcollections/eaa/printlit.html.
In , you read about McClure’s success in 1893 thanks to an advertising model: selling issues for nearly half the price of other magazines and depending on advertising revenues to make up the difference between cost and sales price. Magazines such as Ladies’ Home Journal focused on specific audiences, so they allowed advertisers to market products designed for a specific demographic. By 1900, Harper’s Weekly, once known for refusing advertising, featured ads on half of its pages.All Classic Ads, “Advertising Timeline,” Vintage Collection, All Classic Ads, http://www.allclassicads.com/advertising-timeline.html.
Figure 12.4

In the early 1900s, brand-name food items, such as this one, began making a household name for themselves.
Another ubiquitous aspect of advertising developed around this time: brands. During most of the 19th century, consumers purchased goods in bulk, weighing out scoops of flour or sugar from large store barrels and paying for them by the pound. Innovations in industrial packaging allowed companies to mass produce bags, tins, and cartons with brand names on them. Although brands existed before this time, they were generally reserved for goods that were inherently recognizable, such as china or furniture. Advertising a particular kind of honey or flour made it possible for customers to ask for that product by name, giving it an edge over the unnamed competition.Christina B. Mierau, Accept No Substitutes: The History of American Advertising (Minneapolis, MN: Lerner, 2000), 42.
Figure 12.5

Department stores such as Sears, Roebuck and Co. reached consumers outside of the city through mail-order catalogs.
The rise of department stores during the late 1800s also gave brands a push. Nationwide outlets such as Sears, Roebuck & Company and Montgomery Ward sold many of the same items to consumers all over the country. A particular item spotted in a big-city storefront could come to a small-town shopper’s home thanks to mail-order catalogs. Customers made associations with the stores, trusting them to have a particular kind of item and to provide quality wares. Essentially, consumers came to trust the store’s name rather than its specific products.John Hood, Selling the Dream: Why Advertising Is Good Business (Westport, CT: Praeger, 2005), 28–51.
Although advertising was becoming increasingly accepted as an element of mass media, many still regarded it as an unseemly occupation. This attitude began to change during the early 20th century. As magazines—widely considered a highbrow medium—began using more advertising, the advertising profession began attracting more artists and writers. Writers used verse and artists produced illustrations to embellish advertisements. Not surprisingly, this era gave rise to commercial jingles and iconic brand characters such as the Jolly Green Giant and the Pillsbury Doughboy.
The household cleaner Sapolio produced advertisements that made the most of the artistic advertising trend. Sapolio’s ads featured various drawings of the residents of “Spotless Town” along with a rhymed verse celebrating the virtues of this fictional haven of cleanliness. The public anticipated each new ad in much the same way people today anticipate new television episodes. In fact, the ads became so popular that citizens passed “Spotless Town” resolutions to clean up their own jurisdictions. Advertising trends later moved away from flowery writing and artistry, but the lessons of those memorable campaigns continued to influence the advertising profession for years to come.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 41–46.
World War I fueled an advertising and propoganda boom. Corporations that had switched to manufacturing wartime goods wanted to stay in the public eye by advertising their patriotism. Equally, the government needed to encourage public support for the war, employing such techniques as the famous Uncle Sam recruiting poster. President Woodrow Wilson established the advertiser-run Committee on Public Information to make movies and posters, write speeches, and generally sell the war to the public. Advertising helped popularize World War I on the homefront, and the war in turn gave advertising a much-needed boost in stature. The postwar return to regular manufacturing initiated the 1920s as an era of unprecedented advertising.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 74–77.
The rising film industry made celebrity testimonials, or product endorsementsSupport from a celebrity or well-known person for a particular product or service., an important aspect of advertising during the 1920s. Film stars including Clara Bow and Joan Crawford endorsed products such as Lux toilet soap. In these early days of mass-media consumer culture, film actors and actresses gave the public figures to emulate as they began participating in popular culture.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 89.
As discussed in , radio became an accepted commercial medium during the 1920s. Although many initially thought radio was too intrusive a medium to allow advertising, as it entered people’s homes by the end of the decade, advertising had become an integral aspect of programming. Advertising agencies often created their own programs that networks then distributed. As advertisers conducted surveys and researched prime time slots, radio programming changed to appeal to their target demographics. The famous Lux Radio Theater, for example, was named for and sponsored by a brand of soap. Product placement was an important part of these early radio programs. Ads for Jell-O appeared during the course of the Jack Benny Show,JackBennyShow.com, “Jell-O,” Jack Benny Show, http://jackbennyshow.com/index_090.htm. and Fibber McGee and Molly scripts often involved their sponsor’s floor wax.Read G. Burgan, “Radio Fun with Fibber McGee and Molly,” RGB Digital Audio, January 24, 1996, http://www.rgbdigitalaudio.com/OTR_Reviews/Fibber_McGee_OTRArticle.htm. The relationship between a sponsor and a show’s producers was not always harmonious; the producers of radio programs were constrained from broadcasting any content that might reflect badly on their sponsor.
Unsurprisingly, the Great Depression, with its widespread decreases in levels of income and buying power, had a negative effect on advertising. Spending on ads dropped to a mere 38 percent of its previous level. Social reformers added to revenue woes by again questioning the moral standing of the advertising profession. Books such as Through Many Windows and Our Master’s Voice portrayed advertisers as dishonest and cynical, willing to say anything to make a profit and unconcerned about their influence on society. Humorists also questioned advertising’s authority. The Depression-era magazine Ballyhoo regularly featured parodies of ads, similar to those seen later on Saturday Night Live or in The Onion. These ads mocked the claims that had been made throughout the 1920s, further reducing advertising’s public standing.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 121–124.
This advertising downturn lasted only as long as the Depression. As the United States entered World War II, advertising again returned to encourage public support and improve the image of businesses.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 168. However, there was one lasting effect of the Depression. The rising consumer movement made false and misleading advertising a major public policy issue. At the time, companies such as Fleischmann’s (which claimed its yeast could cure crooked teeth) were using advertisements to pitch misleading assertions. Only business owners’ personal morals stood in the way of such claims until 1938, when the federal government created the Federal Trade Commission (FTC) and gave it the authority to halt false advertising.
Figure 12.6

Shows such as Kraft Television Theatre were created by single sponsors— in this case, Kraft Foods Inc.
In 1955, television outpaced all other media for advertising. television provided advertisers with unique, geographically oriented mass markets that could be targeted with regionally appropriate ads.Lawrence Samuel, Brought to You By: Postwar Television Advertising and the American Dream (Austin, TX: University of Texas Press, 2001), 88–94. The 1950s saw a 75 percent increase in advertising spending, faster than any other economic indicator at the time.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 173.
Single sponsors created early television programs. These sponsors had total control over programs such as Goodyear TV Playhouse and Kraft Television Theatre. Some sponsors went as far as to manipulate various aspects of the programs. In one instance, a program run by the DeSoto car company asked a contestant to use a false name rather than his given name, Ford. The present-day network model of television advertising took hold during the 1950s as the costs of television production made sole sponsorship of a show prohibitive for most companies. Rather than having a single sponsor, the networks began producing their own shows, paying for them through ads sold to a number of different sponsors.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 210–215. Under the new model of advertising, television producers had much more creative control than they had under the sole-sponsorship model.
The quiz shows of the 1950s were the last of the single-sponsor–produced programs. In 1958, when allegations of quiz show fraud became national news, advertisers moved out of programming entirely. The quiz show scandals also added to an increasing scepticism of ads and consumer culture.William Boddy, “The Seven Dwarfs and the Money Grubbers: The Public Relations Crisis of US Television in the Late 1950s,” in Logics of Television: Essays in Cultural Criticism, ed. Patricia Mellencamp (Bloomington, IN: Indiana University Press, 1990), 110.
Advertising research during the 1950s had used scientifically driven techniques to attempt to influence consumer opinion. Although the effectiveness of this type of advertising is questionable, the idea of consumer manipulation through scientific methods became an issue for many Americans. Vance Packard’s best-selling 1957 book The Hidden Persuaders targeted this style of advertising. The Hidden Persuaders and other books like it were part of a growing critique of 1950s consumer culture. The U.S. public was becoming increasingly wary of advertising claims— not to mention increasingly weary of ads themselves. A few adventurous ad agencies used this consumer fatigue to usher in a new era of advertising and American culture.Thomas Frank, The Conquest of Cool (Chicago: University of Chicago Press, 1998), 41.
Burdened by association with Nazi Germany, where the company had originated, Volkswagen took a daring risk during the 1950s. In 1959, the Doyle Dane Bernbach (DDB) agency initiated an ad campaign for the company that targeted skeptics of contemporary culture. Using a frank personal tone with the audience and making fun of the planned obsolescence that was the hallmark of Detroit automakers, the campaign stood apart from other advertisements of the time. It used many of the consumer icons of the 1950s, such as suburbia and game shows, in a satirical way, pitting Volkswagen against mainstream conformity and placing it strongly on the side of the consumer. By the end of the 1960s, the campaign had become an icon of American anticonformity. In fact, it was such a success that other automakers quickly emulated it. Ads for the Dodge Fever, for example, mocked corporate values and championed rebellion.Thomas Frank, The Conquest of Cool (Chicago: University of Chicago Press, 1998), 60–67, 159.
This era of advertising became known as the “creative revolution” for its emphasis on creativity over straight salesmanship. The creative revolution reflected the values of the growing anticonformist movement that culminated in the countercultural revolution of the 1960s. The creativity and anticonformity of 1960s advertising quickly gave way to more product-oriented conventional ads during the 1970s. Agency conglomeration, a recession, and cultural fallout were all factors in the recycling of older ad techniques. Major television networks dropped their long-standing ban on comparative advertising early in the decade, leading to a new trend in positioning ads that compared products. Advertising wars such as Coke versus Pepsi and, later, Microsoft versus Apple were products of this trend.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 324–325.
Innovations in the 1980s stemmed from a new television channel: MTV. Producers of youth-oriented products created ads featuring music and focusing on stylistic effects, mirroring the look and feel of music videos. By the end of the decade, this style had extended to more mainstream products. Campaigns for the pain reliever Nuprin featured black-and-white footage with bright yellow pills, whereas ads for Michelob used grainy atmospheric effects.New York Times, “How MTV Has Rocked Television Commercials,” October 9, 1989, http://www.nytimes.com/1989/10/09/business/the-media-business-how-mtv-has-rocked-television-commercials.html.
During the late 1980s, studies showed that consumers were trending away from brands and brand loyalty. A recession coupled with general consumer fatigue led to an increase in generic brand purchases and a decrease in advertising. In 1983, marketing budgets allocated 70 percent of their expenditures to ads and the remaining 30 percent to other forms of promotion. By 1993, only 25 percent of marketing budgets were dedicated to ads.Naomi Klein, No Logo (New York: Picador, 2002), 14.
These developments resulted in the rise of big-box stores such as Walmart that focused on low prices rather than expensive name brands. Large brands remade themselves during this period to focus less on their products and more on the ideas behind the brand. Nike’s “Just Do It” campaign, endorsed by basketball star Michael Jordan, gave the company a new direction and a new means of promotion. Nike representatives have stated they have become more of a “marketing-oriented company” as opposed to a product manufacturer.Naomi Klein, No Logo (New York: Picador, 2002), 12–22.
Figure 12.7

In the 1990s, Nike was the target of protests due to its questionable labor practices.
As large brands became more popular, they also attracted the attention of reformers. Companies such as Starbucks and Nike bore the brunt of late 1990s sweatshop and labor protests. As these brands attempted to incorporate ideas outside of the scope of their products, they also came to represent larger global commerce forces.Margot Hornblower, “Wake Up and Smell the Protest,” Time, April 17, 2000. This type of brandingThe association of a particular brand with cultural values or lifestyles. increasingly incorporated public relations techniques that will be discussed later in this chapter.
Twenty-first-century advertising has adapted to new forms of digital media. Internet outlets such as blogs, social media forums, and other online spaces have created new possibilities for advertisers, and shifts in broadcasting toward Internet formats have threatened older forms of advertising. Video games, smartphones, and other technologies also present new possibilities. Specific new media advertising techniques will be covered in the next section.
Despite the rise of digital media, many types of traditional advertising have proven their enduring effectiveness. Local advertisers and large corporations continue to rely on billboards and direct-mail fliers. In 2009, Google initiated a billboard campaign for its Google Apps products that targeted business commuters. The billboards featured a different message every day for an entire month, using simple computer text messages portraying a fictitious executive learning about the product. Although this campaign was integrated with social media sites such as Twitter, its main thrust employed the basic billboard.Daniel Ionescu, “Google Billboard Ads Gun for Microsoft and Promote Google Apps,” PC World, August 3, 2009, http://www.pcworld.com/article/169475/google_billboard_ads_gun_for_microsoft_and_promote_google_apps.html.
Figure 12.8

Google billboards targeted commuters, creating a story that spanned the course of a month.
Although print ads have been around for centuries, Internet growth has hit newspaper advertising hard. A 45 percent drop in ad revenue between 2007 and 2010 signaled a catastrophic decline for the newspaper industry.Bruce Sterling, “More Newspaper Calamity,” Wired, March 15, 2010, http://www.wired.com/beyond_the_beyond/2010/03/more-newspaper-calamity/. Traditionally, newspapers have made money through commercial and classified advertising. Commercial advertisers, however, have moved to electronic media forms, and classified ad websites such as Craigslist offer greater geographic coverage for free. The future of newspaper advertising—and of the newspaper industry as a whole—is up in the air.
Print magazines have suffered from many of the same difficulties as newspapers. Declining advertising revenue has contributed to the end of popular magazines such as Gourmet and to the introduction of new magazines that cross over into other media formats, such as Food Network Magazine. Until a new, effective model is developed, the future of magazine advertising will continue to be in doubt.
Compared to newspapers and magazines, radio’s advertising revenue has done well. Radio’s easy adaptation to new forms of communication has made it an easy sell to advertisers. Unlike newspapers, radio ads target specific consumers. Advertisers can also pay to have radio personalities read their ads live in the studio, adding a sense of personal endorsement to the business or product. Because newer forms of radio such as satellite and Internet stations have continued to use this model, the industry has not had as much trouble adapting as print media have.
Television advertisement relies on verbal as well as visual cues to sell items. Promotional ad time is purchased by the advertiser, and a spot usually runs 15 to 30 seconds. Longer ads, known as infomercials, run like a television show and usually aim for direct viewer response. New technologies such as DVR allow television watchers to skip through commercials; however, studies have shown that these technologies do not have a negative effect on advertising.James Gallagher, “Duke Study: TiVo Doesn’t Hurt TV Advertising,” Triangle Business Journal, May 3, 2010, 20advertisinghttp://www.bizjournals.com/triangle/stories/2010/05/03/daily6.html. This is party due to product placement. Product placement is an important aspect of television advertising, because it incorporates products into the plots of shows. Although product placement has been around since the 1890s, when the Lumière brothers first placed Lever soap in their movies, the big boom in product placement began with the reality television show Survivor in 2000.Nate Anderson, “Product placement in the DVR era,” Ars Technica (blog), March 19, 2006, http://arstechnica.com/gadgets/news/2006/03/productplacement.ars. Since then, product placement has been a staple of prime-time entertainment. Reality television shows such as Project Runway and American Idol are known for exhibiting products on screen, and talk-show host Oprah Winfrey made news in 2004 when she gave away new Pontiacs to her audience members.Tanner Stansky, “14 Milestones in TV Product Placement,” Entertainment Weekly, July 28, 2008, http://www.ew.com/ew/article/0,,20215225,00.html. Even children’s shows are known to hock products; a new cartoon series recently began on Nickelodeon featuring characters that represent different Sketchers sneakers.Wayne Friedman, “Product Placement in Kids’ TV Programs: Stuff Your Footwear Can Slip On,” TV Watch, September 16, 2010, http://www.mediapost.com/publications/?fa=Articles.showArticle&art_aid=135873.
Emerging digital media platforms such as the Internet and mobile phones have created many new advertising possibilities. The Internet, like television and radio, offers free services in exchange for advertising exposure. However, unlike radio or television, the Internet is a highly personalized experience that shares private information.
As you read in , new advertising techniques have become popular on the Internet. Advertisers have tried to capitalize on the shared-media phenomenon by creating viral adsAdvertisements that attain spontaneous and widespread popularity through the Internet. that achieve spontaneous success online. Fewer than one in six ads that are intended to go viral actually succeed, so marketers have developed strategies to encourage an advertisement’s viral potential. Successful spots focus on creativity rather than a hard-selling strategy and generally target a specific audience.Fox Business, “Old Spice and E*TRADE Ads Provide Lessons in Viral Marketing,” March 17, 2010, http://www.foxbusiness.com/story/markets/industries/finance/old-spice-etrade-ads-provide-lessons-viral-marketing/. Recent Old Spice ads featured former NFL player Isaiah Mustafa in a set of continuous scenes, from a shower room to a yacht. The commercial ends with the actor on horseback, a theatrical trick that left viewers wondering how the stunt was pulled off. As of July 2010, the ad was the most popular video on YouTube with more than 94 million views, and Old Spice sales had risen 106 percent.Jack Neff, “How Much Old Spice Body Wash Has the Old Spice Guy Sold?” AdvertisingAge, July 26, 2010, http://adage.com/article?article_id=145096.
Social media sites such as Facebook use the information users provide on their profiles to generate targeted advertisements. For instance, if a person is a fan of Mariah Carey or joined a group associated with the singer, he or she might see announcements advertising her new CD or a local concert. While this may seem harmless, clicking on an ad sends user data to the advertising company, including name and user ID. Many people have raised privacy concerns over this practice, yet it remains in use. Free email services such as Gmail also depend on targeted advertising for their survival. Indeed, advertising is the only way such services could continue. Given the ongoing privacy debates concerning targeted Internet advertising, a balance between a user’s privacy and accessibility of services will have to be settled in the near future.
Mobile phones provide several different avenues for advertisers. The growing use of Internet radio through mobile-phone platforms has created a market for advertisements tapped by radio advertising networks such as TargetSpot. By using the radio advertising model for mobile phones, users receive increased radio broadcast options and advertisers reach new targeted markets.Marketwire, “TargetSpot Enters the Mobile Advertising Market,” news release, SmartBrief, February 23, 2010, http://www.smartbrief.com/news/aaaa/industryMW-detail.jsp?id=4217DD5E-932F-460E-BE30-4988E17DEFEC.
Another development in the mobile-phone market is the use of advertising in smartphone apps. Free versions of mobile-phone applications often include advertising to pay for the service. Popular apps such as WeatherBug and Angry Birds offer free versions with ads in the margins; however, users can avoid these ads by paying a few dollars to upgrade to “Pro” versions. Other apps such as Foursquare access a user’s geographic location and offer ads for businesses within walking distance.Rik Fairlee, “Smartphone Users Go for Location-Based Apps,” PC Magazine, May 18, 2010, http://www.pcmag.com/article2/0,2817,2363899,00.asp.
Figure 12.9

Free smartphone apps often contain ads to help pay for the service.
Advertising regulation has played an important role in of advertising’s history and cultural influence. One of the earliest federal laws addressing advertising was the Pure Food and Drug Law of 1906. A reaction to public outcry over the false claims of patent medicines, this law required informational labels to be placed on these products. It did not, however, address the questionable aspects of the advertisements, so it did not truly delve into the issue of false advertising.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 65–66.
Founded in 1914, the Federal Trade Commission became responsible for regulating false advertising claims. Although federal laws concerning these practices made plaintiffs prove that actual harm was done by the advertisement, state laws passed during the early 1920s allowed prosecution of misleading advertisements regardless of harm done.John Hood, Selling the Dream: Why Advertising Is Good Business (Westport, CT: Praeger, 2005), 74–75. The National Association of Attorneys General has helped states remain an important part of advertising regulation. In 1995, 13 states passed laws that required sweepstakes companies to provide full disclosure of rules and details of contests.Thomas O’Guinn, Chris Allen, and Richard Semenik, Advertising and Integrated Brand Promotion (Mason, OH: Cengage Learning, 2009), 133.
During the Great Depression, New Deal legislation threatened to outlaw any misleading advertising, a result of the burgeoning consumer movement and the public outcry against advertising during the period.Time, “The Press: Advertising v. New Deal,” September 1, 1941, http://www.time.com/time/magazine/article/0,9171,850703,00.html. The reformers did not fully achieve their goals, but they did make a permanent mark on advertising history. The 1938 Wheeler-Lea Amendment expanded the FTC’s role to protect consumers from deceptive advertising. Until this point, the FTC was responsible for addressing false advertising complaints from competitors. With this legislation, the agency also became an important resource for the consumer movement.
In 1971, the FTC began the Advertising Substantiation Program to force advertisers to provide evidence for the claims in their advertisements. Under this program, the FTC gained the power to issue cease-and-desist orders to advertisers regarding specific ads in question and to order corrective advertising. Under this provision, the FTC can force a company to issue an advertisement acknowledging and correcting an earlier misleading ad. Regulations under this program established that supposed experts used in advertisements must be qualified experts in their field, and celebrities must actually use the products they endorse.Thomas O’Guinn, Chris Allen, and Richard Semenik, Advertising and Integrated Brand Promotion (Mason, OH: Cengage Learning, 2009), 131–137. In 2006, Sunny Health Nutrition was brought to court for advertising height-enhancing pills called HeightMax. The FTC found the company had hired an actor to appear as an expert in its ads and that the pills did not live up to their claim. Sunny Health Nutrition was forced to pay $375,000 to consumers for misrepresenting its product.ConsumerAffairs.com, “Feds Slam ‘Height-Enhancing’ Pills,” November 29, 2006, http://www.consumeraffairs.com/news04/2006/11/ftc_chitosan.html.
In 1992, the FTC introduced guidelines defining terms such as biodegradable and recyclable. The growth of the environmental movement in the early 1990s led to an upsurge in environmental claims by manufacturers and advertisers. For example, Mobil Oil claimed their Hefty trash bags were biodegradable. While technically this statement is true, a 500- to 1,000-year decomposition cycle does not meet most people’s definitions of the term.Juliet Lapidos, “Will My Plastic Bag Still Be Here in 2507?” Slate, June 27, 2007, http://www.slate.com/id/2169287. The FTC guidelines made such claims false by law.Keith Schneider, “Guides on Environmental Ad Claims,” New York Times, July 29, 1992, http://www.nytimes.com/1992/07/29/business/guides-on-environmental-ad-claims.html.
The FTC has also turned its attention to online advertising. The Children’s Online Privacy Act of 1998 was passed to prohibit companies from obtaining the personal information of children who access websites or other online resources. Because of the youth orientation of the Internet, newer advertising techniques have drawn increasing criticism. Alcohol companies in particular have come under scrutiny. Beer manufacturer Heineken’s online presence includes a virtual city in which users can own an apartment and use services such as email. This practice mirrors that of children’s advertising, in which companies often create virtual worlds to immerse children in their products. However, the age-verification requirements to participate in this type of environment are easily falsified and can lead to young children being exposed to more mature content.Amanda Gardner, “Alcohol Companies Use New Media to Lure Young Drinkers: Report,” Bloomberg BusinessWeek, May 19, 2010, http://www.businessweek.com/lifestyle/content/healthday/639266.html.
Consumer and privacy advocates who are concerned over privacy intrusions by advertisers have also called for Internet ad regulation. In 2009, the FTC acted on complaints against Sears that resulted in an injunction against the company for not providing sufficient disclosure. Sears offered $10 to consumers to download a program that tracked their Internet browsing. The FTC came down on Sears because the downloaded software tracked sensitive information that was not fully disclosed to the consumer. Similar consumer complaints against Facebook and Google for their consumer tracking have, at present, not resulted in FTC actions; however, the growing outcry makes new regulation of Internet advertising likely.Mike Shields, “Pitching Self-Regulation,” Adweek, February 15, 2010.
Discussing advertising’s influence on culture raises a long-standing debate. One opinion states that advertising simply reflects the trends inherent in a culture, the other claims advertising takes an active role in shaping culture. Both ideas have merit and are most likely true to varying degrees.
George Babbitt, the protagonist of Sinclair Lewis’s 1922 novel Babbitt, was a true believer in the growing American consumer culture:
Just as the priests of the Presbyterian Church determined his every religious belief … so did the national advertisers fix the surface of his life, fix what he believed to be his individuality. These standard advertised wares—toothpastes, socks, tires, cameras, instantaneous hot-water heaters—were his symbols and proofs of excellence; at first the signs, and then the substitutes, for joy and passion and wisdom.Sinclair Lewis, Babbitt (New York: Harcourt, Brace, and Co., 1922), 95.
Although Lewis’s fictional representation of a 1920s-era consumer may not be an actual person, it indicates the national consumer culture that was taking shape at the time. As it had always done, advertising sought to attach products to larger ideas and symbols of worth and cultural values. However, the rise of mass media and of the advertising models that these media embraced made advertising take on an increasingly influential cultural role.
Automobile ads of the 1920s portrayed cars as a new, free way of life rather than simply a means of transportation. Advertisers used new ideas about personal hygiene to sell products and ended up breaking taboos about public discussion of the body. The newly acknowledged epidemics of halitosis and body odor brought about products such as mouthwash and deodorant. A Listerine campaign of the era transformed bad breath from a nuisance into the mark of a sociopath.Katherine Ashenburg, The Dirt on Clean: An Unsanitized History (Toronto: Vintage Canada, 2008), 245–247. Women’s underwear and menstruation went from being topics unsuitable for most family conversations to being fodder for the pages of national magazines.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 95–96.
Figure 12.10

Advertisements for deodorants and other hygiene products broke social taboos about public discussion of hygiene.
World War I bond campaigns had made it clear that advertising could be used to influence public beliefs and values. Advertising focused on the new—making new products and ideas seem better than older ones and ushering in a sense of the modernity. In an address to the American Association of Advertising Agencies in 1926, President Coolidge went as far as to hold advertisers responsible for the “regeneration and redemption of mankind.”Roland Marchand, Advertising the American Dream: Making Way for Modernity, 1920–1940 (Berkeley: University of California Press, 1985), 7–9.
Up through the 1960s, most advertising agencies were owned and staffed by affluent white men, and advertising’s portrayals of typical American families reflected this status quo. Mainstream culture as propagated by magazine, radio, and newspaper advertising was that of middle- or upper-class white suburban families.Roland Marchand, Advertising the American Dream: Making Way for Modernity, 1920–1940 (Berkeley: University of California Press, 1985), 77–79. This sanitized image of the suburban family, popularized in such TV programs as Leave It to Beaver, has been mercilessly satirized since the cultural backlash of the 1960s.
A great deal of that era’s cultural criticism targeted the image of the advertiser as a manipulator and promulgator of superficial consumerism. When advertisers for Volkswagen picked up on this criticism, turned it to their advantage, and created a new set of consumer symbols that would come to represent an age of rebellion, they neatly co-opted the arguments against advertising for their own purposes. In many instances, advertising has functioned as a codifier of its own ideals by taking new cultural values and turning them into symbols of a new phase of consumerism. This is the goal of effective advertising.
Apple’s 1984 campaign is one of the most well-known examples of defining a product in terms of new cultural trends. A fledgling company compared to computer giants IBM and Xerox, Apple spent nearly $2 million on a commercial that would end up only being aired once.Curt McAloney, “The 1984 Apple Commercial: The Making of a Legend,” Curt’s Media, http://www.curtsmedia.com/cine/1984.html. During the third quarter of the 1984 Super Bowl, viewers across the United States watched in amazement as an ad unlike any other at the time appeared on their TV screens. The commercial showed a drab gray auditorium where identical individuals sat in front of a large screen. On the screen was a man, addressing the audience with an eerily captivating voice. “We are one people, with one will,” he droned. “Our enemies shall talk themselves to death. And we will bury them with their own confusion. We shall prevail!”Curt McAloney, “The 1984 Apple Commercial: The Making of a Legend,” Curt’s Media, http://www.curtsmedia.com/cine/1984.html. While the audience sat motionlessly, one woman ran forward with a sledgehammer and threw it at the screen, causing it to explode in a flash of light and smoke. As the scene faded out, a narrator announced the product. “On January 24, Apple Computer will introduce the Macintosh. And you’ll see why 1984 won’t be like 1984.”Ted Friedman, “Apple’s 1984: The Introduction of the Macintosh in the Cultural History of Personal Computers,” http://www.duke.edu/~tlove/mac.htm. With this commercial, Apple defined itself as a pioneer of the new generation. Instead of marketing its products as utilitarian tools, it advertised them as devices for combating conformity.Ted Friedman, “Apple’s 1984: The Introduction of the Macintosh in the Cultural History of Personal Computers,” http://www.duke.edu/~tlove/mac.htm. Over the next few decades, other companies imitated this approach, presenting their products as symbols of cultural values.
In his study of advertising’s cultural impact, The Conquest of Cool, Thomas Frank compares the advertising of the 1960s with that of the early 1990s:
How [advertisers] must have rejoiced when the leading minds of the culture industry announced the discovery of an all-new angry generation, the “Twenty-Somethings,” complete with a panoply of musical styles, hairdos, and verbal signifiers ready-made to rejuvenate advertising’s sagging credibility…. The strangest aspect of what followed wasn’t the immediate onslaught of even hipper advertising, but that the entire “Generation X” discourse repeated … the discussions of youth culture that had appeared in Advertising Age, Madison Avenue, and on all those youth-market panel discussions back in the sixties.Thomas Frank, The Conquest of Cool (Chicago: University of Chicago Press, 1998), 233–235.
To be clear, advertisers have not set out to consciously manipulate the public in the name of consumer culture. Rather, advertisers are simply doing their job—one that has had an enormous influence on culture.
The white, middle-class composition of ad agencies contributed to advertisements’ rare depictions of minority populations. DDB—the agency responsible for the Volkswagen ads of the 1960s—was an anomaly in this regard. One of its more popular ads was for Levy’s rye bread. Most conventional advertisers would have ignored the ethnic aspects of this product and simply marketed it to a mainstream white audience. Instead, the innovative agency created an ad campaign that made ethnic diversity a selling point, with spots featuring individuals from a variety of racial backgrounds eating the bread with the headline “You don’t have to be Jewish to love Levy’s.”
Figure 12.11

Unusual for the time, Levy’s rye bread ads made diversity a selling point.
During the 1950s, stereotypical images of African Americans promulgated by advertisers began to draw criticism from civil rights leaders. Icons such as Aunt Jemima, the Cream of Wheat chef, and the Hiram Walker butler were some of the most recognizable black figures in U.S. culture. Unlike the African Americans who had gained fame through their artistry, scholarship, and athleticism, however, these advertising characters were famous for being domestic servants.
During the 1960s, meetings of the American Association of Advertising Agencies (AAAA) hosted civil rights leaders, and agencies began to respond to the criticisms of bias. A New York survey in the mid-1960s discovered that blacks were underrepresented at advertising agencies. Many agencies responded by hiring new African American employees, and a number of black-owned agencies started in the 1970s.Stephen Fox, The Mirror Makers (New York: William Morrow, 1984), 278–284.
Early advertising frequently reached out to women because they made approximately 80 percent of all consumer purchases. Thus, women were well represented in advertising. However, those depictions presented women in extremely narrow roles. Through the 1960s, ads targeting women generally showed them performing domestic duties such as cooking or cleaning, whereas ads targeting men often placed women in a submissive sexual role even if the product lacked any overt sexual connotation. A National Car Rental ad from the early 1970s featured a disheveled female employee in a chair with the headline “Go Ahead, Take Advantage of Us.” Another ad from the 1970s pictured a man with new Dacron slacks standing on top of a woman, proclaiming, “It’s nice to have a girl around the house.”Mark Frauenfelder, “Creepy Slacks Ad From 1970,” Boing Boing, (blog), May 12, 2008, http://boingboing.net/2008/05/12/creepy-slacks-ad-fro.html.
An advertising profile printed in Advertising Age magazine gave a typical advertiser’s understanding of the housewife at the time:
She likes to watch TV and she does not enjoy reading a great deal. She is most easily reached through TV and the simple down-to-earth magazines…. Mental activity is arduous for her…. She is a person who wants to have things she can believe in rather than things she can think about.Jerome Rodnitzky, Feminist Phoenix: The Rise and Fall of a Feminist Counterculture (Westport, CT: Praeger, 1999), 114–115.
The National Organization for Women (NOW) created a campaign during the early 1970s targeting the role of women in advertisements. Participants complained about the ads to networks and companies and even spray-painted slogans on offensive billboards in protest.
Representation of minorities and women in advertising has improved since the 1960s and 1970s, but it still remains a problem. The 2010 Super Bowl drew one of the most diverse audiences ever recorded for the event, including a 45 percent female audience. Yet the commercials remained focused strictly on men. And of 67 ads shown during the game, only four showed minority actors in a lead role. Despite the obvious economic benefit of diversity in marketing, advertising practices have resisted change.Sam Ali, “New Study: Super Bowl Ads Created by White Men,” DiversityInc.com, May 10, 2010., http://www.diversityinc.com/article/7566/New-Study-Super-Bowl-Ads-Created-by-White-Men/.
The majority of advertisements that target children feature either toys or junk food. Children under the age of 8 typically lack the ability to distinguish between fantasy and reality, and many advertisers use this to their advantage. Studies have shown that most children-focused food advertisenments feature high-calorie, low-nutrition foods such as sugary cereals. Although the government regulates advertising to children to a degree, the Internet has introduced new means of marketing to youth that have not been addressed. Online video games called advergamesOnline video games that feature particular products and are marketed to children. feature famous child-oriented products. The games differ from traditional advertising, however, because the children playing them will experience a much longer period of product exposure than they do from the typical 30-second television commercial. Child advocacy groups have been pushing for increased regulation of advertising to children, but it remains to be seen whether this will take place.Sandra Calvert, “Children as Consumers: Advertising and Marketing,” The Future of Children 18, no. 1 (Spring 2008): 205–211.
Although many people focus on advertising’s negative outcomes, the medium has provided unique benefits over time. Early newspaper advertising allowed newspapers to become independent of church and government control, encouraging the development of a free press with the ability to criticize powerful interests. When newspapers and magazines moved to an advertising model, these publications became accessible to large groups of people who previously could not afford them. Advertising also contributed to radio’s and television’s most successful eras. Radio’s golden age in the 1940s and television’s golden age in the 1950s both took place when advertisers were creating or heavily involved with the production of most of the programs.
Advertising also makes newer forms of media both useful and accessible. Many Internet services, such as email and smartphone applications, are only free because they feature advertising. Advertising allows promoters and service providers to reduce and sometimes eliminate the upfront purchase price, making these services available to a greater number of people and allowing lower economic classes to take part in mass culture.
Advertising has also been a longtime promoter of the arts. During the Renaissance, painters and composers often relied on wealthy patrons or governments to promote their work. Corporate advertising has given artists new means to fund their creative efforts. In addition, many artists and writers have been able to support themselves by working for advertisers. The use of music in commercials, particularly in recent years, has provided musicians with notoriety and income. Indeed, it is hard to imagine the cultural landscape of the United States without advertising.
Please answer the following short-answer questions. Each response should be a minimum of one paragraph.
Whereas advertising is the paid use of media space to sell something, public relationsThe actions used by an organization to communicate with its constituents. (PR) is the attempt to establish and maintain good relations between an organization and its constituents.Alison Theaker, The Public Relations Handbook (Oxfordshire, England: Routledge, 2004), 4. Practically, PR campaigns strive to use the free press to encourage favorable coverage. In their book The Fall of Advertising and the Rise of PR, Al and Laura Ries make the point that the public trusts the press far more than they trust advertisements. Because of this, PR efforts that get products and brands into the press are far more valuable than a simple advertisement. Their book details the ways in which modern companies use public relations to far greater benefit than they use advertising.Al Ries and Laura Ries, The Fall of Advertising and the Rise of PR (New York: HarperBusiness, 2004), 90. Regardless of the fate of advertising, PR has clearly come to have an increasing role in marketing and ad campaigns.
Table 12.1 Grunig and Hunt’s Four PR Models
Type of Model |
Description |
Example |
|---|---|---|
Traditional publicity model (the press agentry model) |
Professional agents seek media coverage for a client, product, or event. |
Thong-clad actor Sacha Baron Cohen promotes Bruno by landing in Eminem’s lap at the 2009 MTV Video Music Awards. |
Public information model |
Businesses communicate information to gain desired results. |
Colleges send informational brochures to potential students; a company includes an “about” section on its website. |
Persuasive communication model (the two-way asymmetric model) |
Organizations attempt to persuade an audience to take a certain point of view. |
Public service announcements like the one that shows “your brain” and “your brain on drugs.” |
Two-way symmetric model |
Both parties make use of a back-and-forth discussion. |
A company sends out customer satisfaction surveys; company Facebook groups and message boards. |
Source: James E. Grunig and Todd Hunt, Managing Public Relations (Belmont, CA: Wadsworth Publishing, 1984).
Todd Hunt and James Grunig developed a theory of four models of PR. This model has held up in the years since its development and is a good introduction to PR concepts.James E. Grunig and Todd Hunt, Managing Public Relations, 1984 (Belmont, CA: Wadsworth Publishing).
Under the traditional publicity modelA PR model that aims to gain media attention., PR professionals seek to create media coverage for a client, product, or event. These efforts can range from wild publicity stunts to simple news conferences to celebrity interviews in fashion magazines. P. T. Barnum was an early American practitioner of this kind of PR. His outrageous attempts at publicity worked because he was not worried about receiving negative press; instead, he believed that any coverage was a valuable asset. More recent examples of this style of extreme publicity include controversy-courting musicians such as Lady Gaga and Marilyn Manson. More restrained examples of this type of PR include the modern phenomenon of faded celebrities appearing on television shows, such as Paula Abdul’s long-running appearances on American Idol.
The goal of the public information modelA PR model that attempts to pass information on to the public. is to release information to a constituency. This model is less concerned with obtaining dramatic, extensive media coverage than with disseminating information in a way that ensures adequate reception. For example, utility companies often include fliers about energy efficiency with customers’ bills, and government agencies such as the IRS issue press releases to explain changes to existing codes. In addition, public interest groups release the results of research studies for use by policy makers and the public.
The persuasive communication modelA PR model that uses persuasive techniques to elicit a particular response from the target group., or the two-way asymmetric, works to persuade a specific audience to adopt a certain behavior or point of view. To be considered effective, this model requires a measured response from its intended audience.
Figure 12.12

Edward Bernays created campaigns using the persuasive communication model.
Government propagandaThe organized spreading of information to assist or weaken a cause. is a good example of this model. Propaganda is the organized spreading of information to assist or weaken a cause.Dictionary.com, s.v. “Propaganda,” http://dictionary.reference.com/browse/propaganda. Edward Bernays has been called the founder of modern PR for his work during World War I promoting the sale of war bonds. One of the first professional PR experts, Bernays made the two-way asymmetric model his early hallmark. In a famous campaign for Lucky Strike cigarettes, he convinced a group of well-known celebrities to walk in the New York Easter parade smoking Lucky Strikes. Most modern corporations employ the persuasive communication model.
The two-way symmetric modelA PR model that seeks to achieve consensus between two groups. requires true communication between the parties involved. By facilitating a back-and-forth discussion that results in mutual understanding and an agreement that respects the wishes of both parties, this PR model is often practiced in town hall meetings and other public forums in which the public has a real effect on the results. In an ideal republic, Congressional representatives strictly employ this model. Many nonprofit groups that are run by boards and have public service mandates use this model to ensure continued public support.
Commercial ventures also rely on this model. PR can generate media attention or attract costumers, and it can also ease communication between a company and its investors, partners, and employees. The two-way symmetric model is useful in communicating within an organization because it helps employees feel they are an important part of the company. Investor relations are also often carried out under this model.
Either private PR companies or in-house communications staffers carry out PR functions. A PR group generally handles all aspects of an organization’s or individual’s media presence, including company publications and press releases. Such a group can range from just one person to dozens of employees depending on the size and scope of the organization.
PR functions include the following:
Figure 12.13

Anatomy of a PR campaign
PR campaigns occur for any number of reasons. They can be a quick response to a crisis or emerging issue, or they can stem from a long-term strategy tied in with other marketing efforts. Regardless of its purpose, a typical campaign often involves four phases.
The first step of many PR campaigns is the initial research phase. First, practitoners identify and qualify the issue to be addressed. Then, they research the organization itself to clarify issues of public perception, positioning, and internal dynamics. Strategists can also research the potential audience of the campaign. This audience may include media outlets, constituents, consumers, and competitors. Finally, the context of the campaign is often researched, including the possible consequences of the campaign and the potential effects on the organization. After considering all of these factors, practitioners are better educated to select the best type of campaign.
During the strategy phase, PR professionals usually determine objectives focused on the desired goal of the campaign and formulate strategies to meet those objectives. Broad strategies such as deciding on the overall message of a campaign and the best way to communicate the message can be finalized at this time.
During the tactics phase, the PR group decides on the means to implement the strategies they formulated during the strategy phase. This process can involve devising specific communication techniques and selecting the forms of media that suit the message best. This phase may also address budgetary restrictions and possibilities.
After the overall campaign has been determined, PR practitoners enter the evaluation phase. The group can review their campaign plan and evaluate its potential effectiveness. They may also conduct research on the potential results to better understand the cost and benefits of the campaign. Specific criteria for evaluating the campaign when it is completed are also established at this time.Ronald Smith, Strategic Planning for Public Relations (Mahwah, NJ: Erlbaum Associates, 2002), 9–11.
Since its modern inception in the early 20th century, PR has turned out countless campaigns—some highly successful, others dismal failures. Some of these campaigns have become particularly significant for their lasting influence or creative execution. This section describes a few notable PR campaigns over the years.
During the 1930s, the De Beers company had an enormous amount of diamonds and a relatively small market of luxury buyers. They launched a PR campaign to change the image of diamonds from a luxury good into an accessible and essential aspect of American life. The campaign began by giving diamonds to famous movie stars, using their built-in publicity networks to promote De Beers. The company created stories about celebrity proposals and gifts between lovers that stressed the size of the diamonds given. These stories were then given out to selected fashion magazines. The result of this campaign was the popularization of diamonds as one of the necessary aspects of a marriage proposal.Stuart Reid, “The Diamond Myth,” Atlantic, http://www.theatlantic.com/magazine/archive/2006/12/the-diamond-myth/5491/.
Figure 12.14

In response to the increasing number of health concerns surrounding smoking, tobacco companies began running ads that argued the benefits of smoking their brand.
In 1953, studies showing the detrimental health effects of smoking caused a drop in cigarette sales. An alliance of tobacco manufacturers hired the PR group Hill & Knowlton to develop a campaign to deal with this problem. The first step of the campaign Hill & Knowlton devised was the creation of the Tobacco Industry Research Committee (TIRC) to promote studies that questioned the health effects of tobacco use. The TIRC ran advertisements featuring the results of these studies, giving journalists who were addressing the subject an easy source to quote. The groups working against smoking were not familiar with media relations, making it harder for journalists to quote them and use their arguments.
The campaign was effective, however, not because it denied the harmful effects of smoking but because it stressed the disagreements between researchers. By providing the press with information favorable to the tobacco manufacturers and publicly promoting new filtered cigarettes, the campaign aimed to replace the idea that smoking was undeniably bad with the idea that there was disagreement over the effects of smoking. This strategy served tobacco companies well up through the 1980s.
When the Russian space station Mir was set to crash land in the Pacific Ocean in 2001, Taco Bell created a floating vinyl target that the company placed in the Pacific. Taco Bell promised to give every American a free taco if the space station hit the target. This simple PR stunt gave all the journalists covering the Mir crash landing a few lines to add to their stories. Scientists even speculated on the chances of the station hitting the target—slim to none. Ultimately, the stunt gained Taco Bell global advertising.BBC World, “Taco Bell Cashes in on Mir,” March 20, 2001, http://news.bbc.co.uk/2/hi/americas/1231447.stm.
Figure 12.15

Taco Bell floated a target in the Pacific Ocean as part of a PR campaign.
In some cases, PR has begun overtaking advertising as the preferred way of promoting a particular company or product. For example, the tobacco industry offers a good case study of the migration from advertising to PR. Regulations prohibiting radio and television cigarette advertisements had an enormous effect on sales. In response, the tobacco industry began using PR techniques to increase brand presence.
Tobacco company Philip Morris started underwriting cultural institutions and causes as diverse as the Joffrey Ballet, the Smithsonian, environmental awareness, and health concerns. Marlboro sponsored events that brought a great deal of media attention to the brand. For example, during the 1980s, the Marlboro Country Music Tour took famous country stars to major coliseums throughout the country and featured talent contests that brought local bands up on stage, increasing the audience even further. Favorable reviews of the shows generated positive press for Marlboro. Later interviews with country artists and books on country music history have also mentioned this tour.
On the fifth anniversary of the Vietnam Veterans Memorial in 1987, Marlboro’s PR groups organized a celebration hosted by comedian Bob Hope. Country music legends the Judds and Alabama headlined the show, and Marlboro paid for new names inscribed on the memorial. By attaching the Marlboro brand to such an important cultural event, the company gained an enormous amount of publicity. Just as importantly, these efforts at least partially restored the stature that the brand lost due to health concerns.Leonard Saffir, Power Public Relations: How to Master the New PR (Lincolnwood, IL: NTC Contemporary, 2000), 77–88.
While advertising is an essential aspect of initial brand creation, PR campaigns are vital to developing the more abstract aspects of a brand. These campaigns work to position a brand in the public arena in order to give it a sense of cultural importance.
Pioneered by such companies as Procter & Gamble during the 1930s, the older, advertising-centric model of branding focused on the product, using advertisements to associate a particular branded good with quality or some other positive cultural value. Yet, as consumers became exposed to ever-increasing numbers of advertisements, traditional advertising’s effectiveness dwindled. The ubiquity of modern advertising means the public is sceptical of—or even ignores—claims advertisers make about their products. This credibility gap can be overcome, however, when PR professionals using good promotional strategies step in.
The new PR-oriented model of branding focuses on the overall image of the company rather than on the specific merits of the product. This branding model seeks to associate a company with specific personal and cultural values that hold meaning for consumers. In the early 1990s, for example, car company Saturn marketed its automobiles not as a means of transportation but as a form of culture. PR campaigns promoted the image of the Saturn family, associating the company with powerful American values and giving Saturn owners a sense of community. Events such as the 1994 Saturn homecoming sought to encourage this sense of belonging. Some 45,000 people turned out for this event; families gave up their beach holidays simply to come to a Saturn manufacturing plant in Tennessee to socialize with other Saturn owners and tour the facility.
Recently Toyota faced a marketing crisis when it instituted a massive recall based on safety issues. To counter the bad press, the company launched a series of commercials featuring top Toyota executives, urging the public to keep their faith in the brand.Sharon Bernstein, “Toyota faces a massive marketing challenge,” Los Angeles Times, February 9, 2010, http://articles.latimes.com/2010/feb/09/business/la-fi-toyota-marketing10-2010feb10. Much like the Volkswagen ads half a century before, Toyota used a style of self-awareness to market its automobiles. The positive PR campaign presented Toyotas as cars with a high standard of excellence, backed by a company striving to meet customers’ needs.
Apple has also employed this type of branding with great effectiveness. By focusing on a consistent design style in which every product reinforces the Apple experience, the computer company has managed to position itself as a mark of individuality. Despite the cynical outlook of many Americans regarding commercial claims, the notion that Apple is a symbol of individualism has been adopted with very little irony. Douglas Atkin, who has written about brands as a form of cult, readily admits and embraces his own brand loyalty to Apple:
I’m a self-confessed Apple loyalist. I go to a cafe around the corner to do some thinking and writing, away from the hurly-burly of the office, and everyone in that cafe has a Mac. We never mention the fact that we all have Macs. The other people in the cafe are writers and professors and in the media, and the feeling of cohesion and community in that cafe becomes very apparent if someone comes in with a PC. There’s almost an observable shiver of consternation in the cafe, and it must be discernable to the person with the PC, because they never come back.
Brand managers that once focused on the product now find themselves in the role of community leaders, responsible for the well-being of a cultural image.Douglas Atkin, interview, Frontline, PBS, February 2, 2004, http://www.pbs.org/wgbh/pages/frontline/shows/persuaders/interviews/atkin.html.
Kevin Roberts, the current CEO of Saatchi & Saatchi Worldwide, a branding-focused creative organization, has used the term “lovemark” as an alternative to trademark. This term encompasses brands that have created “loyalty beyond reason,” meaning that consumers feel loyal to a brand in much the same way they would toward friends or family members. Creating a sense of mystery around a brand generates an aura that bypasses the usual cynical take on commercial icons. A great deal of Apple’s success comes from the company’s mystique. Apple has successfully developed PR campaigns surrounding product releases that leak selected rumors to various press outlets but maintain secrecy over essential details, encouraging speculation by bloggers and mainstream journalists on the next product. All this combines to create a sense of mystery and an emotional anticipation for the product’s release.
Emotional connections are crucial to building a brand or lovemark. An early example of this kind of branding was Nike’s product endorsement deal with Michael Jordan during the 1990s. Jordan’s amazing, seemingly magical performances on the basketball court created his immense popularity, which was then further built up by a host of press outlets and fans who developed an emotional attachment to Jordan. As this connection spread throughout the country, Nike associated itself with Jordan and also with the emotional reaction he inspired in people. Essentially, the company inherited a PR machine that had been built around Jordan and that continued to function until his retirement.Kevin Roberts, interview, Frontline, PBS, December 15, 2003, http://www.pbs.org/wgbh/pages/frontline/shows/persuaders/interviews/roberts.html.
An important part of maintaining a consistent brand is preserving the emotional attachment consumers have to that brand. Just as PR campaigns build brands, PR crises can damage them. For example, the massive Gulf of Mexico oil spill in 2010 became a PR nightmare for BP, an oil company that had been using PR to rebrand itself as an environmentally friendly energy company.
In 2000, BP began a campaign presenting itself as “Beyond Petroleum,” rather than British Petroleum, the company’s original name. By acquiring a major solar company, BP became the world leader in solar production and in 2005 announced it would invest $8 billion in alternative energy over the following 10 years. BP’s marketing firm developed a PR campaign that, at least on the surface, emulated the forward-looking two-way symmetric PR model. The campaign conducted interviews with consumers, giving them an opportunity to air their grievances and publicize energy policy issues. BP’s website featured a carbon footprint calculator consumers could use to calculate the size of their environmental impact.Gregory Solman, “BP: Coloring Public Opinion?” Adweek, January 14, 2008, 1http://www.adweek.com/aw/content_display/news/strategy/e3i9ec32f006d17a91cd72d6192b9f7599a. The single explosion on BP’s deep-water oil rig in the Gulf of Mexico essentially nullified the PR work of the previous 10 years, immediately putting BP at the bottom of the list of environmentally concerned companies.
A company’s control over what its brand symbolizes can also lead to branding issues. The Body Shop, a cosmetics company that gained popularity during the 1980s and early 1990s, used PR to build its image as a company that created natural products and took a stand on issues of corporate ethics. The company teamed up with Greenpeace and other environmental groups to promote green issues and increase its natural image.
By the mid-1990s, however, revelations about the unethical treatment of franchise owners called this image into serious question. The Body Shop had spent a great deal of time and money creating its progressive, spontaneous image. Stories of travels to exotic locations to research and develop cosmetics were completely fabricated, as was the company’s reputation for charitable contributions. Even the origins of the company had been made up as a PR tool: the idea, name, and even product list had been ripped off from a small California chain called the Body Shop that was later given a settlement to keep quiet. The PR campaign of the Body Shop made it one of the great success stories of the early 1990s, but the unfounded nature of its PR claims undermined its image dramatically. Competitor L’Oréal eventually bought the Body Shop for a fraction of its previous value.Jon Entine, “Queen of Green Roddick’s ‘Unfair Trade’ Started When She Copied Body Shop Formula,” Daily Mail (London), September 15, 2007, http://www.dailymail.co.uk/femail/article-482012/Queen-Green-Roddicks-unfair-trade-started-copied-Body-Shop-formula.html.
Other branding backlashes have plagued companies such as Nike and Starbucks. By building their brands into global symbols, both companies also came to represent unfettered capitalist greed to those who opposed them. During the 1999 World Trade Organization protests in Seattle, activists targeted Starbucks and Nike stores for physical attacks such as window smashing. Labor activists have also condemned Nike over the company’s use of sweatshops to manufacture shoes. Eventually, Nike created a vice president for corporate responsibility to deal with sweatshop issues.Naomi Klein, No Logo (New York: Picador, 2002), 366.
Adbusters, a publication devoted to reducing advertising’s influence on global culture, added action to its criticisms of Nike by creating its own shoe. Manufactured in union shops, Blackspot shoes contain recycled tire rubber and hemp fabric. The Blackspot logo is a simple round dot that looks like it has been scribbled with white paint, as if a typical logo had been covered over. The shoes also include a symbolic red dot on the toe with which to kick Nike. Blackspot shoes use the Nike brand to create their own anti-brand, symbolizing progressive labor reform and environmentally sustainable business practices.“Nat Ives, “Anti-Ad Group Tries Advertising,” New York Times, September 21, 2004, http://www.nytimes.com/2004/09/21/business/media/21adco.html.
Figure 12.16

Blackspot shoes developed as an anti-brand alternative to regular sneakers.
Politics and PR have gone hand in hand since the dawn of political activity. Politicians communicate with their constituents and make their message known using PR strategies. Benjamin Franklin’s trip as ambassador to France during the American Revolution stands as an early example of political PR that followed the publicity model. At the time of his trip, Franklin was an international celebrity, and the fashionable society of Paris celebrated his arrival; his choice of a symbolic American-style fur cap immediately inspired a new style of women’s wigs. Franklin also took a printing press with him to produce leaflets and publicity notices that circulated through Paris’s intellectual and fashionable circles. Such PR efforts eventually led to a treaty with France that helped the colonists win their freedom from Great Britain.Walter Isaacson, Benjamin Franklin: An American Life (New York: Simon & Schuster, 2003), 325–349.
Famous 20th-century PR campaigns include President Franklin D. Roosevelt’s Fireside Chats, a series of radio addresses that explained aspects of the New Deal. Roosevelt’s personal tone and his familiarity with the medium of radio helped the Fireside Chats become an important promotional tool for his administration and its programs. These chats aimed to justify many New Deal policies, and they helped the president bypass the press and speak directly to the people. More recently, Blackwater Worldwide, a private military company, dealt with criticisms of its actions in Iraq by changing its name. The new name, Xe Services, was the result of a large-scale PR campaign to distance the company from associations with civilian violence.Associated Press, “Blackwater Ditches Tarnished Brand Name,” USA Today, February 13, 2009, http://www.usatoday.com/news/military/2009-02-13-blackwater_N.htm.
The proliferation of media outlets and the 24-hour news cycle have led to changes in the way politicians handle PR. The gap between old PR methods and new ones became evident in 2006, when then–Vice President Dick Cheney accidentally shot a friend during a hunting trip. Cheney, who had been criticized in the past for being secretive, did not make a statement about the accident for three days. Republican consultant Rich Galen explained Cheney’s silence as an older PR tactic that tries to keep the discussion out of the media. However, the old trick is less effective in the modern digital world.
That entire doctrine has come and gone. Now the doctrine is you respond instantaneously, and where possible with a strong counterattack. A lot of that is because of the Internet, a lot of that is because of cable television news.Associated Press, “Cheney Hunting Accident Seen as P.R. Disaster,” MSNBC, February 16, 2006, http://www.msnbc.msn.com/id/11396608/ns/politics/.
PR techniques have been used in propaganda efforts throughout the 20th century. During the 1990s, the country of Kuwait employed Hill & Knowlton to encourage U.S. involvement in the Persian Gulf region. One of the more infamous examples of their campaign was a heavily reported account by a Kuwaiti girl testifying that Iraqi troops had dumped babies out of incubators in Kuwaiti hospitals. Outrage over this testimony helped galvanize opinion in favor of U.S. involvement. As it turned out, the Kuwaiti girl was really the daughter of the Kuwaiti ambassador and had not actually witnessed any of the alleged atrocities.Patricia Parsons, Ethics in Public Relations (Sterling, VA: Chartered Institute of Public Relations, 2005), 7.
Lobbyists also attempt to influence public policy using PR campaigns. The Water Environment Federation, a lobbying group representing the sewage industry, initiated a campaign to promote the application of sewage on farms during the early 1990s. The campaign came up with the word biosolids to replace the term sludge. Then it worked to encourage the use of this term as a way to popularize sewage as a fertilizer, providing information to public officials and representatives. In 1992, the U.S. Environmental Protection Agency adopted the new term and changed the classification of biosolids to a fertilizer from a hazardous waste. This renaming helped New York City eliminate tons of sewage by shipping it to states that allowed biosolids.John Stauber and Sheldon Rampton, Toxic Sludge is Good for You! (Monroe, ME: Common Courage Press, 1995), 105–119.
Politics has also embraced branding. Former President Bill Clinton described his political battles in terms of a brand war:
[The Republicans] were brilliant at branding. They said they were about values…. Everybody is a values voter, but they got the brand … they said they were against the death tax … what a great brand…. I did a disservice to the American people not by putting forth a bad plan, but by not being a better brander, not being able to explain it better.David Kiley, “How Will Bill Clinton Manage His Brand?” BusinessWeek, June 10, 2008, analysishttp://www.businessweek.com/bwdaily/dnflash/content/jun2008/db2008069_046398.htm.
Branding has been used to great effect in recent elections. A consistently popular political brand is that of the outsider, or reform-minded politician. Despite his many years of service in the U.S. Senate, John McCain famously adopted this brand during the 2008 presidential election. McCain’s competitor, Barack Obama, also employed branding strategies. The Obama campaign featured several iconic portraits and slogans that made for a consistent brand and encouraged his victory in 2008. Before Obama’s inauguration in January 2009, an unprecedented amount of merchandise was sold, a further testament to the power of branding.Sheldon Alberts, “Brand Obama,” Financial Post, January 17, 2009, http://www.financialpost.com/m/story.html?id=1191405.
Figure 12.17

The 2008 Obama campaign used logos as a way to publicize Obama’s brand.
That so many different groups have adopted branding as a means of communication is a testament to its ubiquity. Even anti-commercial, anti-brand groups such as Adbusters have created brands to send messages. Social media sites have also encouraged branding techniques by allowing users to create profiles of themselves that they use to communicate their core values. This personal application is perhaps the greatest evidence of the impact of advertising and PR on modern culture. Branding, once a technique used by companies to sell their products, has become an everyday means of communication.
Please answer the following short-answer questions. Each response should be a minimum of one paragraph.
Review Questions
Questions for Section 12.1 "Advertising"
Questions for Section 12.2 "Public Relations"
Advertising has had an enormous influence on the ways that people present and imagine themselves. Personal branding has become an industry, with consultants and coaches ready to help anyone find his or her own brand. Creating a personal brand is a useful way to assess your skills and feelings about the advertising or PR professions.
Research the term personal brand using a search engine. Look for strategies that would help you construct your own brand. Imagine that you are a brand and describe what that brand offers. This does not need to be limited to professional capacities, but should represent your personal philosophy and life experiences. In 15 words or less, write a description of your brand.
Answer the following questions about your brand description:
Figure 13.1
In the late 19th century, Andrew Carnegie had a brilliant idea. Instead of buying materials and manufacturing steel, Carnegie bought up mines, railways, and all other aspects of the industry, pioneering a business model that later became known as vertical integration, in which a company owns both its suppliers and buyers. Gathering, manufacturing, and delivering raw materials and finished goods all under the control of a single corporation allowed Carnegie’s profits to soar by cutting out the middleman and allowing him to drive the competition out of certain markets. A century later, this same strategy still works; it may not drive industrialization, but its effects are just as powerful.
In late 2009, cable company Comcast announced a plan to purchase a controlling ownership stake in NBC Universal to allow Comcast to join with NBC.Tim Arango, “G.E. Makes It Official: NBC Will Go to Comcast,” New York Times, December 4, 2009, http://www.nytimes.com/2009/12/04/business/media/04nbc.html. This multibillion-dollar deal would give Comcast a 51 percent stake in the company, with present owners General Electric (GE) retaining control of the other 49 percent. The proposed venture brought together all NBC Universal content—including Universal Pictures and Focus Features; Spanish-language network Telemundo and the cable networks USA, Bravo, CNBC, and MSNBC—with Comcast’s cable channels, which include E! Entertainment, the Golf Channel, and the sports network Versus. Already one of the nation’s largest cable and broadband Internet providers, Comcast would then conceivably have the power to restrict these hugely popular NBC-owned networks to its own cable service, thus forcing consumers to adopt Comcast in order to watch them, or to charge huge premiums to competitors’ cable subscribers for the channels, thereby making their own cable service more desirable.
The most concerning—or beneficial, for Comcast—aspect of this merger is how it may integrate online content with traditional cable media. NBC Universal cofounded Hulu, the second-largest online video channel in the United States. If Comcast sees ad-driven sites such as Hulu as a threat to its cable business, then ownership over the online video portal would allow Comcast to restrict the site and all of NBC’s online content to its own cable subscribers. In effect, Comcast would be allowed to create a subscription model for Internet content, just as it sells subscriptions for cable content. For years, viewers have been able to pick and choose from a wide variety of sources, selecting only the online content that they want; now, some fear that Comcast could bring the problems of a cable subscription—hundred of channels but only some worth watching—to the Internet.Alex Chasick, “Why a Comcast/NBC Merger Is Bad News,” Consumerist (blog), December 3, 2009, http://consumerist.com/2009/12/why-a-comcastnbc-merger-is-bad-news.html.
This merger has the potential to reshape the way that mass media is produced and distributed to consumers. When most Internet users subscribed to America Online (AOL), the company set up its own site simply as a portal to other companies’ content. The proposed integration of content producers and service providers, however, allows for unprecedented control of Internet content. Net neutrality poses another problem; Comcast could potentially grant its own content channels— such as a subscription-only version of Hulu— privileges over competing channels. While this does not necessarily pose a problem when there is healthy competition, in many regions Comcast is the only provider of broadband Internet, thus raising concerns of a potential monopoly. No matter what happens with this particular merger, it seems that the economics of mass media are becoming even more tangled as the rapid rise of new technology threatens to transform or replace traditional media outlets.
The merger of Comcast and NBC is just one example of the myriad ways media companies do business. Television, print publishing, radio broadcasting, music, and film all have their own economic nuances and distinct models. However, these business models fall into three general categories: monopoly, oligopoly, and monopolistic competition.
Of these three basic media business models, monopoly is probably the most familiar. A monopolyThe control of a product or service by one company. occurs when one controls a product or service—for example, a small town with only one major newspaper. OligopolyThe control of a product or service by a few companies., or the control of a product or service by just a few companies, commonly occurs in publishing; a few major publishers put out most best-selling books, and relatively few companies control many of the nation’s highest-circulating magazines. Television is much the same way, as the major broadcast networks—Comcast and GE’s NBC, Disney’s ABC, National Amusements’s CBS, and News Corporation’s Fox—own nearly all broadcast and cable outlets. Finally, monopolistic competitionThe control of a product or service by numerous companies offering relatively limited products and services. takes place when multiple companies offer essentially the same product or service. For example, Ticketmaster and Live Nation were longtime competitors until they merged in 2010, with both providing basically the same set of event-management services for music and other live entertainment industries.
The last few decades have seen increasing conglomeration of media ownership, allowing for economies of scale that previously could not be achieved. Instead of individual local radio stations competing for advertising revenue among a range of local companies, for example, large corporations can now buy wholesale advertising for any or all of their brands on a dozen different radio stations in a single media market all owned by a conglomerate such as Clear Channel. The economics of mass media has become a matter of macroeconomic proportions: GE now makes everything from jet engines to cable news. The implications of this go beyond advertising. Because major corporations now own nearly every media outlet, ongoing fears of corporate control of media messaging have intensified.
However, these fears are often channeled into productive enterprises. In many media industries, an ongoing countercurrent exists to provide diversity not found in many corporate-owned models. Independent radio stations such as those affiliated with nonprofit organizations and colleges provide news and in-depth analysis as well as a variety of musical and entertainment programs that are not found on corporate stations. Likewise, small music labels have had recent success promoting and distributing music through online CD sales or digital distribution services such as iTunes appliance program. YouTube makes it easier for videographers to reach a surprisingly large market, often surpassing even professional sites such as Hulu.
Companies employ many different ways to raise revenue for their services, but all boil down to two fundamental ideas: The money comes either from consumers or from advertising. In practice, many outlets combine the two to give themselves a flexible stream of income. Equally, consumers may be willing to pay slightly more for fewer ads, or to sit through more advertising in exchange for free content.
Traditional book publishers, which make practically all of their money by selling their products directly to consumers, lie on one extreme end of the spectrum. In some respects, cable companies use a related model under which they directly sell consumers the delivery and subscription of a bundled package of programming channels. However, cable channels primarily rely on a mix of media revenue models, receiving funding from advertising along with subscription fees. Magazines and newspapers may fall into this middle-ground category as well, although online classified advertising has caused print publications to lose this important revenue stream in recent years. Broadcast television is the clearest example of advertising-driven income, as there are no subscription fees for these channels. Because this lack of direct fees increases the potential audience for the network, networks can sell their advertising time at a premium, as opposed to a cable channel with a more limited and likely more narrow viewership.
Print media fall into three basic categories: books, newspapers, and magazines. The book publishing industry is basically an oligopoly; the top 10 trade publishers made up 72 percent of the total market in 2009, with the top five alone comprising 58 percent of this.Michael Hyatt, “Top Ten U.S. Book Publishers for 2009,” January 15, 2010, http://michaelhyatt.com/2010/01/top-ten-u-s-book-publishers-for-2009.html. Newspapers tend toward local monopolies and oligopolies, as there are generally few local news sources. In the past classified advertising made up a substantial portion of newspaper revenue. However, the advent of the Internet—particularly free classified services such as Craigslist—has weakened the newspaper industry through dwindling classified advertising revenues.
The newspaper industry also entails a mix of initial, or first copy costsThe added cost of the first unique good produced, such as the initial copy of a print newspaper., and relatively low marginal costsThe costs per unit of a good, such as a print newspaper.. Journalistic and editorial costs are relatively high, whereas the costs of newsprint and distribution are fairly low. The transition from the labor-intensive process of mechanical typesetting to modern electronic printing greatly reduced the marginal costs of producing newspapers. However, the price of newsprint still goes through cyclical ups and downs, making it difficult to price a newspaper in the long run.
The highest costs of publishing a paper remain the editorial and administrative overheads. Back-office activities such as administration and finance can often be combined if a company owns more than one paper. Unlike the historical restrictions on broadcast media that limited the number of stations owned by a single network, print media has faced no such ownership limits. Because of this, a company such as Gannett has come to own USA Today as well as mostly local newspapers in 33 states, Guam, and the United Kingdom.Columbia Journalism Review, “Who Owns What,” August 13, 2008, http://www.cjr.org/resources/index.php. Other companies, such as McClatchy, also run their own wire services, partly as a way of reducing the costs of providing national journalism to many local markets.
Like newspapers, magazines are largely owned by just a few companies. However, unlike newspapers, many magazine chains are themselves owned by much larger media conglomerates. Time Warner—the highest-ranking media company in 2003—owns numerous magazines, including Time, Fortune, and Sports Illustrated. Taking all of its publications into account, Time Warner controls a 20 percent share of all magazine advertising in the United States. However, many smaller publishers produce niche publications, many of which do not aspire to a wider market. In all, magazines seem to be undergoing a period of economic decline, with a net loss of some 120 publications in 2009 alone.Matthew Flamm, “367 Magazines Shuttered in 2009,” Crain’s New York Business, December 11, 2009, http://www.crainsnewyork.com/article/20091211/FREE/912119988.
As discussed in , large media conglomerates own nearly all television networks. Both national networks and local affiliates are typically owned by conglomerates; however, stations such as Fox-owned WNYW in New York or CBS-owned KCNC in Denver are able to mix local content with national reporting and programming, much as large newspaper companies do.
In a local market, one cable company usually dominates the cable service market. In many places, one cable company, such as Comcast—the largest of the cable companies—is the only option. Over the past several years, however, satellite companies such as Dish Network and DirecTV, which are able to reach any number of consumers with limited local infrastructure, have introduced increased, albeit limited, levels of competition.
Even as cable is expanding, radio has become heavily consolidated. Since the 1990s, massive radio networks such as Clear Channel Communications have bought up many local stations in an effort to control every radio station in a given media market. However, the FCC has designated the lower part of the FM radio band for noncommercial purposes, including nonprofit programming such as educational, religious, or public radio stations—and continues to hold public discussion on frequency allocations. These practices help retain a certain level of programming diversity in the face of increased homogenization, largely because such stations are not supported through advertising. Because they are funded by donations or nonprofit institutions, these stations benefit economically from catering to a minority of listeners who may support the station directly, rather than a larger majority that has other options for entertainment.
Because both the music and film industries face unique business opportunities and challenges, each operates on an economic model unlike either print or broadcast media. Just like those forms of media, however, music and film have undergone significant changes due to consolidation and technological and consumer shifts in recent years.
The music industry is closely related to the radio industry, and the two have a high degree of codependence. Without music, radio would not be quite as lively or nearly as popular; without radio, music would be more difficult for listeners to discover, and perhaps be limited to a local consumer base.
As radio companies have consolidated, so has the music industry. A total of four record companies, popularly called the “Big Four” within the industry, dominate the recorded music business and thus most mainstream radio airwaves. Because a conglomerate such as Clear Channel is ill-equipped to handle local tastes and musical acts—and because it tends to be easier to manage programming across a large regional area than on a station-by-station basis—the Big Four record companies tend to focus on national and international acts. After all, if a label can convince a single radio conglomerate to play a particular act’s music, that performer instantly gains access to a broad national market.
Music is therefore widely considered an oligopoly, despite the presence of countless small, independent companies. A handful of major record labels dominate the market, and they are all basically structured the same way. Universal is owned by NBC, which was in turn owned by GE and now Comcast; Sony Music is owned by the eponymous Japanese technology giant; Warner Music Group, although now its own entity, was previously under the umbrella of Time Warner; and the EMI Group is owned by a private investment firm.
Although the Big Four dominate the recorded music industry, they have surprisingly little to do with live performances. Traditionally, musicians have toured to promote their albums—and sell enough copies to pay off their advances—and the live show was a combination of self-promotion and income. An artist’s record company provided financial support, but a concert ticket generated significantly more income per sale than a CD. Since the merger of ticketing companies Ticketmaster and Live Nation, the ticketing services for large venues have practically been monopolized. For example, Madison Square Garden, one of the largest venues in New York City, does not handle its booking in-house, and with good reason; the technology to manage tens of thousands of fans trying to buy tickets to a soon-to-be-sold-out concert the day they go on sale would likely break the system. Instead, Ticketmaster handles all of the ticketing for Madison Square Garden, adding a 10 percent to 20 percent fee to the face value of the ticket for its exclusive service, depending on the venue and price of the show.
Because of the nature of film, the economics of the medium are slightly different from those of music. The absence of film in broadcasting, the lack of a live performance, and the exponentially higher budgets are just some of its unique facets. As with music, however, large companies tend to dominate the market. These massive studios are now connected corporately with other media outlets. For example, Sony and Universal both have partners in the music industry, while Fox and Disney control major television broadcast and cable networks as well as film studios.
Just as record labels do with radio conglomerates, film distribution companies tend to sell to large chains, such as the over 6,000-screens-strong Regal Entertainment Group and the over 4,000-screens-strong AMC Entertainment, which have national reach.National Association of Theater Owners, “Top Ten Circuits,” July 1, 2009, http://www.natoonline.org/statisticscircuits.htm. However, independent filmmakers still provide limited competition to these larger studios.
Figure 13.2

The founders of Miramax, brothers Bob and Harvey Weinstein, had a messy breakup with major studio Disney.
Brothers Bob and Harvey Weinstein founded Miramax in 1979 with the intention of independence. Over the ensuing years, they released films that were off-limits to major distributors, such as Quentin Tarantino’s violent Reservoir Dogs and Steven Soderbergh’s controversial Sex, Lies, and Videotape. After Disney bought the smaller studio in 1993, Miramax gained access to even larger financial backing, albeit somewhat begrudgingly. Miramax had cultivated relationships with the now-blockbuster directors Tarantino and Kevin Smith—the director of Clerks, Dogma, and Jay and Silent Bob Strike Back—and when Tarantino’s Pulp Fiction made more than $100 million at the box office within 2 years of Disney’s purchase of Miramax, it seemed like a good deal. As a result, Disney signed the Weinsteins to a new contract, giving them an annual budget of $700 million, and in 2003 Disney gave the Weinsteins permission to raise additional hundreds of millions of dollars from Goldman Sachs in order to make even more expensive movies.“Significant Events in Disney’s Ownership of Miramax,” New York Times, March 5, 2005, http://www.nytimes.com/imagepages/2005/03/06/movies/20050306_MIRAMAX.html.
By 2004, however, relations between Miramax and Disney were turning sour. In May of that year, Disney would not allow Miramax to release Michael Moore’s incendiary documentary Fahrenheit 9/11. In response, the Weinsteins sought outside funding and released it themselves to great success; the film became the highest-grossing documentary of all time, with revenue of $222 million on a mere $6 million budget.Box Office Mojo, “Fahrenheit 9/11,” http://boxofficemojo.com/movies/?id=fahrenheit911.htm. A year later, the Weinsteins dissolved their relationship with Disney. Disney, however, kept the Miramax brand and the entire Miramax library of films.
Yet this fissure did not end the Weinsteins’ careers. In 2005, the brothers founded a new independent film company, the Weinstein Co., which has had some success with films including Vicky Cristina Barcelona and The Queen, as well as the Michael Moore documentaries Sicko and Capitalism: A Love Story. However, when even independent film legends such as the Weinsteins have only limited success, it’s clear that success is hard to come by. The A.V. Club—a companion to the satirical newspaper the The Onion—asked in January 2010, just after Disney closed Miramax for good, “How much longer will the studio ‘indie’ model be viable at all?”Scott Tobias, “R.I.P. (Companies Are People, Too, Division): Miramax 1979–2010,” A.V. Club, January 28, 2010, http://www.avclub.com/articles/rip-companies-are-people-too-division-miramax-1979,37639/. Today, there are few true “indie” studios left, and several major studios have closed their boutique studios, such as Warner Independent and Paramount Vantage. But even if some are questioning the economics of the indie-studio models of the 1980s and 1990s, it seems that there will always be an artistic drive for independent film—and, eventually, someone’s bound to make the economics of it work again.
In many ways, the Internet has been a game-changer throughout the media industry. However, a few things have stayed the same; major media companies own popular media content sites such as Hulu and YouTube and control access to a great deal of online information. Even bloggers, who have found a new role as drivers of the media cycle, are at a disadvantage when it comes to the ability to generate original content. They tend to drive much of their traffic by reposting and adding commentary to news stories from established media outlets. One large and relatively influential outlet, the Drudge Report, is mainly composed of links to outside news organizations rather than original journalism. It gained fame during the late 1990s for breaking the Bill Clinton and Monica Lewinsky scandal—albeit by posting about how Newsweek killed reporter Michael Isikoff’s story on the matter.BBC News, “Scandalous Scoop Breaks Online,” January 25, 1998, http://news.bbc.co.uk/2/hi/special_report/1998/clinton_scandal/50031.stm. Still, the economic complications of the Internet have changed the calculus of media permanently, a status made clear by the drastic increase in free content over the past decade.
Choose a media outlet such as the Washington Post or CNN and visit its website to determine its parent company. Often this will be in the “Corporate” or “About Us” sections. Then visit the Columbia Journalism Review’s resource “Who Owns What?” at http://www.cjr.org/resources/index.php. Consider and respond to the following questions:
The challenge to media economics is one of production. When print media was the only widely available media, the concept was simple: Sell newspapers, magazines, and books. Sales of these goods could be gauged like any other product, although in media’s case, the good was intangible—information—rather than the physical paper and ink. The transition from physical media to broadcast media presented a new challenge, because consumers did not pay money for radio and, later, television programming; instead, the price was an interruption every so often by a “word from our sponsors.” However, even this practice hearkened back to the world of print media; just as newspapers and magazines sell advertising space, radio and television networks sell space on their airwaves.
The fundamental shift in Internet economics has been the miniscule price of online space compared to that in print or broadcast media. Combined with the instantaneous proliferation of information, the Internet seems to pose a grave threat to traditional media. Media outlets have responded by establishing themselves online, and it is now practically unheard of for any media company to lack an Internet presence. Companies’ archives have opened up, and aside from a few holdouts such as the Wall Street Journal, nearly every newspaper allows free online access, although some papers, like The New York Times, are going to experiment with a paid subscription model to solve the problem of dwindling revenues. Newspapers now offer video content online, and radio and television networks have published traditional text-and-photo stories. Through Internet portals, media companies have synergized their content; they are no longer merely television networks or local newspapers but instead are quickly moving to become a little bit of everything.
Although the Internet has had many effects on media economics, ranging from media piracy to the lowered costs of distribution, arguably the greatest effect has been the synergyThe combination of media outlets across platforms. of different forms of media. For example, the front page of The New York Times website contains multiple short video clips, and the front page of Fox News’ website contains clips from the cable television network along with relevant articles written by FoxNews.com staff. Media outlets offer many of these services for free to consumers, if for no other reason than because consumers have become accustomed to getting this content for free elsewhere on the Internet.
The Internet has also drastically changed the way that companies’ advertising models operate. During the early years of the Internet, many web ads were geared toward sites such as Amazon and eBay, where consumers purchased products or services. Today, however, many ads—particularly on sites for high-profile media outlets such as Fox News and The New York Times—are for products that are not typically bought online, such as cars or major credit cards. However, another category of advertising that is tailored toward individual web pages has also gained prominence on the Internet. In this form of advertising, marketers match advertisers with particular keywords on particular web pages. For example, if the page is a how-to guide for fixing a refrigerator, some of the targeted ads might be for local refrigerator repair shops.
Figure 13.3

Google makes almost all of its money through advertising, allowing it to provide many services such as email and document sharing for free.
Search-engine company Google has been working to perfect this type of targeted advertising search. Low-cost text ads may appear next to its search results, on various web pages, and in the sidebar of its free web-based email service, Gmail. More than just using algorithms to sort through massive amounts of data and matching advertising to content, Google has lowered the cost barrier to advertising, as well as the volume barrier to hosting advertising. Because Google automatically matches sites with advertisers, an independent site can sign up for its advertising service and get paid for each person who follows the text links. Likewise, relatively small companies can buy advertising space in specialized niches without having to go through a large-volume ad buyer. This business has proven extremely productive; the bulk of Google’s revenue comes from advertising even as it gives away services such as email and document sharing.
Search engines like Google and video-sharing sites like YouTube (which is owned by Google) allow access to online information, but they do not actually produce that information themselves. Thus, the propensity of these sites to gather information and then make it available to consumers free of charge does not necessarily sit well with those who financially depend on the sale of this information.
One of Google’s more controversial projects is Google News, a news aggregator that automatically collects news stories from various sources on the Internet. This service allows users to view the latest news from many different sources conveniently in one location. However, the project has been met with opposition from a number of those news sources, who contend that Google has infringed on their copyrights and cost them revenue. The Wall Street Journal has been one of the more vocal critics of Google News. In April 2009, editor Robert Thomson said that news aggregators are “best described as parasites.”Jane Schulze, “Google Dubbed Internet Parasite by WSJ Editor,” Australian (Sydney), April 6, 2009, http://www.theaustralian.com.au/business/media/google-dubbed-internet-parasite/story-e6frg996-1225696931547. In December 2009, Google responded to these complaints by allowing publishers to set a limit on the number of articles per day a reader can view for free through Google.
The recent confrontation between Google and the traditional news media is only one of many problems resulting from digital technology. Digital technology can create exact copies of data so that one copy cannot be distinguished from the other. In other words, although a printed book might be nicer than a photocopy of that book, a digitization of the book is exactly the same as all other digitized copies and can be transmitted almost instantly. Similarly, although cassette tape copies of recorded music offered lower sound fidelity than the originals, the emergence of writable CD technology during the 1990s allowed for the creation of a copy of a digital audio CD that was identical to the original.
As data storage and transmission costs dropped, CDs no longer had to be physically copied to other CDs. With the advent of MP3 digital encoding, the music information on a CD could be compressed into a relatively small, portable format that could be transmitted easily over the Internet, and music file sharing took off. Although these recordings were not exactly the same as their CD-quality counterparts, most listeners could not tell the difference—or they just didn’t care, because they were now able to share music files conveniently and for free. The practice of transmitting music over the Internet through services such as Napster quickly ballooned.
As high-bandwidth Internet connections proliferated, video-sharing and streaming sites such as YouTube started up. Although these sites were supposedly intended for users to upload and share their own amateur videos, one of the big draws of the site was the high quantity of television show episodes, music videos, and other commercial content that has been posted illegally. The replication potential inherent in digital technology combined with online transmission has caused a sea change in media industries that rely on income directly from consumers, such as books and recorded music. However, as the next section will show, the shift of media and information to the Internet can pose the risk of a digital divide, where those without Internet access are at an even greater disadvantage than they were before.
Producers of content are not without protection under the law. In 1998, Congress enacted the Digital Millennium Copyright ActThe piece of 1998 legislation that made digital piracy illegal while exempting Internet service providers from liability. (DMCA) in an effort to stop the illegal copying and distribution of copyrighted material. The legislation defined many digital gray areas that previously may not have been explicitly covered, such as circumventing antipiracy measures in commercial software; requiring webcasters to pay licensing fees to record companies; and exempting libraries, archives, and some other nonprofit institutions from some of these rules under certain circumstances. Since 1998, this legislation has been the bedrock of a variety of claims against sites such as YouTube. Under the law, copyright holders may send letters to Internet hosts distributing their copyrighted material. Certifying that they have a good-faith belief that the host does not have prior permission to distribute the content, copyright holders may request a removal of that material from the site.U.S. Copyright Office, “Copyright Law: Chapter 5,” 1998, http://www.copyright.gov/title17/92chap5.html.
Although much of the law has to do with the rights of copyright holders to request the removal of their works from unlicensed sites, much of the DMCA also enacts protections for Internet service providers (ISPs). Although before there had been some question as to whether ISPs could be charged with copyright infringement merely for allowing reproductions to use their bandwidth, the DMCA made it clear that ISPs are not expected to police their own bandwidth for illegal use and are therefore not liable. Although the DMCA is not necessarily to everyone’s liking—requiring individual takedown notices is time-consuming for corporations, and the relative lack of safeguards can allow large companies to bully ISPs into shutting down smaller sites, given a good-faith notice—the protections and clarifications that it has created have cleared up some of the confusion surrounding digital media.Ashlee Vance, “Court Confirms DMCA ‘Good Faith’ Web Site Shut Down Rights,” Register, May 30, 2003, http://www.theregister.co.uk/2003/05/30/court_confirms_dmca_good_faith/. However, as the price of bandwidth drastically drops and as more media goes digital, copyright laws will inevitably need to be amended again.
Navigate to a traditional media outlet’s online portal, such as NYTtimes.com or FoxNews.com. Print out a hard copy of the home page and write on it, or save it to your computer and open it in a document editor such as Microsoft Word to annotate it. Note items such as these:
More than just a tool for information transfer, the Internet has become a conduit for a globalized workforce. A corporation in New York can outsource electronically based work to a highly connected developing country like India without incurring the sort of shipping charges or communication delays that previously impeded such efforts. Internet access, particularly for business, has made development possible in remote areas, allowing corporations access to less expensive labor and allowing money to flow into developing countries. However, as the Internet has become integrated into daily business life, a digital divideThe difference between those who derive the benefits of Internet access and those who do not. has emerged: Some derive the benefits from Internet access, but many others do not.
Many U.S. and international leaders and nongovernmental organizations have identified the digital divide as an area of concern. A globalized workforce does not separate the world into easily divisible political territories but rather into those that have useful access to technology to reach a wider market and those that do not. As the 21st century develops, worldwide communication has become increasingly imperative for a healthy economy, creating a new challenge to make sure that rapid technological changes do not preclude economic success for less developed economies.
However, the problem extends beyond simple access or even competency. The 80/20 effectThe effect wherein 80 percent of economic profit is made from the most affluent 20 percent of the population., under which 80 percent of profit is created for the most affluent 20 percent, exacerbates the digital divide. In other words, the Internet—created in large part by and for the rich—is practically useless for the poor, particularly in developing countries. Thus, bridging the digital divide is about helping those with little or no access to the digital world gain the ability to use technology in economically advantageous ways.
As information and media move online, those without ready access to the Internet are increasingly being left behind. Even in developed countries such as the United States, the digital divide is readily apparent. Often, older and less educated workers do not have computer skills or home Internet access. In June 2009, the Pew Research Center studied the demographic differences in broadband Internet adoption and found that 45 percent of those without Internet access were age 65 and over.John Horrigan, “Home Broadband Adoption 2009,” Pew Internet & American Life Project, June 17, 2009, http://www.pewinternet.org/Reports/2009/10-Home-Broadband-Adoption-2009.aspx. Even more significantly, a quarter of the unconnected were between the ages of 50 and 64. These workers are at a distinct disadvantage when it comes to finding and being hired for jobs.
As classified advertisements and job postings have left newspapers for the web, Internet access has become vital to even finding a job to apply for. In the previously mentioned Pew survey, 20 percent of respondents had incomes of less than $20,000 per year. However, these 20 percent made up a disproportionately high 48 percent of the survey’s non-Internet users; a full 64 percent of low-income survey participants did not have access to the Internet. These numbers drastically dropped as wages increased—while those making under $40,000 per year made up 80 percent of non-Internet users, those making over $50,000 made up 50 percent of high-speed Internet users. As the Internet is becoming an integral part of our daily lives, a lack of access among certain groups could severely hamper upward economic mobility.
Access to the Internet is an essential aspect of many successful job hunts, but it is also important to consider computer skills themselves. Many older adults who grew up without the Internet lack the computer and technology skills that contemporary jobs require. MSNBC reported in October 2009 that unemployment rates for older workers were at a 60-year high, having doubled in the period between late 2007 and the fall of 2009.Alex Johnson, “Lack of Computer Skills Foils Many Job-Seekers,” MSNBC, October 2, 2009, http://www.msnbc.msn.com/id/33106445. While the overall unemployment rate at that time had reached a 26-year high, older workers who lacked the skills of younger, computer-savvy adults were suffering disproportionately. Lack of computer skills can be a crippling impediment to job success, even if a person can find a job in difficult economic times.
In response to these challenges, libraries and other nonprofit groups have taken on the task of training older unemployed workers to effectively use the Internet for job-related needs. These training courses, beginning with turning on a computer and using the mouse and ranging into advanced office program use, seek to provide skills necessary to allow older workers to reenter the work force. These organizations also aim to show users that they can increase their quality of life by setting up email for communication with friends and family members.
While the digital divide in the United States is largely a matter of education, cost barrier, and lack of adoption of new technology, the digital divide in economically underdeveloped countries adds the complication of infrastructure. Internet service requires the existence of widespread stable networks to handle large computer centers, and electronic access to the outside world needs a constant data connection. Therefore, in many developing countries, practically no residents have access to computers and the Internet; this cuts them off not only from information but from the entire global economy.
Table 13.1 Internet Connectivity by Country
Country |
Population (millions) |
Internet users (millions) |
Percent connected |
|---|---|---|---|
China |
1,330 |
298 |
22 |
India |
1,173 |
81 |
7 |
United States |
310 |
231 |
75 |
Indonesia |
243 |
30 |
12 |
Brazil |
201 |
65 |
32 |
Japan |
127 |
91 |
72 |
Afghanistan |
29 |
0.5 |
2 |
Mongolia |
3.1 |
0.3 |
10 |
Source: CIA World Factbook, https://www.cia.gov/library/publications/the-world-factbook/ (accessed July 27, 2011).
The Digital Divide Institute has launched a campaign to integrate Indonesia into the global digital network as a representative solution to this problem. Indonesia is the world’s fourth-largest country in terms of population and already has wide cell-phone coverage—a significant advantage when it comes to rural information access.Digital Divide Institute, “Indonesia,” 2010, http://www.digitaldivide.org/indonesia.html. The organization claims that by expanding these wireless communication networks to encompass 3G and high-speed Internet access, access to the Internet could rise so much that Indonesia could become a fully emerging market for global services. To put this in perspective, connecting 20 percent of Indonesians to the Internet brings the total connected population of Indonesia to 48 million users, equivalent to all of South Korea, one of the most connected countries in the world.Central Intelligence Agency, “Country Comparison: Population,” World Factbook, https://www.cia.gov/library/publications/the-world-factbook/rankorder/2119rank.html. The economic and political benefits of widespread Internet connectivity to nations like Indonesia are huge. The Digital Divide Institute points to Ireland as an example of how increasingly high-tech jobs can accompany the decline of terrorism—evidence that bridging the digital divide can be an issue of international security as well as global prosperity.
One contentious issue in bridging the digital divide is which billion to focus on—of the 6.8 billion people in the world, only an estimated 1.6 billion are connected to the Internet.Central Intelligence Agency, “World,” World Factbook, https://www.cia.gov/library/publications/the-world-factbook/geos/xx.html. Therefore, the discussion of bridging the digital divide is quickly complicated by this question: To whom should we build the bridge? While some organizations such as the Digital Divide Institute suggest that the global “pyramid” should focus on the next billion—countries such as Indonesia with wide cell-phone coverage but little access to useful, global digital technology. Other organizations see it differently.
Figure 13.4

The One Laptop per Child project aims to put one of these XO computers in the hands of many children in developing countries.
Many believe that everyone in the world can benefit from technology if it is deployed properly. The organization One Laptop per Child (OLPC) seeks to achieve exactly what its name implies with a low-cost design that runs on free software and requires very little energy.One Laptop per Child Association, “One Laptop per Child (OLPC): Laptop.” http://laptop.org/en/laptop/index.shtml. Central to OLPC’s goal is the idea that learning to use technology needs to be recalibrated toward learning through technology. Another crucial idea is the organization’s conception of networks as essentially localized with the potential to be expanded. OLPC’s XO laptop connects to its neighbors, creating many small networks through a fairly wide wireless range. In addition, its ability to access the Internet through this wide wireless range allows remote educational opportunities for children in developing countries. Although it may seem to leap directly from no communication access to wireless Internet video streaming, this program has shown that it may even be more cost effective than traditional connective technologies like phone lines.
Please respond to the following short-answer writing prompts. Each response should be a minimum of one paragraph.
The modern theory of the information economyAn economic model based on selling intangible information rather than products. was expressed in the 1998 publication of Information Rules: A Strategic Guide to the Network Economy, written by Cal Shapiro, an economics professor at University of California, Berkeley, and Hal Varian, now chief economist at Google. Their fundamental argument was simple: “Technology changes. Economic laws do not.”Carl Shapiro and Hal R. Varian, Information Rules: A Strategic Guide to the Network Economy (Cambridge, MA: Harvard Business School Press, 1998).
While economic laws may not change, the fundamentals of the business of information are far different from the fundamentals of most traditional businesses. For example, the cost of producing a single sandwich is relatively consistent, per sandwich, with the cost of producing multiple sandwiches. As discussed in , information works differently. With a newspaper, the first copy costs are far higher than the marginal costs of secondary copies. The high first costs and low marginal costs of the information economy contribute very heavily to the potential for large corporations gaining dominance. The confluence of these two costs creates a potential economy of scaleAn economic model with high first costs and low marginal costs that heavily rewards expansion., favoring the larger of the competitors.
In addition, information is what economists refer to as an experience goodA good that requires the customer to experience it in order to judge its value., meaning that consumers must actually experience the good to judge its value. The problem with information is that the experience is the good; how do you know, for example, that a movie has high-quality acting and an interesting plot before you’ve watched it? The solution to this is branding, which was discussed in . Although it may be difficult to judge a movie before watching, knowing that a given film was made by a certain director or stars an actor you like increases its value. Marketers use movie trailers, press coverage, and other marketing tools to communicate this branding message in the hopes of convincing you to watch the films they are promoting.
Another important facet of information technology is the associated switching costsThe cost that a user must pay to switch from one technological format to another.. When economists consider switching costs, they take into account the difference between the cost of one technology and the cost of another. If this difference is less than the cost it would take to switch—for information, the cost of moving all of the relevant data to the new technology—then it is deemed possible to switch. A classic example is moving a music collection from vinyl LPs to CDs. For a consumer to switch systems—that is, to buy a CD player and stereo—that person would also have to rebuild his or her entire music collection with the new format. Luckily for the CD player, the increase in convenience and quality was great enough that most consumers were inclined to switch technologies; however, as is apparent to anyone going to a thrift store or garage sale, old technologies are still being used because the information on the records was important enough for some people to keep them around.
Although will discuss government regulation in greater depth, a basic understanding of the interaction between government and media over time is essential to understanding the modern information economy. Public policy and governmental intervention exacerbate an already complicated system of information economics, but for good reason—unlike typical goods and services, the information economy has many significant side effects. The consequences of one hamburger chain outcompeting or buying up all other hamburger chains would surely be fairly drastic for the hamburger-loving world, but not altogether disastrous; there would be only one type of hamburger, but there would still be many other types of fast food remaining. On the contrary, the consequences of monopolization by one media company could be alarming. Because distributed information can influence public policy and public opinion, those in charge of the government have an interest in ensuring fair distribution of that information. The bias toward free markets has been mitigated—even in the United States—when it comes to the information economy.
The Federal Communications Commission (FCC) is largely responsible for this regulation. Established by the Communications Act of 1934, the FCC is charged with “regulating interstate and international communications” for nearly every medium except for print.Federal Communications Commission, “About the Federal Communications Commission,” http://www.fcc.gov/aboutus.html. The FCC also attempts to maintain a nonpartisan, or at least bipartisan, outlook, with a maximum of three of its five commissioners belonging to the same political party. Although the FCC controls many important things—making sure that electronic devices don’t emit radio waves that interfere with other important tools, for example—some of its most important and most contentious responsibilities relate to the media.
As the guardian of the public interest, the FCC has called for more competition among media companies; for example, the ongoing litigation of the merger between Comcast and NBC is not concerned with whether consumers will like streaming Hulu over the Internet, but rather whether one company should own both the content and the mode of distribution. The public good is not served if consumers’ ability to choose is taken away when a service provider like Comcast restricts access to only the content that the provider owns, especially if that service provider is the consumers’ only choice. In other words, the idea of public good is concerned not with the end result of competition, but with its process. The FCC protects consumers’ ability to choose from a wide variety of media products, and the competition among media producers hopefully results in better products for consumers. If the end result is that all customers choose Hulu anyway, either because it has the shows they like or because it offers the best video-streaming capability, then the process has worked to create the best possible model; there was a winner, and it was a fair fight.
The main tool that the government employs to keep healthy competition in the information marketplace is antitrust legislation. The seminal Sherman Antitrust Act of 1890 helped establish modern U.S. antitrust legislation. Although originally intended to dissolve the monopolistic enterprises of late-19th-century industrialists such as Andrew Carnegie and John D. Rockefeller, the law’s basic principles have applied to media companies as well. The antitrust office has also grown since the original Sherman Act; although the office of the attorney general originally brought antitrust lawsuits after the act’s passage, this responsbility shifted to its own Antitrust Division in 1933 under President Franklin D. Roosevelt.
The Sherman Antitrust Act of 1890 outlined many propositions and goals that legislators deemed necessary to foster a competitive marketplace. For example, Chapter 1, Section 2, of the act states that “Every person who shall monopolize, or attempt to monopolize, or combine or conspire with any other person or persons, to monopolize any part of the trade or commerce among the several States … shall be deemed guilty of a felony.”Legal Information Institute, “Monopolizing trade a felony; penalty,” Cornell University Law School, January 5, 2009, http://www.law.cornell.edu/uscode/html/uscode15/usc_sec_15_00000002----000-.html. This establishment of monopolization as a felony was remarkable; before, free-market capitalism was the rule regardless of the public good, making the Sherman Antitrust Act an early proponent of the welfare of people at large.
Two additional pieces of legislation, the Clayton Antitrust Act of 1911 and the Celler-Kefauver Act of 1950, refined the Sherman Antitrust Act in order to make the system of antitrust suits work more effectively. For instance, the Clayton Act makes it unlawful for one company to “acquire … the whole or any part of the stock” of another company when the result would encourage the development of a monopoly.Legal Information Institute, “Acquisition by one corporation of stock of another,” Cornell University Law School, January 5, 2009, http://www.law.cornell.edu/uscode/html/uscode15/usc_sec_15_00000018----000-.html. More than just busting trusts, the Clayton Act thus seeks to stop anticompetitive practices before they take hold. The Celler-Kefauver Act made it more difficult for corporations to get around antitrust legislation; while the Clayton Act allowed the government to regulate the purchase of a competitor’s stock, the Celler-Kefauver Act extended this to include the competitor’s assets.
Although the early part of the 20th century seemed to be devoted to breaking up trusts and keeping monopolies in check, the media—particularly in the latter part of the century—was still able to move steadily toward conglomeration (companies joining together to form a larger, more diversified corporations). Widespread deregulation (the removal of legal regulations on an industry) took place during the 1980s, in large part through the efforts of free-market economists who argued that deregulation would foster more competition in the information marketplace. However, possibly due in large part to the media economy’s focus on economies of scale, this was not the case in practice. Companies became increasingly conglomerated, and corporations such as Comcast and Time Warner came to dominate the marketplace. The Telecommunications Act of 1996 helped solidify this trend. Although touted as a way to let “any communications business compete in any market against any other” and to foster competition, this act in practice sped up the conglomeration of media.Federal Communications Commission, “FCC — Telecommunications Act of 1996,” November 15, 2008, http://www.fcc.gov/telecom.html.
The extension of the Telecommunications Act of 1996 of corporate abilities to vertically integrate was a primary driving factor behind this increased conglomeraton. Vertical integration has proven particulary useful for media companies due to their high first costs and low marginal costs. For example, a television company that both produces and distributes content can run the same program on two different channels for nearly the same cost as only broadcasting it on one. Because of the localized nature of broadcast media, two broadcast television channels will likely reach different geographical areas. This results in cost savings for the company, but also somewhat decreases local diversity in media broadcasting.
In fact, the Telecommunications Act made some changes in authority for these local markets. The concept of Section 253 is that no state may prohibit “the ability of any entity to provide any interstate or intrastate telecommunications service.”Federal Communications Commission, “Telecommunications Act of 1996, Section 253 (a),” January 3, 1996, http://www.fcc.gov/Reports/tcom1996.pdf. Thus, since state and local governments cannot prohibit any company from entering into a marketplace, there are checks on the amount of a local market that any one company can reach. In addition, the Telecommunications Act capped the share of U.S. television audience for any one company at 35 percent. However, the passage of additional legislation in 1999 allowing any one company to own two television stations in a single market greatly diluted the effect of this initial ruling. Although CBS, NBC, and ABC may be declining in popularity, they “still offer the only means of reaching a genuinely mass television audience” in the country.Gillian Doyle, Understanding Media Economics (Thousand Oaks, CA: Sage, 2002).
Almost all of the major media players in today’s market practice extensive vertical integration through either administrative managementThe potential for divisions of a single company to share the same higher-level management structure. or content integrationThe ability of a company to use the same content across platforms.. Administrative management refers to the potential for divisions of a single company to share the same higher-level management structure, which presents opportunities for increased operational efficiency. For example, Disney manages theme parks and movie studios. Although these two industries are not very closely connected through content, both are large, multinational ventures. Placing both of these divisions under a single corporation allows them to share certain structural similarities, accounting practices, and any other administrative resources that may be helpful across multiple industries.
Content integration—an important practice for media industries—is the ability of these companies to use the same content across multiple platforms. Disney’s theme parks would lose much of their charm and meaning without Mickey Mouse and Cinderella’s castle; the integration of these two industries—Disney’s theme parks and Disney’s animated characters—proves profitable for both. Behind the scenes, Disney is also able to reap some excellent benefits from their consolidation. For example, Disney could release a movie through its studio, and then immediately book the stars on news programs that air on Disney-owned broadcast television network ABC. Beyond just the ABC broadcast network, Disney also has many cable channels that it can use to directly market its movies and products to the targeted demographics. Unlike a competitor that might be wary of promoting a Disney movie, Disney’s ownership of many different media outlets allows it to single-handedly reach a large audience.
However, this high level of vertical integration raises several ethical concerns. In the above situation, for example, Disney could entice reviewers on its television outlets to give positive reviews to a Disney studio movie. Therefore, this potential for misused trust and erroneous information could be harmful.
In many ways, the conglomeration of media companies takes place behind the scenes, with only a minority of consumers aware of vertically intergrated holdings. Media companies often try to foster a sense of independence from a larger corporation. Of course, there are exceptions to this rule; the NBC sitcom 30 Rock often delves into the troubles of running a satirical sketch-comedy show (a parody of NBC’s Saturday Night Live) under the ownership of GE, NBC’s real-life owner.
Although media companies are steadily turning into larger businesses than ever before, many of them have nevertheless fallen on hard times. The instant, free content of the Internet is largely blamed for this decline. From the shift of classified advertising from newspapers to free online services to the decline in physical music sales in favor of digital downloads, the Internet has transformed traditional media economics.
One of the main issues with an unregulated Internet is that it allows digital files to be replicated and sent anywhere else in the world. Large music companies, which traditionally made almost all of their money from selling physical music formats such as vinyl records or compact discs, find themselves at a disadvantage. Consumers can share and distribute music files to anyone, and Internet service providers are exempted from liability under the DMCA. With providers freed of liability and media consumption a driving factor in the rise of high-speed Internet services, ISPs have no incentive to deter illegal sharing along with legal downloads.
Although music companies have had some success selling music through digital outlets, they have not been pioneers in online music sales. Rather, technology companies such as Apple and Amazon.com, sensing a large market for digital downloads coupled with a sleek delivery system, have led the way. Already accustomed to downloading MP3s, consumers readily adopted the model. However, record companies believed that the lack of digital rights management (DRM) protection offered by MP3s represnted a major downside.
Apple provided a way to strike a compromise between accessibility and rights control. Having already captured much of the personal digital audio player market with the iPodthat uses other Apple products, Apple has also long prided itself on creating highly integrated systems of both software and hardware. Because so many people were already using the iPod, Apple had a huge potential market for a music store even if it offered DRM-locked tracks that would only play on Apple devices. This inflexibility even offered a small benefit for consumers; Apple succeeded in convincing companies to price their digital downloads lower than CDs.
This compromise may have sold a lot of iPods and MP3s, but it did not satisfy the record companies. When consumers started to download one hit single for 99 cents—rather than buying the whole album for $15 on CD—the music industry felt the pain. Still, huge monetary advances in digital music have taken place. Between 2004 and 2008, digital music sales increased from $187 million to $1.8 billion.
The music industry has wasted no amount of firepower to blame piracy for the decline in album sales: “There’s no minimizing the impact of illegal file-sharing. It robs songwriters and recording artists of their livelihoods, and it ultimately undermines the future of music itself,” said Cary Sherman, president of the Recording Industry Association of America.Cary Sherman, “File-Sharing Is Illegal. Period,” USA Today, September 18, 2003. However, economists see the truth of the matter as significantly more ambiguous. Analyzing over 10,000 weeks of data distributed over many albums, a pair of economists at the Harvard Business School and University of North Carolina found that “Downloads have an effect on sales which is statistically indistinguishable from zero.”Felix Oberholzer-Gee and Koleman Strumpf, “The Effect of File Sharing on Record Sales: An Empirical Analysis,” Journal of Political Economy 115, no. 1 (February 2007): 1–42. Either way, two things are clear: Consumers are willing to pay for digital music, and digital downloads are on the market to stay for the foreseeable future.
Visit the Columbia Journalism Review’s “Who Owns What?” web page at http://www.cjr.org/resources/index.php. Choose a company from the drop-down menu. Make a chart of all the company’s different media outlets and complete the following activities:
The media industry is, in many ways, perfect for globalization, or the spread of global trade without regard for traditional political borders. As discussed earlier, the low marginal costs of media mean that reaching a wider market creates much larger profit margins for media companies. Because information is not a physical good, shipping costs are generally inconsequential. Finally, the global reach of media allows it to be relevant in many different countries.
However, some have argued that media is actually a partial cause of globalization, rather than just another globalized industry. Media is largely a cultural productA product that has some influence on or connection to cultural attributes., and the transfer of such a product is likely to have an influence on the recipient’s culture. Increasingly, technology has also been propelling globalization. Technology allows for quick communication, fast and coordinated transport, and efficient mass marketing, all of which have allowed globalization—especially globalized media—to take hold.
Much globalized media content comes from the West, particularly from the United States. Driven by advertising, U.S. culture and media have a strong consumerist bent (meaning that the ever-increasing consumption of goods is encouraged as a economic virtue), thereby possibly causing foreign cultures to increasingly develop consumerist ideals. Therefore, the globalization of media could not only provide content to a foreign country but may also create demand for U.S. products. Some believe that this will “contribute to a one-way transmission of ideas and values that result in the displacement of indigenous cultures.”Josefina M. C. Santos, “Globalisation and Tradition: Paradoxes in Philippine Television and Culture,” Media Development, no. 3 (2001): 43–48.
Globalization as a world economic trend generally refers to the lowering of economic trade borders, but it has much to do with culture as well. Just as transfer of industry and technology often encourages outside influence through the influx of foreign money into the economy, the transfer of culture opens up these same markets. As globalization takes hold and a particular community becomes more like the United States economically, this community may also come to adopt and personalize U.S. cultural values. The outcome of this spread can be homogenizationThe process of making things the same. (the local culture becomes more like the culture of the United States) or heterogenization (aspects of U.S. culture come to exist alongside local culture, causing the culture to become more diverse), or even both, depending on the specific situation.Terhi Rantanen, The Media and Globalization (Thousand Oaks, CA: Sage, 2005).
Making sense of this range of possibilities can be difficult, but it helps to realize that a mix of many different factors is involved. Because of cultural differences, globalization of media follows a model unlike that of the globalization of other products. On the most basic level, much of media is language and culture based and, as such, does not necessarily translate well to foreign countries. Thus, media globalization often occurs on a more structural level, following broader “ways of organizing and creating media.”Mirza Jan. “Globalization of Media: Key Issues and Dimensions,” European Journal of Scientific Research 29, no. 1 (2009): 66–75. In this sense, a media company can have many different culturally specific brands and still maintain an economically globalized corporate structure.
Because globalization has as much to do with the corporate structure of a media company as with the products that a media company produces, vertical integration in multinational media companies becomes a necessary aspect of studying globalized media. Many large media companies practice vertical integration: Newspaper chains take care of their own reporting, printing, and distribution; television companies control their own production and broadcasting; and even small film studios often have parent companies that handle international distribution.
A media company often benefits greatly from vertical integration and globalization. Because of the proliferation of U.S. culture abroad, media outlets are able to use many of the same distribution structures with few changes. Because media rely on the speedy ability to react to current events and trends, a vertically integrated company can do all of this in a globalized rather than a localized marketplace; different branches of the company are readily able to handle different markets. Further, production values for single-country distribution are basically the same as those for multiple countries, so vertical integration allows, for example, a single film studio to make higher-budget movies than it may otherwise be able to produce without a distribution company that has as a global reach.
Worth considering is the reciprocal influence of foreign culture on American culture. Certainly, American culture is increasingly exported around the world thanks to globalization, and many U.S. media outlets count strongly on their ability to sell their product in foreign markets. But what Americans consider their own culture has in fact been tailored to the tastes not only of U.S. citizens but also to those of worldwide audiences. The profit potential of foreign markets is enormous: If a movie does well abroad, for example, it might make up for a weak stateside showing, and may even drive interest in the movie in the United States.
One prime example of this phenomenon of global culture and marketing is James Cameron’s 1997 film Titanic. One of the most expensive movies ever produced up to that point, with an official budget of around $200 million, Titanic was not anticipated to perform particularly well at the U.S. box office. Rather, predictions of foreign box-office receipts allowed the movie to be made. Of the total box-office receipts of Titanic, only about one-third came from the domestic market. Although Titanic became the highest-grossing film up to that point, it grossed just $140 million more domestically than Star Wars did 20 years earlier.Box Office Mojo, “All Time Domestic Box Office Results,” http://boxofficemojo.com/alltime/domestic.htm. The difference was in the foreign market. While Star Wars made about the same amount—$300 million—in both the domestic and foreign markets, Titanic grossed $1.2 billion in foreign box-office receipts. In all, the movie came close to hitting the $2 billion mark, and now sits in the No. 2 position behind Cameron’s 2009 blockbuster, Avatar.
Figure 13.5

The movie Titanic, which became the highest-grossing movie of all time, made twice as much internationally as it did domestically.
One reason that U.S. studios can make these kinds of arrangements is their well-developed ties with the worldwide movie industry. Hollywood studios have agreements with theaters all over the world to show their films. By contrast, the foreign market for French films is not nearly as established, as the industry tends to be partially subsidized by the French government. Theaters showing Hollywood studio films in France funnel portions of their box-office receipts to fund French films. However, Hollywood has lobbied the World Trade Organization—a largely pro-globalization group that pushes for fewer market restrictions—to rule that this French subsidy is an unfair restriction on trade.Roman Terrill, “Globalization in the 1990s,” University of Iowa Center for International Finance and Development, 1999, http://www.uiowa.edu/ifdebook/ebook2/contents/part3-I.shtml#B.
In many ways, globalization presents legitimate concerns about the endangerment of indigenous culture. Yet simple concerns over the transfer of culture are not the only or even the biggest worries caused by the spread of American culture and values.
Think of a U.S. product that is available throughout the world, such as an athletic brand like Nike or a food product like Pepsi or Coca-Cola. Now go online to the different country-specific branches of the company’s website.
Cultural imperialismThe takeover of a local culture by a more powerful foreign one. was around long before the United States became a world power. In its broadest strokes, imperialism describes the ways that one nation asserts its power over another. Just as imperial Britain economically ruled the American colonists, so did Britain strongly influence the culture of the colonies. The culture was still a mix of nationalities—many Dutch and Germans settled as well—but the ruling majority of ex-Britons led British culture to generally take over.
Today, cultural imperialism tends to describe the United States’ role as a cultural superpower throughout the world. American movie studios are generally much more successful than their foreign counterparts not only because of their business models but also because the concept of Hollywood has become one of the modern worldwide movie business’s defining traits. Multinational, nongovernmental corporations can now drive global culture. This is neither entirely good nor entirely bad. On one hand, foreign cultural institutions can adopt successful American business models, and corporations are largely willing to do whatever makes them the most money in a particular market—whether that means giving local people a shot at making movies, or making multicultural films such as 2008’s Slumdog Millionaire. However, cultural imperialism has potential negative effects as well. From a spread of Western ideals of beauty to the possible decline of local cultures around the world, cultural imperialism can have a quick and devastating effect.
To begin discussing the topic of cultural imperialism, it is important to look at the ideas of one of its founding theorists, Antonio Gramsci. Strongly influenced by the theories and writings of Karl Marx, Italian philosopher and critic Gramsci originated the idea of cultural hegemony to describe the power of one group over another. Unlike Marx, who believed that the workers of the world would eventually unite and overthrow capitalism, Gramsci instead argued that culture and the media exert such a powerful influence on society that they can actually influence workers to buy into a system that is not economically advantageous to them. This argument that media can influence culture and politics is typified in the notion of the American Dream. In this rags-to-riches tale, hard work and talent can lead to a successful life no matter where one starts. Of course, there is some truth to this, but it is by far the exception rather than the rule.
Marx’s ideas remained at the heart of Gramsci’s beliefs. According to Gramsci’s notion, the hegemons of capitalismAn important part of Gramsci’s theory, it referrs to the powerful states or state actors who seek and find control often without resorting to military dominance.—those who control the capital—can assert economic power, while the hegemons of culture can assert cultural power. This concept of culture is rooted in Marxist class struggle, in which one group is dominated by another and conflict arises. Gramsci’s concept of cultural hegemony is pertinent in the modern day not because of the likelihood of a local property-owning class oppressing the poor, but because of concern that rising globalization will permit one culture to so completely assert its power that it drives out all competitors.
A key danger of cultural imperialism is the possibility that American tastes will crowd out local cultures around the globe. The McDonaldizationAn economic force that promotes efficiency, calculability, predictability, and control. of the globe applies not just to its namesake, McDonald’s, with its franchises in seemingly every country, but to any industry that applies the technique of McDonald’s on a large scale. Coined by George Ritzer in his book The McDonaldization of Society (1993), the concept is rooted in the process of rationalization. With McDonaldization, four aspects of the business are taken to the extreme: efficiency, calculability, predictability, and control. These four things are four of the main aspects of free markets. Applying the concepts of an optimized financial market to cultural and human items such as food, McDonaldization enforces general standards and consistency throughout a global industry.
Figure 13.6

McDonald’s has opened up many culturally specific versions of its chain, all employing its famous Golden Arches.
Unsurprisingly, McDonald’s is the prime example of this concept. Although the fast-food restaurant is somewhat different in every country—for example, Indian restaurants offer a pork-free, beef-free menu to accommodate regional religious practices—the same fundamental principles apply in a culturally specific way. The branding of the company is the same wherever it is; the “I’m lovin’ it” slogan is inescapable, and the Golden Arches are, according to Eric Schlosser in Fast Food Nation, “more widely recognized than the Christian cross.”Schlosser, Eric, Fast Food Nation: The Dark Side of the All-American Meal (Boston: Houghton Mifflin, 2001), 4. Yet, more importantly, the business model of McDonald’s stays relatively the same from country to country. Although culturally specific variations exist, any McDonald’s in a particular area has basically the same menu as any other. In other words, wherever a consumer is likely to travel within a reasonable range, the menu options and the resulting product remain consistent.
Media works in an uncannily similar way to fast food. Just as the automation of fast food—from freeze-dried french fries to prewrapped salads—attempts to lower a product’s marginal costs, thus increasing profits, media outlets seek to achieve a certain degree of consistency that allows them to broadcast and sell the same product throughout the world with minimal changes. The idea that media actually spreads a culture, however, is controversial. In his book Cultural Imperialism, John Tomlinson argues that exported American culture is not necessarily imperialist because it does not push a cultural agenda; it seeks to make money from whatever cultural elements it can throughout the world. According to Tomlinson, “no one really disputes the dominant presence of Western multinational, and particularly American, media in the world: what is doubted is the cultural implications of this presence.”John Tomlinson, Cultural Imperialism: A Critical Introduction (London: Continuum, 2001).
There are, of course, by-products of American cultural exports throughout the world. American cultural mores, such as the Western standard of beauty, have increasingly made it into global media. As early as 1987, Nicholas Kristof wrote in The New York Times about a young Chinese woman who was planning to have an operation to make her eyes look rounder, more like the eyes of Caucasian women. Western styles—“newfangled delights like nylon stockings, pierced ears and eye shadow”—also began to replace the austere blue tunics of Mao-era China. The pervasiveness of cultural influence is difficult to track, however, as the young Chinese woman says that she wanted to have the surgery not because of Western looks but because “she thinks they are pretty.”Nicholas D. Kristof, “In China, Beauty Is a Big Western Nose,” New York Times, April 29, 1987, http://www.nytimes.com/1987/04/29/garden/in-china-beauty-is-a-big-western-nose.html.
Figure 13.7

After September 11, 2001, President George W. Bush framed the issue of terrorism as a cultural conflict as much as a military one.
Not everyone views the spread of American tastes as a negative occurence. During the early 21st century, much of the United States’ foreign policy stemmed from the idea that spreading freedom, democracy, and free-market capitalism through cultural influence around the world could cause hostile countries such as Iraq to adopt American ways of living and join the United States in the fight against global terrorism and tyranny. Although this plan did not succeed as hoped, it raises the question of whether Americans should truly be concerned about spreading their cultural system if they believe that it is an ideal one.
Speaking after the attacks of September 11, 2001, then-President George W. Bush presented two simple ideas to the U.S. populace: “They [terrorists] hate our freedoms,” and “Go shopping.”President George W. Bush, address on terrorism before a joint meeting of Congress, New York Times, September 21, 2001, http://www.nytimes.com/2001/09/21/us/nation-challenged-president-bush-s-address-terrorism-before-joint-meeting.html. These twin ideals of personal freedom and economic activity are often held up as the prime exports of American culture. However, the idea that other local beliefs need to change may threaten people of other cultures.
The spread of culture works in mysterious ways. Hollywood probably does not actually have a master plan to export the American way of life around the globe and displace local culture, just as American music may not necessarily be a progenitor of democratic government and economic cooperation. Rather, local cultures respond to the outside culture of U.S. media and democracy in many different ways. First of all, media are often often much more flexible than believed; the successful exportation of the film Titanic was not an accident in which everyone in the world suddenly wanted to experience movies like an American. Rather, the film’s producers had judged that it would succeed on a world stage just as on a domestic stage. Therefore, in some ways U.S. media have become more widespread, and also more worldwide in focus. It could even be argued that American cultural exports promote intercultural understanding; after all, to sell to a culture, a business must first understand that culture.
By contrast, some local cultures around the world have taken to Western-style business models so greatly that they have created their own hybrid cultures. One well-known example of this is India’s Bollywood film industry. Combining traditional Indian music and dance with American-style filmmaking, Bollywood studios release around 700 major films each year, three times the rate of the major Hollywood studios. India’s largest film industry mixes melodrama with musical interludes, lip-synced by actors but sung by pop stars. These pop songs are disseminated well before a movie’s release, both to build up hype and to enter multiple media markets. Although similar marketing tactics have been employed in the United States, Bollywood seems to have mastered the art of cross-media integration. The music and dance numbers are essentially cinematic forms of music videos, both promoting the soundtrack and adding variety to the film. The numbers also feature many different Indian national languages and a hybrid of Western dance music and Indian classical singing, a certain departure from conventional Western media.Richard Corliss, “Hooray for Bollywood!” Time, September 16, 1996, http://www.time.com/time/magazine/article/0,9171,985129,00.html.
While cultural imperialism might cause resentment in many parts of the world, the idea that local cultures are helpless under the crushing power of American cultural imposition is clearly too simplistic to hold water. Instead, local cultures seem to adopt American-style media models, changing their methods to fit the corporate structures rather than just the aesthetics of U.S. media. These two economic and cultural aspects are clearly intertwined, but the idea of a foreign power unilaterally crushing a native culture does not seem to be entirely true.
Please respond to the following short-answer writing prompts. Each response should be a minimum of one paragraph.
Review Questions
Media now rely heavily on synergy, or cross-platform media distribution. Because of this, one of the industry’s quickly expanding career fields employs people who manage the online outlets of a more traditional media outlet such as radio or television. Although such jobs used to require extensive technological knowledge, modern online project managers, online media editors, and web producers spend much of their time determining how best to display the content online.
In this activity, you will research a media outlet and then answer questions about the choices that the web producer, editor, or manager made regarding its content. Some possible websites to research include the following:
Now answer the following questions regarding the site that you picked:
Figure 14.1
The U.S. Constitution’s First Amendment guarantees Americans freedom of the press, which many would agree is an important ingredient in upholding democratic principles. Freedom from government censorship allows the news media to keep citizens informed about the state of their society. But when does the press take this freedom from censorship and restriction too far? The death of Princess Diana in 1997 brought fierce criticism against the paparazzi, and tabloid reporting in general, when it was found that the princess’s car had been pursued by paparazzi vehicles before the crash that caused her death. In June 2011, Tori Spelling crashed after being chased by paparazzi. She was pregnant at the time and feared for her unborn child. In July 2011, paparazzi were detained for chasing Paris Jackson (Michael Jackson’s daughter). Despite these incidents, the public’s interest in celebrity gossip has not diminished; rather, the growth of online news sources has led to a proliferation of celebrity gossip websites.
A potential concern regarding this trend is that tabloid-style gossip is not confined to public figures in the entertainment industry; it can have far-reaching consequences. As noted in , the firing of General Stanley McChrystal from his post as commander of all U.S. and NATO forces in Afghanistan in June 2010 was nearly the direct result of an article in Rolling Stone, in which he made less-than-flattering comments about Vice President Joe Biden.Michael Hastings, “The Runaway General,” Rolling Stone, June 25, 2010, http://www.rollingstone.com/politics/news/17390/119236. McChrystal himself did not directly criticize the president or the administration’s policies; instead, his views were inferred from comments made by his aides.MSNBC, “Obama, McCain, Kerry Comment on McChrystal,” June 22, 2010, http://www.msnbc.msn.com/id/37850711/ns/us_news-military/. However, this was sufficient to cost him his job. In recent years, tabloid reporting has become increasingly invasive and sometimes dangerous.
Should the government begin placing stronger regulations on tabloid reporting as privacy advocates have argued? The Constitution, after all, while guaranteeing freedom of the press, also has been interpreted as guaranteeing individuals certain rights to privacy, and most journalists would agree that standards of ethical journalism include efforts to protect these rights. However, some paparazzi photographers and celebrity journalists disregard journalistic codes of ethics in their efforts to get a story.Patrick J. Alach, “Paparazzi and Privacy,” Loyola of Los Angeles Entertainment Law Review 28, no. 3 (2008): 205. Many argue that because celebrities are “public figures,” the same privacy rights that protect the general public don’t apply. Us Weekly’s editor in chief, Janice Min, has argued, “A celebrity is like an elected official. If you’re getting paid $20 million a movie, you have to rely on public goodwill to stay in office. You have to accept the fact that you’re a public commodity.”Donna Freydkin, “Celebrities Fight for Privacy,” USA Today, July 6, 2004, http://www.usatoday.com/life/people/2004-07-06-celeb-privacy_x.htm. Harvey Levin, editor in chief for the popular celebrity gossip blog TMZ, would agree. When discussing invasions into the private lives of stars like Britney Spears, Levin proclaimed that “Britney is gold; she is crack to our readers. Her life is a complete train-wreck and I thank God for her every day.”New York Times, “TMZ Productions,” Times Topics, July 7, 2009, http://topics.nytimes.com/top/news/business/companies/tmz_productions/index.html?scp=1-spot&sq=tmz&st=cse.
On the other side of the debate, many argue that the public-figure limitation should be balanced with the consideration of a story’s newsworthiness. As law professor Patrick J. Alack has argued, “If ‘social value’ is what constitutes newsworthiness, it is hard to imagine a more perverse concept of social value that incorporates … Paris Hilton’s late-night dining preferences or Lindsay Lohan’s driving habits.”Patrick J. Alach, “Paparazzi and Privacy,” Loyola of Los Angeles Entertainment Law Review 28, no. 3 (2008): 237.
TMZ, a website that publishes celebrity news in real time, was launched in 2005, and since its creation the site has received numerous criticisms from more prestigious news sources like The Washington Post and ABC News. Yet Thane Burnett, reporter for The Toronto Sun, admits that “despite the sideways glances, mainstream news services prowl TMZ’s site for coverage.”Thane Burnett, “Caught on Camera,” Toronto Sun, May 12, 2009, http://www.torontosun.com/entertainment/celebrities/2009/05/12/9429036-sun.html. With the immediacy of Internet news coverage, mainstream media outlets face increasing pressure to release major news while it is still fresh. That pressure is compounded by celebrity gossip sites like TMZ that may resort to unorthodox methods to gather information; the shelf life of breaking news is growing increasingly shorter.
In the competitive and rapidly changing world of mass-media communications, media professionals—overcome by deadlines, bottom-line imperatives, and corporate interests—can easily lose sight of the ethical implications of their work. However, as entertainment law specialist Sherri Burr points out, “Because network television is an audiovisual medium that is piped free into ninety-nine percent of American homes, it is one of the most important vehicles for depicting cultural images to our population.”Sherri Burr, “Television and Societal Effects: An Analysis of Media Images of African-Americans in Historical Context,” Journal of Gender, Race and Justice 4 (2001): 159. Considering the profound influence mass media like television have on cultural perceptions and attitudes, it is important for the creators of media content to grapple with ethical issues.
The U.S. population is becoming increasingly diverse. According to U.S. Census statistics from 2010, 27.6 percent of the population identifies its race as non-white.U.S. Census Bureau, “2010 Census Data,” http://2010.census.gov/2010census/data/. Yet in network television broadcasts, major publications, and other forms of mass media and entertainment, minorities are often either absent or presented as heavily stereotyped, two-dimensional characters. Rarely are minorities depicted as complex characters with the full range of human emotions, motivations, and behaviors. Meanwhile, the stereotyping of women, gays and lesbians, and individuals with disabilities in mass media has also been a source of concern.
The word stereotype originated in the printing industry as a method of making identical copies, and the practice of stereotyping people is much the same: a system of identically replicating an image of an “other.” As related in Chapter 8 "Movies" about D. W. Griffith’s The Birth of a Nation, a film that relied on racial stereotypes to portray Southern whites as victims in the American Civil War, stereotypes—especially those disseminated through mass media—become a form of social control, shaping collective perceptions and individual identities. In American mass media, the white man is still shown as the standard: the central figure of television narratives and the dominant perspective on everything from trends, to current events, to politics. White maleness becomes an invisible category because it gives the impression of being the norm.Joanna Hearne, “Hollywood Whiteness and Stereotypes,” Film Reference, http://www.filmreference.com/encyclopedia/Independent-Film-Road-Movies/Race-and-Ethnicity-HOLLYWOOD-WHITENESS-AND-STEREOTYPES.html.
In the fall of 1999, when the major television networks released their schedules for the upcoming programming season, a startling trend became clear. Of the 26 newly released television programs, none depicted an African American in a leading role, and even the secondary roles on these shows included almost no racial minorities. In response to this ommission, the National Association for the Advancement of Colored People (NAACP) and the National Council of La Raza (NCLR), an advocacy group for Hispanic Americans, organized protests and boycotts. Pressured—and embarrassed—into action, the executives from the major networks made a fast dash to add racial minorities to their prime-time shows, not only among actors, but also among producers, writers, and directors. Four of the networks—ABC, CBS, NBC, and Fox—added a vice president of diversity position to help oversee the networks’ progress toward creating more diverse programming.Leonard M. Baynes, “White Out: The Absence and Stereotyping of People of Color by the Broadcast Networks in Prime Time Entertainment Programming,” Arizona Law Review 45 (2003): 293.
Despite these changes and greater public attention regarding diversity issues, minority underrepresentation is still an issue in all areas of mass media. In fact, the trend in recent years has been regressive. In a recent study, the NAACP reported that the number of minority actors on network television has actually decreased, from 333 during the 2002–2003 season to 307 four years later.WWAY, “NAACP Not Pleased With the Diversity on Television,” January 12, 2009, http://www.wwaytv3.com/naacp_not_pleased_diversity_television/01/2009. Racial minorities are often absent, peripheral, or take on stereotyped roles in film, television, print media, advertising, and even in video games. Additionally, according to a 2002 study by the University of California, Los Angeles, the problem is not only a visible one, but also one that extends behind the scenes. The study found that minorities are even more underrepresented in creative and decision-making positions than they are on screen.Media Awareness Network, “Ethnic and Visible Minorities in Entertainment Media,” 2010, http://www.media-awareness.ca/english/issues/stereotyping/ethnics_and_minorities/minorities_entertainment.cfm. This lack of representation among producers, writers, and directors often directly affects the way minorities are portrayed in film and television, leading to racial stereotypes.
Though advocacy groups like the NCLR and the NAACP have often been at the forefront of protests against minority stereotypes in the media, experts are quick to point out that the issue is one everyone should be concerned about. As media ethicist Leonard M. Baynes argues, “Since we live in a relatively segregated country…broadcast television and its images and representations are very important because television can be the common meeting ground for all Americans.”Leonard M. Baynes, “White Out: The Absence and Stereotyping of People of Color by the Broadcast Networks in Prime Time Entertainment Programming,” Arizona Law Review 45 (2003): 293. There are clear correlations between mass media portrayals of minority groups and public perceptions. In 1999, after hundreds of complaints by African Americans that they were unable to get taxis to pick them up, the city of New York launched a crackdown, threatening to revoke the licenses of cab drivers who refused to stop for African American customers. When interviewed by reporters, many cab drivers blamed their actions on fears they would be robbed or asked to drive to dangerous neighborhoods.Sherri Burr, “Television and Societal Effects: An Analysis of Media Images of African-Americans in Historical Context,” Journal of Gender, Race and Justice 4 (2001): 159.
Racial stereotypes are not only an issue in entertainment media; they also find their way into news reporting, which is a form of storytelling. Journalists, editors, and reporters are still predominately white. According to a 2000 survey, only 11.6 percent of newsroom staff in the United States were racial and ethnic minorities.Media Awareness Network, “Ethnic and Visible Minorities in the News,” 2010, http://www.media-awareness.ca/english/issues/stereotyping/ethnics_and_minorities/minorities_news.cfm. The situation has not improved dramatically during the past decade. According to a 2008 newsroom census released by the American Society of Newspaper Editors, the percentage of minority journalists working at daily newspapers was a scant 13.52 percent.National Association of Hispanic Journalists, “NAHJ Disturbed by Figures That Mask Decline in Newsroom Diversity,” news release, 2010, http://www.nahj.org/nahjnews/articles/2008/April/ASNE.shtml. Because of this underrepresentation behind the scenes, the news media is led by those whose perspective is already privileged, who create the narratives about those without privilege. In the news media, racial minorities are often cast in the role of villains or troublemakers, which in turn shapes public perceptions about these groups. Media critics Robert Entman and Andrew Rojecki point out that images of African Americans on welfare, African American violence, and urban crime in African American communities “facilitate the construction of menacing imagery.”Clifford G. Christians, “Communication Ethics,” in Encyclopedia of Science, Technology, and Ethics, ed. Carl Mitchum (Detroit: Macmillan Reference USA, 2005), 1:366. Similarly, a study by the National Association of Hispanic Journalists found that only 1 percent of the evening news stories aired by the three major U.S. television networks cover Latinos or Latino issues, and that when Latinos are featured, they are portrayed negatively 80 percent of the time.Media Awareness Network, “Ethnic and Visible Minorities in the News.” Still others have criticized journalists and reporters for a tendency toward reductive presentations of complex issues involving minorities, such as the religious and racial tensions fueled by the September 11 attacks. By reducing these conflicts to “opposing framesExplaining an issue as a two-sided struggle for the sake of fast and easy audience comprehension.”—that is, by oversimplifying them as two-sided struggles so that they can be quickly and easily understood—the news media helped create a greater sense of separation between Islamic Americans and the dominant culture after September 11, 2001.Ginny Whitehouse, “Why Diversity Is an Ethical Issue,” The Handbook of Mass Media Ethics, ed. Lee Wilkins and Clifford G. Christians (New York: Routledge, 2009), 101.
Since the late 1970s, the major professional journalism organizations in the United States—Associated Press Managing Editors (APME), Newspaper Association of America (NAA), American Society of Newspaper Editors (ASNE), Society for Professional Journalists (SPJ), Radio and Television News Directors Association (RTNDA), and others—have included greater ethnic diversity as a primary goal or ethic. However, progress has been slow. ASNE has set 2025 as a target date to have minority representation in newsrooms match U.S. demographics.Ginny Whitehouse, “Why Diversity Is an Ethical Issue,” The Handbook of Mass Media Ethics, ed. Lee Wilkins and Clifford G. Christians (New York: Routledge, 2009), 102.
Because the programming about, by, and for ethnic minorities in the mainstream media is disproportionately low, many turn to niche publications and channels such as BET, Univision, Telemundo, Essence, Jet, and others for sources of information and entertainment. In fact, 45 percent of ethnic-minority adults prefer these niche media sources to mainstream television, radio programs, and newspapers.Ginny Whitehouse, “Why Diversity Is an Ethical Issue,” The Handbook of Mass Media Ethics, ed. Lee Wilkins and Clifford G. Christians (New York: Routledge, 2009), 103. These sources cover stories about racial minorities that are generally ignored by the mainstream press and offer ethnic-minority perspectives on more widely covered issues in the news.Pew Project for Excellence in Journalism, “Ethnic,” in The State of the News Media 2010, http://www.stateofthemedia.org/2010/ethnic_summary_essay.php. Entertainment channels like BET (a 24-hour cable television station that offers music videos, dramas featuring predominately black casts, and other original programming created by African Americans) provide the diverse programming that mainstream television networks often drop.Rachel Zellars, “Black Entertainment Television (BET),” in Encyclopedia of African-American Culture and History, 2nd ed., ed. Colin A. Palmer (Detroit: Macmillan Reference USA, 2006.) 1:259. Print sources like Vista, a bilingual magazine targeting U.S. Hispanics, and Vivid, the most widely circulated African American periodical, appeal to ethnic minority groups because they are controlled and created by individuals within these groups. Though some criticize ethnic niche media, claiming that they erode common ground or, in some instances, perpetuate stereotypes, the popularity of these media has only grown in recent years and will likely continue in the absence of more diverse perspectives in mainstream media sources.Can Tran, “TV Network Reviews: Black Entertainment Television (BET),” Helium, http://www.helium.com/items/884989-tv-network-reviews-black-entertainment-television-bet; Joe Flint, “No Black-and-White Answer for the Lack of Diversity on Television,” Company Town (blog), Los Angeles Times, June 11, 2010, http://latimesblogs.latimes.com/entertainmentnewsbuzz/2010/06/diversity-television.html.
In the ABC sitcom The Donna Reed Show (1958–1966), actress Donna Reed plays a stay-at-home mother who fills her days with housework, cooking for her husband and children, decorating, and participating in community organizations, all while wearing pearls, heels, and stylish dresses. Such a traditional portrayal of femininity no doubt sounds dated to modern audiences, but stereotyped gender roles continue to thrive in the mass media. Women are still often represented as subordinate to their male counterparts—emotional, noncompetitive, domestic, and sweet natured. In contrast to these types, other women are represented as unattractively masculine, crazy, or cruel. In television dramas and sitcoms, women continue to fill traditional roles such as mothers, nurses, secretaries, and housewives. By contrast, men in film and television are less likely to be shown in the home, and male characters are generally characterized by dominance, aggression, action, physical strength, and ambition.Daniel Chandler, “Television and Gender Roles” http://www.aber.ac.uk/media/Modules/TF33120/gendertv.html#E. In the mainstream news media, men are predominately featured as authorities on specialized issues like business, politics, and economics, while women are more likely to report on stories about natural disasters or domestic violence—coverage that does not require expertise.Media Awareness Network, “Media Coverage of Women and Women’s Issues,” http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_working.cfm. In sports programming, men are the authoritative figures in the broadcast booth while women are “sideline reporters.“
Not only is the white male perspective still presented as the standard, authoritative one, but also the media itself often comes to embody the male gaze. Media commentator Nancy Hass notes that “shows that don’t focus on men have to feature the sort of women that guys might watch.”Media Awareness Network, “The Economics of Gender Stereotyping,” http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_economics.cfm. Feminist critics have long been concerned by the way women in film, television, and print media are defined by their sexuality. Few female role models exist in the media who are valued primarily for qualities like intelligence or leadership. Inundated by images that conform to unrealistic beauty standards, women come to believe at an early age that their value depends on their physical attractiveness. According to one Newsweek article, eating disorders in girls are now routinely being diagnosed at younger ages, sometimes as early as 8 or 9. The models who appear in magazines and print advertising are unrealistically skinny (23 percent thinner than the average woman), and their photographs are further enhanced to hide flaws and blemishes. Meanwhile, the majority of women appearing on television are under the age of 30, and many older actresses, facing the pressure to embody the youthful ideal, undergo surgical enhancements to appear younger.Jennifer L. Derenne and Eugene V. Beresin, “Body Image, Media, and Eating Disorders,” Academic Psychiatry 30 (2006), http://ap.psychiatryonline.org/cgi/content/full/30/3/257. One recent example is television news host Greta Van Susteren, a respected legal analyst who moved from CNN to Fox in 2002. At the debut of her show, On the Record, Van Susteren, sitting behind a table that allowed viewers to see her short skirt, had undergone not only a hair and wardrobe makeover, but also surgical enhancement to make her appear younger and more attractive.Media Awareness Network, “Media Coverage of Women and Women’s Issues,” http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_working.cfm. However, these enhancements are not restricted to the “over 30” crowd. Indeed, younger stars, such as Ashley Tisdale (High School Musical), singer Ashlee Simpson, and reality star Heidi Montag, have all availed themselves of plastic surgery.
In addition to the prevalence of gender stereotypes, the ratio of men to women in the mass media, in and behind the scenes, is also disproportionate. Surprisingly, though women slightly outnumber men in the general population, over two-thirds of television sitcoms feature men in the starring role.Media Awareness Network, “The Economics of Gender Stereotyping,” http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_economics.cfm. Among writers, producers, directors, and editors, the number of women lags far behind. In Hollywood, for instance, only 17 percent of behind-the-scenes creative talent is represented by women. Communications researcher Martha Lauzen argues that “when women have more powerful roles in the making of a movie or television show, we know that we also get more powerful female characters on-screen, women who are more real and more multi-dimensional.”Media Awareness Network, “Women Working in the Media,” http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_working.cfm.
Creators of all forms of media know that sex—named, innuendoed, or overtly displayed—is a surefire way to grab an audience’s attention. “Sex sells” is an advertising cliché; the list of products that advertisers have linked to erotic imagery or innuendo, from cosmetics and cars to vacation packages and beer, is nearly inexhaustible. Most often, sexualized advertising content is served up in the form of the female body, in part or in whole, featured in provocative or suggestive poses beside a product that may have nothing to do with sexuality. However, by linking these two things, advertisers are marketing desire itself.
Figure 14.2

Sex Sells: Commodifying Desire, Past and Present
Sex is used to sell not just consumer goods; it sells media, too. Music videos on MTV and VH1, which promote artists and their music, capture audience attention with highly suggestive dance moves, often performed by scantily clad women. Recent music videos by Jennifer Lopez, Rihanna, Beyoncé, and Lady Gaga are just a few examples. Movie trailers may flash brief images of nudity or passionate kissing to suggest more to come in the movie. Video games feature female characters like Lara Croft of Tomb Raider, whose tightly fitted clothes reveal all the curves of her Barbie-doll figure. And partially nude models grace the cover of men’s and women’s magazines like Maxim, Cosmopolitan, and Vogue where cover lines promise titillating tips, gossip, and advice on bedroom behavior.Tom Reichert and Jacqueline Lambiase, “Peddling Desire: Sex and the Marketing of Media and Consumer Goods,” Sex in Consumer Culture: The Erotic Content of Media and Marketing, ed. Tom Reichert and Jacqueline Lambiase (New York: Routledge, 2005), 3.
In the 1920s and 1930s, filmmakers attracted audiences to the silver screen with the promise of what was then considered scandalous content. Prior to the 1934 Hays Code, which placed restrictions on “indecent” content in movies, films featured erotic dances, male and female nudity, references to homosexuality, and sexual violence (for more information on the Hays Code, see Chapter 8 "Movies" and Chapter 15 "Media and Government"). D. W. Griffith’s Intolerance (1916) includes scenes with topless actresses, as does Ben Hur (1925). In Warner Bros.’ Female (1933), the leading lady, the head of a major car company, spends her evenings in sexual exploits with her male employees, a story line that would never have passed the Hays Code a year later.Gary Morris, “Public Enemy: Warner Brothers in the Pre-Code Era,” Bright Lights Film Journal, September 1996, http://www.brightlightsfilm.com/17/04b_warner.php. Trouble in Paradise, a 1932 romantic comedy, was withdrawn from circulation after the institution of the Hays Code because of its frank discussion of sexuality. Similarly, Dr. Jekyll and Mr. Hyde (1931), which featured a prostitute as one of the main characters, was also banned under the code.Daniel P. Hauesser, “Indecent and Deviant: Pre-Hays Code Films You Should See,” indieWIRE, 2007, http://www.spout.com/groups/Top_5/Re_5_Pre_Hays_Code_Films/190/19210/1/ShowPost.aspx.
In the 1960s, when the sexual revolution led to increasingly permissive attitudes toward sexuality in American culture, the Hays Code was replaced with the MPAA rating system, with ratings such as G, PG, and R. The rating system, designed to warn parents about potentially objectionable material in films, allowed filmmakers to include sexually explicit content without fear of public protest. Since the replacement of the Hays Code, sexual content has been featured in movies with much greater frequency.
The problem, according to many media critics, is not that sex now appears more often, but that it is almost always portrayed unrealistically in American mass media.Mary Lou Galician, Sex, Love & Romance in the Mass Media (New York: Routledge, 2004), 5; Media Awareness Network, “Sex and Relationships in the Media,” Media Awareness Network, http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_sex.cfm. This can be harmful, they say, because the mass media are important socialization agentsA way people learn about the norms, expectations, and values of their society.; that is, ways that people learn about the norms, expectations, and values of their society.Mary Lou Galician, Sex, Love & Romance in the Mass Media (New York: Routledge, 2004), 82. Sex, as many films, television shows, music videos, and song lyrics present it, is frequent and casual. Rarely do these media point out the potential emotional and physical consequences of sexual behavior. According to one study, portrayals of sex that include possible risks like sexually transmitted diseases or pregnancy only occur in 15 percent of the sexually explicit material on television.Parents Television Council, “Facts and TV Statistics,” http://www.parentstv.org/ptc/facts/mediafacts.asp. Additionally, actors and models depicted in sexual relationships in the media are thinner, younger, and more attractive than the average adult. This creates unrealistic expectations about the necessary ingredients for a satisfying sexual relationship.
Social psychologists are particularly concerned with the negative effects these unrealistic portrayals have on women, as women’s bodies are the primary means of introducing sexual content into media targeted at both men and women. Media activist Jean Kilbourne points out that “women’s bodies are often dismembered into legs, breasts or thighs, reinforcing the message that women are objects rather than whole human beings.”Media Awareness Network, “Sex and Relationships in the Media,” Media Awareness Network, http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_sex.cfm. Adbusters, a magazine that critiques mass media, particularly advertising, points out the sexual objectification of women’s bodies in a number of its spoof advertisements, such as the one in Figure 14.3, bringing home the message that advertising often sends unrealistic and harmful messages about women’s bodies and sexuality. Additionally, many researchers note that in women’s magazines, advertising, and music videos, women are often implicitly—and sometimes explicitly—given the message that a primary concern should be attracting and sexually satisfying men.Media Awareness Network, “Sex and Relationships in the Media,” Media Awareness Network, http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_sex.cfm. Furthermore, the recent increase in entertainment featuring sexual violence may, according to some studies, negatively affect the way young men behave toward women.Barrie Gunter, Media Sex: What Are the Issues? (Mahwah, NJ: Lawrence Erlbaum Associates, 2002), 8.
Figure 14.3

Sexual objectification: Women’s bodies are often headless or dismembered into legs, breasts, or thighs in media portrayals.Adbusters, “Spoof Ads,” https://www.adbusters.org/gallery/spoofads.
Young women and men are especially vulnerable to the effects of media portrayals of sexuality. Psychologists have long noted that teens and children get much of their information and many of their opinions about sex through television, film, and online media. In fact, two-thirds of adolescents turn to the media first when they want to learn about sexuality.Media Awareness Network, “Sex and Relationships in the Media,” Media Awareness Network, http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_sex.cfm. The media may help shape teenage and adolescent attitudes toward sex, but they can also lead young people to engage in sexual activity before they are prepared to handle the consequences. According to one study, kids with high exposure to sex on television were almost twice as likely to initiate sexual activity compared to kids without exposure.Rebecca L. Collins and others, “Watching Sex on Television Predicts Adolescent Initiation of Sexual Behavior,” Pediatrics 114, no. 3 (2004), http://pediatrics.aappublications.org/cgi/content/full/114/3/e280.
Cultural critics have noted that sexually explicit themes in mass media are generally more widely accepted in European nations than they are in the United States. However, the increased concern and debates over censorship of sexual content in the United States may in fact be linked to the way sex is portrayed in American media rather than to the presence of the sexual content in and of itself. Unrealistic portrayals that fail to take into account the actual complexity of sexual relationships seem to be a primary concern. As Jean Kilbourne has argued, sex in the American media “has far more to do with trivializing sex than with promoting it. We are offered a pseudo-sexuality that makes it far more difficult to discover our own unique and authentic sexuality.”Media Awareness Network, “Sex and Relationships in the Media,” Media Awareness Network, http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_sex.cfm. However, despite these criticisms, it is likely that unrealistic portrayals of sexual content will continue to be the norm in mass media unless the general public stops consuming these images.
Choose a television show or movie you are familiar with and consider the characters in terms of racial and gender diversity. Then answer the following short-answer questions. Each response should be one to two paragraphs.
Now more than ever, with the presence of online news sources, news delivery is expected to be instantaneous, and journalists and news agencies face pressure to release stories rapidly to keep up with competing media sources. With this added pressure, standards of accuracy and fairness become more difficult to uphold. What wins when ethical responsibility and bottom-line concerns are at odds? Columnist Ellen Goodman notes that there has always been a tension in journalism between being first and being right. She argues, “In today’s amphetamine world of news junkies, speed trumps thoughtfulness too often.”Ellen Goodman, “Temper ‘Instant’ News Coverage,” Gainesville (FL) Sun, February 7, 1993, http://news.google.com/newspapers?nid=1320&dat=19930207&id=vt4RAAAAIBAJ&sjid=XuoDAAAAIBAJ&pg=5028,1856837. As you read the following sections, decide if you agree with Goodman’s assessment of the state of the news media today.
In 1916, audiences across America tuned in to their radios to hear the first-ever breaking-news coverage of an event as the results of the presidential election between Woodrow Wilson and Charles Evans Hughes were announced from the offices of The New York American. Until that broadcast, news was delivered to American homes once per day in the form of a newspaper, and often this coverage lagged a day or more behind the actual incidents it reported. Whereas much of radio news coverage even into the 1930s involved the reading of newspaper stories and news wires on the air, radio offered something that the newspapers could not: live coverage of special events.Gordon Govier, “The Living Room Fixture,” The Evolution of Radio News, 2007, http://www.radioscribe.com/formats.html.
For decades, the public turned to the family radio when they wanted to hear the most recent coverage of important news. All of that changed, however, in 1963 with the assassination of President John F. Kennedy. CBS correspondent Dan Rather took television audiences live to “the corner window just below the top floor, where the assassin stuck out his 30 caliber rifle,” and for the first time, people were able to see an event nearly as it occurred. This was the beginning of round-the-clock television news coverage, and the American public, while still relying on print news for detailed coverage, came to expect greater immediacy of major event reporting through television and radio broadcasts.Jaime Holguin, “Rather Recalls JFK Assassination,” CBS News, February 28, 2005, http://www.cbsnews.com/stories/2005/02/28/eveningnews/main677096.shtml.
Today, with the widespread availability of Internet news, instant coverage is the norm rather than the exception, and the Internet and cell phones have generally replaced television and radio as the source of immediate information. Visitors to ABCNews.com can watch an evening newscast three and a half hours before it airs on television.Patricia Sullivan, “As the Internet Grows Up, the News Industry Is Forever Changed,” Washington Post, June 19, 2006, http://www.washingtonpost.com/wp-dyn/content/article/2006/06/13/AR2006061300929.html. RSS (which stands for really simple syndication, a standard for the easy syndication of online content) feeds, home pages for major news-delivery sites like Yahoo! News and CNN.com, news tickers, live video streams, blogs, Facebook, Twitter, and a host of other media outlets ensure that news—and rumors of news—circulates within minutes of its occurrence. Additionally, with smartphone applications like those for The New York Times and USA Today, people can access the latest news coverage from almost anywhere.
The development of the Internet as a source of free and immediate access to information has forever changed the structure of the news media. Newspaper, television, and radio news programs have all had to adapt and diversify to compete for a share of the market. As Jeffrey Cole, director of the Center for Digital Communication put it, “For the first time in 60 years, newspapers are back in the breaking news business.” Online, newspapers can compete with broadcast media for immediate coverage, posting articles on their home pages as soon as the stories are written, and supplementing the articles on their websites with audiovisual content. Gone is the era of single-medium newsrooms with predictable deadlines.USC Annenberg School for Communication and Journalism, “Annual Internet Survey by Center for the Digital Future Finds Large Increases in Use of Online Newspapers,” news release, Center for Digital Future, April 2009, http://annenberg.usc.edu/News%20and%20Events/News/090429CDF.aspx.
Not only are traditional news media restructuring, but news consumers are also changing the way they access information. Increasingly, audiences want news on demand; they want to get news when they want it, and they want to be able to gather it from a variety of sources. This is having a significant effect on media revenues. News aggregatorsA website that compiles news headlines from a number of sources for display on its pages. Examples include Newsmap, Google News, and Yahoo! News., websites like Yahoo! News and Google News that compile news headlines from an array of legacy news organizations to display on their pages, have become popular information outlets. Although these websites don’t hire reporters to produce news stories themselves, they get about the same amount of online traffic as websites for legacy news organizationsNews organizations that were around before online news caused a shift in the industry. These organizations have not been replaced, but they have had to diversify. like CNN and The Wall Street Journal. Moreover, many subscribers to print newspapers and magazines are canceling their subscriptions because they can get more current information online at no cost.Pew Project for Excellence in Journalism, The State of the News Media 2010, http://www.stateofthemedia.org/2010/overview_intro.php. Print advertising is down as well. In 2004, The San Francisco Chronicle reported losing $50 million in classified advertising to free online options like Craigslist.Patricia Sullivan, “As the Internet Grows Up, the News Industry Is Forever Changed,” Washington Post, June 19, 2006, http://www.washingtonpost.com/wp-dyn/content/article/2006/06/13/AR2006061300929.html.
This loss of revenue has become a problem in recent years because while newspapers and magazines generate some income from advertisements on their websites, the money is not enough to compensate for lost readership and print ads. Subscriptions and advertising in traditional print media still account for 90 percent of industry funds, which means with less revenue in these areas, the support base for news organizations is dwindling. Newspapers and magazines across the country have had to restructure and scale down. Newspapers now spend $1.6 billion less annually on reporting and editing than they did 10 years ago.Pew Project for Excellence in Journalism, The State of the News Media 2010.
Additionally, reduced budgets combined with greater pressure for immediacy have changed the way information gets reported and disseminated. Newsrooms are asking their staffs to focus on producing first accounts more quickly to feed multiple platforms. This often means that more resources go into distributing information than gathering it. Once news is released online by one source, it spreads rapidly, and other organizations scramble to release accounts, too, in order to keep up, often leaving staff less time for fact-checking and editing. The initial story is then followed quickly by commentary from both professional news organizations and nonprofessional sources on blogs, Twitter, and other social networks.
As a result of this restructuring, certain stories may get distributed, replayed, and commented on almost excessively, while other stories go unnoticed and in-depth coverage that would unearth more facts and context gets neglected. This has led a number of industry professionals to become anxious over the future of the news industry. The Center for Excellence in Journalism has called the news industry today “more reactive than proactive.”Pew Project for Excellence in Journalism, The State of the News Media 2010. Journalist Patricia Sullivan complains, “Right now, almost no online news sites invest in original, in-depth and scrupulously edited news reporting.”Patricia Sullivan, “As the Internet Grows Up, the News Industry Is Forever Changed,” Washington Post, June 19, 2006, http://www.washingtonpost.com/wp-dyn/content/article/2006/06/13/AR2006061300929.html. While some may disagree with Sullivan, in-depth journalism remains an expensive and time-consuming venture that many online news sites, faced with uncertain revenue streams and a growing consumer demand for real-time news updates, are reluctant to bankroll extensively.
Already strapped for funds, news organizations know they have to cater to public demands, and foremost among these demands is speed. When pop-music icon Michael Jackson died on June 26, 2009, at 2:26 p.m., news of his death hit cyberspace by 2:44 p.m. and soon spread nationwide via Twitter. Perhaps surprisingly, the initial report of Jackson’s death was released by celebrity gossip website TMZ. Legacy news sources were slower to publish accounts. The Los Angeles Times, wary of the sourcing of the story, waited to confirm the news and didn’t publish the story on its website until 3:15 p.m., by which time, thanks to the speed of social media, the star’s death was already “old news.”Scott Collins and Greg Braxton, “TV Misses Out as Gossip Website TMZ Reports Michael Jackson’s Death First,” Los Angeles Times, June 26, 2009, http://articles.latimes.com/2009/jun/26/local/me-jackson-media26.
Figure 14.4

American news organizations are losing their audiences to online media and have lost billions in advertising income.
In the preamble to its statement of purpose, the Committee of Concerned Journalists lists as the central purpose of journalism “to provide citizens with accurate and reliable information they need to function in a free society.”Committee of Concerned Journalists, “Statement of Shared Purpose,” Pew Project for Excellence in Journalism, http://www.journalism.org/resources/principles. This theory of the social responsibility of the press is often referred to as the vital information premiseThe widely accepted foundation for principles of journalism ethics, according to which the news media have a responsibility to society as a whole to provide the information it needs to function as a democracy.. Though sometimes worded differently by different organizations, it is widely accepted in the journalism community as the foundation for any principles of media ethics.Jeremy Iggers, Good News, Bad News: Journalism Ethics and the Public Interest (Boulder, CO: Westview Press, 1999), 46. What are those specific principles? Here are some that are particularly important for journalists in the current media climate.
If the basis for the principles of ethical news reporting is giving citizens the information they need to function in a democratic society, then that information must be presented accurately. Journalists should be careful to verify the facts before they report them. As the Committee of Concerned Journalists asserts, “Accuracy is the foundation upon which everything else is built—context, interpretation, comment, criticism, analysis and debate,” so reliable news sources are essential if citizens are to have a clear understanding of the society in which they live.Committee of Concerned Journalists, “Statement of Shared Purpose,” Pew Project for Excellence in Journalism, http://www.journalism.org/resources/principles. Furthermore, although news organizations have a professional responsibility toward advertisers and shareholders, their commitment is always to citizens first. This means that journalists must report the facts truthfully and without omission, even if they are not in the best interest of advertisers, shareholders, or friends.
Reporting issues fairly requires not only factual accuracy, but also lack of favoritism toward any organization, political group, ideology, or other agenda. The Society of Professional Journalists stipulates that journalists should refuse gifts and favors and avoid political involvement or public office if these things compromise journalistic integrity.Society of Professional Journalists, “SPJ Code of Ethics,” http://www.spj.org/ethicscode.asp. Additionally, journalists should avoid inflating stories for sensation and be as transparent as possible about their sources of information so that the public can investigate the issues further on their own.Committee of Concerned Journalists, “Statement of Shared Purpose,” Pew Project for Excellence in Journalism, http://www.journalism.org/resources/principles.
All sides of an issue should be presented in a news story. Of course, all journalists have a perspective from which they write, but a clear distinction should be made between news reports and editorial content.American Society of News Editors, “ASNE’s Statement of Principles,” August 2009, http://asne.org/article_view/articleid/325/asnes-statement-of-principles.aspx.
Many issues in the news are layered and highly complex. Developing a thorough understanding of issues requires dedication and a sometimes lengthy investigation, and, especially in a world where rapid reporting is the norm, there can be a temptation to gloss over the finer points of an issue for the sake of efficiency. Additionally, most consumers of news, increasingly busy and overwhelmed by the amount of information available, want stories that can be quickly digested and easily comprehended. However, as the Committee of Concerned Journalists points out, the media must balance what readers want with what they need but cannot anticipate.Committee of Concerned Journalists, “Statement of Shared Purpose,” Pew Project for Excellence in Journalism, http://www.journalism.org/resources/principles. Oversimplifying issues, whether for the sake of a quick story or to satisfy public tastes, becomes a violation of the vital information premise.Society of Professional Journalists, “SPJ Code of Ethics,” http://www.spj.org/ethicscode.asp.
When discussing what he considers to be one of the key issues in professional journalism, media ethicist Jeremy Iggers points out that because democracy means the widest possible participation of citizens in public life, diversity in journalism is of fundamental importance.Jeremy Iggers, Good News, Bad News: Journalism Ethics and the Public Interest (Boulder, CO: Westview Press, 1999), 138. Not only should newsroom staff represent a diversity of gender and races, but journalists should also speak for all groups in society—“not just those with attractive demographics,” as the Committee for Concerned Journalists puts it. Journalists should represent the underrepresented because ignoring citizens is a form of disenfranchisement.Committee of Concerned Journalists, “Statement of Shared Purpose,” Pew Project for Excellence in Journalism, http://www.journalism.org/resources/principles.
When the framers of the U.S. Constitution guaranteed freedom of the press, one of the things they had in mind was the ability of the news media to serve as a watchdog over those in positions of power.Committee of Concerned Journalists, “Statement of Shared Purpose,” Pew Project for Excellence in Journalism, http://www.journalism.org/resources/principles. It is the duty of the press to ensure that business is conducted in the open and that government actions are public. One famous example of the media fulfilling its watchdog role was The Washington Post’s investigation of the 1972 Watergate scandal. During Richard Nixon’s presidency, journalists at the Post uncovered information linking government agencies and officials to the break-in at the Democratic National Committee headquarters at the Watergate complex as part of an attempt to sabotage the Democratic campaign and guarantee Nixon’s reelection.Richard M. Flanagan and Louis W. Koenig, “Watergate,” in Dictionary of American History, ed. Stanley I. Kutler, 3rd ed. (New York: Charles Scribner’s Sons, 2003), 8:425. Media coverage of the scandal increased publicity and ultimately put pressure on the government that led to an investigation and the prosecution of many who were involved.“The Government and Watergate,” in American Decades, ed. Judith S. Baughman and others (Detroit: Gale, 2001), vol. 8. In the past decade, the media has uncovered incriminating information about various Senators and Congressmen, resulting in charges of corruption, tax evasion, conspiracy, and fraud.
While CNN and other news networks took some criticism for their delay in reporting Michael Jackson’s death in 2009, others commended the news organizations for waiting for official confirmation. For many journalists and members of the public, ensuring accuracy, even when it means delays, is a hallmark of responsible journalism.
More than 400 journalistic codes of ethics have been produced by various unions and associations worldwide.Aidan White, To Tell You the Truth: The Ethical Journalist Initiative (Brussels: International Federation of Journalists, 2008), iii. Where they may differ on specifics, these codes of ethics agree that the news media’s top obligation is to report the truth. When journalists say this, of course, they don’t mean truth in an absolute, philosophical sense; they mean practical truth, the truth that involves reporting the facts as faithfully and accurately as possible. This notion of truth includes an accurate representation of information from reliable sources, but it also includes a complete representation, one that presents multiple perspectives on an issue and does not suppress vital information.
Many codes of ethics stress that the press has a duty to continue its investigation of the facts, even after initially reporting them, and to rectify any inaccuracies that may have occurred in the initial coverage of an issue.Aidan White, To Tell You the Truth: The Ethical Journalist Initiative (Brussels: International Federation of Journalists, 2008), ii; Committee of Concerned Journalists, “Statement of Shared Purpose,” Pew Project for Excellence in Journalism, http://www.journalism.org/resources/principles. One example is The Huffington Post, a news website that, with over 2,000 bloggers, has the world’s most linked-to blog. Blogging is sometimes criticized by more traditional journalists for the tendency, among some blogs, to include biases, unreliable information, and unfounded opinions—in other words, for instances of violating journalistic codes of ethics. However, The Huffington Post requires all of its pass-holding writers to fact-check and to correct any factual errors within 24 hours or lose their privileges.Aidan White, To Tell You the Truth: The Ethical Journalist Initiative (Brussels: International Federation of Journalists, 2008), 76.
Along with an emphasis on the truth, codes of ethics stress loyalty and duty to citizens as a standard of primary importance. Of course, truth telling is an essential component of this loyalty, but additionally, the concern here is in reminding journalists whom their work serves. Especially in the current environment, in which media outlets face increased financial pressure, there is a tension between responsible journalism and the demands for profit. Aiden White notes that corporate and political influences are of increasing concern in this environment, but he reminds journalists that while they have duties to other constituencies, “media products are not just economic.” Journalists must hold the larger public interest above other interests.Aidan White, To Tell You the Truth: The Ethical Journalist Initiative (Brussels: International Federation of Journalists, 2008), 8.
Another challenge often posed by bottom-line concerns and the pressure for a good story is sensitivity toward, and protection of, those involved in the news. Responsible journalists should strive to balance disclosure of the news with a respect for individual privacy. Finding this balance can sometimes be a challenge. On one hand, journalists should never expose private information that could be harmful to individuals for the sake of sensationalizing a story. Issues like family life, sexual behavior, sexual orientation, or medical conditions, for instance, are generally considered tabloid material that would violate the privacy of those involved.
On the other hand, there are times when the private lives of individuals must be made public in the interests of serving the common good. One example was the 2009 media scandal surrounding South Carolina Governor Mark Sanford, who, after media investigations over his weeklong disappearance in June of that year, admitted to flying to Argentina to visit his mistress. After it was revealed that Sanford had used public funds for his private travel, he resigned from his office as the chairman of the Republican Governors’ Association.Associated Press, “Sanford Took Personal Trips on Plane,” CBS News, August 9, 2009, http://www.cbsnews.com/stories/2009/08/09/politics/main5228211.shtml. Although the publicity surrounding this private matter was clearly painful for the governor and his family, releasing information about the incident, particularly regarding the misuse of public funds, was in the best interest of the citizens. The International Federation of Journalists offers three factors as a rough guideline in cases where privacy is in danger of being violated: the nature of the individual’s place in society, the individual’s reputation, and his or her place in public life. Politicians, judges, and others in elected office often must forgo their expectations of privacy for reasons of democracy and accountability—the public’s right to know if their elected officials are engaged in unethical or criminal conduct generally trumps an individual’s right to privacy.Aidan White, To Tell You the Truth: The Ethical Journalist Initiative (Brussels: International Federation of Journalists, 2008), 136. These factors, in recent years, have resulted in public scandals surrounding other political leaders—Representative Anthony Weiner, former Governor Arnold Schwarzenegger, and presidential hopeful John Edwards, to name a few.
Figure 14.5

As shown in the scandal surrounding former South Carolina Governor Mark Sanford, drawing the line between exploiting individuals’ private lives to sell stories and disclosing information in the public interest is not always clear.
Because the press has a duty to serve the best interests of the citizens in a democracy, it is important that journalists act independently and that they remain neutral in their presentation of information. Objectivity was once the common term used to support this notion. More recently, however, there has been wider acceptance of the fact that reporting always occurs through a lens of personal experience, culture, beliefs, and background that ultimately all influence the way any individual subjectively perceives a situation.Howard A. Myrick, “The Search for Objectivity in Journalism,” USA Today (Society for the Advancement of Education), November 2002, http://findarticles.com/p/articles/mi_m1272/is_2690_131/ai_94384327/?tag=content;col1. If this were not the case—if there were only one standard way everyone perceived, investigated, and reported on a story—what would be the value of including racial and gender diversity in the newsroom? Nevertheless, responsible journalism requires journalists to avoid favoritism and to present news that is fair and offers a complete picture of the issue.
The principle of journalistic independence is an important component of the news media’s watchdog role. Journalists should avoid conflicts of interest—financial, political, or otherwise—and, when conflicts of interest are unavoidable, it is a journalist’s ethical responsibility to disclose those.Society of Professional Journalists, “SPJ Code of Ethics,” http://www.spj.org/ethicscode.asp. One example involving conflict of interest centers on recent talk of government bailouts for the news media, similar to the bailouts for the auto and banking industries. However, many journalists are concerned that government support of this kind would present a conflict of interest and interfere with the media’s watchdog role.David Nicklaus, “Bailing Out Journalism Would Threaten Its Independence,” St. Louis Post-Dispatch, June 8, 2010, http://more.stltoday.com/stltoday/business/columnists.nsf/davidnicklaus/story/7db2f5de844ed63f8625773c000da74b?OpenDocument.
In addition to maintaining independence, the news media should allow for commentary and opposition. Leaving space for citizens to voice concerns about journalistic conduct is an important part of serving the public interest and keeping the public’s trust.
While principles of ethical journalism require journalists to remain neutral in their reporting, there is, as previously mentioned, always a degree of bias that will be present in any news reporting due to the element of personal perspective that any journalist will naturally bring to his or her work. A 2005 in-depth study by political scientists at UCLA found that, of 20 media outlets, 18 had a perspective in their news reporting that was left of the national average. Of those 20, only Fox News and The Washington Times scored to the right of the average U.S. voter.Meg Sullivan, “Media Bias is Real, Finds UCLA Political Scientist,” news release, UCLA, December 14, 2005, http://newsroom.ucla.edu/portal/ucla/Media-Bias-Is-Real-Finds-UCLA-6664.aspx.
What, exactly, does political bias in the media look like? In the UCLA study, news sources were scored based on their sources of information and expert opinion. The news outlets with the most liberal slant—CBS News and The New York Times—cited liberal think tanks and policy groups with a much greater frequency than they cited conservative ones.Tim Groseclose and Jeffrey Milyo, “A Measure of Media Bias,” Quarterly Journal of Economics 120, no. 4 (2005), http://www.sscnet.ucla.edu/polisci/faculty/groseclose/pdfs/MediaBias.pdf. Political bias can also be observed by examining which stories a network or newspaper chooses to report. According to media analyst Seth Ackerman, the right-leaning Fox News network reports news stories that favor the Republican Party or show the Democratic Party in a negative light. Additionally, Fox’s panels of pundits who offer commentary after the news tend to be politically conservative or moderate far more often than liberal.Seth Ackerman, “The Most Biased Name in the News,” FAIR: Fairness and Accuracy in Reporting, July/August 2001, http://www.fair.org/index.php?page=1067.
Figure 14.6

Some argue that there is a politically left bias in the news media.
Such biases in news media may have an effect on public opinion. However, while the picture a journalist or particular news outlet creates may not be entirely objective, journalists with integrity will strive to be fair and comprehensive, offering opposing views and citing their sources of information. Members of the public should remember that they also have a responsibility to be active, rather than passive, consumers of information. Good media consumers use critical analysis skills while reading news reports. If a story is presented conscientiously in the news, a reader or viewer will have the resources he or she needs to research an issue further and draw his or her own conclusions. As you continue reading the chapter, keep in mind the ethical obligations of those who work in mass media and the potential consequences of their failure to uphold them.
The Internet has brought about profound and rapid changes in the structuring, delivery, and economics of news media.
Conduct your own survey of political bias in the news. Choose either a television network or newspaper known for more liberal tendencies, such as CNN or The New York Times, and a network or newspaper known for more conservative reporting, such as Fox News or The Washington Times. Examine both sources’ coverage of the same news story (not a column or editorial). Then answer the following short-answer questions. Each response should be one to two paragraphs.
Online media has developed rapidly, with technology advancing at a rate that often surpasses the ability of legislation and policy to keep up with it. As a result, issues like individuals’ rights to privacy, copyright protections, and fair use restrictions have become the subject of numerous court cases and public debates as lawmakers, judges, and civil liberties organizations struggle to define the limits of technology and the access it provides to previously restricted information. In the following section you will look at some of the most prominent issues in today’s online media environment. We have already considered some of the legal questions surrounding these issues. Here, you should reflect on the ethical issues in mass media raised in the two preceding sections and how they are manifested in the areas of personal privacy, copyright law, and plagiarism.
Concerns about online privacy issues in recent years have led some people to wonder whether the collection of personal information on websites has begun to infringe on individuals’ constitutional rights. While the U.S. Constitution does not explicitly guarantee a general right to privacy, the Bill of Rights establishes privacy of beliefs, privacy of the home, and privacy of person and possessions from unreasonable searches. Additionally, in a number of court cases, the “right to liberty” clause has also been read as a guarantee of personal privacy.Doug Linder, “The Right of Privacy,” Exploring Constitutional Law, 2010, http://www.law.umkc.edu/faculty/projects/ftrials/conlaw/rightofprivacy.html. What do these constitutional rights mean when it comes to storing a person’s credit card data online, or tracking his or her Internet searches, or using cookies to collect information about his or her purchasing habits? Because online media is developing so rapidly, many of these issues have not been settled by federal legislation and remain the source of numerous courtroom battles. Consider the 2010 case in which the online services company Yahoo! entered into a legal struggle with government officials who wanted to search the email account of a Yahoo! user for incriminating evidence. While Yahoo! claimed the government would need a search warrant to access a user’s email, the government investigators claimed the Fourth Amendment does not apply in the case of an email account.Electronic Frontier Foundation, “EFF Backs Yahoo! to Protect User from Warrantless Email Search,” news release, April 14, 2010, http://www.eff.org/press/archives/2010/04/13. Many college students reveal much about themselves on Facebook; they sometimes are chagrined to learn that future employers sometimes will view this information.
In defense of information collection and surveillance, many websites argue that, by using their services, individuals are agreeing to make their personal information available. However, many people don’t realize the extent of surveillance capabilities or know how to protect certain personal information while using online tools. The more people rely on the Internet for shopping, communication, social networking, and media consumption, the more their personal data is stored online. Every time a person subscribes to a magazine, joins an organization, donates money to charity, gives to a political cause, or searches the pages of a government agency, that information is stored in a computer.Privacy Rights Clearinghouse, “Privacy Today: A Review of Current Issues,” March 2010, http://www.privacyrights.org/ar/Privacy-IssuesList.htm#publicrecords. For example, cookiesText files that web page servers embed in users’ hard drives to help search engines keep track of their customers’ search histories, buying habits, and browsing patterns., text files that web page servers embed in users’ hard drives, help search engines like Google and Yahoo! track their customers’ search histories, buying habits, and browsing patterns. Cookies stored by Google last for 30 years.Maria Godoy, “Google Records Subpoena Raises Privacy Fears,” NPR, January 20, 2006, http://www.npr.org/templates/story/story.php?storyId=5165854. These search engine cookies are used to customize users’ searches and to deliver customized third-party ads based on a particular user’s demographics and behavior. However, privacy advocates claim this practice fosters predatory advertising.Tom Spring, “Good-Bye to Privacy?” PC World, May 23, 2010, http://www.pcworld.com/article/196787/goodbye_to_privacy.html. Additionally, considering that search engines receive multiple requests per day for specific information on their users (requests that are often tied to criminal investigations and lawsuits), there is a growing concern that unfair or even erroneous profiling may result.Maria Godoy, “Google Records Subpoena Raises Privacy Fears,” NPR, January 20, 2006, http://www.npr.org/templates/story/story.php?storyId=5165854. Much of this information is stored without users’ knowledge or informed consent—although agreements for most software inform users when their data is being collected, few people have the patience or time to read and understand the dense legalistic language of these agreements. And even when users invest the time and effort to understand the agreements, they are left with the difficult choice of either agreeing to have their data recorded or going without useful software.
Internet users concerned about their privacy may also be unaware of another growing trend: the combination of online data with offline information to build profiles of web surfers. Data providers like BlueKai, Datalogic, and Nielsen are now able to pool offline data and demographics to create “digital dossiers” (detailed digital records of a particular subject or market) for online advertisers who want to reach a target market.Tom Spring, “Good-Bye to Privacy?” PC World, May 23, 2010, http://www.pcworld.com/article/196787/goodbye_to_privacy.html. This combination of online and offline information provides a nearly complete picture of someone’s life. If advertisers are looking for a 56-year-old retired female educator who is divorced, owns a home and a dog, suffers from arthritis, and plays tennis at the local fitness club, they can now find her. While advertisers have been careful to point out that people are identified by demographic subgroup but never by name, many organizations that advocate for privacy, such as the Electronic Frontier Foundation, believe that protections and greater transparency should be enforced.Tom Spring, “Good-Bye to Privacy?” PC World, May 23, 2010, http://www.pcworld.com/article/196787/goodbye_to_privacy.html.
Figure 14.7

Online ads like these target users based on pools of very specific information.
Users also supply a wide range of information about themselves through online social networks that is connected with their names, contact information, and photographs. Creditors now look at individuals’ social networks to determine whether they would be good credit customers, and banks may access social network information to make loan decisions.Ginny Mies, “Skeptical Shopper: Can Your Online Life Ruin Your Credit?” PC World, March 23, 2010,. http://www.pcworld.com/article/192207/skeptical_shopper_can_your_online_life_ruin_your_credit.html. If users aren’t careful about their privacy settings on MySpace, Twitter, or Facebook, photographs and other private information may be easily accessible to anyone performing a Google search. Of even greater concern is the growing trend to publicize information that was previously private as the networking sites evolve and change their interfaces.
Surveillance can range from the monitoring of online activity by employers and other institutions that want to make sure users are following guidelines, to high-level government investigations of terrorist activity. The USA PATRIOT Act, passed just 6 weeks after the September 11, 2001, terrorist attacks, expanded the federal government’s rights to access citizens’ personal information. Under the Patriot Act, authorities have access to personal records held by Internet service providers and other third parties, and government officials can tap in to an individual’s email communications and web searches if he or she is suspected of terrorist activity or of having connections to terrorist activity.American Civil Liberties Union, “Surveillance Under the USA PATRIOT Act,” April 3, 2003, http://www.aclu.org/national-security/surveillance-under-usa-patriot-act; Stefanie Olsen, “Patriot Act Draws Privacy Concerns,” CNET, October 26, 2001, http://news.cnet.com/2100-1023-275026.html. One concern among civil liberties organizations is that the Patriot Act might become a back door for the government to conduct undisclosed surveillance that doesn’t necessarily involve the threat of terrorism. For instance, under the Patriot Act the government can wiretap Internet communications even if the primary purpose is a criminal investigation, as long as intelligence gathering is a significant purpose of the investigation.Berkman Center for Internet and Society, “The USA Patriot Act, Foreign Intelligence Surveillance, and Cyberspace Privacy,” Harvard Law School, http://cyber.law.harvard.edu/privacy/module5.html.
Now that a large amount of research can easily be conducted online, and content can be copied and pasted from one platform to another with no more than the click of a button, concerns about plagiarism and copyright infringement are more relevant than ever. The concepts of copyright infringement and plagiarism can easily be confused with each another. The following provides an overview of copyright, its issues and limitations, and its distinction from plagiarism.
CopyrightA form of protection provided by U.S. law, under which the creator of an original artistic or intellectual work is automatically granted certain rights, including the right to distribute, copy, and modify the work. is a form of protection provided by U.S. law, under which the creator of an original artistic or intellectual work is automatically granted certain rights, including the right to distribute, copy, and modify the work.U.S. Copyright Office, “Copyright Basics,” http://www.copyright.gov/. If someone rents a movie from Netflix, for example, and watches it with his friends, he hasn’t violated any copyright laws because Netflix has paid for a license to loan the movie to its customers. However, if the same person rents a movie and burns himself a copy to watch later, he has violated copyright law because he has not paid for nor obtained the film creators’ permission to copy the movie. Copyright law applies to most books, songs, movies, art, essays, and other pieces of creative work. However, after a certain length of time (70 to 120 years depending on the publication circumstances), creative and intellectual works enter the public domain; that is, they are free to be used and copied without permission.
In 2002, Google began scanning millions of books in academic libraries to make them available online in digital format. Of the more than 12 million books Google has digitized since then—and made searchable through Google Book Search—2 million are in the public domain. Those 2 million books are available in “full view” and free for users to download, while books still under copyright are available as limited previews, where users can access about 20 percent of the texts. According to Google, the project will pave the way for greater democratization of knowledge, making texts available to readers who formerly wouldn’t have had access to them. However, many authors, publishers, and legal authorities claim the project represents a massive copyright violation. In 2005, the Authors Guild and the Association of American Publishers filed class-action lawsuits against Google.Annalee Newitz, “5 Ways the Google Books Settlement Will Change the Future of Reading,” io9 (blog).April 2, 2010, http://io9.com/5501426/5-ways-the-google-book-settlement-will-change-the-future-of-reading.
William Cavanaugh, a lawyer with the U.S. Department of Justice, claims that the Google Books Settlement, an agreement partially reached in 2008, “turns copyright law on its head.” According to the settlement agreement, in exchange for $125 million, part of which would go to authors and publishers, Google was released from liability for copying the books and was granted the right to charge money for individual and institutional subscriptions to its Google Books service (which gives subscribers full access to the copied books—even those under copyright). Authors have the choice to opt out of the agreement, asking to have their books removed from Google’s servers. However, more than 30,000 publishers have already made deals with Google, which override the authors’ rights to opt out.Norman Oder, “Google Settlement Fairness Hearing, Part Two: DOJ Expresses Opposition; Parties Mount Vigorous Defense,” Library Journal, February 18, 2010, http://www.libraryjournal.com/article/CA6719808.html.
Some works are in the public domain because the creator has chosen to make them available to anyone without requiring permission. However, most works are in the public domain because their copyright has expired; in the United States, anything published before 1923 is automatically in the public domain. Additionally, there have been changes to U.S. copyright law over the years that caused some works to enter the public domain earlier. Before 1964, for instance, any published work had to have its copyright renewed during the 28th year after its publication. If no renewal was filed, the copyright was lost. shows significant changes to U.S. copyright law since 1790.Nolo Press, “Chapter 8: The Public Domain,” Stanford University Libraries and Academic Information Resources, 2007, http://fairuse.stanford.edu/Copyright_and_Fair_Use_Overview/chapter8/index.html.
Figure 14.8

Changes to U.S. Copyright Law
While it is illegal to violate the rights granted by copyright law, the copyright holder’s rights are not unlimited. One of the significant limitations is the policy of “fair use,” under which the public is entitled to freely use copyrighted information for purposes such as criticism, commentary, news reporting, teaching, scholarship, research, or parody.U.S. Copyright Office, “Fair Use,” May 2009, http://www.copyright.gov/fls/fl102.html. If a critic were writing a book review for a magazine, for instance, according to fair use, she would be allowed to summarize and quote from the book she wanted to review, whether or not the author of the book agreed to this use. According to the U.S. government, there are four issues to consider when determining fair use:
The distinction between what is considered fair and what constitutes copyright infringement is not always clearly defined. For one thing, there are no guidelines that specify a number of words, lines, or notes that can be taken without permission. provides some examples of distinctions between fair use and copyright infringement.
Table 14.1 Cases Involving Fair Use
Fair Use |
Not Fair Use |
|---|---|
|
Wright v. Warner Books Inc. (1991): In a biography of Richard Wright the biographer quoted from 6 of Wright’s unpublished letters and 10 unpublished journal entries. CONSIDERATIONS: The copied letters amounted to less than 1 percent of Wright’s total letter material. Additionally, the biographer’s purpose in copying the documents was informational. |
Castle Rock Entertainment Inc. v. Carol Publication Group (1998): Carol Publication published a book of trivia questions about the television series Seinfeld. The book included direct quotes from the show and based its questions on characters and events in the series. CONSIDERATIONS: The book infringed on the ability of Castle Rock (the copyright holder) to make its own trivia books. |
|
Perfect 10 Inc. v. Amazon.com Inc. (2007): A Google search engine displayed thumbnail-sized photos of nude models from a subscription-only website. CONSIDERATIONS: The search engine’s use of the photos transformed them into “pointers,” directing users to the photos’ original source. The transformative use was more important than any factors that would allow Google to make money from displaying the images. |
Los Angeles News Service v. KCAL-TV Channel 9 (1997): A television news station used a 30-second segment of a four-minute video that depicted the beating of a Los Angeles man. The video was copyrighted by the Los Angeles News Service. CONSIDERATIONS: The segment used by the news station was a significant portion of the total video. Additionally, the use was for commercial reasons and infringed on the Los Angeles News Service’s ability to market the video. |
Source: Stanford University Libraries. “Copyright & Fair Use.” http://fairuse.stanford.edu/Copyright_and_Fair_Use_Overview/chapter9/9-c.html
Sometimes plagiarism becomes confused with copyright violation. However, the two words are not synonymous; while there can be some overlap between them, not every instance of plagiarism involves copyright violation, and not every instance of copyright violation is an act of plagiarism. For one thing, while copyright violation can involve a wide range of acts, plagiarismUsing someone else’s information, writing, or speech without properly documenting or citing the source. is defined more narrowly as using someone else’s information, writing, or speech without properly documenting or citing the source. In other words, plagiarism involves representing another person’s work as one’s own. This can happen in any field—for example, in the music industry. In 1990, Vanilla Ice sampled riffs from David Bowie’s Under Pressure for his hit song, Ice Ice Baby, without licensing or crediting Bowie’s work. Hip hop music, which has a long tradition of “sampling” music, riffs, lyrics, and more from other songs, often raises questions of copyright infringement.Copyright Website, “David Bowie, Queen and Vanilla Ice,” http://www.benedict.com/Audio/Vanilla/Vanilla.aspx.
As the U.S. Copyright Office points out, it is possible to cite a copyrighted source of information without obtaining permission to reproduce that information.U.S. Copyright Office, “Fair Use,” May 2009, http://www.copyright.gov/fls/fl102.html. In such a case, the user has violated copyright law even though she has not plagiarized the material. Similarly, a student writing a paper could copy sections of a document that is in the public domain without properly citing his sources, in which case he would not have broken any copyright laws. However, representing the information as his own work would be an instance of plagiarism.
Plagiarism, a perennially serious problem at academic institutions, has recently become even more prevalent. The ease of copying and pasting online content into a word-processing document can make it highly tempting for students to plagiarize material for research projects and critical papers. Additionally, a number of online “paper mills” contain archives where students can download papers for free or, in some cases, purchase them.Andy Denhart, “The Web’s Plagiarism Police,” Salon, June 14, 1999, http://www.salon.com/technology/feature/1999/06/14/plagiarism. Sloppy work habits can lead students to inadvertently plagiarize. In 2003, The New York Times surveyed students at 23 college campuses and reported that 38 percent of students admitted to having committed copy-and-paste plagiarism within the previous year.Michelle De Leon, “Internet Plagiarism on the Rise in Colleges,” Lehigh University Brown and White, November 12, 2007, http://media.www.thebrownandwhite.com/media/storage/paper1233/news/2007/11/12/News/Internet.Plagiarism.On.The.Rise.In.Colleges-3094622.shtml.
To combat the rise in plagiarism, many schools and universities now subscribe to services that allow instructors to check students’ work for plagiarized material. Plagiarism.org, for instance, offers an analytics tool that compares student writing against a database that includes work from online paper mills, academic databases, documents available through major search engines, and other student papers submitted to Plagiarism.org.Andy Denhart, “The Web’s Plagiarism Police,” Salon, June 14, 1999, http://www.salon.com/technology/feature/1999/06/14/plagiarism. According to many researchers, part of the issue may be that students don’t understand what constitutes plagiarism. Some students, for instance, claim they think information available online is in the public domain.Nicole J. Auer and Ellen M. Krupar, “Mouse Click Plagiarism: The Role of Technology in Plagiarism and the Librarian’s Role in Combating It,” Library Trends, Winter 2001, http://findarticles.com/p/articles/mi_m1387/is_3_49/ai_75278304/. Figure 14.12 offers suggestions for ways to avoid plagiarism in your own work.
While plagiarism is an issue of concern in academia, it occurs in print media as well. Writers, whether through carelessness or laziness, may lift content from existing materials without properly citing or reinterpreting them. In an academic setting, plagiarism may lead to consequences as severe as failure or even expulsion from an institution. However, outside of academia the consequences may be even more damaging. Writers have lost publishing contracts, permanently damaged their reputations, and even ruined their careers over instances of plagiarism. For example, the late George Harrison, of the Beatles, was successfully sued by Ronald Mack for copyright infringement of his song “He’s So Fine.”Bright Tunes Music v. Harrisongs Music, 420 F. Supp. 177 (S.D.N.Y. 1976). It was determined by the court that Harrison unconsciously plagiarized the musical essence of Mack’s song for his composition “My Sweet Lord.”
You should now have an understanding of the key issues in media ethics, particularly as they relate to privacy rights, plagiarism, and copyright laws. Please ensure you understand the following listed key concepts.
Concerns about the public’s right to privacy have increased in recent years, as more personal information has become available online.
Rules that distinguish copyright violation from fair use are not always entirely clear and have been the subject of debate now that a greater amount of copyrighted work is easily accessible via the Internet.
You will now examine several cases in detail to further explore your understanding of the concepts and key ideas covered in this chapter. Respond to the questions asked, and provide evidence or examples to defend and support your answer. Each response should be one or two paragraphs.
Case 1. Research the USA PATRIOT Act. You can read what the American Civil Liberties Union (ACLU) has to say about the act here: http://www.aclu.org/pdfs/safefree/patriot_report_20090310.pdf. You can read what the federal government has to say about the act here: http://www.justice.gov/archive/ll/highlights.htm
Case 2. Consider the following case and decide whether you believe it represents an instance of fair use or whether the action was a copyright violation. Defend your response.
After the publication of author J. K. Rowling’s popular Harry Potter novels, one fan created an elaborate website for Harry Potter enthusiasts. The website includes an encyclopedia of information about the books; indexed lists of people, places, and things; fan art; discussion forums; essays; timelines; and other features. Much of the content of the website’s encyclopedia entries comes directly from the books. Use of the website is free and unrestricted, and while the site includes some ads, the income they generate only goes to offset the site’s operating costs.
Review Questions
Write a detailed response to the questions below, defending your response with examples where appropriate. Each response should be one to two paragraphs:
Write a detailed response to the questions below, defending your response with examples where appropriate. Each response should be one to two paragraphs:
Write a detailed response to the questions below, defending your response with examples where appropriate. Each response should be one to two paragraphs:
Answer the following critical thinking questions. Your responses should be one to two pages for each prompt.
Political Blogger
Research what it takes to be a professional political blogger for a news site like CNN.com or The Huffington Post. Then answer the following short-answer questions. Each response should be one to two paragraphs.
Figure 15.1
In May 2010, the social networking website Facebook was thrown into the news when its chief executive officer, Mark Zuckerberg, announced new changes to the site’s privacy policy. Although the announcement alone did not necessarily garner heavy attention from the news media, the involvement of the Federal Trade Commission (FTC) ramped up public interest.
The previous month, several watchdog groups had sent letters to Congress and the FTC asking for an investigation of Facebook’s privacy policy. The letters attacked the site’s privacy policies, which dated from December 2009 and had been designed to provide users more control over privacy settings. However, PC Magazine noted, “given Facebook’s move toward a more open format as it integrates status updates with search engines like Google and Bing, the site encouraged its users to make more of their data public, and made some of the default settings more open.”Chloe Albanesius, “Facebook Prepping Changes to Privacy Policy,” PC Magazine, May 21, 2010, http://www.pcmag.com/article2/0,2817,2364063,00.asp.
Essentially, Facebook provides three default options for sharing information: with “everyone,” “friends of friends,” or “friends only.” Zuckerberg explained the privacy policy by saying:
We recommended that there be large pieces of information in each of these buckets. For friends only, that’s all of the really sensitive stuff. For friends of friends, it could be who can see the photos and videos of you, which is actually the majority of the content people share on the site. And then for everyone, it’s basic information and status updates and posts like that.Dan Fletcher, “Time’s Q&A With Facebook CEO Mark Zuckerberg,” Time NewsFeed (blog), Time, May 27, 2010, http://newsfeed.time.com/2010/05/27/times-qa-with-facebook-ceo-mark-zuckerberg/
Concern grew that some of Facebook’s default privacy settings allowed everyone, regardless of their level of connection to a user, to access some personal information. In their open letter to Congress, privacy watchdog groups addressed these concerns by stating, “Facebook continues to manipulate the privacy settings of users and its own policy so that it can take personal information provided by users for a limited purpose and make it widely available for commercial purposes…. The company has done this repeatedly and users are becoming increasingly angry and frustrated.”Mark Hachman, “Facebook Targeted by New FTC Privacy Complaint,” PC Magazine, May 7, 2010, http://www.pcmag.com/article2/0,2817,2363518,00.asp. In light of users’ outrage, the letter asked the FTC to get involved.
The FTC is a congressional commission designed to oversee and enforce consumer protections. Despite—or perhaps because of—this stated goal, the FTC’s lack of involvement in Facebook’s privacy settings frustrated many individuals; one letter to Congress “openly worried that the FTC either lacked the power or the motivation to pursue questions of privacy at Facebook.”Mark Hachman, “Facebook Targeted by New FTC Privacy Complaint,” PC Magazine, May 7, 2010, http://www.pcmag.com/article2/0,2817,2363518,00.asp. The FTC responded that the issue was of “particular interest” to them, but as of this writing, no official action has been taken.
The issue has prompted a broader discussion of the government’s role in regulating information disseminated on the Internet. The New York Times articulated the discussion’s central questions: “What can government do to ensure that users have control of their own information, which might live on indefinitely on the web? Would regulation work? Or should government stay out of this arena?”New York Times, “Should Government Take On Facebook?” Room for Debate (blog), May 25, 2010, http://roomfordebate.blogs.nytimes.com/2010/05/25/should-government-take-on-facebook/. Facebook stands by the rights of its users, arguing that “adult users should be free to publish information about their lives if they choose to do so.”New York Times, “Should Government Take On Facebook?” Room for Debate (blog), May 25, 2010, http://roomfordebate.blogs.nytimes.com/2010/05/25/should-government-take-on-facebook/. However, Facebook did respond to the open letter and modified its privacy settings to make it easier for individuals to control their online identities. Yet the debate continues over online privacy and the government’s role in maintaining this privacy. The recent buzz over Facebook’s privacy policies is just one of many examples of the debate over government’s place in the world of media. How is copyright protected across different media outlets? What material is considered appropriate for broadcast? Does the U.S. government have the right to censor information? This chapter explores these and other questions regarding the long and complex relationship between media and the government.
The U.S. federal government has long had its hand in media regulation. Media in all their forms have been under governmental jurisdiction since the early 1900s. Since that time, regulatory efforts have transformed as new forms of media have emerged and expanded their markets to larger audiences.
Throughout the 20th century, three important U.S. regulatory agencies appeared. Under the auspices of the federal government, these agencies—the FTC, the Federal Radio Commission (FRC), and the FCC—have shaped American media and their interactions with both the government and audiences.
The first stirrings of the FTC date from 1903, when President Theodore Roosevelt created the Bureau of Corporations to investigate the practices of increasingly larger American businesses. In time, authorities determined that an agency with more sweeping powers was necessary. Founded on September 26, 1914, the FTCThe government agency charged with overseeing interstate business and trade practices in the United States. came into being when President Woodrow Wilson signed the FTC Act into law, creating an agency designed to “prevent unfair methods of competition in commerce.”Federal Trade Commission, “About the Federal Trade Commission,” http://ftc.gov/ftc/about.shtm. From the beginning, the FTC absorbed the work and staff of the Bureau of Corporations, operating in a similar manner, but with additional regulatory authorization. In the words of the FTC,
Like the Bureau of Corporations, the FTC could conduct investigations, gather information, and publish reports. The early Commission reported on export trade, resale price maintenance, and other general issues, as well as meat packing and other specific industries. Unlike the Bureau, though, the Commission could … challenge “unfair methods of competition” under Section 5 of the FTC Act, and it could enforce … more specific prohibitions against certain price discriminations, vertical arrangements, interlocking directorships, and stock acquisitions.“A Brief History of the Federal Trade Commission,” program notes, Federal Trade Commission 90th Anniversary Symposium, 6.
Although its primary focus was on the prevention of anticompetitive business practices, in its early years, the FTC also provided oversight on wartime economic practices. During World War I, for example, President Wilson frequently turned to the FTC for advice on exports and trading with foreign nations, resulting in the Trading with the Enemy Act, which restricted trade with countries in conflict with the United States.
First established with the passage of the Radio Act of 1927, the FRC was intended to “bring order to the chaotic situation that developed as a result of the breakdown of earlier wireless acts passed during the formative years of wireless radio communication.”Fritz Messere, “The Federal Radio Commission Archives,” http://www.oswego.edu/~messere/FRCpage.html. The FRC comprised five employees who were authorized to grant and deny broadcasting licenses and assign frequency ranges and power levels to each radio station.
In its early years, the FRC struggled to find its role and responsibility in regulating the radio airwaves. With no clear breakdown of what could or could not be aired, nearly everything was allowed to play. As you learned in , the FRC lasted only until 1934, when it was engulfed by the FCC.
Figure 15.2

President Franklin D. Roosevelt established the Federal Communications Commission in 1934 as part of the New Deal.
Since its creation by the Communications Act in 1934, the FCCThe government agency charged with overseeing interstate communications in the United States. has been “charged with regulating interstate and international communications by radio, television, wire, satellite and cable.”Federal Communications Commission, “About the FCC,” http://www.fcc.gov/aboutus.html. Part of the New Deal—President Franklin D. Roosevelt’s Great Depression–era suite of federal programs and agencies—the commission worked to establish “a rapid, efficient, Nation-wide, and world-wide wire and radio communication service.”Museum of Broadcast Communications, “Federal Communications Commission,” http://www.museum.tv/eotvsection.php?entrycode=federalcommu.
The responsibilities of the FCC are broad, and throughout its long history the agency has enforced several laws that regulate media. A selection of these laws include the 1941 National TV Ownership Rule, which states that a broadcaster cannot own television stations that reach more than 35 percent of the nation’s homes; the 1970 Radio/TV Cross-Ownership Restriction, which prohibits a broadcaster from owning a radio station and a TV station in the same market; and the 1975 Newspaper/Broadcast Cross-Ownership Prohibition, which discourages ownership of a newspaper and a television station in the same market.“Media Regulation Timeline,” NOW With Bill Moyers, PBS, January 30, 2004, http://www.pbs.org/now/politics/mediatimeline.html. All of these acts and more have undergone changes in the modern media marketplace.
Today, the FCC continues to hold the primary responsibility for regulating media outlets, with the FTC taking on a smaller role. Although each commission holds different roles and duties, the overall purpose of governmental control remains to establish and bring order to the media industry while ensuring the promulgation of the public good. This section examines the modern duties of both commissions.
The FCC contains three major divisions: broadcast, telegraph, and telephone. Within these branches, subdivisions allow the agency to more efficiently carry out its tasks. Presently, the FCC houses seven operating bureaus and 10 staff offices. Although the bureaus and offices have varying specialties, the bureaus’ general responsibilities include “processing applications for licenses and other filings; analyzing complaints; conducting investigations; developing and implementing regulatory programs; and taking part in hearings.”Federal Communications Commission, “About the FCC,” http://www.fcc.gov/aboutus.html. Four key bureaus are the Media Bureau, the Wireline Competition Bureau, the Wireless Telecommunications Bureau, and the International Bureau.
The Media Bureau oversees licensing and regulation of broadcasting services. Specifically, the Media Bureau “develops, recommends and administers the policy and licensing programs relating to electronic media, including cable television, broadcast television, and radio in the United States and its territories.”Federal Communications Commission, “Media Bureau,” http://www.fcc.gov/mb/. Because it aids the FCC in its decisions to grant or withhold licenses from broadcast stations, the Media Bureau plays a particularly important role within the organization. Such decisions are based on the “commission’s own evaluation of whether the station has served in the public interest,” and come primarily from the Media Bureau’s recommendations.Museum of Broadcast Communications, “Federal Communications Commission.” The Media Bureau has been central to rulings on children’s programming and mandatory closed captioning.
The Wireline Competition Bureau (WCB) is primarily responsible for “rules and policies concerning telephone companies that provide interstate—and, under certain circumstances, intrastate—telecommunications services to the public through the use of wire-based transmission facilities (i.e. corded/cordless telephones).”Federal Communications Commission, “About the FCC,” http://www.fcc.gov/aboutus.html. Despite the increasing market for wireless-based communications in the United States, the WCB maintains its large presence in the FCC by “ensuring choice, opportunity, and fairness in the development of wireline telecommunications services and markets.”Federal Communications Commission, “Wireline Competition Bureau,” http://www.fcc.gov/wcb/. In addition to this primary goal, the bureau’s objectives include “developing deregulatory initiatives; promoting economically efficient investment in wireline telecommunications services; and fostering economic growth.”Federal Communications Commission, “Wireline Competition Bureau,” http://www.fcc.gov/wcb/. The WCB recently ruled against Comcast regarding blocked online content to the public, causing many to question the amount of authority that the government has over the public and big businesses.
Another prominent bureau within the FCC is the Wireless Telecommunications Bureau (WTB). The rough counterpart of the WCB, this bureau oversees mobile phones, pagers, and two-way radios, handling “all FCC domestic wireless telecommunications programs and policies, except those involving public safety, satellite communications or broadcasting, including licensing, enforcement, and regulatory functions.”Federal Communications Commission, “About the WTB,” http://wireless.fcc.gov/index.htm?job=about. The WTB balances the expansion and limitation of wireless networks, registers antenna and broadband use, and manages the radio frequencies for airplane, ship, and land communication. As U.S. wireless communication continues to grow, this bureau seems likely to continue to increase in both scope and importance.
Finally, the International Bureau is responsible for representing the FCC in all satellite and international matters. A larger organization, the International Bureau’s goal is to “connect the globe for the good of consumers through prompt authorizations, innovative spectrum management and responsible global leadership.”Federal Communications Commission, “International Bureau,” http://www.fcc.gov/ib/. In an effort to avoid international interference, the International Bureau coordinates with partners around the globe regarding frequency allocation and orbital assignments. It also concerns itself with foreign investment in the United States, ruling that outside governments, individuals, or corporations cannot own more than 20 percent of stock in a U.S. broadcast, telephone, or radio company.
Although the FCC provides most of the nation’s media regulations, the FTC also has a hand in the media industry. As previously discussed, the FTC primarily dedicates itself to eliminating unfair business practices; however, in the course of those duties it has limited contact with media outlets.
One example of the FTC’s media regulatory responsibility is the National Do Not Call Registry. In 2004, the agency created this registry to prevent most telemarketing phone calls, exempting such groups as nonprofit charities and businesses with which a consumer has an existing relationship. Although originally intended for landline phones, the Do Not Call Registry allows individuals to register wireless telephones along with traditional wire-based numbers.
As discussed in , the federal government has long regulated companies’ business practices. Over the years, several antitrust acts (law discouraging the formation of monopolies) have been passed into law.
During the 1880s, Standard Oil was the first company to form a trust (a unit of business made up of a board of trustees, formed to monopolize an industry), an “arrangement by which stockholders … transferred their shares to a single set of trustees.”“Sherman Antitrust Act (1890),” http://www.ourdocuments.gov/doc.php?flash=old&doc=51. With corporate trustees receiving profits from the component companies, Standard Oil functioned as a monopoly (a business that economically controls a product or a service). The Sherman Antitrust Act was put into place in 1890 to dissolve trusts such as these. The Act stated that any combination “in the form of trust or otherwise that was in restraint of trade or commerce among the several states, or with foreign nations” was illegal.“Sherman Antitrust Act (1890),” http://www.ourdocuments.gov/doc.php?flash=old&doc=51.
The Sherman Antitrust Act served as a precedent for future antitrust regulation. As discussed in , the 1914 Clayton Antitrust Act and the 1950 Celler-Kefauver Act expanded on the principles laid out in the Sherman Act. The Clayton Act helped establish the foundation for many of today’s business and media competition regulatory practices. Although the Sherman Act established regulations in the United States, the Clayton Act further developed the rules surrounding antitrust, giving businesses a “fair warning” about the dangers of anticompetitive practice.Brian Gongol, “The Clayton Antitrust Act,” February 18, 2005, http://www.gongol.com/research/economics/claytonact/. Specifically, the Clayton Act prohibits actions that may “substantially lessen competition or tend to create a monopoly in any line of commerce.”Brian Gongol, “The Clayton Antitrust Act,” February 18, 2005, http://www.gongol.com/research/economics/claytonact/.
The problem with the Clayton Act was that, while it prohibited mergers, it offered a loophole in that companies were allowed to buy individual assets of competitors (such as stocks or patents), which could still lead to monopolies. Established in 1950 and often referred to as the Antimerger Act, the Cellar-Kefauver Act closed that loophole by giving the government the power to stop vertical mergers. (Vertical mergers happen when two companies in the same business but on different levels—such as a tire company and a car company—combine.) The act also banned asset acquisitions that reduced competition.“Celler-Kefauver Antimerger Act,” http://financial-dictionary.thefreedictionary.com/Celler-Kefauver+Antimerger+Act.
These laws reflected growing concerns in the early and mid-20th century that the trend toward monopolization could lead to the extinction of competition, thus leading to less choice and potentially higher prices. Government regulation of businesses increased until the 1980s, when the United States experienced a shift in mind-set and citizens called for less governmental power. The U.S. government responded as deregulation became the norm.
Media deregulation actually began during the 1970s as the FCC shifted its approach to radio and television regulation. Begun as a way of clearing laws to make the FCC run more efficiently and cost effectively, deregulation truly took off with the arrival of the Reagan administration and its new FCC chairman, Mark Fowler, in 1981. The FCC began overturning existing rules and experienced “an overall reduction in FCC oversight of station and network operations.”Museum of Broadcast Communications, “Deregulation,” http://www.museum.tv/eotvsection.php?entrycode=deregulation. Between 1981 and 1985, lawmakers dramatically altered laws and regulation to give more power to media licensees and to reduce that of the FCC. Television licenses were expanded from three years to five, and corporations were now allowed to own up to 12 separate television stations.
The shift in regulatory control had a powerful effect on the media landscape. Whereas initially laws had prohibited companies from owning media entities in more than one medium, consolidation created large mass-media companies that increasingly dominated the U.S. and global media system. Before the increase in deregulation, eight major companies controlled phone services to different regions of the United States. Today, however, there are four.Gene Kimmelman, “Deregulation of Media: Dangerous to Democracy,” Consumers Union, http://www.consumersunion.org/telecom/kimmel-303.htm. Companies such as Viacom and Disney own television stations, record companies, and magazines. Bertelsmann alone owns more than 30 radio stations, 280 publishing outlets, and 15 record companies.Columbia Journalism Review, “Resources: Who Owns What,” http://www.cjr.org/resources/?c=bertelsmann. Due to this rapid consolidation, Congress grew concerned about the costs of deregulation, and by the late 1980s, it began to slow the FCC’s release of control.
Today, deregulation remains a hotly debated topic. Some favor deregulation, believing that the public benefits from less governmental control. Others, however, argue that excessive consolidation of media ownership threatens the system of checks and balances.Gene Kimmelman, “Deregulation of Media: Dangerous to Democracy,” Consumers Union, http://www.consumersunion.org/telecom/kimmel-303.htm. Proponents on both sides of the argument are equally vocal, and it is likely that regulation of media will ebb and flow over the years, as it has since regulation first came into practice.
Is what you see on the Internet being censored? In , you read about the debate between the search engine Google and China. However, Internet censorship is much more widespread, affecting people from Germany to Thailand to the United States. And now, thanks to a new online service, you can see for yourself.
In September 2010, Google launched its new web tool, Google Transparency. This program allows users to see a map of online censorship around the world. With this tool, people can view the number of times a country requests data to be removed, what kind of data they request be removed, and the percentage of requests that Google complies with. In some cases, the content is minor—YouTube videos that violate copyright, for example, are frequent offenders. In other cases, the requests are more formidable; Iran blocked all of YouTube after the disputed 2009 elections, and Pakistan blocked the site for more than a week in response to a 2010 online protest. Perhaps most surprising is the amount of requests from countries not normally associated with strict censorship. Germany, for example, has banned content it deems to be affiliated with neo-Nazism, and Thailand refuses to allow videos of its king that it finds offensive. Between January and June 2010, the United States asked Google 4,287 times for information regarding its users, and sent 128 requests to the search engine to remove data. Eighty percent of the time, Google complied with the requests for data removal. John D. Sutter, “Google: Internet freedom is declining,” CNN, September 21, 2010, http://articles.cnn.com/2010-09-21/tech/google.transparency_1_internet-censorship-google-maps-internet-freedom?_s=PM:TECH.
What is the general trend in Internet censorship? According to Google, it’s becoming more and more commonplace every year. However, the search engine hopes that its new tool will combat this trend. A spokesperson for the company said, “The openness and freedom that have shaped the internet as a powerful tool has come under threats from governments who want to control that technology.” By giving users access to censorship numbers, Google allows them to witness the amount of Internet censorship that they are subject to in their everyday lives. As censorship increases, many predict that citizen outrage will increase as well. The future of Internet censorship may be unsure, but for now, at least, the numbers are visible to all.John D. Sutter, “Google: Internet freedom is declining,” CNN, September 21, 2010, http://articles.cnn.com/2010-09-21/tech/google.transparency_1_internet-censorship-google-maps-internet-freedom?_s=PM:TECH.
Visit the FCC’s web page (http://www.fcc.gov/) and explore some of the regulations that currently exist. Think about television or radio programs that you watch or listen to. Then write a one-page paper addressing the following:
Media law has been a much-debated topic ever since the first U.S. media industry laws appeared in the early 1900s. The contention surrounding media law largely stems from the liberties guaranteed under the First Amendment of the U.S. Constitution, which includes the freedom of the press.
Generally speaking, media law comprises two areas: telecommunications law, which regulates radio and television broadcasts, and print law, which addresses publications such as books, newspapers, and magazines. Despite differences between the two areas, many media laws involve First Amendment protections. This section explores several areas of media law: privacy, libel and slander, copyright and intellectual property, freedom of information, and equal time and coverage.
Privacy, as you have likely noticed, has been a theme in many chapters. The media and privacy intersect in many arenas, including the legal arena. In 1974, Congress passed the Privacy ActA piece of legislation designed to protect how individuals’ personal information is collected, used, and published., which “protects records that can be retrieved by personal identifiers such as a name, social security number, or other identifying number or symbol.”U.S. Department of Health and Human Services, “The Privacy Act,” http://www.hhs.gov/foia/privacy/index.html. This act also regulates how agencies can collect, store, and use information and requires agencies to tell individuals when they are collecting information about them. Designed to ensure that all First Amendment guarantees remain honored, the act requires all public and private agencies to function within its boundaries.
Under the Privacy Act, media personnel must be careful to avoid revealing certain information about an individual without his or her permission, even if that portrayal is factually accurate. Privacy laws, including the Privacy Act, “limit … your ability to publish private facts about someone and recognize … an individual’s right to stop you from using his or her name, likeness, and other personal attributes for certain exploitative purposes.”Citizen Media Law Project, “Publishing Personal and Private Information,” http://www.citmedialaw.org/legal-guide/publishing-personal-and-private-information. Members of the media can avoid the pitfalls of privacy laws by maintaining a professional relationship with the community upon which they report. To avoid liability, journalists and other media professionals are encouraged to report or comment only on “matters of legitimate public interest and only portray people who have a reasonable relationship to [their] topic.”Citizen Media Law Project, “Publishing Personal and Private Information,” http://www.citmedialaw.org/legal-guide/publishing-personal-and-private-information. In 2005, a legal dispute arose between congressional aides Robert Steinbuch and Jessica Cutler. Steinbuch sued Cutler for publishing information about their intimate relationship; however, the case was dismissed when the court decided that Cutler had only provided facts that were already publically known.Citizen Media Law Project, “Publication of Private Facts,” http://www.citmedialaw.org/legal-guide/publication-private-facts.
Media outlets also must be wary of committing acts of defamation. These occur when false statements that can harm a reputation are printed, broadcast, spoken, or otherwise communicated to others. Two different types of legal protections, libelDefamatatory statements or visual depictions in writen or permanent form. and slanderDefamatatory verbal statements. laws, exist to prevent such defamation from taking place and can extend to individuals, groups, and even companies. Although defamation encompasses both categories, they are separate concepts. Libel refers to written statements or printed visual depictions, while slander refers to verbal statements and gestures.Media Law Resource Center, “Frequently Asked Media Law Questions,” http://www.medialaw.org/Content/NavigationMenu/Public_Resources/Libel_FAQs/Libel_FAQs.htm. State jurisdiction largely covers libel and slander laws, but they are nearly identical throughout the United States.
As with privacy laws, print and broadcast journalists can protect themselves from defamation lawsuits by carrying out responsible reporting. Media personnel are legally protected when communicating a report outweighs any potential damage to a person’s reputation. However, when journalists do not report responsibly, the legal and financial consequences can be devastating. In the 2007 case Murphy vs. Boston Herald, the Boston Herald newspaper was sued for misquoting Massachusetts Superior Court Judge Ernest Murphy. The court ruled that the false quote was published with a malicious intent and awarded Murphy $2.1 million in damages.Barbara W. Wall, “News Watch: Boston Newspapers Suffer Setbacks in Libel Cases,” Gannett, http://159.54.227.112/go/newswatch/2005/april/nw0401-4.htm. In the more famous case of Linda Tripp in 1998, Tripp was charged with secretly recording phone conversations between herself and Monica Lewinsky, who had a sexual relationship with President Bill Clinton. Tripp faced a prison sentence of 10 years for slander and illegal documentation; however, the case was dropped in early 2000 due to witness bias.Don Van Natta Jr., “Maryland Is Dropping Wiretap Case against Tripp,” New York Times, http://www.nytimes.com/2000/05/25/us/maryland-is-dropping-wiretap-case-against-tripp.html. More recently, David and Victoria Beckham sued their former nanny for telling the tabloids that their marriage is in trouble. The nanny’s contract contained an agreement not to talk about the celebrity couple’s private lives.CBS News, “Beckham’s Sue Former Nanny,“ February 11, 2009, http://www.cbsnews.com/stories/2005/04/26/entertainment/main691003.shtml
Copyright laws fall under federal jurisdiction and are, therefore, identical across the country. As you learned in , Congress first established U.S. copyright and patent protections in 1790 and, despite revisions and updates, has maintained some form of copyright law to this day. With coverage of a wide range of materials, copyright law encompasses “almost all creative work that can be written down or otherwise captured in a tangible medium.”Citizen Media Law Project, “Copyright,” http://www.citmedialaw.org/legal-guide/copyright. This includes literary works; musical works; dramatic works; pictorial, graphic, and sculptural works; motion pictures and other audiovisual works; sound recordings; and even architectural works. Once a work has achieved copyright, the copyright owner must grant permission for that work to be legally reproduced. After a certain number of years, a copyright expires and the work enters the public domain.
Copyright does not, however, protect facts. This is of particular importance for news media. Despite the length of time it takes to uncover facts, no individual or company can own them. Anyone may repeat facts as long as that person does not copy the written story or broadcast in which those facts were communicated.
Intellectual property law protects “products of the mind,” including copyrights, patents, open licenses, trademarks, trade secrets, URLs, domain names, and even components of television programs (as David Letterman found out when he moved from NBC to CBS, and was forced to leave certain aspects of his television show behind). Intellectual property law generally follows the same guidelines as copyright law, and the associated legislation seeks “to encourage innovation and creativity, with an ultimate aim of promoting a general benefit to society.”Citizen Media Law Project, “Intellectual Property,” http://www.citmedialaw.org/legal-guide/intellectual-property. The role of copyright and intellectual property in the mass media will be covered in greater detail later in this chapter.
President Lyndon B. Johnson first signed the Freedom of Information ActA piece of legislation that permits the public at large, including the news media, to request access to many government documents. (FOIA) into law in 1966. By requiring full or partial disclosure of U.S. government information and documents, the act “helps the public keep track of its government’s actions, from the campaign expenditures of city commission candidates to federal agencies’ management of billions of dollars in tax revenues.”Citizen Media Law Project, “Access to Government Records,” http://www.citmedialaw.org/legal-guide/access-government-records. Because it allows everyone access to federal documents and information that otherwise would go unreleased, FOIA is particularly important for those working in the news media.
Although the act covers a large range of agencies, some offices are exempt from FOIA. The act provides access to the public records of the executive branch of the U.S. government but does not include documents from the current president, Congress, or the judicial branch.Citizen Media Law Project, “Access to Records from the Federal Government,” http://www.citmedialaw.org/legal-guide/access-records-from-federal-government. Because FOIA pertains to individuals and information in high levels of government, the process of accessing information can be complicated. Those who are interested must become skilled at navigating the complex set of procedures to offer citizens accurate information. Although FOIA allows any person for any reason access to the records, journalists who work for mainstream media organizations often receive perks such as the waiving of fees and expedited processing.Citizen Media Law Project, “Who Can Request Records Under FOIA,” http://www.citmedialaw.org/legal-guide/who-can-request-records-under-foia.
Falling under broadcast regulations, the Communication Act’s Section 315A component of the Communications Act of 1934 that requires broadcast media to offer equal opportunity of broadcast time to candidates.—also known as the Equal Time Rule—requires radio and television stations to give equal opportunity for airtime to all candidates. Essentially, Section 315 ensures that television and radio stations cannot favor any one political candidate over another.
Passed by Congress in 1927, the equal opportunity requirement was the first major federal broadcasting law. Legislators feared that broadcasters and stations would be able to manipulate elections by giving one candidate ample air time. Although candidates cannot receive free airtime unless their opponents do as well, the law doesn’t take into consideration campaign funding. Well-funded candidates who can afford to pay for airtime still have an advantage over their poorly funded peers. Controversies over campaign financing are directly tied to the high cost of political campaign advertising, especially over television.
News programs, interviews, and documentaries are exempt from the requirements of Section 315. This allows media outlets to report on the activities of a candidate without also having to cover the activities of his or her opponent. Presidential debates fall under this exemption as well and are not required to include third-party candidates.
Section 315 also prohibits media from censoring what a candidate says or presents on air. Recently there has been controversy over campaign ads picturing aborted fetuses. Citing Section 315, the FCC allowed these television ads to continue to run.Museum of Broadcast Communications, “Equal Time Rule: U.S. Broadcasting Regulatory Rule,” http://www.museum.tv/eotvsection.php?entrycode=equaltimeru.
As discussed in , the Fairness Doctrine was enacted in 1949, when applications for radio broadcast licenses outpaced the number of available frequencies. At the time, concerns that broadcasters might use their stations to promote a particular perspective encouraged the creation of the radio-specific version of Section 315. The FCC thus instituted the Fairness Doctrine to “ensure that all coverage of controversial issues by a broadcast station be balanced and fair.”Museum of Broadcast Communications, “Fairness Doctrine,” http://www.museum.tv/eotvsection.php?entrycode=fairnessdoct.
The FCC took the view … that station licensees were “public trustees,” and as such had an obligation to afford reasonable opportunity for discussion of contrasting points of view on controversial issues. The commission later held that stations were also obligated to actively seek out issues of importance to their community and air programming that addressed those issues.Museum of Broadcast Communications, “Fairness Doctrine,” http://www.museum.tv/eotvsection.php?entrycode=fairnessdoct.
The Fairness Doctrine was considered controversial among journalists who felt that it infringed on the rights of free speech and freedom of press granted in the First Amendment. The doctrine was dissolved during the 1980s with the Reagan administration’s deregulatory efforts. We can see these effects today with the popularity of political talk radio. Stations do not have to assure that all sides of an issue are discussed.
In 1998, Congress passed the Digital Millennium Copyright Act (DMCA) to bring order to the then-largely-unregulated online arena. As discussed in , the DMCA prohibits individuals from either circumventing access-control measures or trafficking devices that may help others circumvent copyright measures. Under this act, it is illegal to use code-cracking devices to illegally copy software, and websites are required to take down material that infringes on copyrights. (You’ve experienced this regulation yourself if you’ve ever visited YouTube or Google Video and found that a video has been removed due to copyright claims.)
The DMCA does allow webcasting (the broadcasting of media over the Internet) as long as webcasters pay licensing fees to the companies that own the material. This allows sites such as Hulu to legally stream movies and television shows to viewers. The DMCA also protects institutes of higher education, including distance-learning programs, from certain copyright liabilities.Online Institute for Cyberspace Law and Policy, “The Digital Millennium Copyright Act,” UCLA, http://www.gseis.ucla.edu/iclp/dmca1.htm.
One of the most controversial aspects of the DMCA is that, while it requires websites to remove copyrighted material, it does not require websites to monitor their content. A 3-year-long court battle between media giant Viacom and the Google-owned website YouTube was recently waged over this factor. Viacom argued that YouTube infringed on its rights by hosting copyrighted videos. Google responded that while YouTube may include copyrighted material, it is not required to scan every user-uploaded video for copyright infringement. When a claim is brought against a YouTube video, the video is removed—beyond that, the website is not responsible for content. The judge ruled in favor of Google, stating that it was indeed protected under the DMCA. While many saw this as a victory for Internet freedom, others warned that it would have future consequences for the protection of copyright holders.Steve Rosenbaum, “Viacom vs. YouTube: What Was Won. What Was Lost,” Huffington Post, July 9, 2010, http://www.huffingtonpost.com/steve-rosenbaum/viacom-vs-YouTube-what-wa_b_641489.html.
Visit the website of a major media outlet and examine the coverage of a recent local, state, or national election. Compare the coverage of different candidates. Then write answers to the short-response questions below. Each response should be a minimum of one paragraph.
Figure 15.3

Attempts to censor material, such as banning books, typically attract a great deal of controversy and debate.
To fully understand the issues of censorship and freedom of speechA right granted to U.S. citizens in the First Amendment of the U.S. Constitution, whereby individuals have the right to speak their minds without fear of prosecution. and how they apply to modern media, we must first explore the terms themselves. CensorshipThe institution of suppressing or removing anything considered objectionable. is defined as suppressing or removing anything deemed objectionable. A common, everyday example can be found on the radio or television, where potentially offensive words are “bleeped” out. More controversial is censorship at a political or religious level. If you’ve ever been banned from reading a book in school, or watched a “clean” version of a movie on an airplane, you’ve experienced censorship.
Much as media legislation can be controversial due to First Amendment protections, censorship in and of the media is often hotly debated. The First Amendment states that “Congress shall make no law…abridging the freedom of speech, or of the press.”“First Amendment—Religion and Expression,” http://caselaw.lp.findlaw.com/data/constitution/amendment01/. Under this definition, the term “speech” extends to a broader sense of “expression,” meaning verbal, nonverbal, visual, or symbolic expression. Historically, many individuals have cited the First Amendment when protesting FCC decisions to censor certain media products or programs. However, what many people do not realize is that U.S. law establishes several exceptions to free speech, including defamation, hate speech, breach of the peace, incitement to crime, sedition, and obscenity.
To comply with U.S. law, the FCC prohibits broadcasters from airing obscene programming. The FCC decides whether or not material is obscene by using a three-prong test.
Obscene material has the following characteristics:
Material meeting all of these criteria is officially considered obscene and usually applies to hard-core pornography.Federal Communications Commission, “Obscenity, Indecency & Profanity: Frequently Asked Questions,” http://www.fcc.gov/eb/oip/FAQ.html. “Indecent” material, on the other hand, is protected by the First Amendment and cannot be banned entirely.
Indecent material has the following characteristics:
Material deemed indecent cannot be broadcast between the hours of 6 a.m. and 10 p.m., to make it less likely that children will be exposed to it.Federal Communications Commission, “Obscenity, Indecency & Profanity: Frequently Asked Questions,” http://www.fcc.gov/eb/oip/FAQ.html.
These classifications symbolize the media’s long struggle with what is considered appropriate and inappropriate material. Despite the existence of the guidelines, however, the process of categorizing materials is a long and arduous one.
There is a formalized process for deciding what material falls into which category. First, the FCC relies on television audiences to alert the agency of potentially controversial material that may require classification. The commission asks the public to file a complaint via letter, email, fax, telephone, or the agency’s website, including the station, the community, and the date and time of the broadcast. The complaint should “contain enough detail about the material broadcast that the FCC can understand the exact words and language used.”Federal Communications Commission, “Obscenity, Indecency & Profanity: Frequently Asked Questions,” http://www.fcc.gov/eb/oip/FAQ.html. Citizens are also allowed to submit tapes or transcripts of the aired material. Upon receiving a complaint, the FCC logs it in a database, which a staff member then accesses to perform an initial review. If necessary, the agency may contact either the station licensee or the individual who filed the complaint for further information.
Once the FCC has conducted a thorough investigation, it determines a final classification for the material. In the case of profane or indecent material, the agency may take further actions, including possibly fining the network or station.Federal Communications Commission, “Obscenity, Indecency & Profanity: Frequently Asked Questions,” http://www.fcc.gov/eb/oip/FAQ.html. If the material is classified as obscene, the FCC will instead refer the matter to the U.S. Department of Justice, which has the authority to criminally prosecute the media outlet. If convicted in court, violators can be subject to criminal fines and/or imprisonment.Federal Communications Commission, “Obscenity, Indecency & Profanity: Frequently Asked Questions,” http://www.fcc.gov/eb/oip/FAQ.html.
Each year, the FCC receives thousands of complaints regarding obscene, indecent, or profane programming. While the agency ultimately defines most programs cited in the complaints as appropriate, many complaints require in-depth investigation and may result in fines called notices of apparent liability (NAL) or federal investigation.
Table 15.1 FCC Indecency Complaints and NALs: 2000–2005
Year |
Total Complaints Received |
Radio Programs Complained About |
Over-the-Air Television Programs Complained About |
Cable Programs Complained About |
Total Radio NALs |
Total Television NALs |
Total Cable NALs |
|---|---|---|---|---|---|---|---|
2000 |
111 |
85 |
25 |
1 |
7 |
0 |
0 |
2001 |
346 |
113 |
33 |
6 |
6 |
1 |
0 |
2002 |
13,922 |
185 |
166 |
38 |
7 |
0 |
0 |
2003 |
166,683 |
122 |
217 |
36 |
3 |
0 |
0 |
2004 |
1,405,419 |
145 |
140 |
29 |
9 |
3 |
0 |
2005 |
233,531 |
488 |
707 |
355 |
0 |
0 |
0 |
Although popular memory thinks of old black-and-white movies as tame or sanitized, many early filmmakers filled their movies with sexual or violent content. Edwin S. Porter’s 1903 silent film The Great Train Robbery, for example, is known for expressing “the appealing, deeply embedded nature of violence in the frontier experience and the American civilizing process,” and showcases “the rather spontaneous way that the attendant violence appears in the earliest developments of cinema.”“Violence,” Film Reference, http://www.filmreference.com/encyclopedia/Romantic-Comedy-Yugoslavia/Violence-BEGINNINGS.html. The film ends with an image of a gunman firing a revolver directly at the camera, demonstrating that cinema’s fascination with violence was present even 100 years ago.
Porter was not the only U.S. filmmaker working during the early years of cinema to employ graphic violence. Films such as Intolerance (1916) and The Birth of a Nation (1915) are notorious for their overt portrayals of violent activities. The director of both films, D. W. Griffith, intentionally portrayed content graphically because he “believed that the portrayal of violence must be uncompromised to show its consequences for humanity.”“Violence,” Film Reference, http://www.filmreference.com/encyclopedia/Romantic-Comedy-Yugoslavia/Violence-BEGINNINGS.html.
Although audiences responded eagerly to the new medium of film, some naysayers believed that Hollywood films and their associated hedonistic culture was a negative moral influence. As you read in , this changed during the 1930s with the implementation of the Hays Code. Formally termed the Motion Picture Production Code of 1930, the code is popularly known by the name of its author, Will Hays, the chairman of the industry’s self-regulatory Motion Picture Producers and Distributors Association (MPPDA), which was founded in 1922 to “police all in-house productions.”“Violence,” Film Reference, http://www.filmreference.com/encyclopedia/Romantic-Comedy-Yugoslavia/Violence-BEGINNINGS.html. Created to forestall what was perceived to be looming governmental control over the industry, the Hays Code was, essentially, Hollywood self-censorship. The code displayed the motion picture industry’s commitment to the public, stating the following:
Motion picture producers recognize the high trust and confidence which have been placed in them by the people of the world and which have made motion pictures a universal form of entertainment…. Hence, though regarding motion pictures primarily as entertainment without any explicit purposes of teaching or propaganda, they know that the motion picture within its own field of entertainment may be directly responsible for spiritual or moral progress, for higher types of social life, and for much correct thinking.“The Motion Picture Production Code of 1930 (Hays Code),” ArtsReformation, http://www.artsreformation.com/a001/hays-code.html.
Among other requirements, the Hays Code enacted strict guidelines on the portrayal of violence. Crimes such as murder, theft, robbery, safecracking, and “dynamiting of trains, mines, buildings, etc.” could not be presented in detail.“The Motion Picture Production Code of 1930 (Hays Code),” ArtsReformation, http://www.artsreformation.com/a001/hays-code.html. The code also addressed the portrayals of sex, saying that “the sanctity of the institution of marriage and the home shall be upheld. Pictures shall not infer that low forms of sex relationship are the accepted or common thing.”“The Motion Picture Production Code of 1930 (Hays Code),” ArtsReformation, http://www.artsreformation.com/a001/hays-code.html.
Figure 15.4

As the chairman of the Motion Pictures Producers and Distributors Association, Will Hays oversaw the creation of the industry’s self-censoring Hays Code.
As television grew in popularity during the mid-1900s, the strict code placed on the film industry spread to other forms of visual media. Many early sitcoms, for example, showed married couples sleeping in separate twin beds to avoid suggesting sexual relations.
Figure 15.5

During the 1950s, popular programs depicted even married couples sleeping in separate beds to avoid suggesting sexual relations.
By the end of the 1940s, the MPPDA had begun to relax the rigid regulations of the Hays Code. Propelled by the changing moral standards of the 1950s and 1960s, this led to a gradual reintroduction of violence and sex into mass media.
As filmmakers began pushing the boundaries of acceptable visual content, the Hollywood studio industry scrambled to create a system to ensure appropriate audiences for films. In 1968, the successor of the MPDDA, the Motion Picture Association of America (MPAA), established the familiar film ratings system to help alert potential audiences to the type of content they could expect from a production.
Although the ratings system changed slightly in its early years, by 1972 it seemed that the MPAA had settled on its ratings. These ratings consisted of G (general audiences), PG (parental guidance suggested), R (restricted to age 17 or up unless accompanied by a parent), and X (completely restricted to age 17 and up). The system worked until 1984, when several major battles took place over controversial material. During that year, the highly popular films Indiana Jones and the Temple of Doom and Gremlins both premiered with a PG rating. Both films—and subsequently the MPAA—received criticism for the explicit violence presented on screen, which many viewers considered too intense for the relatively mild PG rating. In response to the complaints, the MPAA introduced the PG-13 rating to indicate that some material may be inappropriate for children under the age of 13. Examples of films with a PG-13 rating include Harry Potter and the Deathly Hallows Part 2 (2011), Avatar (2009), The Dark Knight (2008), and Titanic (1999).
Another change came to the ratings system in 1990, with the introduction of the NC-17 rating. Carrying the same restrictions as the existing X rating, the new designation came at the behest of the film industry to distinguish mature films from pornographic ones. Examples of films with an NC-17 rating include Showgirls (1995) and Crash (1996). Despite the arguably milder format of the rating’s name, many filmmakers find it too strict in practice; receiving an NC-17 rating often leads to a lack of promotion or distribution because numerous movie theaters and rental outlets refuse to carry films with this rating.
Regardless of these criticisms, most audience members find the rating system helpful, particularly when determining what is appropriate for children. The adoption of industry ratings for television programs and video games reflects the success of the film ratings system. During the 1990s, for example, the broadcasting industry introduced a voluntary rating system not unlike that used for films to accompany all television shows. These ratings are displayed on screen during the first 15 seconds of a program and include TV-Y (all children), TV-Y7 (children age 7 and above), TV-Y7-FV (older children—fantasy violence), TV-G (general audience), TV-PG (parental guidance suggested), TV-14 (parents strongly cautioned), and TV-MA (mature audiences only).
Table 15.2 Television Ratings System
Rating |
Meaning |
Examples of Programs |
|---|---|---|
TV-Y |
Appropriate for all children |
Sesame Street, Barney & Friends, Dora the Explorer |
TV-Y7 |
Designed for children 7 and above |
SpongeBob SquarePants, Captain Planet |
TV-Y7-FV |
Directed toward older children; includes depictions of fantasy violence |
The Powerpuff Girls, Pokémon, Avatar: The Last Airbender |
TV-G |
Suitable for general audiences; contains little or no violence, no strong language, and little or no sexual material |
Hannah Montana, The Price Is Right, American Idol |
TV-PG |
Parental guidance suggested |
The Simpsons, Seinfeld, Tyler Perry’s House of Payne |
TV-14 |
Parents strongly cautioned; contains suggestive dialogue, strong language, and sexual or violent situations |
Saturday Night Live, Keeping Up With the Kardashians, Jersey Shore |
TV-MA |
Mature audiences only |
South Park, The Boondocks, The Shield |
At about the same time that television ratings appeared, the Entertainment Software Rating Board was established to provide ratings on video games. Video game ratings include EC (early childhood), E (everyone), E 10+ (ages 10 and older), T (teen), M (mature), and AO (adults only).
Table 15.3 Video Game Ratings System
Rating |
Meaning |
Examples of Games |
|---|---|---|
EC |
Designed for early childhood, children ages 3 and older |
Nickelodeon BINGO, Winnie the Pooh ABC’s, Elmo’s World |
E |
Suitable for everyone over the age of 6; contains minimal fantasy violence and mild language |
Tiger Woods PGA Tour, Little Big Planet, Frogger, Myst |
E 10+ |
Appropriate for ages 10 and older; may contain more violence and/or slightly suggestive themes |
Dance Dance Revolution, Tales of Monkey Island, Rock Band, Scribblenauts |
T |
Content is appropriate for teens (ages 13 and older); may contain violence, crude humor, sexually suggestive themes, use of strong language, and/or simulated gambling |
Final Fantasy XIV, The Sims 3, Super Smash Bros. Brawl |
M |
Mature content for ages 17 and older; includes intense violence and/or sexual content |
Quake, Grand Theft Auto IV, God of War, Fallout 3 |
AO |
Adults (18+) only; contains graphic sexual content and/or prolonged violence |
Playboy Mansion: Private Party, Manhunt 2 |
Even with these ratings, the video game industry has long endured criticism over violence and sex in video games. One of the top-selling video game series in the world, Grand Theft Auto, is highly controversial because players have the option to solicit prostitutes or murder civilians.Media Issues, “Violence in Media Entertainment,” http://www.media-awareness.ca/english/issues/violence/violence_entertainment.cfm. In 2010, a report claimed that “38 percent of the female characters in video games are scantily clad, 23 percent baring breasts or cleavage, 31 percent exposing thighs, another 31 percent exposing stomachs or midriffs, and 15 percent baring their behinds.”Media Awareness Network, “Sex and Relationships in the Media,” Media Awareness Network, http://www.media-awareness.ca/english/issues/stereotyping/women_and_girls/women_sex.cfm. Despite multiple lawsuits, some video game creators stand by their decisions to place graphic displays of violence and sex in their games on the grounds of freedom of speech.
Look over the MPAA’s explanation of each film rating online at http://www.mpaa.org/ratings/what-each-rating-means. View a film with these requirements in mind and think about how the rating was selected. Then answer the following short-answer questions. Each response should be a minimum of one paragraph.
Since its inception, the Internet has posed problems of who owns the content. Over the years, the government has struggled to find ways to introduce copyright protections into the online environment because, unlike other forms of media, the Internet enables users to make an unlimited number of copies of material and to transmit that information around the world.Bill Rosenblatt, “The Digital Object Identifier: Solving the Dilemma of Copyright Protection Online,” Journal of Electronic Publishing 3, no. 2 (1997), http://quod.lib.umich.edu/cgi/t/text/text-idx?c=jep;view=text;rgn=main;idno=3336451.0003.204. In this section, we explore the unique challenges presented in dealing with online copyright and intellectual property and the U.S. government’s role in regulating those fields.
Congress passed the Digital Millennium Copyright Act in 1998 to establish a protocol for online copyright matters. Yet the nature of the Internet causes very different copyright and intellectual property issues than older forms of media do. Because of the ease of sharing information online, for example, the DMCA has not worked as Congress expected.Electronic Frontier Foundation, “Digital Millennium Copyright Act,” http://www.eff.org/issues/dmca. Copying and sharing materials online is relatively simple and, as such, piracy and rights infringement run rampant. In fact, many have argued that despite the DMCA’s attempt to stop piracy, in practice, it has done nothing.Electronic Frontier Foundation, “Digital Millennium Copyright Act,” http://www.eff.org/issues/dmca. Additionally, because information is disseminated so rapidly online, piracy opponents struggle with determining the rightful owner of a particular copyright.
The DMCA and its role in Internet policing have frustrated many online users and watchdog groups. The Electronic Frontier Foundation (EFF) claims that “the DMCA has become a serious threat that jeopardizes fair use, impedes competition and innovation, chills free expression and scientific research, and interferes with computer intrusion laws.”Electronic Frontier Foundation, “Digital Millennium Copyright Act,” http://www.eff.org/issues/dmca. In 2004, comic book company Marvel Entertainment sued game publishers NCsoft and Cryptic for copyright infringement in their online game City of Heroes. Marvel argued that players could use the character customization system in City of Heroes to make characters look almost identical to Marvel characters.David Jenkins, “Marvel Sues City Of Heroes Creators,” Gamasutra, November 12, 2004, http://www.gamasutra.com/php-bin/news_index.php?story=4548. Situations like this led groups such as the EFF to publically call for DMCA reform. Such disputes serve as reminders of the challenges inherent in issuing copyrights and intellectual property rights for the online industry.
Certainly, the DMCA brought about major transformations by establishing copyright protection guidelines for the digital arena. However, in 1996—prior to the passage of the DMCA—the World Intellectual Property Organization (WIPO) established two treaties designed to “update and supplement the major existing WIPO treaties on copyright and related rights, primarily in order to respond to developments in technology and in the marketplace.”World Intellectual Property Organization, “Frequently Asked Questions,” http://www.wipo.int/copyright/en/faq/faqs.htm#P7_220. The first of these, the WIPO Copyright Treaty (WCT), was created to protect authors of literary and artistic works, including computer programs, original databases, and fine art.World Intellectual Property Organization, “Frequently Asked Questions,” http://www.wipo.int/copyright/en/faq/faqs.htm#P7_220. The second, the WIPO Performances and Phonograms Treaty (WPPT), deals with “related rights,” or rights connected to copyright. This law was created to protect the rights of performers and producers of sound recordings.World Intellectual Property Organization, “Frequently Asked Questions,” http://www.wipo.int/copyright/en/faq/faqs.htm#P7_220. These treaties both ensure basic rights, such as compensation and acknowledgement for those who create works, and extend further protections.World Intellectual Property Organization, “Frequently Asked Questions,” http://www.wipo.int/copyright/en/faq/faqs.htm#P7_220.
Supported by the WIPO and the DMCA, new forms of communication now enjoy copyright protections. Copyright laws cover blogs and website content, provided that these sites contain original writing.U.S. Copyright Office, “What Does Copyright Protect?” http://www.copyright.gov/help/faq/faq-protect.html#what_protect. Despite these developments, however, the Internet still poses challenges for copyrighted material. Because the web changes so quickly, maintaining copyright protection with the Copyright Office can be difficult. Presently, a work must be fixed and in a tangible form to be protected under copyright. Different, altered versions of the same work might not be covered under an original filed copyright claim. As such, authors publishing online must be careful to ensure that their work is protected.
Widespread piracy problems arose during the late 1990s with the popularization of technology allowing peer-to-peer (P2P) music sharing. Suddenly, software such as Napster, Scour, Aimster, AudioGalaxy, Morpheus, Grokster, Kazaa, iMesh, and LimeWire popped up on computers everywhere, allowing access to free music around the world—and fueling online piracy. However, in 2003, the Recording Industry Association of America (RIAA) put the laws established by the DMCA into practice and began a campaign to stop music piracy. In response to the growing number of users, the organization announced that it had been gathering evidence against users sharing music on P2P networks. Rather than go after the software engineers, “the RIAA investigators targeted ‘uploaders’—individuals who were allowing others to copy music files from their ‘shared’ folders.”U.S. Copyright Office, “What Does Copyright Protect?” http://www.copyright.gov/help/faq/faq-protect.html#what_protect.
This data collection led to the RIAA filing more than 250 lawsuits against individuals in what has been called “an unprecedented legal campaign against its own customers.”Electronic Frontier Foundation, “RIAA v. The People,” http://www.eff.org/riaa-v-people. Among the first of these lawsuits was one against a 12-year-old girl who had to pay $2,000 and publicly apologize to settle her case. Since then, the recording industry has filed, settled, or threatened legal actions against over 28,000 individuals.Electronic Frontier Foundation, “RIAA v. The People,” http://www.eff.org/riaa-v-people. Many college students have been targeted. Recently, the popular torrent site The Pirate Bay found itself under attack for allowing users to search for pirated copies of material. This case mirrors the case of Viacom versus YouTube, because the prosecution argued that The Pirate Bay was responsible for the material its users posted and downloaded. These lawsuits raise the question of whether websites are responsible for the actions of their users, an issue that looks to be central to future Internet legislation.Mike Masnick, “Pirate Bay Loses a Lawsuit; Entertainment Industry Loses an Opportunity,” Techdirt, April 17, 2009, http://www.techdirt.com/articles/20090417/0129274535.shtml.
The Internet is a relatively new form of media, but it is not exempt from media laws. Terms of service agreements, as well as legislation such as the 1986 Computer Fraud and Abuse Act, regulate Internet use. As you will see in the following case studies, when it comes to criminal use, the Internet is not as anonymous as it seems.
All software and most Internet sites have a terms of serviceLegally binding rules that an individual must adhere to in order to use an online service or software program. agreement to which its users must comply. Terms of service (TOS) are legally binding rules that an individual must adhere to in order to use a particular piece of software or service. iTunes, for instance, makes users agree to use their downloadable material for noncommercial use only and states that Apple is not responsible for lost or corrupted files. Anyone who has installed a new piece of software or logged on to social networking sites has agreed to a TOS. Entrance into these sites or use of a program typically requires a user to read through legal guidelines and then click a box agreeing to abide by the stated rules. You likely have done so numerous times.
Deterred by the length and legal jargon of the standard TOS, however, many people skip to the end and simply accept the terms without reading them carefully. iTunes, for instance, has a clause that states the following:
You may not use or otherwise export or re-export the Licensed Application [iTunes] except as authorized by United States law … the Licensed Application may not be exported or re-exported … into any U.S.-embargoed countries … You also agree that you will not use these products for any purposes prohibited by United States law, including, without limitation, the development, design, manufacture, or production of nuclear, missile, or chemical or biological weapons.Apple, “Terms and Conditions,” http://www.apple.com/legal/itunes/us/terms.html.
While not all Terms of Service are as extensive, an individual’s breach of any TOS may result in suspension, restriction, or cancellation of account privileges, depending on the severity of the offense. As individuals become increasingly reliant on Internet services such as email, calendars, and social networks, the potential for disruption is enormous.
In 2008, a compelling court case arose regarding TOS violation. Lori Drew, a 49-year-old woman, was accused of using a fake MySpace account to convince 13-year-old Megan Meier to commit suicide. How did it come about? After Drew’s daughter had a confrontation with Meier, Drew created an account pretending to be a teenage boy. At first she used the persona to flirt with Meier and uncover information about the teenager’s social life and relationship to her daughter. Later, when Drew decided she had enough information, she broke off her friendship, telling Meier that the world would be better off without her. Later that day, a distraught Megan Meier hanged herself.Jennifer Steinhauer, “Verdict in MySpace Suicide Case,” New York Times, November 26, 2008, http://www.nytimes.com/2008/11/27/us/27myspace.html. After Lori Drew’s identity was revealed, Meier’s shocked parents filed charges against her. Despite the tragic events, whether Drew had actually committed a crime remained questionable. Eventually, prosecutors decided the following:
Since there were no laws that applied in Missouri, the state where this tragedy occurred, [Drew] will face trial in California (the home of MySpace) where she will be charged with—of all things—TOS violations. Creating a false identity goes against MySpace’s terms of service and … as a result she will be facing 1 count of conspiracy and 3 counts of accessing a computer without authorization.Steve Spalding, “Lori Drew Facing Trial for TOS Violation,” How to Split an Atom, November 21, 2008, http://howtosplitanatom.com/the-news/lori-drew-facing-trial-for-tos-violation/.
The case is complicated and the charge unprecedented. As one author writes, “This raises the questions as to how much weight do online ‘contracts’ hold.”Steve Spalding, “Lori Drew Facing Trial for TOS Violation,” How to Split an Atom, November 21, 2008, http://howtosplitanatom.com/the-news/lori-drew-facing-trial-for-tos-violation/.
Prosecutors charged Drew under the Computer Fraud and Abuse Act (CFAA), although that law is designed primarily to reduce hacking into computer systems. In August 2009, a jury found Drew guilty of “misdemeanor counts of unauthorized access,” but commented that “the CFAA was not devised as a vehicle for criminalizing simple contractual violations on the Internet.”Ryan Paul, “Judge: TOS Violations Not a Crime in Teen Suicide Case,” Ars Technica (blog), August 31, 2009, http://arstechnica.com/tech-policy/news/2009/08/judge-says-tos-violations-arent-a-crime-acquits-lori-drew.ars. Although many believe that prosecutors pushed the charge too far, the Drew case brought TOS agreements to the attention of the public, shedding light on the complicated laws associated with Internet use.
Although cases such as Drew’s have brought about unexpected challenges, other online cases have had less ambiguous results. One newly clarified aspect of online law involves the use of the Internet to commit a crime. Regardless of the supposed anonymity of online use, law enforcement agencies and courts can requisition Internet Protocol (IP) addresses of suspected lawbreakers and trace their computers to discover their identities. This practice has brought many individuals to trial for criminal offenses committed over the Internet.
In 1998, a federal court found a 21-year-old Los Angeles man, Richard Machado, guilty of sending racist death threats to 59 Asian students. This case set a precedent because Machado was the first person to be convicted of an online hate crime for sending the message via email. Machado had used a campus computer to send an email to a group of mostly Asian students at University of California, Irvine, saying, “I personally will make it my life career to find and kill every one of [you].” Machado, a former UC Irvine student, signed the email “Asian Hater.” Prosecutors charged Machado with sending the threat based on the recipients’ race or ethnicity and interfering with their right to attend a public university.Courtney Macavinta, “Conviction in Online Threat Case,” CNET, February 11, 1998, http://news.cnet.com/Conviction-in-online-threat-case/2100-1023_3-208044.html.
The case signaled a new legal development because it was the first trial regarding hate crimes online. Prosecutor Michael Gennaco said of Machado’s sentencing, “The jury has spoken that a line needs to be drawn in cyberspace. If you cross that line, you’ll be subjected to the same criminal penalties you would be as if you use a telephone or post mail to do these kinds of acts.”Courtney Macavinta, “Conviction in Online Threat Case,” CNET, February 11, 1998, http://news.cnet.com/Conviction-in-online-threat-case/2100-1023_3-208044.html. Internet law specialists agree with Gennaco that the Internet is not and should not be treated differently from other communication methods; something posted online carries the same weight as a phone conversation or face-to-face interaction. This means that online anonymity is, in fact, not anonymous.
Despite the precedent of Machado’s case, many people still mistakenly believe that the Internet will protect them from prosecution. Such was the case of Walter Edward Bagdasarian, who discovered that the government can trace supposedly anonymous posts using IP addresses. U.S. Secret Service agents arrested Bagdasarian, a Southern California man, in 2009 for “posting a racist note to a Yahoo message board in October [2008] expressing displeasure over Barack Obama’s candidacy, and predicting ‘he will have a 50 cal in the head soon.’”Kevin Poulsen, “Online Threat to Kill Obama Leads to Arrest,” Wired, January 9, 2009, http://www.wired.com/threatlevel/2009/01/threat/. The case exemplifies both the ease with which authorities can and do trace criminal behavior online and their propensity to take such cases seriously.
What does the future hold for Internet legislation? Many say that it will closely mirror that of other media outlets. Already there have been cases regarding Internet monopolies, defamation of users, and copyright infringement on message boards and personal websites.Netlitigation, “Internet Law: News, Suits, and Discussion,” http://www.netlitigation.com/netlitigation/. Others argue that Internet regulation should take into account the differences between the use of the Internet and the use of other media; for example, an Arizona radio station that violates broadcasting laws is tried in Arizona, but where should an Internet podcaster be charged? If a user posts information on a community forum, is it protected under copyright? Does email spam fall under the same regulations as telemarketing? What privacy rights should Internet users have? As the Internet grows and more issues are taken to court, authorities must come to terms with media issues in a constantly changing digital landscape.
Thoroughly read a terms of service agreement from a major website you use frequently, such as a social networking site. How do the terms fit with your expectations? Is there anything that you find surprising? Is there anything that causes any concerns? With this in mind, answer the following short-answer questions. Each response should be a minimum of one paragraph.
In an era when work, discourse, and play are increasingly experienced via the Internet, it is fitting that politics have surged online as well in a recent phenomenon known as digital democracy. Digital democracyThe use of the Internet and other online tools to engage citizens in government and civic action.—also known as e-democracy—engages citizens in government and civic action through online tools. This new form of democracy began as an effort to include larger numbers of citizens in the democratic process. Recent evidence seems to confirm a rising popular belief that the Internet is the most effective modern way to engage individuals in politics. “Online political organizations…have attracted millions of members, raised tens of millions of dollars, and become a key force in electoral politics. Even more important, the 2004 and 2008 election cycles show that candidates themselves can use the Internet to great effect.”Matthew Hindman, The Myth of Digital Democracy (Princeton, NJ: Princeton University Press, 2008), 4.
Figure 15.6

President Barack Obama has been called “the digital candidate” for his use of digital technology during his 2008 presidential campaign.
Perhaps the best example of a political candidate putting digital democracy to use is the successful 2008 presidential campaign of Barack Obama. On June 8, 2008, following Obama’s victory in the Democratic presidential primaries, The New York Times published an article discussing the candidate’s use of the Internet in his nomination bid. Titled “The Wiki-Way to the Nomination,” the article credits Obama’s success to his employment of digital technology: “Barack Obama is the victor, and the Internet is taking the bows.”Noam Cohen, “The Wiki-Way to the Nomination,” New York Times, June 8, 2008, http://www.nytimes.com/2008/06/08/weekinreview/08cohen.html.
Obama’s campaign certainly is not the first to rely on the Internet. Another Democratic presidential hopeful, Howard Dean, famously built his campaign online during the 2004 election cycle. But the Obama campaign took full advantage of the possibilities of digital democracy and, ultimately, secured the Oval Office partially on the strength of that strategy. As one writer puts it, “What is interesting about the story of his digital campaign is the way in which digital was integrated fully into the Obama campaign, rather than [being] seen as an additional extra.”Eliza Williams, “The Story Behind Obama’s Digital Campaign,” Creative Review, July 1, 2009, http://www.creativereview.co.uk/cr-blog/2009/june/the-story-behind-obamas-digital-campaign. President Obama’s successful campaign serves as an excellent example of the possibilities of digital democracy.
Several existing political websites proved beneficial to the Obama campaign. Founded in 1998, the liberal website MoveOn.org has long used its popularity and supporter base to mobilize citizens to vote, lobby, or donate funds to Democratic campaigns. With more than 4 million members, MoveOn.org plays a noticeable role in U.S. politics and serves as inspiration for other like-minded digital efforts.
The Obama campaign gave a nod to the success of such sites by building a significant web presence. Websites such as MyBarackObama.com formed the foundation of these online efforts. However, the success of the Obama digital campaign came from its use of online media in all its forms. The campaign turned not only to traditional websites but also to social networking sites, email outreach, text messages, and viral videos.
More and more, digital democracy demands that its users rely on these alternative forms of Internet outreach. Social networking site Facebook was the hub of many digital outreach efforts during the 2008 campaign. As of 2010, Barack Obama’s official Facebook page boasts more than 9 million fans, and the Obama administration uses the page to send messages about the current political climate.
Individuals not part of the official campaign also established Facebook pages supporting the candidate. Mamas for Obama emerged just prior to the election, as did Women for Obama and the Michelle Obama Fan Club. The groups range in size, but all speak to a new wave of digital democracy. Other political candidates, including 2008 Republican presidential contender John McCain, have also turned to Facebook, albeit in less comprehensive ways.
The Obama campaign also relied on email. In 2009, an article was published titled “The Story Behind Obama’s Digital Campaign” discussing the success of Obama’s use of the Internet. According to the article, 13.5 million people signed up for updates on Obama’s progress via the MyBarackObama.com website. The campaign regularly sent out emails to reach its audience.
Emails were short—never longer than 300 words—and never anonymous, there was always a consistency of voice and tone. Obama and other key figures in the campaign also contributed emails to be sent—“Michelle wrote her own emails … and more people opened those than her husband’s”—giving the campaign a personal touch and authenticity, rather than the impression of being simply churned out by the PR machine.Eliza Williams, “The Story Behind Obama’s Digital Campaign,” Creative Review, July 1, 2009, http://www.creativereview.co.uk/cr-blog/2009/june/the-story-behind-obamas-digital-campaign.
A combination of message and financial appeal, the emails were successful not only in reaching target audiences but also in earning valuable campaign dollars.
Two billion emails were then sent out, although … this email content was carefully managed, with individuals targeted with different “tracks” depending on their circumstances and whether they had already donated to the campaign…. By the end of the campaign the website had mobilized over 3 million people to contribute over $500 million online.Eliza Williams, “The Story Behind Obama’s Digital Campaign,” Creative Review, July 1, 2009, http://www.creativereview.co.uk/cr-blog/2009/june/the-story-behind-obamas-digital-campaign.
Additionally, Obama used text messaging to reach out to his supporters. During the campaign, supporters could sign up to receive text messages, and attendees at rallies and other events were asked to send text messages to friends or potential supporters to encourage them to participate in Obama’s campaign. Members of MyBarackObama.com were the first to discover his running mate selection via text message.BarackObama.com, “Be the First to Know,” Organizing for America, http://my.barackobama.com/page/s/firsttoknow. This tool proved helpful and demonstrated the Obama campaign’s commitment to fully relying on the digital world.
Figure 15.7

The music video “Yes We Can,” created by the Black Eyed Peas’ will.i.am, was viewed more than 20 million times leading up to the 2008 presidential election. It is just one example of digital campaigning used by supporters of political candidates.
Perhaps even more impressive than the campaign’s commitment to digital democracy were the e-democracy efforts of Obama’s supporters. Websites such as Barackobamaisyournewbicycle.com, a gently mocking site “listing the many examples of Mr. Obama’s magical compassion. (‘Barack Obama carries a picture of you in his wallet’; ‘Barack Obama thought you could use some chocolate’),”Noam Cohen, “The Wiki-Way to the Nomination,” New York Times, June 8, 2008, http://www.nytimes.com/2008/06/08/weekinreview/08cohen.html. emerged, but viral videos offered even stronger examples of Obama’s grassroots campaign.
One example of a supporter-created video was “Barack Paper Scissors,” an interactive game inspired by rock-paper-scissors. Posted on YouTube, the video logged some 600,000 views. The success of videos such as “Barack Paper Scissors” did not go unnoticed by the Obama campaign. The viral video “Yes We Can,” in which Barack Obama’s words were set to music by will.i.am (of the Black Eyed Peas), has been viewed more than 20 million times online. Capitalizing on the popularity of the clip, the campaign brought it from YouTube to its main website, thus generating even more views and greater exposure for its message.
Although the Internet is a powerful tool for candidates, it also propagates rumors that can derail—or at least hinder—a politician’s career. Blog posts and mass emails can be created within minutes and then reposted or forwarded in seconds. Thus, ideas spread like wildfire regardless of their relative truth. Snopes.com, a website dedicated to verifying or debunking urban legends and Internet rumors, has an entire search section dedicated to political rumors, ranging from shooting down a list of books supposedly banned by Sarah Palin to investigating whether actress Nancy Cartwright, best known as the voice of Bart Simpson, was once elected mayor of Northridge, California. The pages dedicated to major political figures such as President Obama can be huge; Obama’s page, for example, lists more than 60 debunked rumors. Some of these rumors include the questioning of his U.S. citizenship, his decision to ban recreational fishing, and his refusal to sign Eagle Scout certificates.
Many of these online rumors are accompanied by “photographic evidence,” thanks to technology such as Photoshop, which allows photographs to be manipulated with the click of a mouse. With such a spread of online rumors, savvy media consumers must be wary of what they read and seek out legitimate sources of information to verify the news that they receive.
Just as digital technology access issues can create the kinds of problems discussed in , the digital divide can equally split the country’s involvement with politics along tech-savvy lines. Certainly, the Obama campaign’s reliance on modern technology allowed it to reach a large population of young voters; but in doing so, the campaign focused much of its attention in an area out of reach to other voters. In The Myth of Digital Democracy, author Matthew Hindman wonders, “Is the Internet making politics less exclusive?”Matthew Hindman, The Myth of Digital Democracy (Princeton, NJ: Princeton University Press, 2008), 4. The answer is likely both yes and no. While the Internet certainly has the power to inform and mobilize many individuals, it also denies poorer citizens without digital access an opportunity to be part of the new wave of e-democracy.
Nevertheless, digital democracy will continue to play a large role in politics, particularly after the overwhelming success of President Obama’s largely digital campaign. But politicians and their supporters must consider the digital divide and work to reach out to those who are not plugged in to the digital world.
Visit YouTube and search for a local or national candidate with whom you are familiar. If possible, compare the video message to those available on a candidate’s website. Then answer the following short-answer questions. Each response should be a minimum of one paragraph.
Figure 15.8

In 2007, The Washington Post published a critical exposé on the Walter Reed Army Medical Center. In response to the public outcry, the U.S. Army launched an investigation and set about improving the facility. As demonstrated in this case, media coverage can directly influence people’s lives.
Media have long had a voice and a role in politics. As you have read in earlier chapters, even some of the earliest newspapers and magazines used their pages as a forum for political discourse. When broadcast media emerged during the 20th century, radio briefs and television reports entered the conversation, bringing political stories to the public’s living rooms.
In addition to acting as a watchdog, media provide readers and viewers with news coverage of issues and events, and also offer public forums for debate. Thus, media support—or lack thereof—can have a significant influence on public opinion and governmental action. In 2007, for example, The Washington Post conducted a four-month investigation of the substandard medical treatment of wounded soldiers at Walter Reed Army Medical Center in Washington, DC. Because of the ensuing two-part feature, the Secretary of the Army and the two-star general in charge of the medical facility lost their jobs.
However, an ongoing debate exists over media’s role in politics. Many individuals wonder who is really behind certain stories. William James Willis, author of The Media Effect: How the News Influences Politics and Government discusses this debate:
Sometimes the media appear willing or unwitting participants in chasing stories the government wants them to chase; other times politicians find themselves chasing issues that the media has enlarged by its coverage. Over the decades, political scientists, journalists, politicians, and political pundits have put forth many arguments about the media’s power in influencing the government and politicians.William James Willis, The Media Effect: How the News Influences Politics and Government (Westport, CT: Praeger, 2007), 4.
Regardless of who is encouraging whom, media coverage of politics certainly raises questions among the public. Despite laws put in place to prevent unbalanced political coverage, such as Section 315, a large majority of the public is still wary of the media’s role in swaying political opinion. In a January 2010 survey, two-thirds of respondents said that the media has too much influence on the government. Additionally, 72 percent of respondents agreed that “most reporters try to help the candidate they want to win.”Rasmussen Reports, “67% Say News Media Have too Much Influence Over Government Decisions,” news release, January 14, 2010, http://www.rasmussenreports.com/public_content/politics/general_politics/january_2010/67_say_news_media_have_too_much_influence_over_government_decisions. This statistic demonstrates the media’s perceived political power along with the road the media must carefully navigate when dealing with political issues.
Throughout their respective histories, radio, television, and the Internet have played important roles in politics. As technology developed, citizens began demanding greater levels of information and analysis of media outlets and, in turn, politicians. Here we explore the transformation of politics with the development of media.
As discussed in , radio was the first medium through which up-to-the-minute breaking news could be broadcast, with its popularization during the 1920s. On November 2, 1920, KDKA in East Pittsburgh, Pennsylvania, became the first station to broadcast election results from the Harding-Cox presidential race, “becoming a pioneer in a brand new technology.”“History of the Radio,” http://americanhistory.suite101.com/article.cfm/history_of_the_radio. Suddenly, information that would previously have been available only later in the newspapers was transmitted directly into American living rooms. The public responded positively, wanting to be more involved in U.S. politics.
As radio technology developed, “Americans demanded participation in the political and cultural debates shaping their democratic republic.”Henry Jenkins, “Contacting the Past: Early Radio and the Digital Revolution,” MIT Communications Forum, http://web.mit.edu/comm-forum/papers/jenkins_cp.html. Radio provided a way to hold these debates in a public forum; it also provided a venue for politicians to speak directly to the public, a phenomenon that had not been possible on a large scale prior to the invention of the radio. This dynamic changed politics. Suddenly, candidates and elected officials had to be able to effectively communicate their messages to a large audience. “Radio brought politicians into people’s homes, and many politicians went to learn effective public-speaking for radio broadcasts.”“Radio’s Emergence,” http://library.thinkquest.org/27629/themes/media/md20s.html.
Today, television remains Americans’ chief source of political news, a relationship that dates back almost to the very beginning of the medium. Political candidates began using television commercials to speak directly to the public as early as 1952. These “living room candidates,” as they are often called, understood the power of the television screen and the importance of reaching viewers at home. In 1952, Dwight D. Eisenhower became the first candidate to harness television’s popularity. Eisenhower stepped onto the television screen “when Madison Avenue advertising executive Rosser Reeves convinced [him] that short ads played during such popular television programs as I Love Lucy would reach more voters than any other form of advertising. This innovation had a permanent effect on the way presidential campaigns are run.”Museum of the Moving Image, The Living Room Candidate, http://www.livingroomcandidate.org/.
The relationship between politics and television took a massive step forward in 1960 with a series of four televised “Great Debates” between presidential candidates John F. Kennedy and Richard Nixon. Seventy million U.S. viewers tuned into the first of these on September 26, 1960. The debates gave voters their first chance to see candidates debate, marking television’s entry into politics.
Figure 15.9

In 1960, candidates John F. Kennedy and Richard Nixon brought presidential debating to television screens around the nation.
As discussed earlier in the book, the visual difference between the two candidates was staggering; Kennedy appeared much more presidential. A record number of viewers watched the debates, and many historians have attributed Kennedy’s success at the polls that November to the public perception of the candidates formed during these debates.“Kennedy-Nixon Debates,” Mary Ferrell Foundation, http://www.maryferrell.org/wiki/index.php/Kennedy-Nixon_Debates.
Later in the decade, rising U.S. involvement in Vietnam brought television and public affairs together again in a significant way. The horrors of battle were broadcast directly into U.S. homes on a large scale for the first time; although television had been invented prior to the Korean War, “the medium was in its infancy…[and] its audience and technology [were] still too limited to play a major role.”Museum of Broadcast Communications, “Vietnam on Television,” http://www.museum.tv/eotvsection.php?entrycode=vietnamonte. As such, in 1965 the Vietnam War became the first “living-room war.”
Early in the war, the coverage was mostly upbeat:
It typically began with a battlefield roundup, written from wire reports based on the daily press briefing in Saigon … read by the anchor and illustrated with a battle map…. The battlefield roundup would normally be followed by a policy story from Washington, and then a film report from the field…. As with most television news, the emphasis was on the visual and above all the personal: “American boys in action” was the story, and reports emphasized their bravery and their skill in handling the technology of war.Museum of Broadcast Communications, “Vietnam on Television,” http://www.museum.tv/eotvsection.php?entrycode=vietnamonte.
In 1969, however, television coverage began to change as journalists grew more and more skeptical of the government’s claims of progress, and there was more emphasis on the human costs of war.Museum of Broadcast Communications, “Vietnam on Television,” http://www.museum.tv/eotvsection.php?entrycode=vietnamonte. Although gore typically remained off screen, a few major violent moments were caught on film and broadcast into homes. In 1965, CBS aired footage of U.S. Marines setting village huts on fire, and in 1972, NBC audiences witnessed Vietnamese civilians fall victim to a napalm strike. Such scenes altered America’s perspective of the war, generating antiwar sentiment. Over twenty years later, in 1991, the Persian Gulf War was brought into homes across the country, as live video feed showed the impact of scud missiles striking their targets. The media, in that conflict, were accused of shaping public sentiments toward acceptance of the U.S. military involvement.
The way that news is televised has dramatically changed over the medium’s history. For years, nightly news broadcasts dominated the political news cycle; then, in the 1980s, round-the-clock cable news channels appeared. Founded by Ted Turner in 1980, CNN (Cable News Network) was the first such network. Upon the launch of CNN, Turner stated, “We won’t be signing off until the world ends. We’ll be on, and we will cover the end of the world, live, and that will be our last event…and when the end of the world comes, we’ll play ‘Nearer, My God, to Thee’ before we sign off.”TV Tropes, “Twenty Four Hour News Networks,” http://tvtropes.org/pmwiki/pmwiki.php/Main/TwentyFourHourNewsNetworks.
Twenty-four-hour news stations such as CNN have become more popular, and nightly news programs have been forced to change their focus, now emphasizing more local stories that may not be covered by the major news programs. Additionally, the 21st century has seen the rise of the popularity and influence of satirical news shows such as The Daily Show and The Colbert Report. The comedic news programs have, in recent years, become major cultural arbiters and watchdogs of political issues thanks to the outspoken nature of their hosts and their frank coverage of political issues.
Finally, the Internet has become an increasingly important force in how Americans receive political information. Websites such as the Huffington Post, Daily Beast, and the Drudge Report are known for breaking news stories and political commentary. Additionally, political groups regularly use the Internet to organize supporters and influence political issues. Online petitions are available via the Internet, and individuals can use online resources to donate to political causes or connect with like-minded people.
Media and government have had a long and complicated history. Each influences the other, through regulations and news cycles. As technology develops, the relationship between media and politics will likely become even more intermeshed. The hope is that the U.S. public will benefit from such developments, and both media and the government will seek out opportunities to involve the public in their decisions.
Choose a political topic that interests you, such as conflict in the Middle East, the legalization of marijuana, or gay marriage. Find a radio story, a television story, and an Internet story about this topic. Then write a one-page paper answering the following questions.
Review Questions
Some media professionals work closely with political candidates to help them craft their public images and messages. Suppose that you were going to advise the campaign of a candidate for local, state, or national office. Choose a candidate who interests you and visit his or her existing website. Explore any other digital outreach efforts. Then answer the following questions to help you make recommendations for the campaign.
Does the tablet computer represent the future of media? Tech-savvy consumers certainly seem to think so—on the day Apple’s much-hyped iPad hit the market in April 2010, the company sold more than 300,000 devices. Described as “Goldilocks” gadgets—not too big, not too small—tablet computers are creating what former Apple CEO Steve Jobs calls a “third segment” of computing between handheld phones and laptop computers.Bobbie Johnson and Charles Arthur, “Apple iPad: The Wait Is Over—But Is It the Future of Technology or Oversized Phone?” Guardian (London), January 27, 2010, http://www.guardian.co.uk/technology/2010/jan/27/apple-ipad-tablet-computer-kindle. The iPad, which sports a 9.7-inch color LED touch screen, enables consumers to surf the web, play games, email, and use many of the same applications available on the company’s vastly popular smartphone, the iPhone. Its primary function upon release however, was to corner the e-book market, putting it in competition with Amazon.com’s black-and-white Kindle e-reader. Signing deals with five major publishers—HarperCollins, Penguin, Simon & Schuster, Macmillan, and Hachette—Apple created a program called iBooks that enables customers to download e-books directly onto the iPad via the digital media application iTunes. The print media industry—which was unable to capitalize on the benefits of new media during the Internet age of free print and video content on the web, and saw its profits disintegrate as a result—is hopeful that tablet computers such as the iPad will provide some form of digital salvation. John Makinson, chairman of the Penguin Group, said the iPad would help “attract millions of new readers to the world’s best books.”Bobbie Johnson and Charles Arthur, “Apple iPad: The Wait Is Over—But Is It the Future of Technology or Oversized Phone?” Guardian (London), January 27, 2010, http://www.guardian.co.uk/technology/2010/jan/27/apple-ipad-tablet-computer-kindle. More importantly for the future of traditional media, the iPad may provide a way for publishers to generate a profit from these new readers. Electronic publishers who sell their products through iBooks receive a 70 percent share of any revenues, and are able to set their prices higher than Amazon’s, a relief for publishers worried that e-books might undercut their sales.
The success of Apple’s iPhone, which is expected to generate an estimated $1.4 billion in 2010 from its App Store alone, may provide some indication of how well the iPad is likely to perform in the near future. Trip Hawkins, a founder of interactive entertainment software company Electronic Arts, commented, “The iPhone was a harbinger. When you have a device that is this convenient and fun for consumers to use, you can get a lot more people interested in paying for and engaging with the content. Big media companies should be all over this like a cheap suit.”Brad Stone and Stephanie Clifford, “With Apple Tablet, Print Media Hope for a Payday,” New York Times, January 25, 2010, http://www.nytimes.com/2010/01/26/technology/26apple.html. And they are. Several major newspapers and magazine companies have signed up with Apple’s latest device, and their content is available via iPad applications. Some, such as The New York Times and USA Today, are initially offering their apps for free with a paid app coming down the line, while others, such as The Wall Street Journal and Time, are available for a download fee. Thomas J. Wallace, editorial director of Condé Nast, said, “2010 is going to be the year of the tablet, and we feel we are in a very good position for it.”Brad Stone and Stephanie Clifford, “With Apple Tablet, Print Media Hope for a Payday,” New York Times, January 25, 2010, http://www.nytimes.com/2010/01/26/technology/26apple.html. The publishing company launched its first app for GQ, a men’s magazine, at a cost of $2.99 in April 2010. To avoid losing paying customers, media companies are adjusting part of their digital strategy so that consumers are no longer able to access the same content for free on the web.
Despite initial concerns that the iPad might prove to be an unnecessary gadget, performing functions that can be performed on other devices, its sales have so far surpassed all expectations. As of 2010, the original iPad sales had topped more than 15,000,000 units in the first 9 months, outpacing sales of Mac laptops. The cheaper price and variety of functions made it a hit with consumers. With magazine and newspaper publishers able to provide a more interactive experience on the iPad through video, graphics, and creative design layouts, analysts are predicting the iPad will revolutionize the publishing industry the way the iPod and the iPhone shook up the digital music and smartphone industries, respectively. Whether the iPad will remain at the forefront of the digital revolution in the years to come remains to be seen, but it has the potential to eventually become an all-in-one television, newspaper, and bookshelf.
Life has changed dramatically over the past century, and a major reason for this is the progression of media technology. Compare a day in the life of a modern student—let’s call her Katie—with a day in the life of someone from Katie’s great-grandparents’ generation. When Katie wakes up, she immediately checks her smartphone for text messages and finds out that her friend will not be able to give her a ride to class. Katie flips on the television while she eats breakfast to check the news and learns it is supposed to rain that day. Before she leaves her apartment, Katie goes online to make sure she remembered the train times correctly. She grabs an umbrella and heads to the train station, listening to a music application on her smartphone on the way. After a busy day of classes, Katie heads home, occupying herself on the train ride by watching YouTube clips on her phone. That evening, she finishes her homework, emails the file to her instructor, and settles down to watch the television show she digitally recorded the night before. While watching the show, Katie logs on to Facebook and chats with a few of her friends online to make plans for the weekend and then reads a book on her e-reader.
Katie’s life today is vastly different from the life she would have led just a few generations ago. At the beginning of the 20th century, neither television nor the Internet existed. There were no commercial radio stations, no roadside billboards, no feature films, and certainly no smartphones. People were dependent on newspapers and magazines for their knowledge of the outside world. An early-20th-century woman the same age as Katie—let’s call her Elizabeth—wakes up to read the daily paper. Yellow journalism is rife, and the papers are full of lurid stories and sensational headlines about government corruption and the unfair treatment of factory workers. Full-color printing became available in the 1890s, and Elizabeth enjoys reading the Sunday comics. She also subscribes to Good Housekeeping magazine. Occasionally, Elizabeth and her husband enjoy visiting the local nickelodeon theater, where they watch short silent films accompanied by accordion music. They cannot afford to purchase a phonograph, but Elizabeth and her family often gather around a piano in the evening to sing songs to popular sheet music. Before she goes to sleep, Elizabeth reads a few pages of The Strange Case of Dr. Jekyll and Mr. Hyde. Separated by nearly a century of technology, Elizabeth’s and Katie’s lives are vastly different.
Traditional mediaMedia that encompass all the means of communication that existed before the introduction of the Internet and new media technology, including printed materials (books, magazines, and newspapers), broadcast communications (television and radio), film, and music. encompasses all the means of communication that existed before the Internet and new media technology, including printed materials (books, magazines, and newspapers), broadcast communications (television and radio), film, and music. New mediaMedia that encompass all the forms of communication in the digital world, including electronic video games and the Internet., on the other hand, includes electronic video games and entertainment, and the Internet and social media. Although different forms of mass media rise and fall in popularity, it is worth noting that despite significant cultural and technological changes, none of the media discussed throughout this text has fallen out of use completely.
First popularized in the 1970s with Atari’s simple table-tennis simulator Pong, video games have come a long way over the past four decades. Early home game consoles could play only one game, a limitation solved by the development of interchangeable game cartridges. The rise of the personal computer in the 1980s enabled developers to create games with more complex story lines and to allow players to interact with each other via the computer. In the mid-1980s, online role-playing games developed, allowing multiple users to play at the same time. A dramatic increase in Internet use helped to popularize online games during the 1990s and 2000s, both on personal computers and via Internet-enabled home console systems such as the Microsoft Xbox and the Sony PlayStation. The Internet has added a social aspect to video gaming that has bridged the generation gap and opened up a whole new audience for video game companies. Senior citizens commonly gather in retirement communities to play Nintendo’s Wii bowling and tennis games using a motion-sensitive controller, while young professionals and college students get together to play in virtual bands on games such as Guitar Hero and Rock Band. No longer associated with an isolated subculture, contemporary video games are bringing friends and families together via increasingly advanced gaming technology.
It is almost impossible to overstate the influence the Internet has had on media over the past two decades. Initially conceived as an attack-proof military network in the 1960s, the Internet has since become an integral part of daily life. With the development of the World Wide Web in the 1980s and the introduction of commercial browsers in the 1990s, users gained the ability to transmit pictures, sound, and video over the Internet. Companies quickly began to capitalize on the new technology, launching web browsers, offering free web-based email accounts, and providing web directories and search engines. Internet usage grew rapidly, from 50 percent of American adults in 2000 to 75 percent of American adults in 2008.Pew Research Center, Internet User Profiles Reloaded, January 5, 2010, http://pewresearch.org/pubs/1454/demographic-profiles-internet-broadband-cell-phone-wireless-users. Now that most of the industrialized world is online, the way we receive our news, do business, conduct research, contact friends and relatives, apply for jobs, and even watch television has changed completely. To provide just one example, many jobs can now be performed entirely from home without the need to travel to a central office. Meetings can be conducted via videoconference, written communication can take place via email, and employees can access company data via a server or file transfer protocol (FTP) site. You very likely have had the opportunity to take an online college class.
In addition to increasing the speed with which we can access information and the volume of information at our fingertips, the Internet has added a whole new democratic dimension to communication. Becoming the author of a printed book may take many years of frustrated effort, but becoming a publisher of online material requires little more than the click of a button. Thanks to social media such as blogs, social networking sites, wikis, and video-sharing websites, anyone can contribute ideas on the web. Social media has many advantages, including the instantaneous distribution of news, a variety of different perspectives on a single event, and the ability to communicate with people all over the globe. Although some industry analysts have long predicted that the Internet will render print media obsolete, mass-media executives believe newspapers will evolve with the times. Just as the radio industry had to rethink its commercial strategy during the rise of television, newspaper professionals will need to rethink their methods of content delivery during the age of the Internet.
New technologies have developed so quickly that executives in traditional media companies often cannot retain control over their content. For example, as we saw, when music-sharing website Napster began enabling users to exchange free music files over the Internet, peer-to-peer file sharing cost the music industry a fortune in lost CD sales. Rather than capitalize on the new technology, music industry executives sued Napster, ultimately shutting it down, but never quite managing to stamp out online music piracy. Even with legal digital music sales through online vendors such as Apple’s iTunes Store, the music industry is still trying to determine how to make a large enough profit to stay in business.
The publishing industry has also suffered from the effects of new technology (although newspaper readership has been in decline since the introduction of television and radio). When newspapers began developing online versions in response to competition from cable television, they found themselves up against a new form of journalism: amateur blogging. Initially dismissed as unreliable and biased, blogs such as Daily Kos and The Huffington Post have gained credibility and large readerships over the past decade, forcing traditional journalists to blog and tweet in order to keep pace (which allows less time to check that sources are reliable or add in-depth analysis to a story). Traditional newspapers are also losing out to news aggregators such as Google News, which profit from providing links to journalists’ stories at major newspapers without offering financial compensation to either the journalists or the news organizations. Many newspapers have adapted to the Internet out of necessity, fighting falling circulation figures and slumping advertising sales by offering websites, blogs, and podcasts and producing news stories in video form. Those that had the foresight to adapt to the new technology are breathing a sigh of relief; a 2010 Pew Research Center report found that more Americans receive their news via the Internet than from newspapers or radio sources, and that the Internet is the third most popular news source behind national and local television news (see ).Pew Research Center, “The New News Landscape: Rise of the Internet,” March 1, 2010, http://pewresearch.org/pubs/1508/internet-cell-phone-users-news-social-experience?src=prc-latest&proj=peoplepress.
Critics of the pay-for-content model point to the failure of Newsday, a Long Island, New York, daily that was one of the first non-business publications to use the pay-for-content model. In October 2009, Newsday began charging readers $5 a week ($260 a year) for unlimited access to its online content. Three months later, an analysis of the move indicated that it had been a total failure. Just 35 people had signed up to pay for access to the site. Having spent $4 million redesigning and relaunching the Newsday website in preparation for the new model, the owners grossed just $9,000 from their initial readership.
However, the lack of paying consumers may be partly accounted for by the number of exceptions granted by the company. Subscribers to the print version of the paper can access the site for free, as can those with Optimum Cable. According to Newsday representatives, 75 percent of Long Island residents have either a newspaper subscription or Optimum Cable. “Given the number of households in our market that have access to Newsday’s website as a result of other subscriptions, it is no surprise that a relatively modest number have chosen the pay option,” said a Cablevision spokeswoman.John Koblin, “After 3 Months, Only 35 Subscriptions for Newsday’s Web Site,” New York Observer, January 26, 2010, http://www.observer.com/2010/media/after-three-months-only-35-subscriptions-newsdays-web-site. Even though most Long Island residents have access to the site, traffic has dropped considerably. A Nielsen Online survey revealed that traffic fell from 2.2 million visits in October 2009 to 1.5 million visits in December 2009. Publishing executives will be watching closely to see whether The New York Times meets a similar fate with its pay-for-content model.
New media have three major advantages over traditional media. First, it is immediate, enabling consumers to find out the latest news, weather report, or stock prices at the touch of a button. Digital music can be downloaded instantly, movies can be ordered via cable or satellite on-demand services, and books can be read on e-readers. In an increasingly fast-paced world, there is little need to wait for anything. The second advantage is cost. Most online content is free, from blogs and social networking sites to news and entertainment sources. Whether readers are willing to pay for content once they are used to receiving it for free is something that the The New York Times set to find out in 2011, when it introduces a metered fee model for its online paper. Finally, new media is able to reach the most remote parts of the globe. For example, if a student is looking for information about day-to-day life in Iran, there is a high probability that a personal web page about living in that country exists somewhere on the Internet. Around three-fourths of Americans, half of Europeans, and just over one-fourth of the world’s population overall have Internet access.Internet World Stats, “Internet Usage Statistics,” http://www.internetworldstats.com/stats.htm. This widespread reach makes the Internet an ideal target for advertisers, who can communicate with their desired niche audiences via tracking devices such as profile information on social networking sites.
Review the traditional and emerging forms of media. Then answer the following short-answer questions. Each response should be a minimum of one paragraph.
Figure 16.1

Michael Jackson’s death at age 50 caused a frenzy of media attention. Here, fans and the media have gathered at Jackson’s mansion hoping to catch a glimpse of Jackson’s family.
As we saw in , when superstar Michael Jackson died of a cardiac arrest in June 2009, the news sent media outlets all over the world into a frenzy, providing journalists, bloggers, authors, and television news anchors with months of material. The pop singer’s death is a good example of how information is disseminated through the various media channels. Unafraid to publish unconfirmed rumors that may have to be retracted later, blogs and gossip websites are often first to produce celebrity news stories. Digital sources also have the advantage of immediacy—rather than waiting for a physical newspaper to be printed and delivered, a time-consuming process that occurs just once a day, bloggers and online reporters can publish a story on the Internet in the time it takes to type it out. Within 40 minutes after the Los Angeles Fire Department arrived at Jackson’s home, a small entertainment website called X17online posted the news that Jackson had suffered cardiac arrest. Twenty minutes later, larger entertainment site TMZ picked up the information and distributed it to hundreds of thousands of people via RSSA content delivery vehicle used to syndicate news and other web content, enabling consumers to automatically receive new digital content from a provider.—a web publishing technology that enables users to automatically receive new digital content from the provider. Multiple Wikipedia members updated Jackson’s biographical entry to include the news of his cardiac arrest before any major news networks or broadcasters had announced the news. By the time the cardiac arrest was reported on CNN’s official Twitter account two hours after the 911 call, Twitter users and TMZ reporters were already posting reports of the star’s death. The story created such a surge in online traffic that microblogging site Twitter temporarily shut down and Google returned an error message for searches of the singer’s name because it assumed it was under attack. An hour after the news of Jackson’s death hit the Internet, mainstream news sources such as The Los Angeles Times, MSNBC, and CNN confirmed the information, and it was immediately disseminated among local and national television and radio stations.
The order in which the news broke among the major media outlets was a source of contention. Many outlets around the world were reluctant to rely on the TMZ report, because the website was primarily known for its frivolous content, aggressive paparazzi tactics, and embarrassing celebrity photographs. Many of the more reputable news sources, including CNN, waited until both the coroner’s office and The Los Angeles Times had confirmed Jackson’s death before announcing it as a fact to viewers, preferring to release an accurate story rather than to gain an edge over other news outlets (even though both TMZ and CNN are owned by Time Warner). “Given the nature of the story we exercised caution,” said CNN spokesman Nigel Pritchard.Scott Collins and Greg Braxton, “TV Misses Out as Gossip Website TMZ Reports Michael Jackson’s Death First,” Los Angeles Times, June 26, 2009, http://articles.latimes.com/2009/jun/26/local/me-jackson-media26. However, Harvey Levin, managing editor of TMZ, denied that his site was less credible than any other news source. “TMZ is a news operation and we are fact based,” he said. “Our goal is always to take stories and factually source them and present them. We’re not a gossip site…. We have things researched, we have things lawyered, we make lots of phone calls…. I mean it’s the same principle.”Neal Karlinsky and Eloise Harper, “Michael Jackson’s Death Puts Us Weekly and TMZ at the Head of the Pack,” ABC News, July 1, 2009, http://abcnews.go.com/Nightline/MichaelJackson/story?id=7971440&page=1. Despite Levin’s protests, it appears that, for now at least, old media stalwarts such as the Associated Press and The LA Times have the advantage of reliability over (sometimes) faster sources with less credibility. As Adam Fendelman, founder of entertainment news site HollywoodChicago.com, noted, “The Web and TV phenomenon that TMZ is is very good at fast-breaking and late-breaking news, but there’s an inherent problem with trust in the everyday consumer’s mind”Wailin Wong, “Michael Jackson Death News: Online Activity Heats Up Twitter and Google, Slows Down Some Sites,” Chicago Tribune, June 26, 2009, http://www.chicagotribune.com/topic/wghp-story-jackson-media-coverage-090625,0,4191041.story. (see for more advantages and disadvantages of new media).
Once news of Michael Jackson’s death had been reported through all the major international media outlets, a tabloid war broke out, with newspapers and magazines determined to get the “story behind the story.” Speculation about the cause of death and the role played by prescription drugs fed salacious media reports in the tabloids and news and gossip magazines long after the initial news story broke. Other newspapers and magazines, including Time and Entertainment Weekly, focused on tribute articles that reviewed Jackson’s long list of accomplishments and reflected on his musical legacy, and the four major broadcast networks (ABC, NBC, Fox, and CBS) aired documentaries covering the pop star’s life. In the days and weeks following Jackson’s death, radio stations abandoned their playlists in favor of back-to-back Michael Jackson hits, contributing to a huge upswing in record sales. Media coverage continued for many months, saturating newspapers, magazines, and television and radio stations—when the coroner’s report ruled Jackson’s death a homicide in August 2009, during the funeral service a month later, and again in February 2010 when Jackson’s doctor was charged with involuntary manslaughter for administering a powerful sedative to help the star sleep.
Although the book-publishing industry was at a disadvantage because of the time delay between receiving news of Jackson’s death and the ability to physically place books on shelves, many authors, agents, and publishers were able to capitalize on the star’s tragic story. Numerous biographies were published in the months following Jackson’s death, along with several explosive “tell-all” books by people close to the star that provided intimate details about his private life. To compensate for their lack of immediacy, books have several advantages over other print and web sources, primarily the ability to include greater depth of information on a subject than any other form of media. Fans eager for more information about their idol and his life eagerly purchased Jackson biographies, including his 1988 autobiography Moonwalk, which was re-released in October 2009.
Other, less immediate forms of media were also commercially successful, including a posthumous film titled This Is It, named after the much-anticipated comeback tour that was supposed to start just 3 weeks after Jackson’s death. Composed of rehearsal footage from the concerts, the documentary was shown on more than 3,400 domestic screens during a sold-out 2-week run in October and November 2009. An accompanying two-disc soundtrack album, featuring classic Jackson hits along with new track “This Is It,” topped the Billboard 200 chart upon its release in November 2009, selling 373,000 copies in its first week of release. A spin-off DVD also topped the U.S. sales chart in February 2010, selling more than 1.2 million copies the week of its release. Posthumous sales of Jackson’s earlier material also generated huge amounts of revenue. In the first 4 months after Jackson’s death, Forbes magazine estimated that his estate made $90 million in gross earnings. Music industry consultant Barry Massarsky commented, “Nothing increases the value of an artist’s catalog [more] than death … an untimely death.”Lauren Streib, “Michael Jackson’s Money Machine,” Forbes, October 27, 2009, http://www.forbes.com/2009/10/27/michael-jackson-earnings-since-death-dead-celebs-09-business-entertainment-jackson.html. This cross-media approach is typical of every major news story, although the controversy surrounding Jackson throughout his life, the circumstances of his death, and the sheer magnitude of his contribution to pop history meant that the performer’s demise had a particularly widespread effect.
As the Michael Jackson example shows, the number of people receiving news from the Internet is rapidly growing, although television remains the dominant source of information. Currently, most Americans use multiple resources for news. In a 2010 survey, 92 percent of people said they obtained their daily news from a variety of sources, including online news sites, blogs, social networking sites such as Twitter and Facebook, television, newspapers, and radio.Suzanne Choney, “Internet, TV Main News Sources for Americans,” MSNBC, March 1, 2010, http://www.msnbc.msn.com/id/35607411/ns/technology_and_science-tech_and_gadgets/. On a typical day, 6 in 10 American adults get their news online, placing the Internet third behind local television news and national or cable television news.Suzanne Choney, “Internet, TV Main News Sources for Americans,” MSNBC, March 1, 2010, http://www.msnbc.msn.com/id/35607411/ns/technology_and_science-tech_and_gadgets/. The use of smartphone technology is contributing to the ease with which people can access online news; more than a third of cell phone owners use their phones to check for weather, news, sports, and traffic information.Suzanne Choney, “Internet, TV Main News Sources for Americans,” MSNBC, March 1, 2010, http://www.msnbc.msn.com/id/35607411/ns/technology_and_science-tech_and_gadgets/.
For young people in particular, the rise in social networking use is transforming the news from a one-way passage of information into a social experience. People log on to their Facebook or Twitter accounts, post news stories to their friends’ web pages, comment on stories that interest them, and react to stories they have recently read. During a survey of students at the University of Texas at Austin, senior Meg Scholz told researchers that she scanned news websites and blogs every time she went online to check her email, eliminating the need to pick up a newspaper or watch television news. “It’s not that I have anything against a printed newspaper,” she said. “But for my lifestyle the Internet is more accessible.”Peter Johnson, “Young People Turn to the Web for News,” Media Mix, USA Today, March 22, 2006, http://www.usatoday.com/life/columnist/mediamix/2006-03-22-media-mix_x.htm. Other Internet users appreciate the ability to filter news and information that is relevant to them; 28 percent of those surveyed said they customize their social networking home pages to include news from sources or on topics that interest them.Suzanne Choney, “Internet, TV Main News Sources for Americans,” MSNBC, March 1, 2010, http://www.msnbc.msn.com/id/35607411/ns/technology_and_science-tech_and_gadgets/. Researchers at the Pew Research Center’s Internet & American Life Project, the organization that conducted the survey, speculate that this personalization of news is a result of the constant stream of information in modern life. Pew Research Center Director Lee Rainie commented, “People feel more and more pressed about the volume of information flowing into their lives. So, they customize the information flow in order to manage their lives well and in order to get the material that they feel is most relevant to them.”Suzanne Choney, “Internet, TV Main News Sources for Americans,” MSNBC, March 1, 2010, http://www.msnbc.msn.com/id/35607411/ns/technology_and_science-tech_and_gadgets/. Although television remains the primary source of news for most Americans, Internet and mobile technology is changing the structure of information delivery methods to audiences, making it more portable, more personalized, and more participatory.
Conduct a survey among your friends, family, and classmates to find out where they get their news on a regular basis. Then respond to the following short-answer questions. Each response should be a minimum of one paragraph.
In October 2009, 17-year-old child-care student Ashleigh Hall made friends with a handsome 19-year-old man on Facebook. Ashleigh, from Darlington, England, and her new friend began chatting online and exchanged mobile phone numbers so they could text each other. The excited teenager soon told her friends that she was going on a date with her new boyfriend, Pete, and that his father would be picking her up in his car. Unfortunately, Pete and his “father” were one and the same person—convicted rapist Peter Chapman. The 33-year-old homeless sex offender used his Facebook alter ego (which included photographs of an unknown teenage boy) to lure Ashleigh to a secluded location, where he raped and murdered her. Chapman was arrested by chance shortly after the event, and in court he pleaded guilty to kidnap, rape, and murder.
Ashleigh’s tragic story illustrates some disadvantages of modern media delivery: anonymity and unreliability. Although social networking sites such as Facebook are a convenient way to create new relationships and reconnect with old friends, there is no way of knowing whether users are who they claim to be, leaving people (particularly impressionable youths) vulnerable to online predators. Since much of the content on the Internet is unregulated, this lack of reliability spans the entire online spectrum, from news stories and Wikipedia articles to false advertising claims and unscrupulous con artists on websites such as Craigslist.
However, modern media can also work to mobilize efforts to stop crime. The popular NBC television series Dateline: To Catch a Predator followed police investigators who used Internet chat rooms to identify potential child molestors. Posing as young teens, police officers entered chat rooms and participated in conversations with various users. If an adult user began a sexual dialogue and expressed interest in meeting the teen for sexual purposes, the police set up a sting operation, catching the would-be pedophile in the act. In cases such as these, the rapid transmission of information and the global nature of the Internet made it possible for criminals to be apprehended.
If Ashleigh’s story highlights some of the most negative aspects of modern media, the quick dissemination of news and information are some of the most beneficial aspects of the World Wide Web. As we noted earlier in the chapter, speed can be a huge advantage of online media delivery. When a news story breaks, it can be delivered almost instantaneously through RSS feeds and via many major outlets, enabling people all over the world to learn about a breaking news story mere minutes after it happens.
Once an Internet user has paid for a monthly service provider, most of the content on the web is free, allowing people access to an unlimited wealth of information via news websites, search engines, directories, and home pages for numerous topics ranging from cooking tips to sports trivia. When all this information became readily available at the touch of a button, many journalists and technology experts wrote articles claiming the information overload was bad for people’s health. Fears that the new technology would cause attention deficit disorder, stunt people’s reasoning, and damage their ability to empathize were raised by some highly respected publications, including The Times of London and the The New York Times. However, there is no consistent evidence that the Internet causes psychological problems; in fact, statistics show that people who use social networking sites have better offline social lives, and people who play computer games are better at absorbing and reacting to information than those who do not, and they experience no loss of accuracy or increased impulsiveness.Vaughan Bell, “Don’t Touch That Dial!” Slate, February 15, 2010, http://www.slate.com/id/2244198/pagenum/all/. As Vaughan Bell points out in his article about the history of media scares, “Worries about information overload are as old as information itself, with each generation reimagining the dangerous impacts of technology on mind and brain.”Vaughan Bell, “Don’t Touch That Dial!” Slate, February 15, 2010, http://www.slate.com/id/2244198/pagenum/all/.
In addition to speed, reach, and cost, online media delivery enables a wider range of voices and perspectives on any subject. Through nontraditional media such as blogs and Twitter, people can put their own personal slant on current events, popular culture, and issues that are important to them without feeling obliged to remain neutral. A study by the Pew Research Center found that nontraditional media sources report on a wider variety of stories than traditional media, enabling individual sites to develop their own personality and voice. The study also discovered that these online sources focus on highly emotional subject matter that can be personalized by the writers and shared in the social forum.Pew Research Center, “New Media, Old Media,” May 23, 2010, http://pewresearch.org/pubs/1602/new-media-review-differences-from-traditional-press. By opening up blogs and social media sites to online discussion or debate, bloggers enable readers to generate their own content, turning audiences from passive consumers into active creators. In this way, knowledge becomes a social process rather than a one-way street—the blogger posts an opinion, a reader comments on the blogger’s opinion, the blogger then evaluates the reader’s comment and revises his or her perspective accordingly, and the process repeats itself until an issue has been thoroughly explored. Many bloggers also provide links to other blogs they support or enjoy reading, enabling ideas with merit to filter through various channels on the Internet.
Along with a growing number of online predators misrepresenting themselves on social networking sites, the Internet is responsible for a lot of other types of misinformation circulating the web. Unless users are able to distinguish between reliable, unbiased sources and factual information, they may find themselves consuming inaccurate news reports or false encyclopedia entries. Even so-called reliable news sources are subject to occasional errors with their source material. When French composer Maurice Jarre died in 2009 at the age of 84, Irish sociology and economics student Shane Fitzgerald decided to try an experiment with WikipediaCollaborative, web-based encyclopedia that is freely edited by registered users.. He added fictional quotes to Jarre’s Wikipedia entry and then watched as newspapers worldwide (including reputable sources such as the The Guardian) copied his quotes word for word and attributed them to the composer. Red-faced journalists were later forced to correct their errors by retracting the quotes. Writing a follow-up report for The Irish Times, Fitzgerald commented, “If I could so easily falsify the news across the globe, even to this small extent, then it is unnerving to think about what other false information may be reported in the press.”J. Mark Lytle, “Wikipedia Hoax Shames Major Publishers,” TechRadar, May 10, 2009, http://www.techradar.com/news/internet/web/wikipedia-hoax-shames-major-publishers-597729.
Although most traditional media strive for nonpartisanship, many newer online sources are fervently right wing or left wing. With websites such as the Huffington Post on the left of the political spectrum and the Drudge Report on the right, consumers need to be aware when they are reading news with an ideological slant. Critics fear the trend toward social media sources may lead to the restriction of the movement of ideas. If consumers choose their media circle exclusively consistent with their own political biases, they will be limited to a narrow political viewpoint.
Along with practical disadvantages, the Internet also has several economic disadvantages. An increasing gap between people who can afford personal computers and access to the web and people who cannot, known as the digital divide, separates the haves and the have-nots. Although about 75 percent of U.S. households are connected to the Internet, there are gaps in access in terms of age, income, and education. For example, a recent study found that 93 percent of people age 18–29 have Internet access, compared with 70 percent of people 50–64 and just 38 percent of people over 65.Pew Research Center, “Demographics of Internet Users,” Pew Internet & American Life Project, January 6, 2010, http://www.pewinternet.org/Static-Pages/Trend-Data/Whos-Online.aspx. Similar disparities occur with income and education (see ).
These disparities mean that people with lower incomes and educational levels are at a disadvantage when it comes to accessing online job listings, information, news, and computer-related skills that might help them in the workplace. The digital divide is even more prominent between developed and developing countries. In nations such as Jordan, Saudi Arabia, and Syria, the government permits little or no access to the Internet. In other countries, such as Mexico, Brazil, and Columbia, poor telecommunications infrastructure forces users to wait extremely inconvenient lengths of time to get online. And in many developing countries that have poor public utilities and intermittent electrical service, the Internet is almost unheard of. Despite its large population, the entire continent of Africa accounts for less than 5 percent of Internet usage worldwide.Internet World Stats, “Internet Usage Statistics,” http://www.internetworldstats.com/stats.htm.
Figure 16.2

The digital divide places people with lower incomes and lower educational levels at a disadvantage when it comes to Internet access.
Traditional media also face economic disadvantages when it comes to profiting from the Internet. Having freely given away much of their online content, newspapers are struggling to transition to an entirely ad-based business model. Although publishers initially envisioned a digital future supported entirely by advertising, two years of plummeting ad revenue (the Newspaper Association of America reported that online advertising revenues fell 11.8 percent in 2009) has caused some papers to consider introducing online fees. Although modern media delivery is quick and efficient, companies are still trying to establish a successful economic model to keep them afloat in the long term.
Choose two online newspaper articles or blogs on the same subject, one from a liberal website such as the Huffington Post and one from a conservative website such as the Drudge Report. Read through both articles and underline examples of political bias or prejudice. Then answer the following short-answer questions. Each response should be a minimum of one paragraph.
What do your former high school classmates do for a living? What does your favorite celebrity think about the current administration? What do other professionals in your field think about industry trends? Which restaurant do your coworkers frequent? Five years ago, these questions would most likely have been met with blank stares, but thanks to the exponential growth of electronic media—social networking in particular—it is now possible to keep track of past and present contacts via the Internet, sometimes in exhaustive detail. As social media use continues to grow in popularity, marketers, advertisers, and businesses are looking for ways to use the new technology to increase revenue and improve customer service. Meanwhile, social networking sites are expanding into commerce, connecting businesses and consumers via third-party sites so that people can bring a network of friends to partner websites. Facebook ConnectTechnology that enables users of social networking site Facebook to connect their account with any partner website using an authentication method., for example, enables a consumer to visit a partner site such as Forever 21, find a pair of jeans on sale, and broadcast the information to everyone on her Facebook network. If a few Facebook friends do the same thing, the information can create an effective viral marketing campaign for the partner site. A more secure version of the ill-fated Beacon (see ), Facebook Connect extends the Facebook platform out of the social network’s walls, creating one giant network on the web.
The current trend toward immediacy (instant Twitter updates, instant Google searches, instant driving directions from Google Maps) is compounded by the development of smartphone applications, which allow users to access or post information wherever they happen to be located. For example, a person shopping for a particular product can instantly compare the price of that product across an entire range of stores using the Android ShopSavvy app, while someone new to an area can immediately locate a gas station, park, or supermarket using iPhone’s AroundMe app. Industry insiders have coined the term nowismThe instant gratification that can be achieved by real-time content on the web. to describe the instant gratification that can be achieved by real-time content on the web. Sparked by social networking sites such as Facebook and Twitter, the real-time trend looks set to continue, with companies from all types of industries jumping on the immediacy bandwagon.
The growth of social media over the past few years has been exponential; according to Nielsen, Twitter alone grew 1,382 percent in February 2009, registering 7,000,000 unique visitors in the United States for the month. By February 2010, Twitter had 75,000,000 registered users and between 10,000,000 and 15,000,000 active tweeters.Sharon Gaudin, “Twitter Users Send 50 Million Tweets a Day,” Computerworld, February 23, 2010, http://www.computerworld.com/s/article/9161118/Twitter_users_send_50_million_tweets_a_day. Meanwhile, Facebook has more than 400 million active users worldwide, according to its website, with each user averaging 130 Facebook friends. In February 2010, Facebook was declared the web’s most popular site, with users spending an average of more than 7 hours a month on the site; more than the amount of time spent on Google, Yahoo!, YouTube, Amazon.com, Wikipedia, and MSN combined.Ben Parr, “Facebook Is the Web’s Ultimate Timesink,” Mashable (blog), February 16, 2010, http://mashable.com/2010/02/16/facebook-nielsen-stats/.
Figure 16.3

The average U.S. user spends more than 7 hours a month on social networking site Facebook.
Initially conceived in 2004 as a website for students to keep in touch over the Internet and get to know each other better, Facebook has since developed into the world’s largest social networking site. In addition to connecting friends and acquaintances and enabling users to share photos, links, and multimedia, the site (along with other social networking sites such as MySpace) has branched out into social gaming, a rapidly growing industry that allows users to download free games through the site and play online with friends and family members. Appealing to a wide demographic—including people who rarely play video games—social games such as FarmVille and Mafia Wars are free to play, but generate revenue for developers by offering additional bonuses or virtual goods for paying players. A recent survey found that most of the revenue generated by the social gaming audience comes from a small percentage of players (around 10 percent) who are willing to actually spend money on social networking games. Out of that 10 percent, just 2 percent of people, described as the “whales” of the social gaming industry, spend more than $25 a month on social games. Inside Network founder Justin Smith, who coauthored the survey, said, “It is clear that people either spend a lot of money or spend nothing.”Dean Takahashi, “Social Game ‘Whales’ are Big Spenders on Facebook, Survey Says” VentureBeat, June 22, 2010, http://venturebeat.com/2010/06/22/social-game-whales-are-big-spenders-on-facebook-survey-says/. The games, which primarily appeal to the female over-40 demographic, are designed so that Facebook users can spend a few minutes playing several times a day. In the United States, 55 percent of social network game players are women, and the average age is 48.Caleb Johnson, “Average Social Networking Gamer in the U.S.? Your Mom,” Switched, February 17, 2010, http://www.switched.com/2010/02/17/average-social-networking-gamer-in-the-u-s-your-mom/.
Other continuing trends in social networking include microblogging on sites such as Twitter, which is rapidly becoming the fastest source of news on the Internet. The site acts as a personal newswire, passing on information about shared world events as they affect people in real time. For example, when an earthquake shook Los Angeles in 2008, people began tweeting personal accounts from their homes 9 minutes before the Associated Press picked up the story. In 2009, citizens of Iran bypassed government censorship by tweeting news of the election results across the world. Organizations such as the Associated Press communicated with Twitter users to receive information about the resulting protests and demonstrations.Rebecca Santana, “Twittering the election crisis in Iran,” USA Today, June 16, 2009, http://www.usatoday.com/tech/world/2009-06-15-iran-twitter_N.htm.
Business owners are also beginning to realize the power of Twitter; online shoe merchant Zappos.com provides more than 500 of its employees with Twitter accounts to humanize the people behind the sales and help them connect with their customers. Feedback from Twitter users provides companies with valuable information about how they can improve their products and services. Celebrities have also attached themselves to Twitter as a means of publicizing forthcoming projects and keeping in touch with fans. Actor Ashton Kutcher is particularly media savvy; beating news outlet CNN to become the first Twitter user with more than 1,000,000 followers in 2009, the star used his popularity to raise awareness for medical charity Malaria No More, donating 10,000 mosquito nets to the organization following his success as Twitter’s first “millionaire.” Kutcher’s social media consultancy, Katalyst Films, maximizes the use of social networking technology by working with entertainment content, advertising, and online conversation in an effort to generate money from the web. “Entertainment, really, is a dying industry,” Kutcher said in a 2009 interview. “We’re a balanced social-media studio, with revenue streams from multiple sources—film, TV, and now digital. For the brand stuff, we’re not replacing ad agencies but working with everyone to provide content and the monetization strategies to succeed on the Web.”Ellen McGirt, “Mr. Social: Ashton Kutcher Plans to be the Next New-Media Mogul,” Fast Company, December 1, 2009, http://www.fastcompany.com/magazine/141/want-a-piece-of-this.html.
In addition to brand marketing and cross-promotions infiltrating social networking sites, digital experts predict social media will become more exclusive, with people filtering out clutter from unwanted sources. David Armano, senior vice president of Edelman Digital, said, “Not everyone can fit on someone’s newly created Twitter list and as networks begin to fill with noise, it’s likely that user behavior such as ‘hiding’ the hyperactive updaters that appear in your Facebook news feed may become more common.”David Armano, “Six Social Media Trends for 2010,” The Conversation (blog), Harvard Business Review, November 2, 2009, http://blogs.hbr.org/cs/2009/11/six_social_media_trends.html.
Armano’s prediction for social networking sites may filter across other areas of the web. Membership-only sites that cater to a specific audience are becoming increasingly popular. Based on e-commerce models such as Gilt and Rue La La, which sell luxury brand clothing at below-retail prices by invitation only, websites such as Thrillist offer exclusive clothing deals in addition to providing information on food, drink, entertainment, nightlife, and gadgets by subscription newsletter. Aimed at young, affluent male professionals, Thrillist reaches more than 2,200,000 subscriptions across the United States and the United Kingdom, and has reached over $10,000,000 in revenue in 2010. Cofounder and CEO Ben Lerer believes that Thrillist represents the future of media. “It’s what modern media looks like,” he said. “Content plus commerce.”Ty McMahan, “Is Thrillist the Future of Media?” Speakeasy (blog), Wall Street Journal, May 13, 2010, http://blogs.wsj.com/speakeasy/2010/05/13/is-thrillist-the-future-of-media/. In 2010, Thrillist acquired members-only online retailer JackThreads.com, enabling the company to offer its user base exclusive access to JackThreads’ private shopping community as a benefit to subscribing.
Another highly targeted web trend is the emergence of micro magazinesA digital subscription magazine with a specific target audience, delivered via email or RSS feed.—digital publications aimed at a specific audience that attract advertisers wanting to reach a particular group of people. For example, the magazine Fearless is an online magazine entirely dedicated to stories of overcoming fear. Marketing expert Seth Godin believes that whereas publications such as Newsweek and Time are “slow and general, the world is fast and specific,” which creates a need for online subscription magazines that can provide targeted material to interested individuals.Seth Godin, “Micro Magazines and a Future of Media,” Seth Godin’s Blog, May 6, 2010, http://sethgodin.typepad.com/seths_blog/2010/05/micro-magazines-and-a-future-of-media.html. “The big difference is that instead of paying for an office building and paper and overhead, the money for an ad in a micro-magazine can go directly to the people who write and promote it and the ad itself will be seen by exactly the right audience,” Godin writes.Seth Godin, “Micro Magazines and a Future of Media,” Seth Godin’s Blog, May 6, 2010, http://sethgodin.typepad.com/seths_blog/2010/05/micro-magazines-and-a-future-of-media.html. The possibilities for micro magazines are endless, with focus topics covering every travel destination, interest group, and profession. Operating in a similar way to traditional subscription magazine models, micro magazines are distributed via email or RSS and are supported by a forum or blog. This interactive aspect provides readers with a sense of community—rather than passive consumers of general-interest news, they are part of a network of readers who can communicate with others who have a shared interest.
In April 2009, Apple celebrated the 1 billionth download from its App Store. Launched in July 2008, the online venue for third-party iPhone and iPod Touch applications initially offered consumers 500 apps, ranging from shortcuts to websites such as Facebook and eBay to games and useful online services. Although competing smartphones such as the Treo and BlackBerry offered similar application facilities, Apple’s App Store quickly became the most successful platform for mobile software, averaging around $1,000,000 a day in iPhone application sales during the first month of its existence.Dianne See Morrison, “Apple’s App Store Sales Top $30 Million in First Month; Can Free Apps Make Developers Money?” Washington Post, August 11, 2008, http://www.washingtonpost.com/wp-dyn/content/article/2008/08/11/AR2008081100440.html. Under a revenue-sharing agreement, the company keeps 30 percent of any income generated and gives the other 70 percent to third-party app developers. By April 2011, the App Store offered around 350,000 applications, aiding iPhone and iPad users with numerous daily activities, ranging from identifying an unknown song, to finding a nearby gas station, to matching the color of a photograph taken by the iPhone with a database of paint colors. Unlike many commercials that exaggerate a products’ abilities, Apple’s tagline “There’s an app for that” is usually on the mark.
One recent trend in smartphone applications is the use of location-sharing services such as Foursquare, Gowalla, Brightkite, and Google Latitude. Utilizing the GPS function in modern smartphones, these apps enable users to “check in” to a venue so that friends can locate each other easily. The apps also encourage users to explore new places in their area by following other users’ suggestions on places to go. Users have the option of automatically updating their Facebook and Twitter accounts when they check in, and are able to earn points or badges according to how many times they check into a location, adding a competitive element to the service. Users with the most check-ins at a location become the “mayor” of that place, and some businesses offer rewards to users who achieve this status.
Although many apps stand alone, some are tied to other forms of media. For example, popular musical-comedy television show Glee has its own application that enables users to sing their favorite musical numbers from the show, upload their efforts to Facebook or MySpace, and invite friends to sing with them. The application also provides a voice-enhancing feature to correct users’ pitch and harmonize their voices while they sing. Other cross-media applications include game versions of television quiz shows Are You Smarter Than a 5th Grader? and Who Wants to Be a Millionaire?, apps for individual celebrities such as country singer Reba McEntire, and apps for television news channels, including CNN and MSNBC. Making life easier for users while providing them with endless entertainment options, apps have become a huge part of everyday life for many people; by June 2010, Apple’s App Store had generated total revenue of $1.4 billion.Philip Elmer-DeWitt, “App Store: 1% of Apple’s Gross Profit,” Fortune, CNN Money, June 23, 2010, http://tech.fortune.cnn.com/2010/06/23/app-store-1-of-apples-gross-profit/.
Poll a group of friends or colleagues about the amount of time they spend on social networking sites, and write a one- to two-page report on the answers to the following questions.
When a young waitress named Ashley was having a tough time at work, she decided to vent about her job on Facebook. The 22-year-old was working an overtime shift at a North Carolina pizza parlor and a demanding customer who had stayed late left a meager tip. Feeling frustrated, Ashley posted a short status update on her Facebook profile, calling the anonymous customer an unflattering name. Unfortunately for Ashley, her coworkers saw her post on the social networking site. Two days after her angry post, Ashley’s manager called her in to show her a copy of her comments and promptly fired her.Jodi Lai, “Waitress Gets Fired After Facebook Rant About Bad Tipper,” National Post (Don Mills, Toronto), May 17, 2010, http://news.nationalpost.com/2010/05/17/waitress-gets-fired-after-facebook-rant-about-bad-tipper/. Ashley’s story is one of many examples of employers terminating their employees because of inappropriate comments or photographs on social networking sites; a study by Internet security firm Proofpoint found that 8 percent of companies have dismissed an employee for his or her behavior on social networking sites.Adam Ostrow, “Facebook Fired: 8% of US Companies Have Sacked Social Media Miscreants,” Mashable (blog), August 10, 2009, http://mashable.com/2009/08/10/social-media-misuse/. These cases highlight a blurring of personal and professional life in the Internet age, leaving many people uncomfortable with the notion that their employer can monitor what they say or do in their free time and use it as a reason for dismissal.
Since the passing of the USA PATRIOT ActStatute passed in the wake of the September 11, 2001, terrorist attacks that allowed federal officials greater authority in tracking and intercepting communications., which as we have seen extended the government’s surveillance powers over communication devices, privacy has become a fiercely controversial issue in the United States, with supporters arguing the legal measures are necessary to prevent terrorist attacks, and opponents claiming that the act infringes on civil liberties. Privacy issues raised by the USA PATRIOT Act, combined with the growing problem of identity theft and increased monitoring in the workplace, make privacy a greater concern now than ever before.
Figure 16.5

President George W. Bush signs the USA PATRIOT Act.
As we saw in Chapter 14 "Ethics of Mass Media", the USA PATRIOT Act has generated a huge amount of debate and controversy since its approval by President George W. Bush in October 2001. Signed into law with little debate or congressional review just 43 days after the September 11 attacks, the act’s provisions enable the government, with permission from a special court, to obtain roving wiretaps over multiple communication devices, seize suspects’ records without their knowledge, monitor an individual’s web surfing and library records, and conduct surveillance on a person deemed to be suspicious but without known ties to a terrorist group. Approving the House of Representatives’ decision to renew 16 of the act’s provisions in 2005, President Bush said, “The [USA] PATRIOT Act is essential to fighting the war on terror and preventing our enemies from striking America again. In the war on terror, we cannot afford to be without this law for a single moment.”CNN, “Patriot Act’s Fate Remains Uncertain,” December 15, 2005, http://www.cnn.com/2005/POLITICS/12/14/patriot.act/.
However, not everyone agrees with the former president’s opinion. While proponents of the act cite the need to disrupt or prevent terrorist attacks, New York City Council member Bill Perkins, who sponsored a 2004 resolution condemning the law, says, “The [USA] PATRIOT Act is really unpatriotic, it undermines our civil rights and civil liberties. We never give up our rights, that’s what makes us Americans.”Michelle Garcia, “N.Y. City Council Passes Anti-Patriot Act Measure,” Washington Post, February 5, 2004, http://www.washingtonpost.com/wp-dyn/articles/A13970-2004Feb4.html. Opposition to the USA PATRIOT Act sparked a wave of protest across the United States. More than 330 communities in 41 states passed resolutions condemning the act.Timothy Egan, “State of the Union: Opposing the Patriot Act,” BBC News, September 13, 2004, http://news.bbc.co.uk/2/hi/programmes/3651542.stm. Librarians in Detroit reported that Muslim children had stopped checking out books on Islam out of fear they were being monitored, while librarians in New Jersey and California shredded records and computer sign-up sheets in an attempt to thwart the legislation. While citizens can protect against invasions of privacy on the Internet by limiting personal information and being careful about the information they share, the invasion of privacy through other lines of communication is more difficult to prevent. Despite fierce objections to the act, President Barack Obama signed an unamended 1-year extension of several key provisions of the PATRIOT Act (including the use of roving wire taps) in 2010. In the near future, politicians will have to decide whether citizen protection is worth the loss of liberties in the United States.
The privacy issue has strayed well beyond government legislation; it affects anyone who is currently employed or even just looking for a job. When employers consider whether or not to hire an individual, they no longer need to rely on just a résumé to obtain pertinent information. A simple Google search often reveals that a potential employee has a social networking site on the Internet, and unless privacy settings have been put in place, the employer can access everything the candidate has posted online. A 2010 survey by CareerBuilder.com revealed that 53 percent of companies check out candidates’ profiles on social networking sites such as MySpace, Twitter, LinkedIn, and Facebook before deciding to employ them, and a further 12 percent of companies intend to review social networking sites of potential employees in the future.Carrie-Ann Skinner, “Job Seekers, Watch Your Walls – Employers Check Facebook,” PC World, January 17, 2010, http://www.pcworld.com/article/186989/job_seekers_watch_your_walls_employers_check_facebook.html. Factors that affect an employer’s decision whether or not to hire candidates based on their social networking page include the use of drugs or drinking, the posting of discriminatory comments, or the posting of photographs deemed to be inappropriate or provocative. The survey also revealed that some candidates posted information on their social networking page that proved they had lied on their résumé.Carrie-Ann Skinner, “Job Seekers, Watch Your Walls – Employers Check Facebook,” PC World, January 17, 2010, http://www.pcworld.com/article/186989/job_seekers_watch_your_walls_employers_check_facebook.html.
As we have seen, once employees are hired, they still need to be careful about what they post on social networking sites, particularly in relation to their jobs. Cheryl James, a hospital worker from Michigan, was fired in 2010 after she posted a message on Facebook describing a patient as a “cop killer” and hoping that he would “rot in hell.”Ronnie Dahl, “Oakwood Hospital Employee Fired for Facebook Posting,” MyFOXDetroit.com, July 30, 2010, http://www.myfoxdetroit.com/dpp/news/local/oakwood-hospital-employee-fired-for-facebook-posting-20100730-wpms. A few years earlier, Virgin Atlantic Airlines terminated 13 crew members for describing passengers as “chavs” (a derogatory British term similar to “white trash”). A Virgin spokesman commented, “There is a time and a place for Facebook. But there is no justification for it to be used as a sounding board for staff of any company to criticize the very passengers who pay their salaries.”Lawrence Conway, “Virgin Atlantic Sacks 13 Staff for Calling its Flyers ‘Chavs’,” Independent (London), November 1, 2008, http://www.independent.co.uk/news/uk/home-news/virgin-atlantic-sacks-13-staff-for-calling-its-flyers-chavs-982192.html.
Although employees might reasonably expect to be disciplined for using social networking sites on company time—a 2009 study discovered that 54 percent of U.S. companies have banned workers from using social networks during work hours—the issue of whether companies can influence how their employees behave in their private lives is a little trickier.Sharon Gaudin, “Study: 54% of Companies Ban Facebook, Twitter at Work,” Computerworld, October 6, 2009, http://www.computerworld.com/s/article/9139020/Study_54_of_companies_ban_Facebook_Twitter_at_work. The outcome of a 2009 federal court case in New Jersey may have some bearing on whether companies have the right to spy on their employees while the employees are on password-protected sites using non–work computers. The case, between restaurant employees Brian Pietrylo and Doreen Marino and managers at Houston’s in Hackensack, New Jersey, centered on a forum set up by Pietrylo on MySpace. The forum, which was password-protected and required an email invitation to join, made fun of the restaurant décor and patrons and included sexual jokes and negative comments about restaurant supervisors. Restaurant hostess Karen St. Jean, who had received an invitation to the forum, showed the supervisors the site and believed they found it amusing; however, the information was passed further up the management chain, and Pietrylo and Marino were fired. The restaurant claimed that the pair’s online posts violated policies set out in the employee handbook, including professionalism and a positive attitude. Marino and Pietrylo filed for unfair dismissal, claiming that the restaurant managers had violated their privacy under New Jersey law. Following a trial in June 2009, a federal jury agreed that the restaurant had violated state and federal laws that protect the privacy of web communications. The jury awarded Pietrylo and Marino a total of $3,400 in back pay and $13,600 in punitive damages.Charles Toutant, “Restaurateurs Invade Waiters’ MySpace,” New Jersey Law Journal, June 19, 2009, http://www.law.com/jsp/lawtechnologynews/PubArticleLTN.jsp?id=1202431575049.
Although the outcome of the New Jersey case may have some bearing on the use of social networking sites outside of work, employees should still exercise caution in the office. Companies are increasingly using technological advances to monitor Internet usage, track employees’ whereabouts through GPS-enabled cell phones, and even film employees’ movements via webcam or miniature video cameras. Lewis Maltby, author of workplace rights book Can They Do That?, says, “There are two trends driving the increase in monitoring. One is financial pressure. Everyone is trying to get leaner and meaner, and monitoring is one way to do it. The other reason is that it’s easier than ever. It used to be difficult and expensive to monitor employees, and now, it’s easy and cheap.”Laura Petrecca, “More Employers Use Tech to Track Workers,” USA Today, March 17, 2010, http://www.usatoday.com/money/workplace/2010-03-17-workplaceprivacy15_CV_N.htm. Whereas employees using their own equipment outside of work hours might have a reasonable expectation of privacy, the situation changes when using company property. Nancy Flynn, founder of training and consulting firm ePolicy Institute, said, “Federal law gives employers the legal right to monitor all computer activity. The computer system is the property of the employer, and the employee has absolutely no reasonable expectations of privacy when using that system.”Laura Petrecca, “More Employers Use Tech to Track Workers,” USA Today, March 17, 2010, http://www.usatoday.com/money/workplace/2010-03-17-workplaceprivacy15_CV_N.htm. Because this lack of privacy covers everything from instant messages sent to coworkers to emails sent from personal accounts when employees are logged onto the company network, the prudent action for employees to take is to separate their work life from their personal life as much as possible.
Social networking sites have come under fire in recent years for violating users’ privacy. In 2009, Facebook simplified its settings to keep up with the popularity of microblogging sites such as Twitter. One consequence of this action was that the default setting enabled status updates and photos to be seen across the entire Internet (see Chapter 11 "The Internet and Social Media" for more information about Facebook privacy settings). The social networking site has also come under criticism for a temporary glitch that gave users unintended access to their friends’ private instant messages, and for a new feature in 2010 that enabled the company to share private information with third-party websites. Although Facebook simplified its controls for sharing information by consolidating them on a single page and making it easier for users to opt out of sharing information with third-party applications, public concern prompted 14 privacy groups to file an unfair-trade complaint with the Federal Trade Commission (FTC) in May 2010.Warwick Ashford, “Facebook Stands Up to Privacy Coalition,” ComputerWeekly, June 21, 2010, http://www.computerweekly.com/Articles/2010/06/21/241663/Facebook-stands-up-to-privacy-coalition.htm. Congress is currently investigating whether more government regulation of social networking sites is necessary to protect people’s privacy.
Figure 16.6

Google Street View cars breached privacy by inadvertently collecting private communications data from unsecured Wi-Fi networks.
Other companies, including Google, are actively attempting to restore users’ privacy. In response to revelations that the company had accidentally captured and archived wireless data with its Google Street View cars (which are equipped with cameras to provide panoramic views along many streets around the world), Google announced in 2010 that it was launching an encrypted search facility. The technology uses SSLA protocol for managing the security of message transmission on the Internet. (secure sockets layer) to protect Internet searches from being intercepted while traveling across the web. Users can activate the secure search facility by typing “https” at the beginning of the URL instead of “http.” Although the technology provides a measure of security—the search will not be archived in the computer’s history or appear in the AutoFill during a subsequent search—it is not entirely private. Google maintains a record of what people search for, and Internet users will still need to rely on the company’s promise not to abuse the data. However, if the encrypted search facility proves successful, it may become a role model for social networking sites, which could offer encryption for more than just log-ins.
Visit the website located at http://www.eff.org/wp/effs-top-12-ways-protect-your-online-privacy. Read through the 12 tips and use them to evaluate your security on the Internet. How many of the tips do you already follow? What can you do to protect your privacy further? Keep these answers in mind as you respond to the following short-answer questions. Each response should be a minimum of one paragraph.
When the iPad went on sale in the United States in April 2010, 36-year-old graphic designer Josh Klenert described the device as “ridiculously expensive [and] way overpriced.”Connie Guglielmo, “Apple IPad’s Debut Weekend Sales May Be Surpassing Estimates,” Businessweek, April 4, 2010, http://www.businessweek.com/news/2010-04-04/apple-ipad-s-debut-weekend-sales-may-be-surpassing-estimates.html. The cost of the new technology, however, did not deter Klenert from purchasing an iPad; he preordered the tablet computer as soon as it was available and ventured down to Apple’s SoHo store in New York on opening weekend to be one of the first to buy it. Klenert, and everyone else who stood in line at the Apple store during the initial launch of the iPad, is described by sociologists as an early adopter: a tech-loving pioneer who is among the first to embrace new technology as soon as it arrives on the market. What causes a person to be an early adopter or a late adopter? What are the benefits of each? In this section you will read about the cycle of technology and how it is diffused in a society. The process and factors influencing the diffusion of new technology is often discussed in the context of a diffusion model known as the technology adoption life cycleModel that explains the process and factors influencing the diffusion of new technology..
Figure 16.7

Like other cultural shifts, technological advances follow a fairly standard diffusion model.
The technology adoption life cycle was originally observed during the technology diffusion studies of rural sociologists during the 1950s. University researchers George Beal, Joe Bohlen, and Everett Rogers were looking at the adoption rate of hybrid seed among Iowa farmers in an attempt to draw conclusions about how farmers accept new ideas. They discovered that the process of adoption over time fit a normal growth curve pattern—there was a slow gradual rate of adoption, then quite a rapid rate of adoption, followed by a leveling off of the adoption rate. Personal and social characteristics influenced when farmers adopted the use of hybrid seed corn; younger, better-educated farmers tended to adapt to the new technology almost as soon as it became available, whereas older, less-educated farmers waited until most other farms were using hybrid seed before they adopted the process, or they resisted change altogether.
In 1962, Rogers generalized the technology diffusion model in his book Diffusion of Innovations, using the farming research to draw conclusions about the spread of new ideas and technology. Like his fellow farming model researchers, Rogers recognizes five categories of participants: innovatorsExperimentalists who are interested in new technology and are usually the first to acquire it when it reaches the market., who tend to be experimentalists and are interested in the technology itself; early adoptersTechnically sophisticated individuals who usually buy new technology to help solve academic or professional problems. such as Josh Klenert, who are technically sophisticated and are interested in using the technology for solving professional and academic problems; early majorityIndividuals who acquire new technology when it begins to grow in popularity., who constitute the first part of the mainstream, bringing the new technology into common use; late majorityIndividuals who are less comfortable with new technology and are reluctant to change or adapt to it., who are less comfortable with the technology and may be skeptical about its benefits; and laggardsIndividuals who are resistant to new technology and may be critical of its use by others., who are resistant to the new technology and may be critical of its use by others.Everett M. Rogers, Diffusion of Innovations, 4th ed. (New York: The Free Press, 1995).
When new technology is successfully released in the market, it follows the technology adoption life cycle shown in . Innovators and early adopters, attracted by something new, want to be the first to possess the innovation, sometimes even before discovering potential uses for it, and are unconcerned with the price. When the iPad hit stores in April 2010, 120,000 units were sold on the first day, primarily as a result of presales.Sam Oliver, “Preorders for Apple iPad Slow After 120K First-Day Rush,” Apple Insider, March 15, 2010, http://www.appleinsider.com/articles/10/03/15/preorders_for_apple_ipad_slow_after_120k_first_day_rush.html. Sales dropped on days 2 and 3, suggesting that demand for the device dipped slightly after the initial first-day excitement. Within the first month, Apple had sold 1,000,000 iPads, exceeding industry expectations.Jim Goldman, “Apple Sells 1 Million iPads,” CNBC, May 3, 2010, http://www.cnbc.com/id/36911690/Apple_Sells_1_Million_iPads. However, many mainstream consumers (the early majority) are waiting to find out just how popular the device will become before making a purchase. Research carried out in the United Kingdom suggests that many consumers are uncertain how the iPad will fit into their lives—the survey drew comments such as “Everything it does I can do on my PC or my phone right now” and “It’s just a big iPod Touch…a big iPhone without the phone.”Steve O’Hear, “Report: The iPad Won’t Go Mass Market Anytime Soon,” TechCrunch, May 12, 2010, http://eu.techcrunch.com/2010/05/12/report-the-ipad-wont-go-mass-market-anytime-soon/. The report, by research group Simpson Carpenter, concludes that most consumers are “unable to find enough rational argument to justify taking the plunge.”Steve O’Hear, “Report: The iPad Won’t Go Mass Market Anytime Soon,” TechCrunch, May 12, 2010, http://eu.techcrunch.com/2010/05/12/report-the-ipad-wont-go-mass-market-anytime-soon/.
However, as with previous technological advances, the early adopters who have jumped on the iPad bandwagon may ultimately validate its potential, helping mainstream users make sense of the device and its uses. Forrester Research notes that much of the equipment acquired by early adopters—laptops, MP3 players, digital cameras, broadband Internet access at home, and mobile phones—is shifting into the mainstream. Analyst Jacqueline Anderson, who works for Forrester, said, “There’s really no group out of the tech loop. America is becoming a digital nation. Technology adoption continues to roll along, picking up more and more mainstream consumers every year.”Jenna Wortham, “The Race to Be an Early Adopter of Technologies Goes Mainstream, a Survey Finds,” New York Times, September 1, 2009, http://www.nytimes.com/2009/09/02/technology/02survey.html. To cite just one example, in 2008 nearly 10 million American households added HDTV, an increase of 27 percent over the previous year.Jenna Wortham, “The Race to Be an Early Adopter of Technologies Goes Mainstream, a Survey Finds,” New York Times, September 1, 2009, http://www.nytimes.com/2009/09/02/technology/02survey.html. By the time most technology reaches mainstream consumers, it is more established, more user-friendly, and cheaper than earlier versions or prototypes. In June 2010, Amazon.com slashed the price of its Kindle e-reader from $259 to $189 and in 2012 to $79 in response to competition from Barnes & Noble’s Nook.Jeffry Bartash, “Amazon Drops Kindle Price to $189,” MarketWatch, June 21, 2010, http://www.marketwatch.com/story/amazon-drops-kindle-price-to-189-2010-06-21. Companies frequently reduce the price of technological devices once the initial novelty wears off, as a result of competition from other manufacturers or as a strategy to retain market share.
Although many people ultimately adapt to new technology, some are extremely resistant or unwilling to change at all. When Netscape web browser user John Uribe was repeatedly urged by a message from parent company AOL to switch to one of Netscape’s successors, Firefox or Flock, he ignored the suggestions. Despite being informed that AOL would stop providing support for the web browser service in March 2008, Uribe continued to use it. “It’s kind of irrational,” Mr. Uribe said. “It worked for me, so I stuck with it. Until there is really some reason to totally abandon it, I won’t.”Miguel Helft, “Tech’s Late Adopters Prefer the Tried and True,” New York Times, March 12, 2008, http://www.nytimes.com/2008/03/12/technology/12inertia.html. Uribe is a self-confessed late adopter—he still uses dial-up Internet service and is happy to carry on using his aging Dell computer with its small amount of memory. Members of the late majority make up a large percentage of the U.S. population—a 2010 survey conducted by the U.S. Census Bureau found that despite the technology’s widespread availability, 40 percent of households across the United States have no high-speed or broadband Internet connection, while 30 percent have no Internet at all.Lance Whitney, “Survey: 40 Percent in U.S. Have No Broadband,” CNET, February 16, 2010, http://news.cnet.com/8301-1035_3-10454133-94.html. Of 32.1 million households in urban areas, the most common reason for not having high-speed Internet was a lack of interest or a lack of need for the technology.Lance Whitney, “Survey: 40 Percent in U.S. Have No Broadband,” CNET, February 16, 2010, http://news.cnet.com/8301-1035_3-10454133-94.html.
Figure 16.8

The most common reason that people in both rural and urban areas do not have high-speed Internet is a lack of interest in the technology.
Experts claim that, rather than slowing down the progression of new technological developments, laggards in the technology adoption life cycle may help to control the development of new technology. Paul Saffo, a technology forecaster, said, “Laggards have a bad rap, but they are crucial in pacing the nature of change. Innovation requires the push of early adopters and the pull of laypeople asking whether something really works. If this was a world in which only early adopters got to choose, we’d all be using CB radios and quadraphonic stereo.”Miguel Helft, “Tech’s Late Adopters Prefer the Tried and True,” New York Times, March 12, 2008, http://www.nytimes.com/2008/03/12/technology/12inertia.html. He added that aspects of the laggard and early adopter coexist in most people. For example, many consumers buy the latest digital camera and end up using just a fraction of its functions. Technological laggards may be the reason that not every new technology becomes a mainstream trend (see sidebar).
Have you ever heard of the Apple Newton? How about Microsoft Bob? Or DIVX? For most people, the names probably mean very little because these were all flash-in-the-pan technologies that never caught on with mainstream consumers.
The Apple Newton was an early PDA, officially known as the MessagePad. Introduced by Apple in 1993, the Newton contained many of the features now popularized by modern smartphones, including personal information management and add-on storage slots. Despite clever advertising and relentless word-of-mouth campaigns, the Newton failed to achieve anything like the popularity enjoyed by most Apple products. Hampered by its large size compared to more recent equivalents (such as the PalmPilot) and its cost—basic models cost around $700, with more advanced models costing up to $1,000—the Newton was also ridiculed by talk show comedians and cartoonists because of the supposed inaccuracy of its handwriting-recognition function. By 1998, the Newton was no more. A prime example of an idea that was ahead of its time, the Newton was the forerunner to the smaller, cheaper, and more successful PalmPilot, which in turn paved the way for every successive mobile Internet device.
Even less successful in the late 1990s was DIVX, an attempt by electronics retailer Circuit City to create an alternative to video rental. Customers could rent movies on disposable DIVX discs that they could keep and watch for 2 days. They then had the choice of throwing away or recycling the disc or paying a continuation fee to keep watching it. Viewers who wanted to watch a disc an unlimited amount of times could pay to convert it into a “DIVX silver” disc for an additional fee. Launched in 1998, the DIVX system was promoted as an alternative to traditional rental systems with the promise of no returns and no late fees. However, its introduction coincided with the release of DVD technology, which was gaining traction over the DIVX format. Consumers feared that the choice between DIVX and DVD might turn into another Betamax versus VHS debacle, and by 1999 the technology was all but obsolete. The failure of DIVX cost Circuit City a reported $114,000,000 and left early enthusiasts of the scheme with worthless DIVX equipment (although vendors offered a $100 refund for people who bought a DIVX player).Nick Mokey, “Tech We Regret,” Digital Trends, March 18, 2009, http://www.digitaltrends.com/how-to/tech-we-regret/.
Another catastrophic failure in the world of technology was Microsoft Bob, a mid-1990s attempt to provide a new, nontechnical interface to desktop computing operations. Bob, represented by a logo with a yellow smiley face that filled the o in its name, was supposed to make Windows more palatable to nontechnical users. With a cartoon-like interface that was meant to resemble the inside of a house, Bob helped users navigate their way around the desktop by having them click on objects in each room. Microsoft expected sales of Bob to skyrocket and held a big advertising campaign to celebrate its 1995 launch. Instead, the product failed dismally because of its high initial sale price, demanding hardware requirements, and tendency to patronize users. When Windows 95 was launched the same year, its new Windows Explorer interface required far less dumbing down than previous versions, and Microsoft Bob became irrelevant.
Technological failures such as the Apple Newton, DIVX, and Microsoft Bob prove that sometimes it is better to be a mainstream adopter than to jump on the new-product bandwagon before the technology has been fully tried and tested.
As new technology reaches the shelves and the number of early majority consumers rushing to purchase it increases, mass media outlets are forced to adapt to the new medium. When the iPad’s popularity continued to grow throughout 2010 (selling 3,000,000 units within 3 months of its launch date), traditional newspapers, magazines, and television networks rushed to form partnerships with Apple, launching applications for the tablet so that consumers could directly access their content. Unconstrained by the limited amount of space available in a physical newspaper or magazine, publications such as The New York Times and USA Today are able to include more detailed reporting than they can fit in their traditional paper, as well as interactive features such as crossword puzzles and the use of video and sound. “Our iPad App is designed to take full advantage of the evolving capabilities offered by the Internet,” said Arthur Sulzberger Jr., publisher of The New York Times. “We see our role on the iPad as being similar to our traditional print role—to act as a thoughtful, unbiased filter and to provide our customers with information they need and can trust.”Andy Brett, “The New York Times Introduces an iPad App,” TechCrunch, April 1, 2010, http://techcrunch.com/2010/04/01/new-york-times-ipad/.
Because of Apple’s decision to ban Flash (the dominant software for online video viewing) from the iPad, some traditional television networks have been converting their video files to HTML5 in order to enable full television episodes to be screened on the device. CBS and Disney were among the first networks to offer free television content on the iPad in 2010 through the iPad’s built-in web browser, while ABC streamed its shows via an iPad application. The iPad has even managed to revive forms of traditional media that had been discontinued; in June 2010, Condé Nast announced the restoration of Gourmet magazine as an iPad application called Gourmet Live. As more media content becomes available on new technology such as the iPad, the iPod, and the various e-readers available on the market, it appeals to a broader range of consumers, becoming a self-perpetuating model.
Choose a technological innovation from the past 50 years and research its diffusion into the mass market. Then respond to the following short-answer questions. Each response should be a minimum of one paragraph.
Review Questions
As a result of rapid change in the digital age, careers in media are constantly shifting, and many people who work in the industry face an uncertain future. However, the Internet (and all the various technologies associated with it) has created numerous opportunities in the media field. Take a look at the following website and scroll down to the “Digital” section: http://www.getdegrees.com/articles/career-resources/top-60-jobs-that-will-rock-the-future/
The website lists several media careers that are on the rise, including the following:
Read through the description of each career, including the links within each description. Choose one career that you are interested in pursuing, research the skills and qualifications it requires, and then write a one-page paper on what you found. Here are some other helpful websites you might like to use in your research: