Precision Journalism

Precision journalism is the use of social and behavioral science research methods to gather and analyze data, bringing a level of rigor to journalistic work beyond anecdotal evidence. Although it can be practiced without computers, precision journalism is usually a subset of “computer-assisted reporting,” the catch-all term for anything from using the Internet for gathering information to developing newsroom intranets for sharing information among reporters. Another common term is “database journalism,” which focuses on gathering and analyzing large collections of government data.

Precision journalism may expand most in places with high concentrations of computers, where public records exist in electronic form, but internationally journalists practice it using any available techniques if they can get access to information and have sufficient training to carry out an analysis.

Origins

The term “precision journalism,” and the central idea behind it, were popularized by the 1973 book of the same name written by Knight-Ridder reporter Philip Meyer. He had discovered the journalistic potential of using public opinion research and other social science methods during a sabbatical year at Harvard University in 1966– 1967. He applied what he learned shortly thereafter by doing groundbreaking surveys of participants in race riots in Detroit and Miami. He credits journalism educator Everette Dennis with creating the term precision journalism (Meyer 1991).

Perhaps the earliest precursor to precision journalism occurred on the US presidential election night of 1952. Walter Cronkite, then the Washington correspondent for CBS News, used a Univac I computer analysis of early returns to predict correctly that Dwight Eisenhower would easily win the presidency.
A pioneering use of the techniques was a 1969 Miami Herald study of the Dade County, Florida, criminal justice system. In 1972, the New York Times analyzed how arrest statistics differed across New York police precincts. A year later, the Philadelphia Inquirer investigative team worked with Meyer to quantify racial bias in criminal sentencing. In 1978, Meyer assisted Miami Herald reporters (one of whom, Rich Morin, later became survey research editor for the Washington Post) in an analysis of property tax records that revealed the undervaluation of expensive homes compared to more modest homes in the Miami area (Cox 2000). These US examples illustrate the initial pattern of major newspapers using advanced computing in a media system that fostered journalistic independence.
Despite a widely read book and successful models, precision journalism was slow to spread in its country of origin or elsewhere during the 1970s for two reasons. First, it required access to large mainframe computers like the Univac or IBM 360, expensive “big iron” hardware typically available only at large corporations or major universities. Second, most datasets interesting to journalists, such as court records, existed only on paper, requiring tedious hand-entry into a computer before analysis.

In the early 1980s, the spread of microcomputers started with hobbyist machines and then entered small businesses. Microcomputers also began to appear in government offices, leading to increased availability of machine-readable data that did not need keypunch data entry. Precision journalism then had the possibility of international diffusion but lagged behind the spread of its instrumental technologies.

Diffusion

A tiny scattering of US reporters began using early microcomputers, including machines like the Apple, Atari, Kaypro, and IBM PC, for newsroom work. Some had purchased computers for home hobby use, but then began to realize how the machines could help with their reporting. For instance, a reporter in the Miami Herald state capital bureau in 1983 wrote, for his IBM PC in the computer language BASIC, a vote analysis program that would take in a legislative roll-call vote and produce cross-tabulations of the data based on political demographics of law-makers (party, race, gender, leadership role, region, campaign funds from special interests, and the like).

The profile of database reporting grew in the mid-1980s when Elliot Jaspin, a reporter for the Providence Journal, matched Rhode Island databases of school bus drivers, traffic violators, and drug arrests, finding bus drivers with histories of bad driving records or drug dealing. On another occasion, he used a computer to analyze 35,000 mortgages meant to help lower-income home buyers and discovered many of the best loans going to the children of senior state officials. His stories prompted changes in state licensing procedures for bus drivers and indictments for those who had abused the mortgage program, attracting the attention of investigative reporters around the country (Cox 2000).

Interest in precision journalism skills exploded in 1989 when a young Atlanta JournalConstitution reporter named Bill Dedman won a Pulitzer Prize for an investigative project called “The Color of Money.” Computer analysis of mortgage applications showed how banks in the Atlanta area were shunning predominantly African-American neighborhoods, refusing to approve home loans there even for families with good incomes. The stories forced Atlanta banks to stop the practice and prompted US newspapers to do similar analyses in their own areas.

Later that year, Jaspin joined the University of Missouri journalism school and created what became the National Institute of Computer-Assisted Reporting (NICAR), under the wing of Investigative Reporters and Editors (IRE), an organization based there. NICAR began training reporters in the techniques of using computers and database software to extract government data from magnetic tapes and analyze it for patterns. At about the same time, the first national conference of reporters focused on precision journalism took place at Indiana University. Centers at other universities followed.
Interest in and application of precision journalism grew rapidly in the United States during the 1990s. In 1992, IRE began computer-assisted reporting training sessions and panels at its annual conferences. The next year, NICAR held the first of what have become annual conferences that hundreds of reporters now attend. More than a thousand reporters now subscribe to the NICAR-L email listserv.

Methods And Applications

Notable precision journalism stories include topics such as natural disasters, school performance, and crime, and apply a range of methods, including surveys, geographic information software (GIS), financial data analysis, and cross-tabulations. Most of these applications of information technologies to news originated in the United States.

Precision journalism techniques have enhanced the coverage of natural and human-produced disasters. The Miami Herald won a Pulitzer in 1993 for a computer-aided analysis of the destruction patterns from Hurricane Andrew. Even more extensive computer work went into tracking the diaspora of victims of Hurricane Katrina in New Orleans in 2005, mapping the debris field from the 2003 loss of the space shuttle Columbia, and cataloguing the devastation of the September 11, 2001, attacks on New York City and Washington, DC.

To cover school performance, reporters have used statistical software like SPSS to create linear regression models that account for the effect of students’ poverty on standardized test scores. The same statistical techniques also have uncovered significant cases of teachers and administrators fraudulently inflating classroom test scores in order to qualify for salary bonuses.

To cover the criminal justice system, one striking example was the use by the Dallas Morning News of logistic regression models to examine racial bias in jury selection by prosecutors and defense attorneys. Curiously, the paper found that both sides were biased in opposite directions, thereby canceling out the effects of the bias. In Brazil, O Globo did a computer-aided study of the incarcerations of more than 700 violent criminals, revealing that most had been quietly released long before their sentences were up. Newspapers and their network news partners in major cities around the world regularly do national public opinion polls using scientifically drawn random samples of respondents. Papers and networks such as the New York Times, the BBC World Service, Le Monde in Paris, El País in Madrid, Yomiuri Shinbun in Japan, and Folha de Sao Paulo in Brazil practice this type of precision journalism. Geographic information systems software allows newspapers and television stations to reveal crime patterns, show the impact of toxic waste sites on the poor, and examine overbuilding in areas prone to floods or fires, among other topics. Dutch papers, for instance, used GIS to map the results of the referendum on the European Union constitution.

Financial data can uncover scandals involving the dates of stock transactions and trades favorable to insiders. Social network analysis software can show, for example, the relationships of those who make large donations to the political candidates they support. Other examples include the creation of a database of the personal finances of Brazilian legislators, and analyses of the uncounted votes in Florida’s controversial presidential election returns in 2000.

Precision journalism is not only for major investigative projects. Examples of lighter topics include cross-tabulation of pet licenses for most popular breeds and names; the different traits men and women seek when placing personal ads, the pattern of parking tickets on college campuses, sports ratings for the performance of players and teams, and a study of the profits of Mexican soccer teams.

International Spread

Precision journalism may have started in the United States, but the practice is spreading around the world. In 1997, IRE trainers held their first computer-assisted reporting workshops in Europe. The Danish International Center for Analytical Reporting (DICAR) then became an early proselytizer for precision journalism among international reporters. Since then, Global Investigative Journalism conferences in Copenhagen, Amsterdam, and Toronto have provided training in the use of spreadsheets, database programs, mapping, and statistics. Precision journalism trainers have conducted workshops in South Korea, China, and Nigeria, as well as in other countries in Latin America and Europe, including Bosnia. After attending the training sessions, reporters have gone on to do important stories despite the difficulty of getting public records in many countries. In a 2006 example, DICAR founder Nils Mulvad organized an international team of reporters to gather and publish a database revealing country-by-country details of the European Union 55-billion-euro farm subsidy program for the first time.

By the beginning of the twenty-first century, precision journalism had evolved from an exotic newsroom specialty into mainstream practice among some reporters in all but the smallest US newspapers as well as television stations. Journalism schools have begun teaching the basics to students. Although institutions and practitioners have emerged slowly in other countries, precision journalism has become widespread. The use of precision journalism techniques can grow as more countries put their census, courts, economic, election, and other data online. Ambitious international reporters can easily discover how the power of precision journalism is being used elsewhere and learn to use those tools in their own work.

References:

1. Cox, M. (2000). The development of computer-assisted reporting. Paper presented to the Newspaper Division, AEJMC Southeast Colloquium, March 17, University of North Carolina, Chapel Hill. At www.com.miami.edu/car/cox00.htm.
2. Houston, B. (2004). Computer-assisted reporting: A practical guide, 3rd edn. Boston: Bedford and St. Martin’s.
3. Meyer, P. (1973). Precision journalism. Bloomington, IN: Indiana University Press.
4. Meyer, P. (1991). The new precision journalism. Bloomington, IN: Indiana University Press.

Scroll to Top