Does Google have a responsibility to help stop the spread of 9/11 denialism, anti-vaccine activism, and other fringe beliefs?
Andrew Wakefield, the British doctor who popularized the current anti-vaccination movement
Photograph by Peter Macdiarmid/Getty Images.
In its early days, the Web was often imagined as a global clearinghouse—a new type of library, with the sum total of human knowledge always at our fingertips. That much has happened—but with a twist: In addition to borrowing existing items from its vast collections, we, the patrons, could also deposit our own books, pamphlets and other scribbles—with no or little quality control.
Such democratization of information-gathering—when accompanied by smart institutional and technological arrangements—has been tremendously useful, giving us Wikipedia and Twitter. But it has also spawned thousands of sites that undermine scientific consensus, overturn well-established facts, and promote conspiracy theories. Meanwhile, the move toward social search may further insulate regular visitors to such sites; discovering even more links found by their equally paranoid friends will hardly enlighten them. Is it time for some kind of a quality control system?
People who deny global warming, oppose the Darwinian account of evolution, refuse to see the causal link between HIV and AIDS, and think that 9/11 was an inside job have put the Internet to great use. Initially, the Internet helped them find and recruit like-minded individuals and promote events and petitions favorable to their causes. However, as so much of our public life has shifted online, they have branched out into manipulating search engines, editing Wikipedia entries, harassing scientists who oppose whatever pet theory they happen to believe in, and amassing digitized scraps of "evidence" that they proudly present to potential recruits.
While the anti-vaccination movement itself is not new—religious concerns about vaccination date back to the early 18thcentury—the ease of self-publishing and search afforded by the Internet along with a growing skeptical stance towards scientific expertise—has given the anti-vaccination movement a significant boost. Thus, Jenny McCarthy, an actress who has become the public face of the anti-vaccination movement, boasts that much of her knowledge about the harms of vaccination comes from "the university of Google.” She regularly shares her "knowledge" about vaccination with her nearly half-million Twitter followers. This is the kind of online influence that Nobel Prize-winning scientists can only dream of; Richard Dawkins, perhaps the most famous working scientist, has only 300,000 Twitter followers.
A new article in the medical journalVaccine sheds light on the online practices of one such group—the global anti-vaccination movement, which is a loose coalition of rogue scientists, journalists, parents, and celebrities, who think that vaccines cause disorders like autism—a claim that has been thoroughly discredited by modern science.
The Vaccine article contains a number of important insights. First, the anti-vaccination cohort likes to move the goal posts: As scientists debunked the link between autism and mercury (once present in some childhood inoculations but now found mainly in certain flu vaccines), most activists dropped their mercury theory and point instead to aluminum or said that kids received “too many too soon.” "Web 2.0 facilitated the debate of these new theories in public forums before their merits could be examined scientifically; when they were studied, the theories were not supported,” notes the Vaccine article.
Second, it isn't clear whether scientists can "discredit" the movement's false claims at all: Its members are skeptical of what scientists have to say—not least because they suspect hidden connections between academia and pharmaceutical companies that manufacture the vaccines. (This, in itself, is ironic: In 2006 the British investigative reporter Brian Deer revealed that Andrew Wakefield, the British scientist who famously “showed” the connection between vaccination and autism in a now-retracted 1998 article in the Lancet, was himself handsomely compensated by trial lawyers who were readying to sue the vaccine manufacturers.)
In other words, mere exposure to the current state of the scientific consensus will not sway hard-core opponents of vaccination. They are too vested in upholding their contrarian theories; some have consulting and speaking gigs to lose while others simply enjoy a sense of belonging to a community, no matter how kooky.
Thus, attempts to influence communities that embrace pseudoscience or conspiracy theories by having independent experts or, worse, government workers join them—the much-debated antidote of “cognitive infiltration” proposed by Cass Sunstein (who now heads the Office of Information and Regulatory Affairs in the White House)—won't work. Besides, as the Vaccinestudy shows, blogs and forums associated with the anti-vaccination movement are aggressive censors, swiftly deleting any comments that tout the benefits of vaccination.
What to do then? Well, perhaps, it's time to accept that many of these communities aren't going to lose core members regardless of how much science or evidence is poured on them. Instead, resources should go into thwarting their growth by targeting their potential—rather than existent—members.
Today, anyone who searches for "is global warming real" or "risks of vaccination" or "who caused 9/11?" on Google or Bing is just a few clicks away from joining one of such communities. Given that censorship of search engines is not an appealing or even particularly viable option, what can be done to ensure that users are made aware that all the pseudoscientific advice they are likely to encounter may not be backed by science?
The options aren't many. One is to train our browsers to flag information that may be suspicious or disputed. Thus, every time a claim like "vaccination leads to autism" appears in our browser, that sentence would be marked in red—perhaps, also accompanied by a pop-up window advising us to check a more authoritative source. The trick here is to come up with a database of disputed claims that itself would correspond to the latest consensus in modern science—a challenging goal that projects like “Dispute Finder” are tackling head on.
The second—and not necessarily mutually exclusive—option is to nudge search engines to take more responsibility for their index and exercise a heavier curatorial control in presenting search results for issues like "global warming" or "vaccination." Google already has a list of search queries that send most traffic to sites that trade in pseudoscience and conspiracy theories; why not treat them differently than normal queries? Thus, whenever users are presented with search results that are likely to send them to sites run by pseudoscientists or conspiracy theorists, Google may simply display a huge red banner asking users to exercise caution and check a previously generated list of authoritative resources before making up their minds.
In more than a dozen countries Google already does something similar for users who are searching for terms like "ways to die" or "suicidal thoughts" by placing a prominent red note urging them to call the National Suicide Prevention Hotline. It may seem paternalistic, but this is the kind of nonintrusive paternalism that might be saving lives without interfering with the search results. Of course, such a move might trigger conspiracy theories of its own—e.g. is Google shilling for Big Pharma or for Al Gore?—but this is a risk worth taking as long as it can help thwart the growth of fringe movements.
Unfortunately, Google's recent embrace of social search, whereby links shared by our friends on Google's own social network suddenly gain prominence in our search results, moves the company in the opposite direction. It's not unreasonable to think that denialists of global warming or benefits of vaccination are online friends with other denialists. As such, finding information that contradicts one's views would be even harder. This is one more reason for Google to atone for its sins and ensure that subjects dominated by pseudoscience and conspiracy theories are given a socially responsible curated treatment.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.
Nenhum comentário:
Postar um comentário