Rebecca Blood: "Wide adoption of the Internet has fueled a resurgence of citizen science. Cornell's Project Feeder Watch employs 16,000 volunteers across North America who record their sightings on a website that will automatically ask them to double-check if they report sighting a bird that normally does not range in their area. In Canada, Frogwatch has set up systems for reporting and mapping observations so that volunteers can see the results of their input immediately.And Earthdive is working on a global scale, allowing recreational divers and snorkellers to record their experiences. Members can search and explore dives, snorkel trips, science logs, and personal experiences recorded all over the world. By including sightings of key indicator species during their dives and trips, Earthdive members are creating a daily global snapshot of the state of our planet's oceans." (http://www.worldchanging.com/archives/002974.html)
"While there are dozens of Web sites and projects dedicated to the "citizen science" movement, they tend to fall under one of three categories.
- The first are computer programs that run on idle computers. These programs use the extra computing power for folding protein structures, looking for extraterrestrial life and everything in between.
- The second type involves a bit more work on the volunteers part, asking them to map, photograph or identify geographical areas they know well.
- And, the third and most diverse category is the 'field reporter' category, where volunteers are encouraged to observe birds, insects or plants in their neighborhoods or backyards and contribute data to larger tracking projects."
Citizen Science as an alternative to centralized Big Science
In this NYT editorial, Aaron Hirsh first describes the inevitability of Big Science approaches as scienfific communities grow, but also argues in the second part, that distributed citizen science is an alternative way to achieve bigness wihout centralization:
"if Big Science is what it takes to gather the truly precious data, what are we to do?
There is another way to extend our scientific reach, and I believe it can also restore some of what is lost in the process of centralization. It has been called Citizen Science, and it involves the enlistment of large numbers of relatively untrained individuals in the collection of scientific data. To return to our architectural metaphor, if Big Science builds the high-rise yet higher, Citizen Science extends outward the community of villages.
For me, an especially inspiring example of Citizen Science is the Audubon Society’s Christmas Bird Count. Every winter, from mid December to early January, tens of thousands of intrepid hobbyists fan out across North America, and together, they do their best to answer two basic questions: How many birds are there? And what kinds?
It’s a simple sort of data, to be sure, but it is nonetheless scientifically invaluable. The CBC dataset now covers 109 years, and this remarkable temporal extent, along with geographic range that spans the continent, enables scientists to address questions that would otherwise be as inaccessible as a Higgs boson. Just in the past few years, scientists have used the CBC dataset to track the emergence and impact of West Nile virus, to understand the ecological effects of competition between introduced species and to measure the shift that birds make toward the poles in response to global warming.
The CBC is surely a glowing exemplar, but there are many other cases of Citizen Science in action. Even underwater, a snorkeling citizen can serve science by taking down a few notes, which he’ll enter into a website when he returns to shore. The Internet is a natural medium for Citizen Science — widely distributed yet highly organized, it’s the ultimate network of villages — and an initiative known as The Encyclopedia of Life has called upon the world’s hobbyists to contribute their sundry discoveries to a Web-based library of species: one page per organism, featuring photos, taxonomy, natural history, even ecology.
Of course, Citizen Science won’t be very helpful in genome sequencing or particle physics. But it will be helpful — indeed, perhaps essential — for gathering a kind of data that will be increasingly important over the next few decades. Widespread networks of observers are especially well-suited to detecting global change — shifts in weather patterns; movements in the ranges of species; large-scale transformations of eco-systems — and that, unfortunately, is something we will need to know far more about if we are to mitigate and adapt to the fateful effects we are having on the planet.
In the end, though, what may be most important about Citizen Science is what it could mean for the relationship between citizens and science. When everyone is gathering data, that rather austere and forbidding tower becomes a shared human pursuit. In 1963, Alvin Weinberg, who was then the director of Oak Ridge, likened Big Science to the greatest monuments civilizations have ever built: the cathedrals of medieval Europe; the pyramids of Egypt.
But just as we build higher our temples of scientific investigation, so too should we strengthen their foundations, and broaden their congregations. Perhaps the new administration, which has already proven itself so skilled in using the Internet to coordinate broad networks of volunteers, ought to consider a national initiative in Citizen Science. It would provide us with timely data, and it would make us better citizens." (http://judson.blogs.nytimes.com/2009/01/13/guest-column-a-new-kind-of-big-science/)
Collaboration between experts and amateurs
Citizen science projects pose the problem of accuracy and the cooperation between experts and amateurs.
By Adam Glenn at http://www.poynter.org/column.asp?id=31&aid=116168
See our entry on Citizen Journalism for extra context:
"I think such citizen science projects offer valuable models that can be applied to citizen media projects:
1. Rigorous data collection. The Bird Count uses carefully developed methodologies to avoid spoiling data with inaccurate or duplicate information. Likewise, citizen journalists can establish and disseminate guides for reporting and photography standards -- especially regarding verifiable info such as names, quotes, attribution, numbers and the like.
2. Pooling and verifying cumulative results. The sheer volume of overall data collected in the Bird Count ensures that, if any contaminated info does sneak in, it won't unacceptably distort the final result. That's an important lesson for citizen journalism sites, harking back to the journalistic principle of verifying information with multiple sources. Ideally, citJ projects should seek multiple iterations of information -- for example, requiring that assertions by one contributor be verified by others.
3. Vetting amateurs. Even small hurdles like registration forms and minimal fees can weed out the unworthy, while extensive mandatory training can seriously raise the level of contributions (as well as the cost, unfortunately). It's worth considering whether citJ sites might benefit from mandatory online tutorials, accuracy checklists or story forms to make sure vital info isn't left out of submissions.
4. Expert-amateur interaction. Most citizen science projects aim to pair the novice with either experienced amateurs or experts themselves, fostering mentoring relationships that ultimately improve the data. Why shouldn't experienced citizen journalists (or professional journalists associated with new media or even mainstream media) provide the same mentoring? This could be done via workshops, in-the-field training, online editing, or other means. If the gains in media democratization aren't enough for you, how about the ways in which the resulting bond with the community and its most active news consuming members could pay off in loyalty to the news product?" (http://www.poynter.org/column.asp?id=31&aid=116168)
Citizen Science Differs from Crowdsourcing
"the distinction made between crowdsourcing and citizen science, by Yale-based astrophysicist and Galaxy Zoo founder Kevin Schawinski:
“We prefer to call this [Galaxy Zoo] citizen science because it’s a better description of what you’re doing; you’re a regular citizen but you’re doing science. Crowd sourcing sounds a bit like, well, you’re just a member of the crowd and you’re not; you’re our collaborator. You’re pro-actively involved in the process of science by participating.”
On comparisons between Galaxy Zoo and seti@home, stardust@home, etc., etc., etc.:
“Galaxy Zoo volunteers do real work. They’re not just passively running something on their computer and hoping that they’ll be the first person to find aliens. They have a stake in science that comes out of it, which means that they are now interested in what we do with it, and what we find.” (http://www.wikinomics.com/blog/index.php/2009/02/09/crowdsourcing-versus-citizen-science/)
The role of Citizen Science in participatory 'precision medicine'
"citizen science efforts have also begun to achieve something that is crucial to the future of personalized, or “precision,” medicine. A cornerstone of such medicine, according to a 2011 National Research Council (NRC) report, is a dense “knowledge network” (i.e., biobank), built by “mining” genomic, phenotypic, health, behavioral, and environmental data from many people. Indeed, former National Cancer Institute director John Neiderhuber has predicted a near future in which “every citizen” will contribute biosamples to biobanks and funnel health data into a centralized databank via biosensors linked to smartphones.
Yet, few will submit to being “mined” in this way. The NRC report notes that to realize such a vision, there must be a “gradual elimination of institutional, cultural, and regulatory barriers to widespread sharing of the molecular profiles and health histories of individuals.” But this emphasis on overcoming barriers neglects that such a knowledge network is necessarily also a social network, a network that connects people, whether they are the people who experience disease or the people who study it. The robustness of the knowledge network will depend on social connections—on the relationships, rules, and forms of trust that hold it together. Far from “barriers” to be removed, these are connections that must be built.
This has been well demonstrated by some citizen science projects, which stand or fall on the strength of the social networks that underlie them. Though far from perfect, these projects begin to sketch the outlines of an altered social contract between science and society—one that is open, participatory, and dependent on the collective energy of the community.
A key barrier to forging similar relationships in mainstream research is the dominant approach to informed consent. The consent process tends to treat participants as outsiders to the projects that depend on them. Signing the form is often the first and last interaction between participants and the research enterprise. These practices engender exceptionally weak social connections between donors and research: the focus is on mining information, not forging relationships.
Some have proposed that the ethical rules governing biobanking should encourage broad consent—or eliminate the need for it altogether. A better solution lies in the opposite direction. Mainstream research should invite forms of participation that treat donors as citizens committed to achieving a public good, not as repositories of informatic gold, ripe for mining. Thanks to continually evolving network technology, participants could easily remain connected to research, electing personalized privacy protections or providing additional study-specific data upon the researchers’ request via flexible web-portals. Such a system would recognize the right and the competency of participants who have given material from their bodies to govern that material’s future uses. It would give participants a means to do more than the minimum, and the authority to say no when research goals do not comport with their values or preferences.
By seeing how their contributions are used, participants would gain a window into knowledge-in-progress, opening the black box of research to reveal just how far there is to go, and that getting there depends on a collective effort. It would also offer a more dynamic—and more honest—picture of science, balancing hope with realism. It would clarify the essential roles (and responsibilities) of scientists, patients, and research participants in advancing science. It would move beyond a model that treats the asymmetry in expertise between scientists and citizens as a justification for asymmetry in research governance. And it would build public trust in science by entrusting the public with a central role in shaping the future of biomedical research.
Science and citizens should not occupy separate worlds: by strengthening the role of citizens in science, science too will be strengthened." (http://the-scientist.com/2012/01/24/opinion-occupy-science/)
How to increase the trustworthiness of Citizen Science data
"Citizen science occurs when data for scientific research is collected by members of the public in a voluntary capacity. Public participation in environmental projects, in particular, has been described as a global phenomenon.
But there is a stigma associated with these types of projects. The data collected are often labelled untrustworthy and biased. Research in this area continues to show however, that data collected by what is essentially a non-professional workforce, are comparable to those collected by professional scientists.
Provided steps are in place to deal with data integrity, we have much to gain by putting more trust in citizen scientists.
Across the globe thousands of people collect data on everything from counts of stars in distant galaxies to the timing of flowering events. Volunteers have long been collecting data on the health of coral reefs, and ornithologists encourage volunteers to collect data on bird migration.
Citizen science has benefits for scientists – including an inexpensive and potentially large labour force – and citizens, who get knowledge and fulfilment. These schemes expose people to the environment and develop the stewardship ethic.
But what motivates my interest in this area is the potential to create a more scientifically literate society; building the capacity for people to take information they receive in their everyday lives and then being able to make informed choices based on the what they have learned. Those choices could be anything from the products they buy as consumers or the political parties they support.
While citizen science projects vary in their study subjects, the ecosystem of interest and objectives of the research, they all face one important challenge that jeopardises their success, sustainability, integrity and in general, acceptance in the wider scientific community. Many scientists question the quality, reliability and in general, the utility of data.
Evidence for the stigma associated with citizen science projects comes from different sources. One was a long standing community-based program to survey diversity of bird species. Researchers found the estimated numbers of birds changed through time simultaneously with changes in the observers. It was concluded that the trends detected were not likely to represent real changes in bird abundance, but were more likely due to prejudices of the individual observers.
The negative perception of citizen projects is not new. Twenty years ago, the use of volunteer data came into the international spotlight when an amendment was made to prohibit the US National Biological Survey from accepting the work of volunteers. This was supported by two arguments in the House of Representatives declaring that volunteers are incompetent and biased.
Are volunteers’ data that bad?
Questions over data integrity continue to this day. It’s surprising, because a growing body of literature shows that data collected by citizens are comparable to those of professional scientists.
For example, researchers have detected no differences between field samples of aquatic invertebrates that were collected and identified by volunteers and professionals. A similar study showed that data collected by volunteers and scientists agreed 96% of the time.
Both studies concluded that volunteers could collect reliable data and make assessments that were comparable to those made by highly trained professionals.
My own research on vegetation metrics collected as part of ecosystem restoration projects also showed that the degree of agreement of data collected by volunteers can be as good as those recorded by professional scientists.
Results showed that scientists as a collective group collected data that was in closer agreement with “the truth” than those of volunteers. But when data collected by individuals were analysed, some volunteers collected data that were in similar or closer agreement to the truth, than scientists. Both groups’ estimates were in closer agreement for particular attributes than others, also suggesting that some attributes are more difficult to estimate or are more subjective than others.
An important message from these studies is that data-integrity issues can occur. But it’s just a matter of honing in on those particular issues and addressing them if necessary. This can be through training to improve skill sets or calibrating data where possible.
It should not be a case of blaming the citizens. The scientist behind such programs should have checks in place – citizen science project or otherwise!" (http://blog.okfn.org/2013/01/23/citizen-science-can-produce-reliable-data/)
- Good examples are listed here at http://www.thetakeaway.org/stories/2009/apr/09/slideshow-citizen-science-projects/
- Science Cheerleader has a directory of amateur science activity
Some other examples:
- distributed proofreading for Project Gutenberg, at http://www.pgdp.net/ #Great Internet Mersenne Prime Search, http://www.mersenne.org/prime.htm #NASA Clickworkers, http://clickworkers.arc.nasa.gov/top ;
- SETI http://setiathome.ssl.berkeley.edu
- Christmas Bird Count
- Foldit: Citizen Science project in the field of biology 
- Project Implicit: Citizen Science project in the field of psychology 
- Galaxy Zoo astronomy
See our entry on Communal Validation