Friday marked the final day of an ongoing annual experiment in “citizen science” that offers lessons for how citizen journalists — and even mainstream media — can gather large volumes of ground-level data suitable for parsing for larger trends.
This annual citizen science effort is the Christmas Bird Count, which runs for several weeks in early winter. It uses tens of thousands of volunteers to collect widely dispersed data from hundreds of sites around the country. The National Audubon Society has been running it for over a century now, and it’s no publicity stunt. (I wrote more about this project in a Jan. 4 posting at NewAssignment.net.)
This project is taken very seriously by the scientific community because this data helps ornithologists understand bird population shifts, environmental pressures, serious threats to given species, and more. It’s spawned similar data-pooling projects among birders, and is often cited by citizen science advocates as a prime example of how amateurs can work side-by-side with experts to share in a deeper scientific endeavor.
I think such citizen science projects offer valuable models that can be applied to citizen media projects:
- Rigorous data collection. The Bird Count uses carefully developed methodologies to avoid spoiling data with inaccurate or duplicate information. Likewise, citizen journalists can establish and disseminate guides for reporting and photography standards — especially regarding verifiable info such as names, quotes, attribution, numbers and the like.
- Pooling and verifying cumulative results. The sheer volume of overall data collected in the Bird Count ensures that, if any contaminated info does sneak in, it won’t unacceptably distort the final result. That’s an important lesson for citizen journalism sites, harking back to the journalistic principle of verifying information with multiple sources. Ideally, citJ projects should seek multiple iterations of information — for example, requiring that assertions by one contributor be verified by others.
- Vetting amateurs. Even small hurdles like registration forms and minimal fees can weed out the unworthy, while extensive mandatory training can seriously raise the level of contributions (as well as the cost, unfortunately). It’s worth considering whether citJ sites might benefit from mandatory online tutorials, accuracy checklists or story forms to make sure vital info isn’t left out of submissions.
- Expert-amateur interaction. Most citizen science projects aim to pair the novice with either experienced amateurs or experts themselves, fostering mentoring relationships that ultimately improve the data. Why shouldn’t experienced citizen journalists (or professional journalists associated with new media or even mainstream media) provide the same mentoring? This could be done via workshops, in-the-field training, online editing, or other means. If the gains in media democratization aren’t enough for you, how about the ways in which the resulting bond with the community and its most active news consuming members could pay off in loyalty to the news product?
Some smart journalists and citizen media types like those at Sunlight Foundation are, of course, already taking advantage of some of these distributed reporting projects. And perhaps — thanks to the efforts of a bunch of bird lovers — they’ll have even better models for trying more of them.