1. Methods Questions (Required)
Two years ago, a natural disaster in a city in the Southeastern United States set off a series of events that led to an outbreak of a rare tropical disease. During that period, hurricane flooding led to evacuations, breakdown in communication systems, and the onset of the disease, which killed several hundred people in one week’s time. During the first few days of the events, the twitter feeds were public and lively, and were mapped using Google. Rumors of the disease spread through social networking, but as official power supplies ran out, and then came back sporadically, supplemented by rogue sources and private generators, questions about how access to communication systems was being controlled added to the tensions. The most vulnerable population included those with a specific sexually transmitted illness that had already damaged their immune system. Contacting this portion of the population required accessing medical records and databanks for personal information, but contact with the at-risk population through social media risked a breach of confidentiality rules. Design an experiment to study the way the data, social networking, and information systems were used in the emergency and to assure that ethical guidelines could be established to guide policies for the Center for Disease Control and FEMA ahead. Outline your project, including specific methodologies (quantitative and qualitative), and justify your choice of a method or methods. You may emphasize one approach over others, but should make clear how that decision was made.
- Theory Questions
In the decade since its launch in 2001, Wikipedia has become one of the most frequently used information resources on the Web. With versions in over 160 languages, millions of contributors, and a constantly expanding corpus of articles (currently over 11.5 million in English alone), it
can be considered one of the great successes of “Web 2.0”.
Yet it is also a focus of controversy. Numerous empirical studies and critiques have been published supporting different views (including a special issue of Episteme in 2009). Advocates of “mass collaboration” (e.g., Fallis, 2008, 2009) claim that Wikipedia is a good first-resort reference tool whose reliability and epistemic value are comparable to that of traditional reference sources like Encyclopedia Britannica. They note that legions of contributors monitor and correct errors and mischief within minutes, in an ongoing process of commons-based peer production (Benkler, 2002, 2007). Detractors, in contrast, argue that Wikipedia’s volunteer “microcontributors” are no substitute for qualified subject-matter experts, that the “self-healing” quality of Wikipedia content is overrated (e.g., Duguid, 2006), and that Wikipedia may even pose a threat to knowledge itself (Keen, 2007).
For this question, write an essay explaining the consequences/implications of Wikipedia – whether positive or negative – for research and scholarship in information studies. Your essay might include, for example, questions of information quality, credibility or authenticity, the role of expertise vs. amateurism in knowledge production, the dynamic, emergent nature of Wikipedia content, the sheer scope of Wikipedia’s ambitions as a comprehensive reference resource, or other issues that you think are most salient to the problem areas of information studies research and scholarship you choose.
OR
Google Books: “A train wreck: a mishmash wrapped in a muddle wrapped in a mess.” (Nunberg 2009). Discuss.