Posted tagged ‘crowdsourcing’

How do we know what we know?

March 24, 2014

While drinking a cup of cappuccino in a very nice coffee shop recently, I overheard two students discussing research methods for their essays. Both of them believed that they had correctly identified the solution to a particular scientific – I think biomedical – problem, but neither was sure on what evidence they could base it. So one of them pulled out his mobile phone and tweeted the question. Within two minutes they apparently had received 38 responses, with 21 of these suggesting one particular source, 8 another, and the remaining 9 (according to one of the students) ‘just spouting rubbish’. So the 21 were deemed to have the winning formula, and I believe that this is what both submitted in their essays.

It was, I suppose, a form of crowdsourcing. And of course this doesn’t just get used as a research tool for students. Last week we read that online crowdsourcing was used to identify the likely flight direction of the missing Malaysian flight MH370. Or how about Californian Assemblyman Mike Gatto, who is using Twitter to help him draft legislation which he would like to see enacted? Others again have taken to crowdsourcing to predict stock market movements. A cancer research charity is using crowdsourcing to analyse medical data.

For those still struggling with the validity or otherwise of using Wikipedia as a research tool, the ever more informal and broad ranging methods of research made possible by the internet must seem a major challenge. In part this is because, increasingly, we are processing information supplied by large numbers of people about whose credentials we know, and seek to know, nothing at all; and yet we may trust what they advise us. This raises completely new notions about the validation of information and data.

In the past, when I was first doing research, our task was to acquire knowledge and based on that knowledge carry out analysis, each step of which we could document and justify. If those were our intellectual tools, how shall we respond to a new age in which we throw questions into cyberspace and wait for an answer, whose validity we cannot document beyond the volume of the response? Do we need to review the whole idea of what constitutes knowledge?

Crowdsourcing as academic methodology

October 18, 2011

Modern information and communications technology has allowed large groups of people in locations around the globe to participate collective analysis and debate. The best known product of this approach is Wikipedia, which publishes information online that is written and edited by anyone who chooses to do so, expert of otherwise. While academics often advise caution in the use of such collective work as source material, it is clear that Wikipedia has become the reference of choice for people in all walks of life, including members of the academy itself.

Now researchers from George Mason University are assessing the use of non-expert crowds in making judgements about the likelihood or nature of future events. The initial context for this research is intelligence analysis, but there may be scope for the development of crowdsourcing as a research method in other contexts as well. The initial assumption is that large numbers of people, even those without advanced expertise in the subject, can when their views are taken together make more accurate judgments on certain topics than smaller groups of specialists. Such methods are unlikely to find a cure for cancer, but they may be useful in gaining political or other social science insights.

It is not that academic study is set to become a giant ‘ask-the-audience’ exercise, but larger groups contain both wisdom and knowledge that can be tapped more effectively. It is at any rate worth some analysis.