Extensive crowdsourcing study reveals surprising results in reversal of platform design assumptions
By John Kaiser, Research News
A 21-month study on the dynamics of crowdsourcing shows workers depend on collaborating with each other, upending platform designers’ long-held assumptions of crowdsourcing tasks performed in isolation.
“People spend far more time on a task, trying to get it right, than we might expect, given the pay rates for many tasks,” said MSR lead researcher Mary Gray, who presented the study during this week’s Conference on Human Computation & Crowdsourcing in San Diego. “Finding a network of support makes all the difference for people learning how to do crowdwork efficiently.”
In seeking to replicate the social support networks associated with traditional workplaces, crowdworkers motivate each other and create systems of collaboration to make even low-paying crowdwork pay off, the MSR research shows.
In the conference keynote Tuesday, Crowdwork’s Invisible Engine: Valuing the Organic Collaboration in Crowdsourcing Labor Markets, Gray explained how crowdworkers collaborate with members of their networks to manage administrative overhead and find lucrative tasks and reputable employers. But it’s the intangible benefits that crowdworkers derive from recreating real-world social connections that compelled researchers to look deeper into the underlying crowdsourcing ecosystem.
The study, co-authored with MSR researcher Siddharth Suri, focused on four platforms: Amazon Mechanical Turk, one of the largest micro-task platforms, Amara (a video captioning and translation service), Leadgenius (sales leads), and UHRS (Microsoft’s micro-task platform).
The MSR researchers collected experiences from crowdworkers in the U.S. and India amassing:
- 2,762 worker surveys.
- 138 India-based interviews and 52 U.S.-based interviews.
- Millions of workflows and bits of metadata.
Self-reported location of study participants
Crucially, the extent of crowdworker collaboration proved to be such a surprise that measuring it was not even a goal when research started two years ago.
“We never would have measured task sharing if we didn’t learn about it through interviews,” the study states.
Gray and Suri’s findings have transformative implications for crowdsourcing APIs originally developed a decade ago with the assumption that crowdworkers mostly work alone without the need or desire to communicate with each other. The MSR researchers argue that platforms are now “dependent on worker collaboration (they just don’t know it).” To grow and otherwise gain market share, they will need to “build in ways for workers to collaborate.”
“The challenge is how to get engineers to embrace technology that facilitates social interaction,” Gray says. “The effort so far has been to engineer the collaboration” instead of providing tools to enable social interaction.
Typical Mturk tasks include describing an image or highlighting purchased items from Walmart sales receipts for 3-5 cents on completion. Higher paying tasks include translating TED talks or transcribing recordings, yielding about $18 for what is estimated to take an hour or more.
The economics of crowdsourcing could soon extend into other industries beyond the existing base of service, technology, management, HR, scientific research, and arts/design, according to the MSR study. These industries include: Real estate, banking, health care, education, leisure, and tourism.
Are these trends merely a precursor to total machine automation? On the contrary, the MSR study echoes the traditional economic theory of innovation as an overall job creator in what it calls the “Paradox of tech innovation’s last mile” creating rather than eliminating markets.
For more computer science research news, visit ResearchNews.com.