Crowd computing

Crowd computing is a form of distributed work where tasks that are hard for computers to do, are handled by large numbers of humans distributed across the internet.

It is an overarching term encompassing tools that enable idea sharing, non-hierarchical decision making and utilization of "cognitive surplus" - the ability of the world’s population to collaborate on large, sometimes global projects. Crowd computing combines elements of crowdsourcing, automation, distributed computing, and machine learning.

Prof. Rob Miller of MIT further defines crowd computing as “harnessing the power of people out in the web to do tasks that are hard for individual users or computers to do alone. Like cloud computing, crowd computing offers elastic, on-demand human resources that can drive new applications and new ways of thinking about technology.”

History
The practice predates the internet. At the end of the 18th century, the British Royal Astronomers distributed spreadsheets by mail, asking the crowd to help them create maps of the stars and the seas. In the United States during the 1930s, when the government employed hundreds of “human computers” to work on the WPA and the Manhattan Project.

The modern day microchip made using large crowds for mechanical computation less attractive in the second half of the twentieth century. However, as the volume of data online grew, it became clear to companies like Amazon and Google that there were some things humans were simply better at doing than machines.