User:Jshankroff/sandbox

Background
In the past four decades, Computational Sociology has been introduced and gaining popularity. This has been used primarily for modeling or building explanations of social processes and are depending on the emergence of complex behavior from simple activities. The idea behind emergence is that properties of any bigger system don’t always have to be properties of the components that the system is made of. The men responsible for the introduction of the idea of emergence are Alexander, Morgan, and Broad, who were classical emergentists. The time at which these emergentists came up with this concept and method was during the time of the early twentieth century. The reasoning behind the introduction of this method was these men wanted to find a good enough accommodation between two different and extreme ontologies, which were reductionist materialism and dualism.

While emergence has had a valuable and important role with the foundation of Computational Sociology, there are those who do not necessarily agree. One major leader in the field, Epstein, doubted the use because there were aspects that are unexplainable. Epstein put up a claim against emmergentism, in which he says it “is precisely the generative sufficiency of the parts that constitutes the whole’s explanation.”

Agent-based models have had a historical influence on Computational Sociology. These models first came around in the 1960’s, and were used to simulate control and feedback processes in organizations, cities, etc. During the 1970’s, the application introduced the use of individuals as the main units for the analyses and used bottom-up strategies for modeling behaviors. The last wave occurred in the 1980’s. At this time, the models were still bottom-up; the only difference is that the agents interact interdependently.