1800s: Martineau · Tocqueville · Marx · Spencer · Le Bon · Ward · Pareto · Tönnies · Veblen · Simmel · Durkheim · Addams · Mead · Weber · Du Bois · Mannheim · Elias Computational sociology is a branch of sociology that uses computationally intensive methods to analyze and model social phenomena.
[6] A practical and well-known example is the construction of a computational model in the form of an "artificial society", by which researchers can analyze the structure of a social system.
This has been used primarily for modeling or building explanations of social processes and are depending on the emergence of complex behavior from simple activities.
The aim of this method was to find a good enough accommodation between two different and extreme ontologies, which were reductionist materialism and dualism.
[8] In the post-war era, Vannevar Bush's differential analyser, John von Neumann's cellular automata, Norbert Wiener's cybernetics, and Claude Shannon's information theory became influential paradigms for modeling and understanding complexity in technical systems.
Following Émile Durkheim's call to analyze complex modern society sui generis,[10] post-war structural functionalist sociologists such as Talcott Parsons seized upon these theories of systematic and hierarchical interaction among constituent components to attempt to generate grand unified sociological theories, such as the AGIL paradigm.
By the late 1960s and early 1970s, social scientists used increasingly available computing technology to perform macro-simulations of control and feedback processes in organizations, industries, cities, and global populations.
These models used differential equations to predict population distributions as holistic functions of other systematic factors such as inventory control, urban traffic, migration, and disease transmission.
[14][15] Although simulations of social systems received substantial attention in the mid-1970s after the Club of Rome published reports predicting that policies promoting exponential economic growth would eventually bring global environmental catastrophe,[16] the inconvenient conclusions led many authors to seek to discredit the models, attempting to make the researchers themselves appear unscientific.
[2] Research organizations explicitly dedicated to the interdisciplinary study of complexity were also founded in this era: the Santa Fe Institute was established in 1984 by scientists based at Los Alamos National Laboratory and the BACH group at the University of Michigan likewise started in the mid-1980s.
This cellular automata paradigm gave rise to a third wave of social simulation emphasizing agent-based modeling.
Like micro-simulations, these models emphasized bottom-up designs but adopted four key assumptions that diverged from microsimulation: autonomy, interdependency, simple rules, and adaptive behavior.
Hamilton published a major paper in Science titled "The Evolution of Cooperation" which used an agent-based modeling approach to demonstrate how social cooperation based upon reciprocity can be established and stabilized in a prisoner's dilemma game when agents followed simple rules of self-interest.
[4] Throughout the 1990s, scholars like William Sims Bainbridge, Kathleen Carley, Michael Macy, and John Skvoretz developed multi-agent-based models of generalized reciprocity, prejudice, social influence, and organizational information processing (psychology).
Electronic records such as email and instant message records, hyperlinks on the World Wide Web, mobile phone usage, and discussion on Usenet allow social scientists to directly observe and analyze social behavior at multiple points in time and multiple levels of analysis without the constraints of traditional empirical methods such as interviews, participant observation, or survey instruments.
[26] Content analysis has been a traditional part of social sciences and media studies for a long time.
Gender bias, readability, content similarity, reader preferences, and even mood have been analyzed based on text mining methods over millions of documents.
These models would help us predict how societies might evolve over time and provide possible explanations on how things work.
A model proposed by Epstein, is the agent-based simulation, which talks about identifying an initial set of heterogeneous entities (agents) and observe their evolution and growth based on simple local rules.
Also, coming up with tools and applications to help analyse and visualize the data based on these hybrid models is another added challenge.
Various law and policy makers would be able to see efficient and effective paths to issue new guidelines and the mass in general would be able to evaluate and gain fair understanding of the options presented in front of them enabling an open and well balanced decision process.