[2] Here, data farming uses collaborative processes in combining rapid scenario prototyping, simulation modeling, design of experiments, high performance computing, and analysis and visualization in an iterative loop-of-loops.
The term “data farming” is more recent, coined in 1998[4] in conjunction with the Marine Corp's Project Albert,[5] in which small agent-based distillation models (a type of stochastic simulation) were created to capture specific military challenges.
Initially, the use of brute-force full factorial (gridded) designs meant that the simulations needed to run very quickly and the studies required high-performance computing.
The SEED Center for Data Farming[7] at the Naval Postgraduate School[8] also worked closely with Project Albert in model generation, output analysis, and the creation of new experimental designs to better leverage the computing capabilities at Maui and other facilities.
The workshops have seen a diverse array of representation from participating countries, such as Canada, Singapore, Mexico, Turkey, and the United States.
The teams of data farmers are assigned a specific area of study, such as robotics, homeland security, and disaster relief.