Natural computing

Computational paradigms studied by natural computing are abstracted from natural phenomena as diverse as self-replication, the functioning of the brain, Darwinian evolution, group behavior, the immune system, the defining properties of life forms, cell membranes, and morphogenesis.

The Zuse-Fredkin thesis, dating back to the 1960s, states that the entire universe is a huge cellular automaton which continuously updates its rules.

Conway's Game of Life is one of the best-known examples of cellular automata, shown to be computationally universal.

Cellular automata have been applied to modelling a variety of phenomena such as communication, growth, reproduction, competition, evolution and other physical and biological processes.

Learning algorithms based on backwards propagation of errors can be used to find optimal weights for given topology of the network and input-output pairs.

It comprises a constant- or variable-size population of individuals, a fitness criterion, and genetically inspired operators that produce the next generation from the current one.

This process of simulated evolution eventually converges towards a nearly optimal population of individuals, from the point of view of the fitness function.

Evolutionary programming originally aimed at creating optimal "intelligent agents" modelled, e.g., as finite state machines.

[17][18] Swarm intelligence,[19] sometimes referred to as collective intelligence, is defined as the problem solving behavior that emerges from the interaction of individual agents (e.g., bacteria, ants, termites, bees, spiders, fish, birds) which communicate with other agents by acting on their local environments.

Ant algorithms have been successfully applied to a variety of combinatorial optimization problems over discrete search spaces.

Artificial life (ALife) is a research field whose ultimate goal is to understand the essential properties of life organisms [26] by building, within electronic computers or other artificial media, ab initio systems that exhibit properties normally associated only with living organisms.

Early examples include Lindenmayer systems (L-systems), that have been used to model plant growth and development.

[27] Pioneering experiments in artificial life included the design of evolving "virtual block creatures" acting in simulated environments with realistic features such as kinetics, dynamics, gravity, collision, and friction.

[28] These artificial creatures were selected for their abilities endowed to swim, or walk, or jump, and they competed for a common limited resource (controlling a cube).

This computational approach was further combined with rapid manufacturing technology to actually build the physical robots that virtually evolved.

The first experimental realization of special-purpose molecular computer was the 1994 breakthrough experiment by Leonard Adleman who solved a 7-node instance of the Hamiltonian Path Problem solely by manipulating DNA strands in test tubes.

[30] DNA computations start from an initial input encoded as a DNA sequence (essentially a sequence over the four-letter alphabet {A, C, G, T}), and proceed by a succession of bio-operations such as cut-and-paste (by restriction enzymes and ligases), extraction of strands containing a certain subsequence (by using Watson-Crick complementarity), copy (by using polymerase chain reaction that employs the polymerase enzyme), and read-out.

A successful open air experiment in quantum cryptography was reported in 2007, where data was transmitted securely over a distance of 144 km.

Implementations of practical quantum computers are based on various substrates such as ion-traps, superconductors, nuclear magnetic resonance, etc.

Already in the 1960s, Zuse and Fredkin suggested the idea that the entire universe is a computational (information processing) mechanism, modelled as a cellular automaton which continuously updates its rules.

Biochemical networks refer to the interactions between proteins, and they perform various mechanical and metabolic tasks inside a cell.

Other approaches to describing accurately and succinctly protein–protein interactions include the use of textual bio-calculus[46] or pi-calculus enriched with stochastic features.

The history of synthetic biology can be traced back to the 1960s, when François Jacob and Jacques Monod discovered the mathematical logic in gene regulation.

Indeed, rapid assembly of chemically synthesized short DNA strands made it possible to generate a 5386bp synthetic genome of a virus.

[49] Alternatively, Smith et al. found about 100 genes that can be removed individually from the genome of Mycoplasma Genitalium.

A third approach to engineering semi-synthetic cells is the construction of a single type of RNA-like molecule with the ability of self-replication.

Another effort in this field is towards engineering multi-cellular systems by designing, e.g., cell-to-cell communication modules used to coordinate living bacterial cell populations.

One particular study in this area is that of the computational nature of gene assembly in unicellular organisms called ciliates.

[52] From the biological point of view, a plausible hypothesis about the "bioware" that implements the gene-assembly process was proposed, based on template guided recombination.

[57] This article was written based on the following references with the kind permission of their authors: Many of the constituent research areas of natural computing have their own specialized journals and books series.

DNA tile self-assembly of a Sierpinski triangle, starting from a seed obtained by the DNA origami technique [ 32 ]