Concurrency (computer science)

[7] Design of concurrent systems often entails finding reliable techniques for coordinating their execution, data exchange, memory allocation, and execution scheduling to minimize response time and maximise throughput.

In the years since, a wide variety of formalisms have been developed for modeling and reasoning about concurrency.

to operate indefinitely, including automatic recovery from failure, and not terminate unexpectedly (see Concurrency control).

require the inclusion of some[example needed] kind of arbiter somewhere in their implementation (often in the underlying hardware), to control access to those resources.

The use of arbiters introduces the possibility of indeterminacy in concurrent computation which has major implications for practice including correctness and performance.

In these models, threads of control explicitly yield their timeslices, either to the system or to another process.

Parallelism vs concurrency