It is also very relevant in many software architectures today including database engine designs and parallel computing frameworks.
[citation needed] Synchronous dataflow architectures tune to match the workload presented by real-time data path applications such as wire speed packet forwarding.
Dataflow architectures that are deterministic in nature enable programmers to manage complex tasks such as processor load balancing, synchronization and accesses to common resources.
The research, however, never overcame the problems related to: Instructions and their data dependencies proved to be too fine-grained to be effectively distributed in a large network.
If any practical machine based on data flow ideas and offering real power ever emerges, it will be very different from what the originators of the concept had in mind.
[citation needed] Designs that use conventional memory addresses as data dependency tags are called static dataflow machines.
By giving each dependency a unique tag, it allows the non-dependent code segments in the binary to be executed out of order and in parallel.
In contrast to the conventional von Neumann architecture, data tokens are not permanently stored in memory, rather they are transient messages that only exist when in transit to the instruction storage.