Event-driven architectures are evolutionary in nature and provide a high degree of fault tolerance, performance, and scalability.
This architectural pattern may be applied by the design and implementation of applications and systems that transmit events among loosely coupled software components and services.
[citation needed] The physical implementation of event channels can be based on traditional components such as message-oriented middleware or point-to-point communication which might require a more appropriate transactional executive framework[clarify].
Building systems around an event-driven architecture simplifies horizontal scalability in distributed computing models and makes them more resilient to failure.
[3] New events can be initiated anywhere, but more importantly propagate across the network of data stores updating each as they arrive.
Adding extra nodes becomes trivial as well: you can simply take a copy of the application state, feed it a stream of events and run with it.
Integration events tend to have more complex payloads with additional attributes, as the needs of potential listeners can differ significantly.
This often leads to a more thorough approach to communication, resulting in overcommunication to ensure that all relevant information is effectively shared.
These techniques can enable systems to evolve while remaining compatible and reliable in complex, distributed environments.
As an example, an event producer could be an email client, an E-commerce system, a monitoring agent or some type of physical sensor.
[10] However, considering that an event is a strongly declarative frame, any informational operations can be easily applied, thus eliminating the need for a high level of standardization.
This can be done in many different ways and forms; e.g., an email is sent to someone and an application may display some kind of warning on the screen.
[10] Depending on the level of automation provided by the sink (event processing engine) the downstream activity might not be required.
Simple event processing is commonly used to drive the real-time flow of work, thereby reducing lag time and cost.
[10] For example, simple events can be created by a sensor detecting changes in tire pressures or ambient temperature.
Ordinary events (orders, RFID transmissions) are screened for notability and streamed to information subscribers.
Event stream processing is commonly used to drive the real-time flow of information in and around the enterprise, which enables in-time decision making.
[11] OLEP allows reliably composing related events of a complex scenario across heterogeneous systems.
It thereby enables very flexible distribution patterns with high scalability and offers strong consistency.
[10] Event-driven architectures have loose coupling within space, time and synchronization, providing a scalable infrastructure for information exchange and distributed workflows.
The high degree of semantic heterogeneity of events in large and open deployments such as smart cities and the sensor web makes it difficult to develop and maintain event-based systems.
[12] Synchronous transactions in EDA can be achieved through using request-response paradigm and it can be implemented in two ways: [1] Event driven architecture is susceptible to the fallacies of distributed computing, a series of misconceptions that can lead to significant issues in software development and deployment.
For instance, in a compliance check scenario, it may be adequate to publish just two types of events: compliant and non-compliant.