Alyvix

The user then combines these elements with a visual scripting language that describes a sequence of desired interaction steps (for instance, clicking on one of the buttons, or inserting a predefined string into one of the text fields) and how those steps proceed from one to the next, along with the original series of screen grabs.

It then cycles through the recognition and interaction phases, applying the user-defined actions in the current step to the interface it sees.

While up to this point Alyvix can be used for automation, it also allows you to declare warning and critical thresholds that are useful for monitoring, based on visual recognition timeouts.

Coordinating this integration is Alyvix Service, which schedules test case runs over multiple target servers, manages configuration settings like how often to run each test case, records the measurements made by Alyvix Robot, and provides that data and reports via an open API.

Any monitoring system, like NetEye, only needs to add a module that calls the open API as necessary.

A screenshot of the editing phase when creating an Alyvix test case
A screenshot of the annotation phase when creating an Alyvix test case