Verification-based message-passing algorithms in compressed sensing

Verification-based message-passing algorithms (VB-MPAs) in compressed sensing (CS), a branch of digital signal processing that deals with measuring sparse signals, are some methods to efficiently solve the recovery problem in compressed sensing.

One of the main goal in compressed sensing is the recovery process.

Generally speaking, recovery process in compressed sensing is a method by which the original signal is estimated using the knowledge of the compressed signal and the measurement matrix.

[1] Mathematically, the recovery process in Compressed Sensing is finding the sparsest possible solution of an under-determined system of linear equations.

Based on the nature of the measurement matrix one can employ different reconstruction methods.

If the measurement matrix is also sparse, one efficient way is to use Message Passing Algorithms for signal recovery.

[1][2] The main problem in recovery process in CS is to find the sparsest possible solution to the following under-determined system of linear equations

[6] Here is an example of a binary sparse measurement matrix where the weights of the edges are either zero or one.

The basic idea behind message passing algorithms in CS is to transmit appropriate messages between variable nodes and check nodes in an iterative manner in order to efficiently find signal

[8] The common rule between all verification based message passing algorithms is the fact that once a variable node become verified then this variable node can be removed from the graph and the algorithm can be executed to solve the rest of the graph.

It is shown that these simple rules can efficiently recover the original signal provided that certain conditions are satisfied.

[8][6] There are four algorithms known as VB-MPA's, namely Genie, LM, XH, and SBB.

[6] All of these algorithms use the same strategy for recovery of the original signal; however, they use different combination of the message passing rules to verify variable nodes.

Using this knowledge, Genie should not care about the zero variable nodes in the graph, and the only task of the Genie algorithm is to recover the values of the non-zero elements of the original signal.

In such cases, employing the third rule violated the locality nature of the algorithms.

[6] This algorithm is the same as LM, but it only uses ECN instead of D1CN for the verification of the non-zero variable nodes.

component of the messages emanating from variable and check nodes.

is also used to keep the set of verified variable nodes in the previous iteration.

Although there is no guarantee that these algorithms succeed in all of the cases but we can guarantee that if some of the variable nodes become verified during these algorithms then the values of those variable nodes are correct almost surely.

If the non-zero elements of the original signal are chosen from a continuous distribution then the probability of this to occur is zero.

For the first part, it says that if we have two or more equations with the same right hand side, and if we only have one single unknown variable

from this equation we should have the summation of some unknown variables to be a non-zero value

are chosen randomly from a continuous distribution the probability that this summation equals exactly

The proof of this claim can be achieved by a change of variable in those equations.

then the problem will be changed to the first part where we only have one common variable node in all of those equations.

Therefore, with the same reasoning as in the first part we can see that all other variable nodes that are not common in all of those equations can be verified with value zero almost surely.

Moreover, the second part of the ECN rule is not necessary to be implemented as the non-zero verified variable node in the ECN rule will be removed from the bipartite graph in the next iteration and ZCN rule will be enough to verify all the zero valued variable nodes remained from those equations with the same right hand side.

All in all, when the non-zero elements of the measurement matrix are chosen form a continuous distribution then the SBB and XH algorithm that use ECN rule can be implemented efficiently.

Therefore, every minor loop in the algorithm can be executed in constant time

Moreover, since the algorithm will terminate when there is no progress in verification of the variable nodes then the if in the worst case in each iteration of the main loop there is only one variable node to be verified, then the maximum number of times that the main loop will be executed is

bi-regular bipartite graph corresponding to the measurement matrix A [ 7 ]
SBB Algorithm [ 7 ]