Communication complexity

[1] The problem is usually stated as follows: two parties (traditionally called Alice and Bob) each receive a (potentially different)

In VLSI circuit design, for example, one seeks to minimize energy used by decreasing the amount of electric signals passed between the different components during a distributed computation.

Then, as and when each party communicates a bit to the other, the number of choices for the answer reduces as this eliminates a set of rows/columns resulting in a submatrix of

Yao, in his seminal paper[1] answers this question by defining randomized communication complexity.

Returning to the previous example of EQ, if certainty is not required, Alice and Bob can check for equality using only ⁠

This shows that if Alice and Bob share a random string of length n, they can send one bit to each other to compute

⁠ bits that are as good as sharing a random string of length n. Once that is shown, it follows that EQ can be computed in ⁠

In particular, they would like to find a communication protocol requiring the transmission of as few bits as possible to compute the following partial Boolean function, Clearly, they must communicate all their bits if the protocol is to be deterministic (this is because, if there is a deterministic, strict subset of indices that Alice and Bob relay to one another, then imagine having a pair of strings that on that set disagree in

It turns out that the answer somewhat surprisingly is no, due to a result of Chakrabarti and Regev in 2012: they show that for random instances, any procedure which is correct at least

Let's say we additionally allow Alice and Bob to share some resource, for example a pair of entangle particles.

can have arbitrarily large entry size, but still the number of communication bit is constant to a single one.

Some resources are shown to be non-collapsing, such as quantum correlations [3] or more generally almost-quantum correlations,[4] whereas on the contrary some other resources are shown to collapse randomized communication complexity, such as the PR-box,[5] or some noisy PR-boxes satisfying some conditions.

[13] The (internal) information complexity of a (possibly randomized) protocol R with respect to a distribution μ is defined as follows.

This means that the cost for solving n independent copies of f is roughly n times the information complexity of f. This is analogous to the well-known interpretation of Shannon entropy as the amortized bit-length required to transmit data from a given information source.

[14] Information complexity techniques have also been used to analyze extended formulations, proving an essentially optimal lower bound on the complexity of algorithms based on linear programming which approximately solve the maximum clique problem.

At least three quantum generalizations of communication complexity have been proposed; for a survey see the suggested text by G. Brassard.

In a second model the communication is still performed with classical bits, but the parties are allowed to manipulate an unlimited supply of quantum entangled states as part of their protocols.

By doing measurements on their entangled states, the parties can save on classical communication during a distributed computation.

Viewed differently, this amounts to covering all 1-entries of the 0/1-matrix by combinatorial 1-rectangles (i.e., non-contiguous, non-convex submatrices, whose entries are all one (see Kushilevitz and Nisan or Dietzfelbinger et al.)).

In particular, if the number of public bits shared between Alice and Bob are not counted against the communication complexity, it is easy to argue that computing any function has

[18] On the other hand, both models are equivalent if the number of public bits used by Alice and Bob is counted against the protocol's total communication.

[20] Forster[21] was the first to prove explicit lower bounds for this class, showing that computing the inner product

This technique was pioneered in the context of communication complexity by Raz and McKenzie,[23] who proved the first query-to-communication lifting theorem, and used the result to separate the monotone NC hierarchy.

Garg, Göös, Kamath and Sokolov extended this to the DAG-like setting,[28] and used their result to obtain monotone circuit lower bounds.

[29] A different type of lifting is exemplified by Sherstov's pattern matrix method,[30] which gives a lower bound on the quantum communication complexity of

This dual witness is massaged into other objects constituting data for the generalized discrepancy method.

Another example of this approach is the work of Pitassi and Robere,[31] in which an algebraic gap is lifted to a lower bound on Razborov's rank measure.

For a randomized protocol, the number of bits exchanged in the worst case, R(f), was conjectured to be polynomially related to the following formula: Such log rank conjectures are valuable because they reduce the question of a matrix's communication complexity to a question of linearly independent rows (columns) of the matrix.

This particular version, called the Log-Approximate-Rank Conjecture, was recently refuted by Chattopadhyay, Mande and Sherif (2019)[33] using a surprisingly simple counter-example.

This reveals that the essence of the communication complexity problem, for example in the EQ case above, is figuring out where in the matrix the inputs are, in order to find out if they're equivalent.