Secure multi-party computation

The two party case was followed by a generalization to the multi-party by Oded Goldreich, Silvio Micali, and Avi Wigderson.

The computation is based on secret sharing of all the inputs and zero-knowledge proofs for a potentially malicious case, where the majority of honest players in the malicious adversary case assure that bad behavior is detected and the computation continues with the dishonest person eliminated or his input revealed.

This work suggested the very basic general scheme to be followed by essentially all future multi-party protocols for secure computing.

This work was followed by the first robust secure protocol which tolerates faulty behavior graciously without revealing anyone's output via a work which invented for this purpose the often used `share of shares idea'[6] and a protocol that allows one of the parties to hide its input unconditionally.

[9] The above results established that it is possible under the above variations to achieve secure computation when the majority of users are honest.

[10][11] Adding a broadcast channel allows the system to tolerate up to 1/2 misbehaving minority,[12] whereas connectivity constraints on the communication graph were investigated in the book Perfectly Secure Message Transmission.

Participants want to compute the value of a public function on that private data: F(d1, d2, ..., dN) while keeping their own inputs secret.

The goal of MPC is to design a protocol, where, by exchanging messages only with each other, Alice, Bob, and Charlie can still learn F(x, y, z) without revealing who makes what and without having to rely on Tony.

Informally speaking, the most basic properties that a multi-party computation protocol aims to ensure are: There are a wide range of practical applications, varying from simple tasks such as coin tossing to more complex ones like electronic auctions (e.g. compute the market clearing price), electronic voting, or privacy-preserving data mining.

Unlike traditional cryptographic applications, such as encryption or signature, one must assume that the adversary in an MPC protocol is one of the players engaged in the system (or controlling internal parties).

Thus, protocols that are covertly secure provide mechanisms to ensure that, if some of the parties do not follow the instructions, then it will be noticed with high probability, say 75% or 90%.

In a way, covert adversaries are active ones forced to act passively due to external non-cryptographic (e.g. business) concerns.

This mechanism sets a bridge between both models in the hope of finding protocols which are efficient and secure enough in practice.

Yao's basic protocol is secure against semi-honest adversaries and is extremely efficient in terms of number of rounds, which is constant, and independent of the target function being evaluated.

Firstly, the ranges of the encryption function under any two distinct keys are disjoint (with overwhelming probability).

If one is considering malicious adversaries, further mechanisms to ensure correct behavior of both parties need to be provided.

In the secret sharing based methods, the parties do not play special roles (as in Yao, of creator and evaluator).

while achieving information-theoretic security, meaning that even if the adversary has unbounded computational power, they cannot learn any information about the secret underlying a share.

The most popular is SPDZ,[22] which implements MPC with additive secret shares and is secure against active adversaries.

One of the main issues when working with Yao-based protocols is that the function to be securely evaluated (which could be an arbitrary program) must be represented as a circuit, usually consisting of XOR and AND gates.

[26] The approach that so far seems to be the most fruitful in obtaining active security comes from a combination of the garbling technique and the "cut-and-choose" paradigm.

To avoid the aforementioned problems with respect to dishonest behaviour, many garblings of the same circuit are sent from the constructor to the evaluator.

Then around half of them (depending on the specific protocol) are opened to check consistency, and if so a vast majority of the unopened ones are correct with high probability.

[27] This technique was implemented by Pinkas et al. in 2009,[26] This provided the first actively secure two-party evaluation of the Advanced Encryption Standard (AES) circuit, regarded as a highly complex (consisting of around 30,000 AND and XOR gates), non-trivial function (also with some potential applications), taking around 20 minutes to compute and requiring 160 circuits to obtain a

As many circuits are evaluated, the parties (including the receiver) need to commit to their inputs to ensure that in all the iterations the same values are used.

The experiments of Pinkas et al. reported[26] show that the bottleneck of the protocol lies in the consistency checks.

In recent results[28] the efficiency of actively secure Yao-based implementations was improved even further, requiring only 40 circuits, and a much smaller number of commitments, to obtain

More recently, there has been a focus on highly parallel implementations based on garbled circuits, designed to be run on CPUs with many cores.

This approach seems to achieve comparable efficiency to the cluster computing implementation, using a similar number of cores.

On the other hand, the hardware required here is far more accessible, as similar devices may already be found in many people's desktop computers or games consoles.