[9] He is currently the T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics at Harvard University.
Among his many contributions to Complexity Theory, he introduced the notion of #P-completeness ("Sharp-P completeness") to explain why enumeration and reliability problems are intractable.
In computer systems, he is most well-known for introducing the Bulk Synchronous Parallel processing model.
Recent examples are Google adopting it for computation at large scale via MapReduce, MillWheel,[16] Pregel[17] and Dataflow, and Facebook creating a graph analytics system capable of processing over 1 trillion edges.
His earlier work in Automata Theory includes an algorithm for context-free parsing, which is still the asymptotically fastest known.
Valiant's 2013 book is Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World.
In 1977, he defined the notion of ‘sharp-P’ (#P)-completeness and established its utility in classifying counting or enumeration problems according to computational tractability.