Some Recent Results in Complexity Theory
The Regularity Lemma#
Regularity, Boosting, and Efficiently Simulating Every High-Entropy Distribution
L. Trevisan, M. Tulsiani and S. Vadhan, "Regularity, Boosting, and Efficiently Simulating Every High-Entropy Distribution," 2009 24th Annual IEEE Conference on Computational Complexity, Paris, France, 2009, pp. 126-136, doi: 10.1109/CCC.2009.41.
Abstract: We show that every bounded function \(g: \{0,1\}^n \rarr [0,1]\) admits an efficiently computable "simulator" function \(h: \{0,1\}^n \rarr [0,1]\) such that every fixed polynomial size circuit has approximately the same correlation with \(g\) as with \(h\). If \(g\) describes (up to scaling) a high min-entropy distribution \(D\), then \(h\) can be used to efficiently sample a distribution \(D'\) of the same min-entropy that is indistinguishable from \(D\) by circuits of fixed polynomial size. We state and prove our result in a more abstract setting, in which we allow arbitrary finite domains instead of \(\{0,1\}^n\), and arbitrary families of distinguishers, instead of fixed polynomial size circuits. Our result implies
the weak Szemeredi regularity Lemma of Frieze and Kannan
a constructive version of the dense model theorem of Green, Tao and Ziegler with better quantitative parameters (polynomial rather than exponential in the distinguishing probability), and
the Impagliazzo hardcore set Lemma. It appears to be the general result underlying the known connections between "regularity" results in graph theory, "decomposition" results in additive combinatorics, and the hardcore Lemma in complexity theory.
We present two proofs of our result, one in the spirit of Nisan's proof of the hardcore Lemma via duality of linear programming, and one similar to Impagliazzo's "boosting" proof. A third proof by iterative partitioning, which gives the complexity of the sampler to be exponential in the distinguishing probability, is also implicit in the Green-Tao-Ziegler proofs of the dense model theorem.
keywords: {Boosting;Computational modeling;Polynomials;Circuit simulation;Computational complexity;Computer science;Computer simulation;Graph theory;Combinatorial mathematics;Complexity theory;pseudorandomness;average-case complexity;additive combinatorics;boosting},URL: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5231258&isnumber=5231176
“It effectively lets them break a given computational problem or function into simpler pieces.”
Impagliazzo’s Hardcore Lemma (Hard-core-set Lemma)#
A priori, if a function is average-case hard, it could be because every circuit fails to compute \(f\) on a different fraction of inputs. The hardcore lemma rules out this possibility. If a function is average-case hard, then every circuit fails essentially on the same fraction of inputs.
More in Arora & Barak's Computational Complexity, or The Impagliazzo Hard-Core-Set Theorem