Small is defined in terms of asymptotic density.
The apparent efficacy of generic case complexity is because for a wide variety of concrete computational problems, the most difficult instances seem to be rare.
This approach to complexity originated in combinatorial group theory, which has a computational tradition going back to the beginning of the last century.
The notion of generic complexity was introduced in a 2003 paper,[1] where authors showed that for a large class of finitely generated groups the generic time complexity of some classical decision problems from combinatorial group theory, namely the word problem, conjugacy problem and membership problem, are linear.
A detailed introduction of generic case complexity can be found in the surveys.
[2][3] Let I be an infinite set of inputs for a computational problem.
Whether X appears large in practice depends on how fast
Definition 4 An algorithm is in GenP (generically polynomial time) if it never gives incorrect answers and if it gives correct answers in polynomial time on a generic set of inputs.
we can define the class Gen(f) corresponding to time complexity O(f) on a generic set of input.
However, there is a kind of lower bound for machines of both types.
The halting problem is not in ExpGenP for any model of Turing machine,[9][10] The decision problem for Presburger arithmetic admits a double exponential worst case lower bound[11] and a triple exponential worst case upper bound.
A series of articles[15][16][17] is devoted to cryptanalysis of the Anshel–Anshel–Goldfeld key exchange protocol, whose security is based on assumptions about the braid group.
This series culminates in Miasnikov and Ushakov (2008)[18] which applies techniques from generic case complexity to obtain a complete analysis of the length based attack and the conditions under which it works.
If F is a subset of the set of all partial computable function from
to itself such that F and its complement are both non-empty, then the problem of deciding whether or not a given Turing machine computes a function from F is not decidable on any exponentially generic subset of I. Theorem 2 The set of formal languages which are generically computable has measure zero.
Theorem 3 There is an infinite hierarchy of generic complexity classes.
Theorem 4[2] There is a notion of generic-polynomial-time reduction with respect to which the distributional bounded halting problem is complete within class of distributional NP problems.
Meyer and Paterson[20] define an algorithm to be almost polynomial time, or APT, if it halts within p(n) steps on all but p(n) inputs of size n. Clearly, APT algorithms are included in our class GenP.
We have seen several NP complete problems in GenP, but Meyer and Paterson show that this is not the case for APT.
Generic case complexity is a direct measure of the performance of an algorithm on most inputs while average case complexity gives a measure of the balance between easy and difficult instances.
In addition Generic-case complexity naturally applies to undecidable problems.
T is polynomial on average even though it is exponential on typical inputs.
Average case complexity measures something else: the balance between the frequency of difficult instances and the degree of difficulty.
[21][22] Roughly speaking an algorithm which is polynomial time on average can have only a subpolynomial fraction of inputs that require superpolynomial time to compute.
-average, and for many distributions the converse holds [23] Theorem 5[2] If a function
-average on spheres then f is generically polynomial relative to the spherical asymptotic density
has subexponential time bound T and a partial algorithm
In a 2006 paper, Bogdanov and Trevisan came close to defining generic case complexity.
These are complete algorithms which may fail by halting with output "?".
The class AvgnegP is defined to consist of all errorless heuristic algorithms A which run in polynomial time and for which the probability of failure on