Hellman TMTO for Permutations: Success Probability

A quick reminder

Recall the definitions: $\pi : X\toX$ is a random permutation we'd like to invert.
We prepare $m$ chains, of length $t$ each, $C_i = \big(s_i, \pi(s_i), \pi(\pi(s_i), \ldots, e_i = \pi(\pi(\cdots\pi(s_i)\cdots))\big)$.

The preprocessing step requires $mt$ applications of $\pi$, but only needs to be done once. After it, we store only $(s_i, e_i)$ for every chain, using $O(m)$ memory.
Now we can perform the $O(t)$ time algorithm shown in class to try to invert a single value $y$. This succeeds exactly when $y$ is covered by any chain.

Calculating the success probability

The ideal estimate

Ideally, we have $mt$ elements covered by the chains, so the probability to succeed in calculating $\pi^{-1}(y)$ is $mt / |X|$.1

This is of course an over-estimation since not all $mt$ elements are distinct.
For instance, a single chain can lie in a short cycle of $\pi$ (of length $<t$), and then $|C_i| < t$; another problem is elements covered by multiple chains.

An almost rigorous calculation

Assume for now $mt = |X|$; we'd like to show that the success probability $p$ is a constant.

We will use the following claim (try to prove it, it's really easy):
Claim 1. For a random permutation $\pi : X\toX$, the length of the cycle of $\pi$ to which a given $x\in X$ belongs is distributed uniformly in ${1, 2, \ldots, |X|$.

We are interested in calculating the expected size of $C_1 \cup C_2 \cup \cdots \cup C_m$.
By the inclusion-exclusion principle,

\begin{align} E(|C_1 \cup C_2 \cup \cdots \cup C_m|) \ge \sum_{i=1}^m E(|C_i|) - \sum_{1\le i<j\le m} E(|C_i \cap C_j|), \end{align}

so we need to calculate two values: $A_1 = E(|C_i|)$ and $A_2 = E(|C_i \cap C_j|)$.

If the starting point $s_i$ lies in a cycle of length $k$, then $|C_i| = \min(k,t)$. Using Claim 1, we can calculate

\begin{align} A_1 = E(|C_i|) = \sum_{k=1}^{|X|} \frac{\min(k,t)}{|X|} = \frac1{|X|}\left(\sum_{k=1}^t k + (|X|-t)t \right) = t - \frac{\binom t2}{|X|} \ge t - \frac t{2m} \end{align}

Given that the point $x\in X$ lying in a cycle of length $k$, $P(x\in C_i \cap C_j) = \min(k,t)^2/|X|^2$. Using Claim 1 again,

\begin{align} A_2 = E(|C_i \cap C_j|) = \sum_{x\in|X|} P(x\in C_i \cap C_j) = |X| \sum_{k=1}^{|X|} \frac1{|X|}\cdot\frac{\min(k,t)^2}{|X|^2} \le \frac{t^2}{|X|} = \frac tm \end{align}

Putting it all together,

\begin{align} p \ge \frac{m A_1 - \binom m2 A_2}{|X|} \ge \frac{A_1}{t} - \frac{m A_2}{2t} \ge 1-\frac1{2m} - \frac12 \approx \frac12 \end{align}


We stopped the inclusion-exclusion calculation after two steps; had we continued it to the end, we'd get $p = 1 - \frac12 + \frac1{3!} - \cdots + \cdots = 1-1/e$, which is the "right" constant.

Moreover, if we take $mt = \alpha|X|$, we'd get $p = \alpha - \frac{\alpha^2}{2} + \frac{\alpha^3}{3} - \cdots + \cdots = 1-e^{-\alpha}$.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License