Randomized vs. Deterministic Separation in Time-Space Tradeoffs of Multi-Output Functions
We prove the first polynomial separation between randomized and deterministic time-space tradeoffs of multi-output functions. In particular, we present a total function that on the input of n elements in [n], outputs O(n) elements, such that: (1) There exists a randomized oblivious algorithm with space O(log n), time O(nlog n) and one-way access to randomness, that computes the function with probability 1-O(1/n); (2) Any deterministic oblivious branching program with space S and time T that computes the function must satisfy T^2S≥Ω(n^2.5/log n). This implies that logspace randomized algorithms for multi-output functions cannot be black-box derandomized without an Ω(n^1/4) overhead in time. Since previously all the polynomial time-space tradeoffs of multi-output functions are proved via the Borodin-Cook method, which is a probabilistic method that inherently gives the same lower bound for randomized and deterministic branching programs, our lower bound proof is intrinsically different from previous works. We also examine other natural candidates for proving such separations, and show that any polynomial separation for these problems would resolve the long-standing open problem of proving n^1+Ω(1) time lower bound for decision problems with polylog(n) space.
READ FULL TEXT