### Abstract

Mergers are procedures that, with the aid of a short random string, transform k (possibly dependent) random sources into a single random source, in a way that ensures that if one of the input sources has min-entropy rate δ then the output has min-entropy rate close to δ. Mergers were first introduced by Ta-Shma [28th STOC, pp. 276-285, 1996] and have proven to be a very useful tool in explicit constructions of extractors and condensers. In this work we present a new analysis of the merger construction of Lu et al [35th STOC, pp. 602-611, 2003]. We prove that the merger's output is close to a distribution with min-entropy rate of at least 6/11δ. We show that the distance from this distribution is polynomially related to the number of additional random bits that were used by the merger (i.e., its seed). We are also able to prove a bound of 4/7 δ on the min-entropy rate at the cost of increasing the statistical error. Both results are improvements to the previous known lower bound of 1/2 δ (however, in the 1/2 δ result the error decreases exponentially in the length of the seed). To obtain our results we deviate from the usual linear algebra methods that were used by Lu et al and introduce techniques from additive number theory.

Original language | English (US) |
---|---|

Pages (from-to) | 34-59 |

Number of pages | 26 |

Journal | Computational Complexity |

Volume | 16 |

Issue number | 1 |

DOIs | |

State | Published - May 1 2007 |

Externally published | Yes |

### All Science Journal Classification (ASJC) codes

- Theoretical Computer Science
- Mathematics(all)
- Computational Theory and Mathematics
- Computational Mathematics

### Keywords

- Extractors
- Kakeya
- Mergers
- Randomness

## Fingerprint Dive into the research topics of 'An improved analysis of linear mergers'. Together they form a unique fingerprint.

## Cite this

*Computational Complexity*,

*16*(1), 34-59. https://doi.org/10.1007/s00037-007-0223-z