Tel-Aviv University - Computer Science Colloquium

Sunday, Dec 4, 2005, 11:15-12:15

Room 309
Schreiber Building


Guy Kindler

Microsoft Research


On Gallager's problem: new lower-bounds for noisy communication


The effect of noise on computation and communication has been studied

extensively in recent decades.  This field of research can be traced back

to Shannon, who proved that transferring information from one party to

another over a noisy channel only requires a constant blowup in resource

usage, compared to transmission over a noise free channel. Over the years

it was shown that a constant blowup is sufficient to overcome noise for

many other models.


In this talk I will discuss a model introduced by El Gamal in 1984 for

sharing information over a noisy broadcast network: each of N players is

given one input bit, and the goal is for all players to learn (with high

probability) all the input bits of the other players, using the smallest

possible number of broadcasts over a joint communication channel. In each

broadcast a player transmits one bit to all other players; however noise

flips the bit heard by each recipient with some fixed probability.


Without noise, N broadcasts would trivially suffice for the players to

learn all bits. However the best known protocol that deals with noise,

discovered by Gallager in 1988, uses $N\log\log N$ broadcasts. Attempts

made since to bring the blowup down to a constant have failed. Our main

result is that Gallager's protocol is in fact optimal up to a constant



This is joint work with Navin Goyal and Michael Saks.