In Shannon's classical rate distortion problem, an encoder compresses data from a source into a message and sends it losslessly to the decoder. The decoder then outputs the estimate corresponding to this message, which reconstructs the source with required fidelity. Practical issues give rise to several interesting extensions of this problem. Dobrushin and Tsybakov formulated a so-called `remote source' problem where the encoder observes a noisy version of the source. Wyner and Ziv considered an extension where the decoder observes 'side information', i.e., additional data correlated to the source. In these problems, the observation channels are fixed and known. However, more generally, a designer may have to contend with limited channel knowledge. Worse still, the channel statistics may be partially controlled by an adversary.
In this talk, we look at lossy compression of an 'arbitrarily varying remote source'. Here a source broadcasts its data over a two-output channel controlled by an adversary. These two outputs are observed as input at the encoder, and side information at the decoder. The adversary knows the source data non-causally, and can employ jamming strategies arbitrarily correlated to it. We study the adversarial rate distortion function of the source under randomized coding, and provide upper and lower bounds on it. Interesting special cases are discussed where the two bounds coincide (joint work with Bikash Kumar Dey at IIT Bombay and Vinod M. Prabhakaran at TIFR Mumbai).