In information theory rate distortion addresses the problem of determining the minimal number of bits per symbol, as measured by the rate $R$, that should be communicated over a channel, so that the source (input signal) can be approximately reconstructed at the receiver (output signal) without exceeding a given distortion $D$. In this talk we will discuss rate distortion from the view point of one-shot (non-asymptotic) information theory.