The amount of information lost in sub-Nyquist uniform sampling of a continuous-time Gaussian stationary process is quantified. We first derive an expression for the mean square error in reconstruction of the process for a given sampling structure as a function of the sampling frequency and the average number of bits describing each sample. We define this function as the distortion-rate-frequency function. It is obtained by reverse water-filling over spectral density associated with the minimum variance reconstruction of an undersampled Gaussian process, plus the error in this reconstruction. Further optimization is then performed over the sampling structure, and an optimal pre-sampling filter associated with the statistic of the input signal and the sampling frequency is found. This results in an expression for the minimal possible distortion achievable under any uniform sampling scheme. This expression is calculated for several examples to illustrate the fundamental tradeoff between rate distortion and sampling frequency derived in this work that lies at the intersection of information theory and signal processing.