In this work we introduce Power Spectral Density Sampling (PSDS), a new method for building a reduced rank approximation to a stationary kernel matrix. We apply this method in the framework of Gaussian Processes (GP). The resulting model can be trained in an online fashion: As each training sample arrives, we can include it in the model using only O(r2) time and storage, where r is the selected number of spectral samples used in the approximation. For n training points, the overall cost is O(nr2) time and O(r2) storage. The resulting GP is properly defined and model hyperparameters can be selected by evidence maximization. Though it is especially well-suited for low dimensional problems, it can reach and even outperform other (batch) state-of-the-art sparse GP methods on higher dimensional datasets if allowed to work in batch mode, learning the locations of the spectral samples. We check this possibility on some large datasets.