We consider the problem of learning of an arbitrary function selected from the non-smooth class of functions that are of bounded variation. Bounds on the prediction errors resulting from sequential type algorithms are achieved for various scenarios. It is shown that for any algorithm there exists a sequence of samples and a function that results in a unit error at each step. If the samples are i.i.d. uniform then there exists an algorithm such that for any function of bounded variation the expected error is less than O(log n/n). Furthermore, for any algorithm there exists a function such that the expected error is greater than O(1/n). We then introduce a mixed sampling setting in which an adversary can choose points but the actual samples are drawn uniformly from a δ neighborhood of his selection. We show that there exists an algorithm such that for any function of bounded variation and for any sequence of adversary-chosen points the expected cumulative error is less than O(√n log n). Finally, results are derived for the uniform sampling case with noisy observations. We show that the order of growth of the expected error with noise sequences in l1 remains unaltered. However, for i.i.d. zero mean and finite variance noise there exists an algorithm with expected error less than O(1/n1/3), and for i.i.d. Gaussian noise any algorithm has expected error of at least O(1/√n).