High-dimensional statistical inference requires the recovery/estimation of a large number of parameters from a possibly much smaller number of samples. A growing body of recent work has established that this is possible provided (a) the signal possesses an appropriate low-dimensional structure, and (b) the sampling is "incoherent", i.e. does not suppress this structure. Popular structural assumptions include sparsity, block-sparsity, low-rank etc., and popular recovery approaches include regularization via convex penalties, alternating projections, greedy approaches etc. However, in the existing literature, analysis for each combination of structure and method has proceeded on a case by case basis. In this work we provide a unified framework that broadly characterizes when incoherence will enable consistent estimation in the high-dimensional setting. Specifically, we provide general definitions for structure and incoherence, and then establish that incoherence guarantees success in recovery (exactly in the noiseless case, and approximately in noisy case) for two broad classes of methods: (a) appropriate convex regularization, and (b) a new algorithm - Generalized Projections - that we propose. We identify several existing results that are recovered as special cases of each of our results. Our work builds on the recent framework for convex regularizers by Negahban et.al.; in particular one of our results is a characterization, in the presence of incoherence, of a crucial constant they define but do not evaluate in general. Finally, we also extend our framework to the case of multiple superimposed structures, where we define a new inter-structure notions of incoherence - Restricted Orthogonality Property.