Given a channel and an input process we study the minimum randomness of those input processes whose output statistics approximate the original output statistics with arbitrary accuracy. We introduce the notion of resolvability of a channel, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process. We obtain a general formula for resolvability which holds regardless of the channel memory structure. We show that, for most channels, resolvability is equal to Shannon capacity. By-products of our analysis are a general formula for the minimum achievable (fixed-length) source coding rate of any finite-alphabet source, and a strong converse of the identification coding theorem, which holds for any channel that satisfies the strong converse of the channel coding theorem.