We consider the problem of one-way communication when the recipient does not know exactly the distribution that the messages are drawn from, but has a prior distribution that is known to be close to the source distribution, a problem first considered by Juba et al. . This problem generalizes the classical source coding problem in information theory, in which the receiver knows the source distribution exactly, that was first considered in Shannon's work . This uncertain priors coding problem was intended to illuminate aspects of natural language communication, and has applications to adaptive compression schemes. We consider the question of how much longer the messages need to be in order to cope with the uncertainty that the sender and receiver have about the receiver's prior and the source distribution, respectively, as compared to the source coding problem. We obtain lower bounds for one-way communication using uncertain priors that are tight up to low-order terms. Specifically, we consider two variants of the uncertain priors problem. First, we consider the original setting of Juba et al.  in which the receiver is required to correctly recover the message with probability 1. We find that in this setting, the overhead of 2 log α + O(1) bits achieved by that scheme when the prior is α-close to the source is optimal up to an additive O(log log α) bits. We also consider a setting introduced in the work of Haramaty and Sudan , in which the receiver is permitted to fail to recover the message with some positive probability ϵ. In this setting, we find that the optimal overhead is essentially log α + log 1/ϵ bits by presenting both a variant of the coding scheme of Juba et al. with an overhead of log α+log 1/ϵ+1 bits, and a lower bound that matches up to an additive O(log log α) bits. Our lower bounds are obtained by observing that a worst-case, one-way communication complexity problem can be embedded in the sources and priors that any any uncertain priors coding scheme must address.