Neural activity in the cortex appears to be notoriously noisy. A widely accepted explanation for this finding is that excitatory and inhibitory inputs to downstream neurons are balanced in a way that the upstream population activity does not affect the mean but only the variance of the input current. This can be thought of as a multiplicative noise channel. However, the capacity limits imposed by this information channel are not known. Here we develop a general understanding of the encoding process in terms of scale mixture processes and derive information-theoretic bounds on their performance. Our results show that signal transmission via instantaneous changes in the variance can behave quite differently from the common additive noise channel. We perform systematic numerical analyses to maximize the information across the variance channel and thus obtain tight lower bounds to its capacity. Furthermore, we found that additional noise, resembling the unreliable synaptic transmission of spikes, can surprisingly enhance the coding performance of the channel.