Much recent work on natural information has focused on probabilistic theories, which construe natural information as a matter of probabilistic relations between events or states. This paper assesses three variants of probabilistic theories (due to Millikan, Shea, and Scarantino and Piccinini). I distinguish between probabilistic theories as (1) attempts to reveal why probabilistic relations are important for human and non-human animals and as (2) explications of the information concept(s) employed in the sciences. I argue that the strength of probabilistic theories lies in the first project. Probability-raising can enable organisms to draw specific inferences they otherwise could not entertain and I show how exactly they help to explain the behaviour of organisms. In addition, probability-raising warrants inferences by providing incremental inductive support.