In a standard Bayesian model of beliefs, an agent starts out with a prior distribution over a set of possible states, and then updates to a new distribution, in principle using all the info that agent has ever acquired.
> The odds form of Bayes's Rule states that the prior odds times the likelihood ratio equals the posterior odds. We can take the log of both sides of this equation, yielding an equivalent equation which uses addition instead of multiplication.
I'm guessing that tjhis ''change in probabilities per bit,', which is maximal for topics on which we know 1 bit, has a name and (in this setting, at least) a formula. Does it?
Absolutely agree with this article. Well written.
Thanks Konrad and Robin.
> The odds form of Bayes's Rule states that the prior odds times the likelihood ratio equals the posterior odds. We can take the log of both sides of this equation, yielding an equivalent equation which uses addition instead of multiplication.
https://arbital.com/p/bayes...
I know of no name. It is the change in prob per %bits that is maximal for ~1bit, not the change per bit.
I'm guessing that tjhis ''change in probabilities per bit,', which is maximal for topics on which we know 1 bit, has a name and (in this setting, at least) a formula. Does it?