MathOverflow Asked by jw7642 on November 24, 2021
My question is about f-divergences and Richard Jeffrey’s (1965) rule for updating probabilities in the light of partial information.
The set-up:
The concept of an f-divergence seems to be a fairly natural generalization of the Kullback-Leiber divergence. And minimizing the Kullback-Leibler divergence between prior and posterior probability functions is known to agree with Jeffrey’s Rule (Williams, 1980).
Here is where I get stuck.
I have seen it written that "minimizing an arbitrary f-divergence subject to the constraint $p(E_{i}) = k$ is equivalent to updating by Jeffrey’s Rule". However, I can only find proofs going in one direction, namely, all f-divergences agree with Jeffrey’s Rule (e.g., Diaconis and Zabell, 1982, Theorem 6.1).
Q: Is it also true that only f-divergences agree with Jeffrey’s Rule? Or might there be some non-f-divergence $mathcal{D}$ such that minimizing it subject to the same constraint also agrees with Jeffrey’s Rule?
Any pointers would be awesome.
Refs:
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP