Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But it is closer to absolute than you make it sound here. There are information theoretic models which are “universal” with respect to a class; that is, they are essentially as good as any in that class, for every individual case you apply - even if different cases are best described by distinct models from that class.

E.g. the KT estimator is, for each individual Bernoulli sequence, as good as the best Bernoulli model for that sequence with at most 1/2 but difference (independent of sequence length)

It is undecidable/uncomputable, and only well defined up to a constant, but you have a “universally universal” model - Kolmogorov complexity. In that sense, entropy IS an absolute.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: