Kolmogorov Complexity, Data Compression, and Inference

作者: Thomas M. Cover

DOI: 10.1007/978-94-009-5113-6_2

关键词:

摘要: If a sequence of random variables has Shannon entropy H, it is well known that there exists an efficient description this which requires only H bits. But the also to do with inference. Low sequences allow good guesses their next terms. This best illustrated by allowing gambler gamble at fair odds on such sequence. The amount money one can make essentially complement respect length

参考文章(6)
C. P. Schnorr, A unified approach to the definition of random sequences Theory of Computing Systems \/ Mathematical Systems Theory. ,vol. 5, pp. 246- 258 ,(1971) , 10.1007/BF01694181
Gregory J. Chaitin, A Theory of Program Size Formally Identical to Information Theory Journal of the ACM. ,vol. 22, pp. 329- 340 ,(1975) , 10.1145/321892.321894
R.J. Solomonoff, A formal theory of inductive inference. Part II Information and Control. ,vol. 7, pp. 224- 254 ,(1964) , 10.1016/S0019-9958(64)90131-7
S Leung-Yan-Cheong, T Cover, None, Some equivalences between Shannon entropy and Kolmogorov complexity IEEE Transactions on Information Theory. ,vol. 24, pp. 331- 338 ,(1978) , 10.1109/TIT.1978.1055891
R.J. Solomonoff, A formal theory of inductive inference. Part I Information and Control. ,vol. 7, pp. 1- 22 ,(1964) , 10.1016/S0019-9958(64)90223-2