Improving dictionary based data compression by using previous knowledge and interaction

作者: Bruno Carpentieri , None

DOI:

关键词:

摘要: State of the art lossless data compressors are very efficient. While it is not possible to prove that they always achieve their theoretical limit (i.e. source entropy), effective performances for specific types often close this limit. If we have already compressed a large number messages in past, then can use previous knowledge increase compression current message and design algorithms efficiently compress decompress given knowledge. By doing fundamental coding theorem substitute entropy with conditional new allows better compression. Moreover, if assume possibility interaction between compressor decompressor exploit source. The price might pay low communication errors. In paper study interactive present experimental results on textual data.

参考文章(6)
James A. Storer, Data compression: methods and theory Computer Science Press, Inc.. ,(1987)
Richard M. Karp, Michael O. Rabin, Efficient randomized pattern-matching algorithms Ibm Journal of Research and Development. ,vol. 31, pp. 249- 260 ,(1987) , 10.1147/RD.312.0249
Claude E. Shannon, Warren Weaver, Norbert Wiener, The Mathematical Theory of Communication Physics Today. ,vol. 3, pp. 31- 32 ,(1950) , 10.1063/1.3067010
Bruno Carpentieri, Sending compressed messages to a learned receiver on a bidirectional line Information Processing Letters. ,vol. 83, pp. 63- 70 ,(2002) , 10.1016/S0020-0190(01)00316-7
A. El Gamal, A. Orlitsky, Interactive Data Compression 25th Annual Symposium onFoundations of Computer Science, 1984.. pp. 100- 108 ,(1984) , 10.1109/SFCS.1984.715906
Warren Weaver, Claude E. Shannon, The mathematical theory of communication IEEE Transactions on Instrumentation and Measurement. ,(1949)