Data Compression/Differencing – Tradeoff

Repetition and mapping tables for lossless(like RLE,LZ77-8,Huffman algorithms), or for lossy(discrete cosine transform (DCT) waves interpolation or chroma-based algorithms), are the basis of fewer bytes compression.

This can produce overhead in non-repeatable/differentiable files, and even in cases of lossy files when the loss accepted is small.

If the application is unique in every way, this overhead will lead to a greater file size, but in 99% of theĀ  cases this is not the case and decompression will assist the program saving great amounts secondary memory per file with few overhead of cpu+RAM memory for the decompression.

 

 

Comments are closed, but trackbacks and pingbacks are open.