History / Edit / PDF / EPUB / BIB
Created: August 26, 2015 / Updated: December 12, 2016 / Status: in progress / 2 min read (~269 words)

Entropy is an important concept to understand if one wishes to compress a function (data) as closely as possible to the theoretical limit.

If your goal is to get as much compression as possible and you are okay with losing information, then lossy compression will allow you to go past what lossless compression would give you.

$$ Η(X) = -\sum_{i} {\mathrm{P}(x_i) \log_b \mathrm{P}(x_i)}$$

  • What happens as the volume of binary information grows?
  • Is it always possible to compress binary data past a certain amount of data? Even if this data is at its peak entropy?

  1. https://en.wikipedia.org/wiki/Entropy_(information_theory)