ITWissen.info - Tech know how online

entropy

Entropy is a term from thermodynamics that has been adopted in information technology and represents the information content of a message. Since the information content of a message essentially depends on the message source, it is also referred to as source entropy or source coding.

However, the entropy of a message is also determined by the transmission channel, thus by the transmission capacity. If the entropy of the source is lower than that of the transmission channel, then the messages are transmitted without errors. In the opposite case, the messages are error-prone.

Numerically, the source entropy corresponds to the average rate of information transfer from the message. An information source that uses 80% of the linguistic capabilities has an entropy of 0.8. The redundancy of the resulting message is 100% minus the entropy, or minus 20%.

Entropy methods are used for lossless compression. After decompression, the compressed data corresponds to the original data. Lossless compression methods offer particularly high compression rates for text compression, and lower rates for image, video and audio compression.

Informations:
Englisch: entropy
Updated at: 14.12.2012
#Words: 168
Links: information technology (IT), content, message (MSG), coding, transmission
Translations: DE
Sharing:    

All rights reserved DATACOM Buchverlag GmbH © 2024