# Fundamental research in error corecting coding (Hamming Distance/Hamming Codes)

Hamming Distance and Hamming Codes

In daily life distances are measured by counting the minimum of meters necessary to move from one place to another.

Analogous to this, in the digital world of string of zero´s and one´s i.e. bits, the “Hamming distance” is the minimum number of bits must be changed to move from one string to another. This use of the term distance was first introduced by R.W. Hamming at the beginning 1950´s and has been widely applied ever since in areas extending from communications einginnering to information science. For example, in the typical problem of face recognition using artificial intelligence, the similarity between two faces can be measured in terms of the Hamming distance

Hamming, however, originally introduced this measure of distance in connection with the correction of errors in th transmission, storing and processing of binary signals. The larger the number of transmission erros ist, the larger the Hamming distance is between the string of bits which hass been transmitted and that which has been incorrectly received. Therefore Hamming only used such strings of bits for transmission as had a Hamming distance more than twice as large as that which could originate owing to errors in transmission. This allows a mistake to be recognized and even corrected. A number of strings of bits with a large Hamming distance form a “Hamming code.” Together with Shannon´s informations theory, Hamming work created the basis for the general theory of error correction. Its importance lies above all the fact that arbitrarily complex systems van be assembled with a given reliability from less reliable parts.

The Theory of error correction, with extremly comples mathematics, is indispensible for the proper funtioning of every CD player and mobile telephone, but also of every mainframe computer and worl-wide communication networks. Only through the use of such codes can the inevitables erros in the transmission, storing and processing of information be rectified.

Prof. Dr. Hans Dieter Lüke

RWTH Aachen