for his pioneering contributions to the field of information and coding theory, particularly for the unique information theory for individual sequences which led to the universal lossles Lempel-Ziv datacompression algorithm .
Contributions to Information and Coding Theory
Professor Jacob Ziv of TECHNION Haifa, Israel, is receiving the Eduard Rhein Prize for several groundbreaking publications on the theories of information and communication technologies. Each of these publications has spawned new avenues of research and novel applications. To people outside of the field, terms like concatenated channel coding, the Ziv-Zakai bound of estimation theory, and complexity measures for strings of symbols may appear esoteric. Therefore, the following example is
provided to illustrate one of Ziv’s achievements – a text coding procedure that every Internet user relies on today.
Assume that you enter a text into your PC in order to process it, save it, or send it over the Internet afterwards. This text is initially converted to a string of ones and zeros, which are known as bits. Saving and sending the text gets faster and cheaper as the coding procedure works with fewer and fewer bits. Samuel Morse, one of the early pioneers of communication technology, recognized this when he assigned especially short Morse signals to frequently occurring letters of the alphabet. Cutting down
on the amount of data that has to be transmitted is called data compression. Even greater efficiency can be achieved if one takes frequently occurring longer character strings into account. In normal English texts the character string the is much more frequent than, say, the string das, which means that one can assign bits more sparingly. In his seminal work on information theory in 1948 Claude E. Shannon ( 1991 Eduard Rhein Prize showed that a minimum number of bits per character exists for
such data compression if the compression is to be lossless. However, using the method sketched above requires going to inordinate lengths to approximate this “entropy,” i.e. the minimum number of bits per character. Furthermore, such coding very much depends on the type of text and especially on the language, as the examples the and das show.
A coding procedure offered by Jacob Ziv (who developed the concept and provided the theoretical basis) and Abraham Lempel (who developed the programming algorithm) solves this problem in a very elegant way. Any additional text is encoded by searching for already encoded segments – the longer they are, the better- in dynamically linked memory and by using these as prefixes for new code words, for example them. Such a coding procedure is universal, that is, independent of the type and
language of the text. As Ziv was able to show in the 1970s, for uniform texts it astonishingly quickly approximates the ideal goal, namely “entropy” as the minimum number of bits in lossless data compression. Thereafter, the original text can be recovered using an equally simple procedure.
Experts consider this coding procedure the most important step to date in the development of text coding.
Prof. Dr. Hans Dieter Lüke, RWTH Aachen