World War II was a crucible of technological innovation, including advances in statistics. Jacob Wolfowitz, born a century ago (1920), looked at the problem of noisy radio transmissions. Coded radio transmissions were critical elements of military command and control, and they were plagued by the problem of atmospheric or other interference – “noise”. The weaker the transmission and the longer the distance, the more likely it is that the signal will be lost in the noise. When the human ear and brain hears and interprets a noisy transmission, it has the ability to mitigate some of the signal loss by filling in information, using nearby context as a crutch. When the transmission is a letter-by-letter code, however, context is unhelpful until it is decoded. Therefore, a critical parameter is the rate of loss of individual letters, and minimizing that loss. Wolfowitz helped elaborate a statistical technique to be applied to the coding algorithm with this in mind. His theoretical description envisioned representing each word in what today’s text miners would call a dictionary (e.g., the vocabulary used in military operations) by a numerical sequence. If each numerical sequence is sufficiently distant from all other such sequences, then the potential for guessing an incomplete word is maximized by reducing the potential for confusing the sequence with a similar sequence. For more, read his monograph Coding Theorems of Information Theory.
Wolfowitz worked closely with Abraham Wald during the war years, and the two were responsible for a number of theoretical contributions to probability and statistics, especially in the area of nonparametric statistics.
Jacob Wolfowitz’s son, Paul, carried on the martial tradition in a new way; he was one of the key architects of the U.S. invasion of Iraq.