Timeline Of Information Theory

All Updates

A timeline of events related to information theory, quantum information theory, data compression, error correcting codes and related subjects.

Read More

- 1872 – Ludwig Boltzmann presents his H-theorem, and with it the formula Σ
*p*<sub>i</sub> log*p*<sub>i</sub> for the entropy of a single gas particle. - 1878 – J. Willard Gibbs defines the Gibbs entropy: the probabilities in the entropy formula are now taken as probabilities of the state of the
*whole*system. - 1924 – Harry Nyquist discusses quantifying "intelligence" and the speed at which it can be transmitted by a communication system.
- 1927 – John von Neumann defines the von Neumann entropy, extending the Gibbs entropy to quantum mechanics.
- 1928 – Ralph Hartley introduces Hartley information as the logarithm of the number of possible messages, with information being communicated when the receiver can distinguish one sequence of symbols from any other (regardless of any associated meaning).
- 1929 – Leo Szilard analyses Maxwell's Demon, showing how a Szilard engine can sometimes transform information into the extraction of useful work.
- 1940 – Alan Turing introduces the deciban as a measure of information inferred about the German Enigma machine cypher settings by the Banburismus process.
- 1944 – Claude Shannon's theory of information is substantially complete.
- 1947 – Richard W. Hamming invents Hamming codes for error detection and...... ...

Read More

No messages found

about this page

for companies, colleges, celebrities or anything you like.Get updates on MyPage.

Create a new Page