Revising Zipf’s law [From PNAS]

From the Abstract: “We demonstrate a substantial improvement on one of the most celebrated empirical laws in the study of language, Zipf’s 75-y-old theory that word length is primarily determined by frequency of use. In accord with rational theories of communication, we show across 10 languages that average information content is a much better predictor of word length than frequency. This indicates that human lexicons are efficiently structured for communication by taking into account interword statistical dependencies. Lexical systems result from an optimization of communicative pressures, coding meanings efficiently given the complex statistics of natural language use.”

[ HT to Paul Kedrosky ]

Claude Shannon – Father of Information Theory

This summer in the Complex Systems Advanced Academic Workshop we are devoting attention to information theory.  In collecting some materials about Claude Shannon, I came across the above video and thought I would share it with others.  Here is the description … “Considered the founding father of the electronic communication age, Claude Shannon’s work ushered in the Digital Revolution. This fascinating program explores his life and the major influence his work had on today’s digital world through interviews with his friends and colleagues.”

A Quantum Calculation: Is Information at the Root of Everything?

From the article “… Vlatko Vedral, an Oxford physicist, examines the claim that bits of information are the universe’s basic units, and the universe as a whole is a giant quantum computer. He argues that all of reality can be explained if readers accept that information is at the root of everything.”