[Claude] Shannon said “Look, here’s what information is. Let’s say I want to navigate from one part of the city to another, from A to B, in a car. I could just drive around randomly. It would take an awful long time to get there, but I might eventually get there.
Alternatively, I could give you a map or driving directions, and you’d get there very efficiently. And the difference between the time taken to get there randomly and the time taken to get there with directions is a measure of information.”
And Shannon mathematised that concept and said, “That is the reduction of uncertainty. You start of not knowing where to go, you get information in the form of a map or driving directions, and then you get there directly.” He formulised that, and he called that information.
And it’s the opposite of what Boltzmann and Gibbs were talking about. It’s a system instead of going from the ordered into the disordered state – the billiard balls on the table starting maybe in a lattice and ending up randomly distributed – it’s going from a state of them being random because you don’t know where to go, to becoming ordered.
And so it turns out that Shannon realised that information is in fact the negative of thermodynamic entropy. And it was a beautiful connection that he made between what we now think of as the science of information and what was the science of statistical physics.
David Krakauer – Making Sense with Sam Harris, Episode #40
Information brings order, making it far more likely that things will be in the right places at the right time, (and indeed, defines what those things, places and times are), faster and more efficiently than they otherwise would be.
How does your organisation create a framework of meaning through expressing its vision, mission and values?
What information do you create and distribute to achieve your goals?
How could you organise and express key information better? (This is information about information).