In 1948 Claude Shannon quantified information as:
“The information in a message is inversely proportional to its probability. The more surprising a message, the more information it contains.”
What is Mr. Shannon telling us with this description? Does his description help to define just what information is? How is Mr. Shannon quantifying information?