Information and (Information-Theoretic) Entropy

Doug Elias (dme7@cornell.edu)
Sun, 29 Mar 1998 00:27:11 +0100 (MET)

G'day ...

A recent posting from [spking1@mindspring.com (Stephen Paul King)], in
response to a query regarding "existence requires non-existence",
included the following statement:

" ... information and entropy are inversely related-if you have a
maximum-entropy solution, then you have a minimum-information
solution, and visa versa, according to Kosko ..."

I just want to point out that the type of "information" being
described here is different from that commonly encountered in
Information Theory generally, where "information" is regarded as what
results from an experimental trial -- in that case, "maximum-entropy"
prior to the trial is equivalent to "maximum information" resulting
from the trial, as is immediately obvious when considering how much
information is derived (i.e., how "surprised" one is) from a totally
certain event: this event has an absolutely minimum entropy (0), and
performing the trial results in absolutely no surprise or new
information.

Ta,
doug

-- 
  __|_        Internet:   dme7@cornell.edu
dr _|_)oug    USmail:     Director of Technology
  (_|                     Parker Center/Johnson Grad. School of Mgmt.
  (_|__lias               226 Malott Hall/C.U./Ithaca/N.Y./14853-3801
    |         Phone:      607-255-3521   Fax: 607-254-8886