terça-feira, 2 de fevereiro de 2010

Formal Requirements for the Average Uncertainty

(An Introduction to Information Theory, Fazlollah M. Reza)

Shannon's approach, as well as several other authos', in suggesting a suitable H function has been to some extent directed toward an axiomatic description of such functions. The desired H function should habe the following basic properties:

1. Continuity.
That is, if the probabilities of the occurrence of events Ek are slightly changed, the measure of uncertainty associated with the system should vary accordingly in a continuous manner. (...) a slight change in the probability of the occurrence of an event should not provide us with a significantly large amount of information.

2. Symmetry.
The H function must be functionally symmetric in every pk. Indeed, the measure of uncertainty associated with a complete probability set [Ek, Ek'] must be exactly the same as the measure associated with the set [Ek',Ek]. Our measure must be invariant with respect to the order of the events.

3. Extremal Property.
When all the events are equally likely, the average uncertainty must have its largest value. In this case, it is most uncertain which event is going to occur. Conversely, once we know which specific event among a number of n equally likely events has occured, we have acquired the largest average amount of information relevant to the occurance of events of a universe consisting of n complete events.

4. Additivity.
Suppose that we habe obtained a suitable measure of the average uncertainty H(p1,p2,...,pn) associated with a complete set of events. Now, let us assume that the event En is divided into disjoint subsets such that
(...)

Complying with properties 1 to 4 given above, or with similar requirements, one should be able to derive a functional form for the desired uncertainty function. Such treatments have appeared in the work of Feinsten, Khinchin, Shannon, Schtzenberger, and others. (...)
1. Fadiev assumes properties 1, 2, and 4, subsequent to several lemmas, proves that H must be of the form suggested in Eq.(3-11) expectfor a multiplicative constant.
H(X) = - sum_{k=1}^{n} p_k log p_k
2. Khinchin assumes properties 1, 3, and 4 and the fact that adding a null set to a complete set of events should not change its entropy, and he derives the form of Eq.(3-11) up to a positive constant multiplier.
3. Schutzenberger aims for a more general axiomatic search for a measure of information associated with a complete set of events. He shows that functions other than the Shannon-Wiener entropy of Eq.(3-11) may also be employed. An example of such a function is given in the work of R. A. Fischer. It should be pointed out, however, that the Shannon-Wiener suggested form is certainly the simplest of all such forms. The present richness and depth of the literature of information theory are to a great extent due to the simplicity of the form of Eq.(3-11).

Nenhum comentário:

Postar um comentário