terça-feira, 2 de fevereiro de 2010

A Quantitative Measure of Information

(An Introduction to Information Theory, Fazlollah M. Reza)

In our study we deal with ideal mathematical models of communication. We confine ourselves to models that are statistically defined. That is, the most significant feature of our model is its unpredictability. The source, for instance, transmits at random any one of a set of prespecified messages. We have no specific knowledge as to which message will be transmitted next. But we know the propbability of transmitting each message directly, or something to that effect. If the behavior of the model were predictable (deterministic), then recouse to measuring an amount of information would hardly be necessary.

When the model is statistically defined, while we have no concrete assurance of its detailed performance, we are able to describe, in a sense, its "over-all" or "avarage" performance in the light of its statistical description. In short, our search for an amount of information is virtually a search for statistical parameter associated with a probability scheme. The parameter should indicate a relative measure of uncertainty relevant to the occurrence of each particular message in the message ensemble. We shall illustrate how one goes about defining the amount of information by a well-known rudimentary example. Suppose that you are faced with the selection of equipment from a catalog which indicates n distinct models:

[x1, x2, ..., xn]

The desired amount of information I (xt) associated with the selection of a particular model xk must be a function of the probability of choosing xk:

I(xk) = f(P{xk})

If, for simplicity, we assume that each one of these models is selected with an equal probability, then the desired amount of information is only a function of n.

I1(xk) = f(1/n) (1-2a)

Next assume that each piece of equipment listed in the catalog can be ordered in one of m distinct colors. If for simplicity we assume that the selection of colors is also equiprobable, then the amount of information associated with the selection of a color cj among all equiprobable colors [c1, c2, ..., cm] is

I2(cj) = f(P{cj}) = f(1/m) (1-2b)

where the function f(x) must be the same unknown function used in Eq. (l-2a).

Finally, assume that the selection is done in two ways:
1. Select the equipment and then select the color, the two selections being independent of each other.
2. Select the equipment and its color at the same time as one selection from mn possible equiprobable choices.

The search for the function f(z) is based on the intuitive choice which requires the equality of the amount of infomation associated with the selection of the model xk with color cj in both schemes (l-2c) and (1-2d).

I(xk and cj) = I1(xk) + I2(cj) = f(1/n) + f(1/m) (1-2c)

I(xk and cj) = f(1/mn) (1-2d)

Thus
f(1/n) + f(1/m) = f(1/mn)

This functional equation has several solutions, the most important of which, for our purpose, is
f(x) = -log x

To give a numerical example, len n = 18 and m = 8.
I1(xk) = log 18
I2(cj) = log 8
I(xk and cj) = I1(xk) + I2(cj)
I(xk and cj) = log 18 + log 8 = log 144

Thus, when a statistical experiment has n equiprobable outcomes, the average amount of information associated with an outcome is log n. The logarithmic information measure has the desirable property of additivity for independent statistical experiments. These ideas will be elaborated upon in Chap 3.

Nenhum comentário:

Postar um comentário