IDENTIFIABILITY

One must be able to identify the coefficients/parameters of the postulated mathemat­ical model (Chapters 2 and 9), from the given I/O data of a system under experiment (with some statistical assumptions on the noise processes, which contaminate the measurements). Identifiability refers to this aspect. The input to the system should be persistently exciting. The spectrum of the input signal should be broader than the bandwidth of the system that generates the data.

B15 INFORMATION MEASURE

The concept of information measure is of a technical nature and does not directly equate to the usual emotive meaning of information. The entropy, somewhat directly related to dispersion, covariance, or uncertainty, of a random variable x (with probability density p(x)) is defined as

Hx = Ex{logp(x)}, where E is the expectation operator.

ForGaussianm vectorx, itisgivenasHx = 1/2m(log2 pi + 1) + 1/2log|P|; pi = p, with P as the covariance matrix of x. Entropy is thought of as a measure of disorder or lack of information. Let Hp = —Epilogp(p)], the entropy prior to collecting data z, and p(b) the prior density function of b. When data are collected we have Hp/z = — Ep/z{logp(b|z)}. Then, the measure of the average amount of information provided by the experiment with data z and parameter b is given by I = Hp — Ez{Hp/z}. This is ‘‘mean information’’ in z about p. The entropy is the dispersion or covariance of the density function and hence the uncertainty. Thus, the information is seen as the difference between the prior uncertainty and the ‘‘expected’’ posterior uncertainty. Due to collection of data z, the uncertainty is reduced and hence the information is gained. The information is a non-negative measure and it is zero if p(z, P) = p(z) • p(P); i. e., if the data is independent of the parameters.