STATISTICAL MODELS PRODUCED FROM FISHER INFORMATION FUNCTION

Abstract

Statistics the science of extracting information from data appears the most natural field of application of information theoretic methods in statistics. the Fisher information I ( θ) is the variance of score . it is named in honor of its inventor the statistician R. A .Fisher .the fisher information is the amount of information that an observable random variable X carries about unobservable parameter θ upon which the likelihood function of X,L(θ)=F(X;θ),depends.The likelihood function is the joint probability of the data , the Xs , conditional on the value of θ, as a function of θ. Since the expectation of the score is zero , the variance is simply the second moment of the likelihood function with respect to θ. Hence the Fisher information can be written I(θ) = E { [ϑ/ϑθ ln f ( x ,θ)]2|θ } Which implies 0 ≤ I (θ)<∞ .We are discussing the regular estimation case when:The range of the random variable X does not depend upon the unknown parameterθ .i.e a≤x ≤bwhere a and b are constants .Differentiation with respect to θ can be carried out under the integral ∫▒〖f ( x ,θ)dx 〗where the limits from c ( θ) to d ( θ) depend on θ . Both univariate and multivariate parameters using fisher information matrix .