A minimax formulation of the problem of parametric identification of input/output mapping systems based on the entropy of the identification error is considered. The equivalence of such statements is shown. The relationship between the proposed minimax criterion of entropy according to Shannon of the identification error and the criterion of maximum mutual information (Kullback-Leibler divergence) of the system output process and the model output process is considered. Using this entropy approach, a measure of proximity to the Gaussian distribution is proposed, which at the same time can be considered as a test for the Gaussian probability distribution of a continuous random variable acceptable for use under dependent observations.