49085

Автор(ы): 

Автор(ов): 

1

Параметры публикации

Тип публикации: 

Статья в журнале/сборнике

Название: 

Tsallis Relative Entropy within Statistical Linearization Problems

Электронная публикация: 

Да

ISBN/ISSN: 

2405-8963

DOI: 

10.1016/j.ifacol.2018.09.196

Наименование источника: 

  • IFAC-PapersOnLine

Обозначение и номер тома: 

Vol. 51, No. 15

Город: 

  • Amsterdam

Издательство: 

  • Elsevier

Год издания: 

2018

Страницы: 

509-514
Аннотация
Statistical linearization is just a kind of stochastic system identification problems, which are considerably influenced with measures of dependence of random values. Within the context, selecting a consistent measure of dependence as a main instrumental tool plays a decisive role. In the present paper, the derivations are based on constructing (symmetric) Tsallis relative entropy (relative entropy is also commonly referred as mutual information) as a basis to formulate the corresponding information-theoretic criterion of the statistical linearization of a system described by an input/output mapping and driven with Gaussian white-noise process. As a result, a constructive procedure to derive weight function coefficients of the linearized model is obtained. Simultaneously the problem of the initial non-linear system identifiabilty is solved in the sense that vanishing all these coefficients is equivalent to the stochastic independence of the system input and output processes, what, in turn, means that the system is not identifiable.

Библиографическая ссылка: 

Чернышев К.Р. Tsallis Relative Entropy within Statistical Linearization Problems // IFAC-PapersOnLine. 2018. Vol. 51, No. 15. С. 509-514.