THESIS
2008
viii, 73 leaves : ill. ; 30 cm
Abstract
Originating in thermodynamics, the concept of entropy was introduced into information theory, evolution, psychodynamics and dynamic systems. In most areas, entropy measures how "disordered" or "complex" a system is internally. In system theory, it reflects the internal complexity of a system. It is defined by measure in probability spaces, by open covers in compact spaces and by metric in metric spaces....[
Read more ]
Originating in thermodynamics, the concept of entropy was introduced into information theory, evolution, psychodynamics and dynamic systems. In most areas, entropy measures how "disordered" or "complex" a system is internally. In system theory, it reflects the internal complexity of a system. It is defined by measure in probability spaces, by open covers in compact spaces and by metric in metric spaces.
However, in linear system theory, how to define the entropy remains a problem. This thesis gives a new definition of topological entropy for linear systems. The entropy of linear endomorphisms and discrete-time LTI system are defined using the Lebesgue measure and the properties of the entropy function are discussed. It is also compared with the existing definition. The analysis suggests the definitions are equal when both are applicable. The definition for the discrete-time case is then extended to the continuous-time case.
In order to apply the concept to the control problems, the entropy is quantified in both discrete-time and continuous-time cases. It is applied to the stabilizing control problem and shown to provide a lower bound for the input energy. A further application is to include the sampling part in continuous-time LTI systems. The minimum stabilizing input energy in the sampled-data control problem is analyzed.
Post a Comment