In this paper, we obtain information theoretical conditions for tracking in linear time-invariant control systems. We consider the particular case where the closed loop contains a channel in the feedback loop. The mut...In this paper, we obtain information theoretical conditions for tracking in linear time-invariant control systems. We consider the particular case where the closed loop contains a channel in the feedback loop. The mutual information rate between the feedback signal and the reference input signal is used to quantify information about the reference signal that is available for feedback. This mutual information rate must be maximized in order to improve the tracking performance. The mutual information is shown to be upper bounded by a quantity that depends on the unstable eigenvalues of the plant and on the channel capacity. If the channel capacity reaches a lower limit, the feedback signal becomes completely uncorrelated with the reference signal, rendering feedback useless. We also find a lower bound on the expected squared tracking error in terms of the entropy of a random reference signal. We show a misleading case where the mutual information rate does not predict the expected effect of nonminimum phase zeros. However, mutual information rate helps generalize the concept that there is a tradeoff when tracking and disturbance rejection are simultaneous goals, and a constraint communication channel is present in the feedback loop. Examples and simulations are provided to demonstrate some of the results.展开更多
基金supported by Conacytthe second author was partly supported by NSF award under the FIND initiative CNS 0626380
文摘In this paper, we obtain information theoretical conditions for tracking in linear time-invariant control systems. We consider the particular case where the closed loop contains a channel in the feedback loop. The mutual information rate between the feedback signal and the reference input signal is used to quantify information about the reference signal that is available for feedback. This mutual information rate must be maximized in order to improve the tracking performance. The mutual information is shown to be upper bounded by a quantity that depends on the unstable eigenvalues of the plant and on the channel capacity. If the channel capacity reaches a lower limit, the feedback signal becomes completely uncorrelated with the reference signal, rendering feedback useless. We also find a lower bound on the expected squared tracking error in terms of the entropy of a random reference signal. We show a misleading case where the mutual information rate does not predict the expected effect of nonminimum phase zeros. However, mutual information rate helps generalize the concept that there is a tradeoff when tracking and disturbance rejection are simultaneous goals, and a constraint communication channel is present in the feedback loop. Examples and simulations are provided to demonstrate some of the results.