Communication can often be exchanged with computation in control systems. A car's computer needing to know the speed can either get the data from the speed sensor over the vehicle's communication network (bus); or it can calculate the speed from the initial speed, the history of throttle commands, using the laws of physics driving the car. In a fully deterministic world with powerful enough computers, communication may be redundant. In the real world, the degree of uncertainty in the physics can say something about the level of communication necessary. Quantifying this communication need can help principled design and allocation of network bandwidth and other resources in vehicles and other control systems.

Uncertainty or lack of information is usually measured by *entropy* of some flavor. Claude Shannon developed a definition of entropy in the context of engineering telephone networks. That definition uses probability distributions, not coincidentally, capturing noise in telephone channels. In contrast, *topological entropy*, used in studying evolution of worstcase uncertainty in safety-critical systems, does not use probabilities at all. Instead, it measures the rate of growth of uncertainty in a system's state with time. Topological entropy of a stable system like a pendulum will be smaller than that of an unstable system like an inverted pendulum.

Why do we care about topological entropy? First, as entropy describes the rate of growth of state uncertainty (without new measurements), it should also somehow relate to the rate of measurements necessary to accurately estimate the state. In the speed sensor-estimator example, the entropy of the system would give the minimal channel capacity necessary for connecting the two, so the computer can construct accurate speed estimates with worst-case error bounds. These lower bounds hold across *all* algorithms and codes, and therefore, can take the guesswork out of communication network design. As more devices feed into shared networks, entropy bounds can guide allocation of bandwidth to different processes.

Secondly, state estimation is a building block for monitoring, failure detection, runtime verification. The entropy-based bandwidth requirements transfer to those functions as well. Finally, bounding the uncertainty in the system's state is also a core issue in *verification*—the disciple of constructing correctness proofs about system's behavior. Entropy can also be used to get lower bounds on the number of tests or simulation samples needed for constructing such proofs based on simulation data.

One can think of topological entropy as the exponential growth rate of the number of system trajectories distinguishable with finite precision. Although this definition is mathematically elegant, it is difficult to calculate this quantity for a given system, model, or program. Linear systems are an exception. The topological entropy for such systems *can* be calculated exactly from the real parts of the eigenvalues of the system's matrix. For general nonlinear and hybrid models, control theorists have developed results for computing upper and lower bounds on topological entropy using Jacobians and matrix measures, which can be used with some loss, in the place of the actual entropy.

Why is the following paper foundational? The main result in this paper gives a method for computing the topological entropy of a broad class of systems called *switched linear systems.* These are systems that evolve according to *some* linear dynamics at each point in time, but over time the specific linear dynamics being followed jumps (or *switches*) among a known and finite set of possibilities. For example, the motion of a vehicle system that moves according to linear (or linearized) laws of physics and a program that picks the control inputs, like throttle and steering, every so often, from a finite bank of possibilities. The paper gives a sharp result showing the entropy of such systems, for all possible switchings, can be computed *exactly* in terms of the *joint spectral radius* of those finite set of system matrices. This result adds to our knowledge base an expressive class of systems for which entropy can be calculated. I recommend the theoretically inclined readers to read the short proof that uses a couple of equivalent definitions of entropy and properties of the joint spectral radius. The latter object is well-studied and can be calculated using numerical analysis toolboxes. The paper is also interesting because it gives a practical sensor-estimator algorithm for implementing state estimation for switched linear systems, and the data rate used matches the entropy bound.

What can we expect to see in the future? Following on this line of research, one obvious next step would be to get data rate lower bounds for controlling different classes of switched systems. The current methods require calculation of derivatives and Jacobians, which curtails their application in realistic simulation models that often use heterogeneous numerical analysis functions. A fruitful research direction would be to combine the topological entropy methods with automated differentiation techniques, to get practical data rate bounds from simulation programs.

## Join the Discussion (0)

## Become a Member or Sign In to Post a Comment