A solution to a problem in mathematics that lingered unsolved for more than 50 years could help deliver faster computer algorithms to many problems in physics and signal processing. However, it may take years for mathematicians to fully digest the result, which was first published online three years ago.
The roots of the problem defined by Richard Kadison and Isadore Singer in the late 1950s lie in attempts to give the physics of quantum mechanics a footing in abstract mathematics. The concept it deals with traces back to Werner Heisenberg’s initial work on quantum mechanics. Heisenberg used matrix mathematics to develop his model of the quantum world that says it is not possible to accurately measure simultaneously different properties of a physical system at the microscopic level.
A decade later, John von Neumann applied graph theory to the problem of reconciling what physicists had postulated about quantum mechanics with mathematics. One result of this work was the development of a specialized algebra known as C* (pronounced “see-star”) that could model quantum states. In this algebra, Kadison and Singer formulated a question of abstract mathematics that mirrored one of the key issues facing quantum mechanics: do measurements of quantum properties in the observable world map into uniquely identifiable states that include the properties that cannot be measured?
They proved this did not apply to one class of measurements: those referring to continuous properties such as momentum. They thought, but could not prove, the answer would also be ‘no’ for discrete situations, such as energy states.
For close to 40 years, attempts to deliver a proof of the Kadison-Singer problem within the discipline of C*-algebra failed. Yet work in seemingly unrelated fields of mathematics turned up other angles of attack on the problem.
Petter Brändén, a mathematician working at the KTH Royal Institute of Technology in Sweden, says, “Kadison-Singer is a very old problem. If it was going to be solved, it was going to be better to attack from another angle. You see it quite a lot in mathematics: in order to solve these older problems, you need to use new tools or have a brilliant idea.”
The path to a solution to Kadison-Singer lay in mathematicians seeing connections between different branches of the discipline. Adam Marcus, an assistant professor at Princeton University, worked with Daniel Spielman and Nikhil Srivastava at Yale University to solve the Kadison-Singer problem. He adds, “Looking back, essentially the problem was hard to solve because it demands two different approaches.
“One of the parts could be dealt with using techniques in functional algebra, but the other problem was a combinatorial one. Unless you have someone who knew the techniques from both areas or a way of coming together, it probably wasn’t going to get solved.”
The first clear step on the path to solving Kadison-Singer was taken close to 40 years ago, with a conjecture developed by Joel Anderson while working at Pennsylvania State University for a way of partitioning a specific type of matrix into multiple sub-matrices that fit within certain limits. This “paving conjecture,” Anderson argued, was equivalent to the Kadison-Singer problem. Its advantage, according to Pete Casazza, Curator’s Distinguished Professor at the University of Missouri, is that it took the original problem away from being based on C*-algebras and into the much more heavily researched field of operator theory.
Just over a decade ago, Casazza began to notice an increasing number of problems resembling that of Kadison and Singer. Having moved from pure mathematics where he dealt with Banach spaces (which are closely related to C*-algebras), Casazza started working on signal-processing problems, such as separating audio signals from each other. “One of the leaders is Hans Feichtinger. He wrote to me and asked me about the vectors we could use to do signal analysis. I recognized that this problem was at least similar to the Kadison-Singer problem. It took a few years before I realized they were the same problem, and then I began to find other problems that were similar to Feichtinger’s,” says Casazza.
The path to a solution to Kadison-Singer lay in mathematicians seeing connections between different branches of the discipline.
With colleagues, Casazza published a paper in 2006 that described a number of problems collected from different areas of mathematics, spanning audio processing, building error tolerance into Internet protocols, and other branches of mathematics.
“I wasn’t looking for equivalence with Kadison-Singer: I was looking for an area that might provide a powerful solution to our problem,” Casazza. “But at a meeting, someone asked me, ‘when did you start to make a living out of finding conjectures similar to Kadison-Singer?'”
Two years later, Gil Kalai, the Henry and Manya Noskwith professor of mathematics at the Hebrew University of Jerusalem, while visiting Yale University in his role as an adjunct professor, spoke to Spielman about his group’s work on manipulating the matrices that describe the connections between points on a graph.
The primary focus in the Yale group is developing “sparsifiers,” techniques to reduce the number of connections in graphs that are often used to model networks, electrical circuits, and mechanical systems.
By sparsifying the graphs, problems can be solved faster without losing overall accuracy. Matrix math provides a way to represent the information in graphs, which provides a connection to many problems in physics that rely on the ability to solve simultaneous equations that also can be represented in matrix form.
“Our main motivation for this was developing very fast systems for solving systems of linear equations and Laplacian matrices,” Spielman says. “We are trying to come up with faster algorithms, and sparse is a big hammer for that.”
Kalai told Spielman the work looked similar to Anderson’s paving conjectures, and with some further development, could be used to solve the Kadison-Singer problem.”
“We looked it up and thought, ‘it won’t take very long.’ We were very wrong,” Spielman noted. “To prove Kadison-Singer, we had to go through a very different kind of route to the one we had been taking.”
Spielman points out the relatively short proof hides many blind alleys that he, Marcus, and Srivastava investigated on their way to the final proof. “There were similarities, but our first path didn’t work. We spent a while trying to modify our original proof, but eventually failed.
“For most of the five years we were working on it, the types of proof we had looked quite different to the original work we did.”
Says Marcus, “A lot of techniques that we picked were based on experimentation. We acted like scientists: based on the information, we decided how to proceed.”
Spielman adds, “It makes a large difference to be able to experiment. In general, I do a lot of computational experiments to figure out what should be true and find counterexamples to conjectures.”
The core of the final proof revolves around the properties of specialized types of polynomials. These interlacing polynomials call for the roots of two otherwise independent polynomial functions to interleave, such that each root of one lies between two roots of the other. The polynomials also need to fall within a class known as “real stable.” Matrix functions such as determinants provide suitable polynomials. Spielman says the computational work gave the group the confidence to pursue this route to the eventual Kadison-Singer proof.
“The theory around stable polynomials and hyperbolic polynomials is not new, but it has been more heavily developed over the past 10 years. Stable polynomials had been used prove other conjectures and had already been proven to be useful for a wide set of problems,” says Brändén who, with the late Julius Borcea, worked on the properties of these polynomials. “The approach gives you ways of deforming polynomials, so you can start with a determinant and deform it so that it’s not a determinant any more, but you still have control over the roots.”
Although it marks a breakthrough in pure mathematics, the Marcus-Spielman-Srivastava proof is not a constructive proof, and so does not provide a guide to developing algorithms that can be used in signal processing, physics, or other areas. The proof’s novel approach also presents problems.
Says Casazza, “People in my area are not experts in the tools these guys use, so they not trying to use those tools to go further. We would really like a constructive proof of Kadison-Singer or its equivalents.”
What the proof does deliver is increased confidence that such a constructive proof can be discovered. Also, it might come from the same angle of attack.
“The algorithmic problem is simpler now,” says Marcus, pointing out the availability of even a non-constructive proof means there is a target at which to aim.
Many of the problems being worked on look for a minimum difference between the original graph and a version that has been split into two or more partitions when approximations have been employed. “You used to be asking what the minimum value could be,” Marcus says. “Now you know. You just need an algorithm that finds the minimum. You don’t have to worry whether it’s a minimum or not. Having the knowledge that these things exist opens up the possibility for different algorithms. Previously you could find a solution, but couldn’t say that the solution is good.”
Spielman and colleagues have explored the theme of interlacing polynomials in a series of four papers, with a fifth in preparation. The work has begun to point to algorithms for other problems, although the original sparsification path still provides the most hope for faster matrix solvers.
“At first, I didn’t think an algorithmic proof would be available from this work, but now there is work that gives hope to that possibility,” Marcus says.
Brändén adds, “I believe we will see this method applied to more problems. Quite recently it was proven applicable to the traveling-salesman problem, and I think that the theory can be developed further.”
One lingering issue may lie in the specific type of polynomials on which the Marcus-Spielman-Srivastava proof of Kadison-Singer lies.
“The types of problems for which these polynomials can be used are very specialized, but we hope they will become more generic,” says Marcus. “It may be that these polynomials really are very specific. I tend to think that’s not true, but if we can’t make it more generic, the proof probably won’t be used as much as it could.”
Further Reading
Marcus, A., Spielman, D. And Srivastava, N.
Interlacing polynomials II: Mixed Characteristic Polynomials and the Kadison–Singer Problem, Annals of Mathematics, 182 (2015), Issue 1. Preprint: http://arxiv.org/abs/1306.3969
Casazza, P. and Tremain J.
Consequences of the Marcus/Spielman/Srivastava Solution of the Kadison-Singer Problem, New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing, and Multifractal Analysis, ed. Akram Aldroubi, Carlos Cabrelli, Stephane Jaffard, Ursula Molter (2016). Preprint: http://arxiv.org/pdf/1407.4768v2.pdf
Borcea, J. and Brändén, P.
The Lee-Yang and Pólya-Schur Programs, II: Theory of Stable Polynomials and Applications, Comm. Pure Appl. Math., 62:1595–1631 (2009).
Kadison, R.V. and Singer, I. M.
Extensions of Pure States, American Journal of Mathematics, Vol. 81, No. 2 (April 1959), 383–400
Join the Discussion (0)
Become a Member or Sign In to Post a Comment