TechNews Pictorial PriceGrabber Video Wed Dec 25 23:12:19 2024

0


‘Outsiders’ Crack a 50-Year-Old Math Problem
Source: Erica Klarreich


In 2008, Daniel Spielman told his Yale University colleague Gil Kalai about a computer science problem he was working on, concerning how to “sparsify” a network so that it has fewer connections between nodes but still preserves the essential features of the original network.

About

Original story reprinted with permission from Quanta Magazine, an editorially independent division of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences

Network sparsification has applications in data compression and efficient computation, but Spielman’s particular problem suggested something different to Kalai. It seemed connected to the famous Kadison-Singer problem, a question about the foundations of quantum physics that had remained unsolved for almost 50 years.

Over the decades, the Kadison-Singer problem had wormed its way into a dozen distant areas of mathematics and engineering, but no one seemed to be able to crack it. The question “defied the best efforts of some of the most talented mathematicians of the last 50 years,” wrote Peter Casazza and Janet Tremain of the University of Missouri in Columbia, in a 2014 survey article.

As a computer scientist, Spielman knew little of quantum mechanics or the Kadison-Singer problem’s allied mathematical field, called C*-algebras. But when Kalai, whose main institution is the Hebrew University of Jerusalem, described one of the problem’s many equivalent formulations, Spielman realized that he himself might be in the perfect position to solve it. “It seemed so natural, so central to the kinds of things I think about,” he said. “I thought, ‘I’ve got to be able to prove that.’” He guessed that the problem might take him a few weeks.




Three computer scientists (from left), Nikhil Srivastava, Adam Marcus, and Daniel Spielman, solved the famous Kadison-Singer problem in 2013. Courtesy of Adam Marcus

Instead, it took him five years. In 2013, working with his postdoc Adam Marcus, now at Princeton University, and his graduate student Nikhil Srivastava, now at the University of California, Berkeley, Spielman finally succeeded. Word spread quickly through the mathematics community that one of the paramount problems in C*-algebras and a host of other fields had been solved by three outsiders ― computer scientists who had barely a nodding acquaintance with the disciplines at the heart of the problem.

Mathematicians in these disciplines greeted the news with a combination of delight and hand-wringing. The solution, which Casazza and Tremain called “a major achievement of our time,” defied expectations about how the problem would be solved and seemed bafflingly foreign. Over the past two years, the experts in the Kadison-Singer problem have had to work hard to assimilate the ideas of the proof. Spielman, Marcus and Srivastava “brought a bunch of tools into this problem that none of us had ever heard of,” Casazza said. “A lot of us loved this problem and were dying to see it solved, and we had a lot of trouble understanding how they solved it.”

“The people who have the deep intuition about why these methods work are not the people who have been working on these problems for a long time,” said Terence Tao, of the University of California, Los Angeles, who has been following these developments. Mathematicians have held several workshops to unite these disparate camps, but the proof may take several more years to digest, Tao said. “We don’t have the manual for this magic tool yet.”

Computer scientists, however, have been quick to exploit the new techniques. Last year, for instance, two researchers parlayed these tools into a major leap forward in understanding the famously difficult traveling salesman problem. There are certain to be more such advances, said Assaf Naor, a mathematician at Princeton who works in areas related to the Kadison-Singer problem. “This is too profound to not have many more applications.”
A Common Problem

The question Richard Kadison and Isadore Singer posed in 1959 asks how much it is possible to learn about a “state” of a quantum system if you have complete information about that state in a special subsystem. Inspired by an informally worded comment by the legendary physicist Paul Dirac, their question builds on Werner Heisenberg’s uncertainty principle, which says that certain pairs of attributes, like the position and the momentum of a particle, cannot simultaneously be measured to arbitrary precision.

Kadison and Singer wondered about subsystems that contain as many different attributes (or “observables”) as can compatibly be measured at the same time. If you have complete knowledge of the state of such a subsystem, they asked, can you deduce the state of the entire system?




Richard Kadison (left), pictured at the 1970 International Congress of Mathematicians in Nice, France, and Isadore Singer posed a mathematics problem in 1959 that stood unsolved for more than 50 years. Left: Konrad Jacobs, Archives of the Mathematisches Forschungsinstitut Oberwolfach; Right: Courtesy of Isadore Singer

In the case where the system you’re measuring is a particle that can move along a continuous line, Kadison and Singer showed that the answer is no: There can be many different quantum states that all look the same from the point of view of the observables you can simultaneously measure. “It is as if many different particles have exactly the same location simultaneously―in a sense, they are in parallel universes,” Kadison wrote by email, although he cautioned that it’s not yet clear whether such states can be realized physically.

Kadison and Singer’s result didn’t say what would happen if the space in which the particle lives is not a continuous line, but is instead some choppier version of the line―if space is “granular,” as Kadison put it. This is the question that came to be known as the Kadison-Singer problem.

Based on their work in the continuous setting, Kadison and Singer guessed that in this new setting the answer would again be that there are parallel universes. But they didn’t go so far as to state their guess as a conjecture―a wise move, in hindsight, since their gut instinct turned out to be wrong. “I’m happy I’ve been careful,” Kadison said.

Kadison and Singer―now at the University of Pennsylvania and the Massachusetts Institute of Technology (emeritus), respectively―posed their question at a moment when interest in the philosophical foundations of quantum mechanics was entering a renaissance. Although some physicists were promoting a “shut up and calculate” approach to the discipline, other, more mathematically inclined physicists pounced on the Kadison-Singer problem, which they understood as a question about C*-algebras, abstract structures that capture the algebraic properties not just of quantum systems but also of the random variables used in probability theory, the blocks of numbers called matrices, and regular numbers.

C*-algebras are an esoteric subject―“the most abstract nonsense that exists in mathematics,” in Casazza’s words. “Nobody outside the area knows much about it.” For the first two decades of the Kadison-Singer problem’s existence, it remained ensconced in this impenetrable realm.

Then in 1979, Joel Anderson, now an emeritus professor at Pennsylvania State University, popularized the problem by proving that it is equivalent to an easily stated question about when matrices can be broken down into simpler chunks. Matrices are the core objects in linear algebra, which is used to study mathematical phenomena whose behavior can be captured by lines, planes and higher-dimensional spaces. So suddenly, the Kadison-Singer problem was everywhere. Over the decades that followed, it emerged as the key problem in one field after another.

Because there tended to be scant interaction between these disparate fields, no one realized just how ubiquitous the Kadison-Singer problem had become until Casazza found that it was equivalent to the most important problem in his own area of signal processing. The problem concerned whether the processing of a signal can be broken down into smaller, simpler parts. Casazza dived into the Kadison-Singer problem, and in 2005, he, Tremain and two co-authors wrote a paper demonstrating that it was equivalent to the biggest unsolved problems in a dozen areas of math and engineering. A solution to any one of these problems, the authors showed, would solve them all.

One of the many equivalent formulations they wrote about had been devised just a few years earlier by Nik Weaver, of Washington University in St. Louis. Weaver’s version distilled the problem down to a natural-sounding question about when it is possible to divide a collection of vectors into two groups that each point in roughly the same set of directions as the original collection. “It’s a beautiful problem that brought out the core combinatorial problem” at the heart of the Kadison-Singer question, Weaver said.

So Weaver was surprised when―apart from the mention in Casazza’s survey and one other paper that expressed skepticism about his approach―his formulation seemed to meet with radio silence. He thought no one had noticed his paper, but in fact it had attracted the attention of just the right people to solve it.
Electrical Properties

When Spielman learned about Weaver’s conjecture in 2008, he knew it was his kind of problem. There’s a natural way to switch between networks and collections of vectors, and Spielman had spent the preceding several years building up a powerful new approach to networks by viewing them as physical objects. If a network is thought of as an electrical circuit, for example, then the amount of current that runs through a given edge (instead of finding alternate routes) provides a natural way to measure that edge’s importance in the network.

Spielman discovered Weaver’s conjecture after Kalai introduced him to another form of the Kadison-Singer problem, and he realized that it was nearly identical to a simple question about networks: When is it possible to divide up the edges of a network into two categories―say, red edges and blue edges―so that the resulting red and blue networks have similar electrical properties to the whole network?

It’s not always possible to do this. For instance, if the original network consists of two highly connected clusters that are linked to each other by a single edge, then that edge has an outsize importance in the network. So if that critical edge is colored red, then the blue network can’t have similar electrical properties to the whole network. In fact, the blue network won’t even be connected.

Weaver’s problem asks whether this is the only type of obstacle to breaking down networks into similar but smaller ones. In other words, if there are enough ways to get around in a network―if no individual edge is too important―can the network be broken down into two subnetworks with similar electrical properties?

Spielman, Marcus and Srivastava suspected that the answer was yes, and their intuition did not just stem from their previous work on network sparsification. They also ran millions of simulations without finding any counterexamples. “A lot of our stuff was led by experimentation,” Marcus said. “Twenty years ago, the three of us sitting in the same room would not have solved this problem.”

The simulations convinced them that they were on the right track, even as the problem raised one stumbling block after another. And they kept making spurts of progress, enough to keep them hooked. When Marcus’ postdoctoral fellowship expired at the end of the team’s fourth year working on the problem, he elected to leave academia temporarily and join a local startup called Crisply rather than leave New Haven. “I worked for my company four days a week, and then once a week or so I would go to Yale,” he said.

A network’s electrical properties are governed by a special equation called the network’s “characteristic polynomial.” As the trio performed computer experiments on these polynomials, they found that the equations seemed to have hidden structure: Their solutions were always real numbers (as opposed to complex numbers), and, surprisingly, adding these polynomials together always seemed to result in a new polynomial with that same property. “These polynomials were doing more than we gave them credit for,” Marcus said. “We used them as a way of transferring knowledge, but really the polynomials seemed to be containing knowledge themselves.”

Piece by piece, the researchers developed a new technique for working with so-called “interlacing polynomials” to capture this underlying structure, and finally, on June 17, 2013, Marcus sent an email to Weaver, who had been his undergraduate advisor at Washington University 10 years earlier. “I hope you remember me,” Marcus wrote. “The reason I am writing is because we … think we have solved your conjecture (the one that you showed was equivalent to Kadison-Singer).” Within days, news of the team’s achievement had spread across the blogosphere.

The proof, which has since been thoroughly vetted, is highly original, Naor said. “What I love about it is just this feeling of freshness,” he said. “That’s why we want to solve open problems―for the rare events when somebody comes up with a solution that’s so different from what was before that it just completely changes our perspective.”

Computer scientists have already applied this new point of view to the “asymmetric” traveling salesman problem. In the traveling salesman problem, a salesman must travel through a series of cities, with the goal of minimizing the total distance traveled; the asymmetric version includes situations in which the distance from A to B differs from the distance from B to A (for instance, if the route includes one-way streets).

The best-known algorithm for finding approximate solutions to the asymmetric problem dates back to 1970, but no one knew how good its approximations were. Now, using ideas from the proof of the Kadison-Singer problem, Nima Anari, of the University of California, Berkeley, and Shayan Oveis Gharan, of the University of Washington in Seattle, have shown that this algorithm performs exponentially better than people had realized. The new result is “major, major progress,” Naor said.

The proof of the Kadison-Singer problem implies that all the constructions in its dozen incarnations can, in principle, be carried out―quantum knowledge can be extended to full quantum systems, networks can be decomposed into electrically similar ones, matrices can be broken into simpler chunks. The proof won’t change what quantum physicists do, but it could have applications in signal processing, since it implies that collections of vectors used to digitize signals can be broken down into smaller frames that can be processed faster. The theorem “has potential to affect some important engineering problems,” Casazza said.

But there’s a big gulf between principle and practice. The proof establishes that these various constructions exist, but it doesn’t say how to carry them out. At present, Casazza says, “there isn’t a chance in hell” of pulling a useful algorithm out of the proof. However, now that mathematicians know that the question has a positive answer, he hopes that a constructive proof will be forthcoming―not to mention a proof that mathematicians in his field can actually understand. “All of us were completely convinced it had a negative answer, so none of us was actually trying to prove it,” he said.

Mathematicians in the fields in which the Kadison-Singer problem has been prominent may feel wistful that three outsiders came in and solved “their” central problem, but that’s not what really happened, Marcus said. “The only reason we could even try to solve such a problem is because people in that field had already removed all the hardness that was happening” in C*-algebras, he said. “There was just one piece left, and that piece wasn’t a problem they had the techniques to solve. I think the reason why this problem lasted 50 years is because it really had two parts that were hard.”

Throughout the five years he spent working on the Kadison-Singer problem, Marcus said, “I don’t think I could have told you what the problem was in the C*-algebra language, because I had no clue.” The fact that he, Srivastava and Spielman were able to solve it “says something about what I hope will be the future of mathematics,” he said. When mathematicians import ideas across fields, “that’s when I think these really interesting jumps in knowledge happen.”

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |