The natural method to multiply two n × n matrices takes O(n^{3}) steps, but in 1969, Volker Strassen stunned the computational world by publishing an algorithm that does it in only O(n^{2.808}) steps. The greek letter omega is used to denote the minimum exponent for the complexity of matrix multiplication. Strassen had shown the world that omega, which had always been assumed to be 3, was actually less than 2.808.

What is the true value of omega?

Strassen's discovery touched off a flurry of research. In the following two decades, bounds on omega were improved and tuned until 1987, when Don Coppersmith and Shmuel Winograd reduced the exponent to 2.3755. That low exponent has stood as the best upper-bound for omega for 24 years.

Although the Strassen and C-W algorithms are not practical for most real-world problems, the theoretical work shows that matrix multiplication is fundamentally simpler than we would naively assume. It hints that the mathematical world has a gap in knowledge with respect to the fundamental operation of composing linear operations. Some believe (or hope) that matrix multiplication will ultimately be shown to be O(n^{2}). But for decades nobody has been able to get the exponent lower than 2.3755 - until now.

**The Race for a Better Omega is On Again**

A few days ago, Virginia Vassilevska Williams published a paper achieving an exponent lower than 2.3727, improving the bound by 0.0028 and importantly showing that C-W is not the end of the story. She credits Andrew Stothers for doing thesis work last year that pointed the way to breaking the C-W record. Then she describes an algorithm that beats upon the Stothers algorithm bound further, and she describes a general method to generate further algorithms that have low exponents.

Why did it take so long to make this next improvement? In her words, "with each new tensor power, the number of new values that need to be analyzed grows quadratically. For the eighth tensor power for instance, 30 separate analyses are required! Prior to our work, each of these analyses would require a separate application of the CW techniques. It would have required an enormous amount of patience to analyze larger tensor powers, and since the third tensor power does not give any improvement, the prospects looked bleak."

Her paper (here) is an example of a breed of modern mathematics that has grown to be comfortable with using computers as a tool. Not only does she drive computer algebra systems to crack formulas with a higher level of complexity than would be practical to do when working by hand: she describes algorithms for creating algorithms, driving computers to sift through large numbers of permutations, hunting for the best ideas.

Links. The Paper; RJ Lipton's blog entry; New Scientist.

Posted by David at 07:08 AM
| Comments (1)

Copyright 2019 © David Bau. All Rights Reserved.