Fast Matrix Multiplication
Consider the following two-by-two matrix multiplication:
Note that we need eight multiplications and four additions in order to perform the multiplication. This means that in order to multiply two n-by-n matrices, it is necessary to perform n3 multiplications and n2(n-1) additions.
In 1969, Strassen [Str69] surprised everyone by discovering a way to multiply the above matrices using only seven multiplications. He first computed:
Then he noted that:
Of course, the number of additions and subtractions went up to eighteen, but this is fine since multiplications take longer to perform.
In order to extend this to n-by-n from two-by-two, we merely divide the matrices into four equal portions and multiply the parts as above in a recursivel manner. This of course requires n to be a power of two, but that is fine. Then, if you solve the recurrence relation from this (setting it up is a nice exercise), the complexity comes out to be O(nlog7).
For the curious, Pan [Pan80] discovered an even faster way to do this.
[Pan80] Pan, V. Y. "New fast algorithms for matrix operations." Siam Journal on Computing 9, 1980, pp. 321-342.
[Str69] Strassen, V. "Gaussian elimination is not optimal." Numerische Mathematik 13, 1969, pp. 354-356.