… Frameworks are appearing.
Regularly, I search for the keywords Fast Matrix Multiplication or Multiply. I use Google and I use Bing/Yahoo. The two search engines have different algorithms and the results are often different. I search for two separate reasons: First, narcissism, I want to find my site in the first page. Second, to see what is new in the field (similar research tends to use the same terminology).
In Bing and Yahoo, I regularly appear in the first page close to wikipedia. This provides great visibility and their algorithm validates my effort to provide technical and high quality contents. As such, I cannot write a new blog every week and thus this search engine does not really consider me as a classic blog.
If I could publish regularly, my contents bubble up in google. As I write today, I appear in either the first or the second page, which is very good and not common. In general, papers by Stanford or Berkeley alumni are higher ranked. If you repeat the search the result may well be tailored to your history and this site could be a long shot.
Nonetheless, I like this scenario: one search engine is consistent and it values my effort and the other often provides “temporal” importance, I will explain the negative effect first and in the next paragraph the positive one. The negative effect is simple to explain: Not being in the top in Google searches would suck if I had monetary investments or my professional career depends on this type of work. For example, citations from peers are important in so many ways in the research field especially in academia: for advancement, funding and personal satisfaction (pat on your back kind of thing). Not being in the top is equivalent to not being published in the top tier conferences or journals: it will diminish your contribution to the eyes of new researchers and your research peers. If they cannot find your work it means the majority will not know what is your contribution.
The temporal ranking effect helps me to stay updated in the current research easily: new research will bubble up rather quickly and I can easily being in the midst of smarter people thoughts. I found out that the field attracts interesting work and interest in general, this is exciting.
Smart people are reaching the conclusion that it is good if not necessary to have development framework for the generation of algorithms for Fast Matrix multiplication.
The first is by Alexey Smirnov: he provides a systematic way to create recursive algorithms using matrix notations a la Bini’s style. His work is available at http://link.springer.com/
One of the obstacles the use of Fast MM algorithms is their poor formulation for rectangular matrices. One of my humble contributions is the presentation of one single algorithm that can be used for any matrix size (my first TOMS paper). Now, I am happy to see that there is a practical interest in tackling this problem.
Recently, I have been contacted to share my adaptive library to tackle exactly rectangular matrices, and thank to Google, I have found that Benson and Ballard (http://arxiv.org/abs/1409.2908) are proposing a new framework combining Smirnov’s idea. My library is used as reference and seems having a little edge when applicable. The good news is that the optimizations presented could be applied to mine in principle.
I am happy to see that the current research is going towards the direction I would go: a practical framework where algorithms are presented as matrices and automation will take these to create codes: for every matrices and for every architectures.