Talk:Basic Linear Algebra Subprograms

[Untitled]
Should the reference to uBLAS as a BLAS be given a caveat, seeing as it isn't a BLAS implementation but a C++ container class template system?

How about a note for IBM's BLAS tuned for the Cell Processor? Is this important? http://www-03.ibm.com/technology/cell/swlib.html

--- I created a section with a new heading: Other libraries with Blas functionality. I hope I sorted all the libraries correctly into the two categories.... 16 Feb. 2010. KJ —Preceding unsigned comment added by Kristjan.Jonasson (talk • contribs) 01:05, 16 February 2010 (UTC)

scalar dot products?
I thought it is either called a scalar product or a dot product; scalar dot product sounds ridiculous. —Preceding unsigned comment added by 79.235.154.60 (talk) 07:41, 15 July 2010 (UTC)

MATTLAS BLAS
I've twice reverted the introduction of a link to MATTLAS BLAS. (I was wrong about the age of the project.) The text is:


 * MATTLAS BLAS: A modern task based LGPL BLAS implementation written in C++, currently supporting the AVX instruction set for x86_64.

Following the link to github.com, the project README states:


 * MATTLAS (Matt's Linear Algebra Subroutines) is a high performance BLAS implementation. I am using this BLAS (Basic Linear Algebra Subroutines) implementation primarily for research purposes, however, I intend to produce something of high quality that should be competitive with enterprise vendor solutions.  MATTLAS is licensed under the GNU Lesser General Public License, version 3.  This is a very early alpha, so please submit bugs when you find them.  Also, the library currently only supports AVX and x64 on linux.

By its own statements, this implementation is a personal project in "very early alpha". I don't see anything that suggests stature or reliable sources. Consequently, I believe the mention is WP:UNDUE.

The other BLAS implementations are well-known efforts (e.g., MKL and Goto) or at least university efforts. For example, there was a recent addition of BLIS; that project has technical reports (the second TR has many coauthors) and claims funding by Microsoft and NSF. Although the technical reports are not secondary sources, both TRs claim they have been submitted to journals or conferences.

The MATTLAS BLAS doesn't yet belong in an encyclopedia. I'd revert it again, but I'm at my basic revlimit. Glrx (talk) 23:54, 4 October 2013 (UTC)


 * Actually, there's now a very reliable source that MATTLAS BLAS is an enterprise-quality implementation. Sarcasm mode off: I've reverted it. This is clearly a one man project with no mention beyond the guy's various webpages. It has no forks on GitHub, no issues or discussion there or on Launchpad, and it isn't even mentioned in any of his publications. Q VVERTYVS (hm?) 22:24, 21 October 2013 (UTC)

Proposed merge with AXPY
Articles about single BLAS functions will likely never rise above the level of WP:HOWTO; this one certainly hasn't. Q VVERTYVS (hm?) 13:39, 20 October 2013 (UTC)


 * Support. Covering many subprograms such as AXPY also runs into the problem of creating a reference manual. It is reasonable to have some examples show how different types are handled. Glrx (talk) 22:07, 21 October 2013 (UTC)

Proposed merge with General Matrix Multiply
Same reasoning as with the former article AXPY, except this one actually some interesting content that does not violate WP:HOWTO. This content should probably be split between the BLAS article and matrix multiplication (unless there's a specific page on matrix product algorithms?). Q VVERTYVS (hm?) 21:37, 22 October 2013 (UTC)


 * Support merge. I don't think the material belongs in matrix multiplication (except for Golub reference) because the viewpoint is BLAS-specific rather than MM in general. The BLAS article should have a link to Matrix multiplication. Glrx (talk) 22:59, 25 October 2013 (UTC)

Not an API?
Glrx removed my remark that BLAS is a de facto standard API for linear algebra. However, I do have the idea that it is: there are many different implementations of BLAS, some based on the reference implementation, some not, all sharing (almost) the same calling sequence and output semantics. Doesn't that constitute an API? (Or maybe two APIs, Fortran BLAS and CBLAS?) Q VVERTYVS (hm?) 12:55, 11 December 2013 (UTC)


 * My edit corrected the use of "function", left the "de facto standard" claim, but removed the term API. BLAS is an interface, and it is a programming interface, but it is not (possibly save the later sparse matrix construction/access routines) intended as an applications programming interface. The 1997 BLAS Quick Reference Guide does not mention API.  The 2000 BLAST Forum Standard refers to BLAS as "a specification of a set of kernel routines for linear algebra"; it does not use API even though API was a common term in 2000.
 * The focus of BLAS is not the applications programmer but rather those programmers who implement numerical libraries for others. LINPACK and LAPACK are APIs; the intention is that applications programmers will directly call the LINPACK and LAPACK interfaces.  The text I replaced acknowledges that viewpoint: it claimed that BLAS was "a standard API for linear algebra routines." Routines are not applications, so the statement essentially says BLAS is an API for APIs, a statement that lacks precision.
 * Google does turn up some hits for BLAS API, but WP is the first hit, and the IBM Software Development Kit for Multicore Acceleration v3.0 Basic Linear Algebra Subprograms Programmer’s Guide and API Reference does not seem to carry much weight (I won't WP:COPYLINK it here). LLL's http://acs.lbl.gov/software/colt/api/cern/colt/matrix/linalg/Blas.html is a hit because BLAS was dumped into an API directory tree for linear algebra; the actual page calls BLAS a set of "High quality 'building block' routines for performing basic vector and matrix operations". Glrx (talk) 17:54, 11 December 2013 (UTC)


 * This raises the question what an "application" is. That term can be explained two-fold: either it is a user-facing program, or it's taken relative to BLAS, e.g. Julia/Matlab/NumPy applies BLAS and is therefore an application of it. But if few sources call BLAS an API, and the standard calls it a specification, then I'll use the latter term instead. Q VVERTYVS (hm?) 12:13, 16 January 2014 (UTC)


 * Application software, http://www.webopedia.com/TERM/A/application.html, ... Glrx (talk) 01:02, 19 January 2014 (UTC)


 * If you look at Wikipedia's definition of Application programming interface, then you'll find that the narrow notion of application software hardly features in it. That makes sense, because some APIs (notably OS APIs such as POSIX) are meant for systems software as well as end-user applications. Q VVERTYVS (hm?) 11:32, 19 January 2014 (UTC)


 * I consider that article poor. Glrx (talk) 22:05, 22 January 2014 (UTC)


 * Your definition of API seems different from the one I'm familiar with. The ATLAS FAQ also speaks of a "BLAS API". Q VVERTYVS (hm?) 12:34, 25 March 2014 (UTC)

Reason for beta parameter
From the section Level 3:


 * by decomposing one or both of $$A, B$$ into block matrices, GEMM can be implemented recursively. This is one of the motivations for including the $$\beta$$ parameter, so the results of previous blocks can be accumulated.

I find this (unsourced) remark dubious. Had there been no $β$, the recursive algorithm could still be implemented by putting a driver routine around it. Simplified C code:

Q VVERTYVS (hm?) 10:48, 29 January 2015 (UTC)


 * You think it would be smarter to implement gemm with the $β$ parameter, but then only expose an interface forcing $β$ = 1? I'm not sure why anyone would make that design decision in a low level library like BLAS. This seems like more evidence for the unsourced remark but a source would be nice. 50.191.22.227 (talk) 18:08, 28 April 2015 (UTC)


 * That's not what I'm saying at all. I said that if there had been no such parameter, the divide-and-conquer implementation would have been just as feasible so the reasoning is flawed. Q VVERTYVS (hm?) 19:00, 28 April 2015 (UTC)


 * I think the section is trying to say something different.
 * For example, to multiply one large matrix A by another B, divide and conquer. Intermediate problem might be multiplying the first 16 rows of A by the first 16 cols of B to get a 16 by 16 subresult in C. That multiplication might be done by dividing Asub and Bsub into 16 by 16 chunks. I set &beta; to 0 for the first sub-sub-product and then set &beta; to 1 for the subsequent sub-sub-product accumulations. If I don't have &beta;, then I need one routine for the initial case and a different routine for the accumulation case.
 * Glrx (talk) 17:04, 20 May 2015 (UTC)


 * But then it's not GEMM that is implemented recursively, but GEMM being used as a base case for recursive algorithms; this is what Dongarra seems to be suggesting too. So the text is off. Q VVERTYVS (hm?) 08:52, 21 May 2015 (UTC)


 * No, I'm gave an example of how GEMM could call itself on simpler problems (and why &beta; is needed). I don't know if GEMM is done that way (and given most Fortrans, it probably isn't; D comments about Fortran excluding recursion). The ref you gave shows that GEMM does have matrix update applications in its own right. Ref 19 could be an appropriate source for recursive calls. Instead of explicit recursion, I would expect GEMM to break the problem into reasonably-sized subblocks and then iterate them with a base-case GEMM. Glrx (talk) 19:21, 21 May 2015 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified 7 one external links on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added archive https://web.archive.org/web/20161012014431/http://www.lahey.com/docs/blaseman_lin62.pdf to http://www.lahey.com/docs/blaseman_lin62.pdf
 * Added tag to http://www.netlib.org/).
 * Added archive https://web.archive.org/web/20131029204826/http://78.158.56.101/archive/msor/headocs/34mathematica5.pdf to http://78.158.56.101/archive/msor/headocs/34mathematica5.pdf
 * Added archive https://web.archive.org/web/20120517132718/http://www.tacc.utexas.edu:80/tacc-projects/gotoblas2 to http://www.tacc.utexas.edu/tacc-projects/gotoblas2/
 * Added archive https://web.archive.org/web/20070222154031/http://www.nec.co.jp:80/hpc/mediator/sxm_e/software/61.html to http://www.nec.co.jp/hpc/mediator/sxm_e/software/61.html
 * Added archive https://web.archive.org/web/20070513173030/http://www.sgi.com:80/products/software/scsl.html to http://www.sgi.com/products/software/scsl.html
 * Added archive https://web.archive.org/web/20100803003649/http://z.cs.utexas.edu:80/wiki/flame.wiki/FrontPage to http://z.cs.utexas.edu/wiki/flame.wiki/FrontPage
 * Added archive https://web.archive.org/web/20061009230911/http://history.siam.org/oralhistories/lawson.htm to http://history.siam.org/oralhistories/lawson.htm

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at ).

Cheers.— InternetArchiveBot  (Report bug) 01:08, 28 October 2016 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified 5 external links on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added tag to http://publib.boulder.ibm.com/infocenter/clresctr/index.jsp?topic=%2Fcom.ibm.cluster.essl.doc%2Fesslbooks.html
 * Corrected formatting/usage for http://www.tacc.utexas.edu/tacc-projects/gotoblas2/
 * Corrected formatting/usage for http://www.nec.co.jp/hpc/mediator/sxm_e/software/61.html
 * Corrected formatting/usage for http://www.sgi.com/products/software/scsl.html
 * Corrected formatting/usage for http://z.cs.utexas.edu/wiki/flame.wiki/FrontPage
 * Added archive https://web.archive.org/web/20061009230904/http://history.siam.org/oralhistories/dongarra.htm to http://history.siam.org/oralhistories/dongarra.htm

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

Cheers.— InternetArchiveBot  (Report bug) 13:49, 15 July 2017 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified 2 external links on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added archive https://web.archive.org/web/20051130022536/http://developer.amd.com/acml.aspx to http://developer.amd.com/acml.aspx
 * Added archive https://web.archive.org/web/20161116145528/http://developer.amd.com/tools-and-sdks/opencl-zone/acl-amd-compute-libraries/ to http://developer.amd.com/tools-and-sdks/opencl-zone/acl-amd-compute-libraries/

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

Cheers.— InternetArchiveBot  (Report bug) 09:13, 9 September 2017 (UTC)

External links modified
Hello fellow Wikipedians,

I have just modified one external link on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
 * Added archive https://web.archive.org/web/20150905190558/http://developer.amd.com/tools-and-sdks/archive/amd-core-math-library-acml/ to http://developer.amd.com/tools-and-sdks/archive/amd-core-math-library-acml/

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

Cheers.— InternetArchiveBot  (Report bug) 11:28, 14 September 2017 (UTC)

Which applications?
This article would be more useful if it described the most notable applications for BLAS. I know that matrix math is used in a wide range of applications from neural networks to the design of nuclear bombs, but are there particular applications or classes of applications that tend to use BLAS specifically rather than other library interface specifications? 67.188.1.213 (talk) 20:42, 3 September 2021 (UTC)