Talk:Criticism of APL

Bookstaber
Richard Bookstaber is known as the King of Options Pricing.

In his 2007 book A Demon Of Our Own Design, pp 43-49 contain a section entitled The APL Cult. This is meant solely to correct technical inaccuracy or show how his comments are aligned with the criticism of APL. However Bookstaber's comments are very much in line with the observed love-hate relationship management may have about technology.

It would be nice to know exactly what timeframe Bookstaber is writing about, particularly with respect to the departments (which often changed names or acronyms) and individuals who worked there.

p.44, The language allows the programmer...

Summing the elements of a vector that are greater than 100 is +/(X>100)/X while the code fragment +/X>100 as given in the paragraph counts the number of items that fit the constraint. It does not produce their sum.

p.44, There was a reason to put things into as condensed a form as possible...

Strictly speaking, Bookstaber's explanation is not correct. Brevity does not guarantee performant code. Code is interpreted, not translated to machine language. The length of identifiers does not materially affect performance, though a well chosen algorithm (expressed in characters) will. The time an expression executes is determined by the algorithm chosen plus the size of the data.

p.44, ...with a "one-liner" being the prized objective.

It is generally understood among APL professionals that a one-liner is not more than a form of recreation. A one-liner, outside of a party, represents a code fragment which itself may be unmaintainable or incomprehensible. Brevity is not a substitute for quality, efficiency, or maintainability. The same one-liner broken up into several documented pieces is still brief compared to solutions in other languages, and goes a long way to being readable to the next programmer who comes along. Bookstaber's point here is well taken in that there tendency of programmers to do this (and of professors to ask this) without the requisite refactoring is all too common. This has blackened, tarnished, or destroyed the reputation of the language.

p.44, In fact, sitting near almost every terminal was a dog-eared copy of one code book or another containing page after page of one-liners to do almost every conceivable operation

Bookstaber is probably talking about an idiom list, the closest thing to an APL standard library.

p.45, ...but if a program requires a loop,...

APL is an interpretive array language which allows an individual to easily and efficiently solve certain classes of problems. To say that "APL is not good at looping" is like saying "Humans are not good at driving". Certain other classes of problems expose a weakness of the APL implementation where it would perform poorly compared to a program written in another language. Dynamic Programming is such a problem area, along with countless others. APL programmers are quite aware of APL's performance limitations.

p.48, Short of committing the heresy of rewriting the whole program...

Depending on the time frame Bookstaber is talking about, there were many tools around available to combat just this problem. First and foremost it was probably not the whole program which needed to be rewritten. It was just the computationally intensive part.

Prior to 1984, Morgan Stanley probably used IBM VSAPL, a well implemented interpreter designed for the adverse conditions of virtual memory or the IBM VM/CMS operating system. Here it was not possible, outside of being a systems programmer, to implement your own functions. Mixed-language programming was possible but not within the realm of most APL programmers.

Afterwards, Morgan used Sharp APL, which was designed to perform well with a huge number of concurrent users. Sharp APL had two relevant features, namely FCAP and Intrinsic Functions, for dealing with slow looping code. FCAP stood for "Function Call AP" and was perhaps more suitable for running larger programs from Sharp APL. Intrinsic Functions were a forerunner to []NA (Name Association) and probably, with a bit of help from a systems programmer, would allow a compiled Fortran, Cobol, or PL/1 program to be called from APL. Further, some departments had access to other APL systems, such as IBM APL2 (which had an easy-to-use []NA) and STSC's APL*Plus extensions to VSAPL, the system which actually supported compilation of scalar, iterative APL code, precisely what was needed.

Lastly, A+, in use today, has []NA- like features for calling programs written in compiled scalar languages like C.

All APL implementations today have a facility for calling external programs. If an iterative solution can be programmed in another language, then it can be connected to APL and used like an APL function.

p.48, APL was not much of a success at UBS, either.

Morgan Stanley used APL heavily. Although Bookstaber cites two examples where incorrect APL programs failed, he does not enumerate all of the other APL programs which were correct, successful, and made money, lots of it, for the firm.

p.48. ...even with a hundred programmers and a total cost that edged into eight figures, CORE never was fully implemented.

It seems unreasonable to blame APL for this failure, if it indeed was that. Bookstaber does not go on to enumerate all of the projects, having nothing to do with APL, which failed around the same time.

Cowznofski 14:39, 29 June 2007 (UTC)

Late 1980s
Cowz, Bookstaber is talking generally of the late 1980s there. Earlier in the book, he's discussed the October 1987 crash. He says that his friend Bob Platt was fired from MS a month later. Then on pp. 44-45 he's talking about Joel Kaplan as a big APL booster, who "moved over ... to run fixed income research sometime after Platt's exit."

A little vague, but plainly we're still in the late 1980s.

--Christofurio 16:17, 10 August 2007 (UTC)

Lack of possibility to index by a character string
Many people switched from APL to Perl because they needed to index information by character strings rather than with integers ("hashes") without having to manage their own symbol table handling every time.

The eventual possibility of indexing with character strings would not imply any rewrite of existing programs, nor the use of special {} indexing like in Perl : if an index position contains anything else than an integer, that would mean it is a hash. That minor syntax extension is mentioned as a possibility (and its lack as one of the reason some users deserted APL) in the francophone Wikipedia APL page fr:APL (langage). 212.198.139.139 (talk) 07:52, 10 May 2008 (UTC)

I think many people switched from APL to Perl because that tool was better suited for the job at hand. Being free and available everywhere probably helped as well.

As for indexing by a character string ("associative arrays"), this is not a minor syntax extension, not a trivial change at all. This change has enormous semantic implications. Things to think about:

How do you add a new key? a := iota 0; a[ 'cat' 'fat' ] := 10 42 ? How would you delete a key? Not 1 0 / a, order isn't guaranteed any more Catenation a, b - what if there are duplicate keys? Lamination a,[1.5] b - what are the keys of the two columns of the resulting matrix? What if the keys are different? Matrices - mat[ 'US" 'Canada' ; 'cat' 'rat' ] ? Can one dimension be indexed by numbers and the other by keys?

The list goes on. The q language goes a bit in this direction.

Ibeam2000 (talk) 15:46, 12 February 2011 (UTC)

Entrepreneur code
Does anyone actually believe the "Entrepreneur code" section is relevant? I only discovered the existence of APL as an example of a godawful unreadable programming language. This section states: "APL was and still is a fertile means where an entrepreneur, [...] is able to build an application and sell the service or product to a wider audience."

Unless someone has managed to log onto the wikipedia servers, using some sort of mainframe telnet connection through time, I cannot imagine anyone even attempting to type out a line of APL, other than to prove that they are masochistic enough to try it.

I am curious as to what others think about this. 67.201.221.225 (talk) 03:14, 23 March 2010 (UTC)
 * I agree with some of your points, but not with others. You're certainly right to argue with the "entrepreneur code" section; exactly the same points could be made about other smaller, powerful languages such as Forth, MUMPS and, to a lesser extent, FOCAL and JOSS. Which language one would use depended of course both on the particular problems to be solved (APL shone at quick data reductions) and on one's potential market (the biomedical community were used to dealing with MUMPS and astronomers with Forth). I think that the section could be drastically curtailed or even omitted without much loss of utility.


 * However, I would disagree strongly that APL is intrinsically either "godawful" or "unreadable". You can write bad, cryptic code in any language; excepting party-trick one-liners, which nobody sane would use in a real programming context, APL code can be written in a clear, structured, commented way. I've taught people to write good productive APL code by using clearly-written examples in a couple of hours. Still, each to their own! --Kay Dekker (talk) 23:05, 23 April 2010 (UTC)

Beauty is in the eyes of the beholder. I guess you're just not one of them. —Preceding unsigned comment added by 193.8.106.40 (talk) 15:07, 8 October 2010 (UTC)

I wrote this section with two companies in mind. One still uses APL and is very, very big. The other went out of business (not on account of APL) a few years ago. Cowznofski (talk) 19:45, 20 January 2011 (UTC)

Java 20 times slower than C?
The current quote #5 claims that Java is 20 times slower than C/C++, considering the new improvements like JIT, this is no longer relevant. —Preceding unsigned comment added by 79.127.12.72 (talk) 17:14, 10 July 2010 (UTC)

Sure, now it is probably only 10 times slower than C. —Preceding unsigned comment added by 193.8.106.40 (talk) 15:06, 8 October 2010 (UTC)

Considering the sluggish (to put it politely) performance of a few of the flagship Java applications, such as Eclipse and Open Office, I would imagine the claim that Java is 20 times slower than C is still more or less true. To state otherwise would be, plain and simple, denial. —Preceding unsigned comment added by 92.106.208.132 (talk) 19:24, 13 January 2011 (UTC)