APL's Enduring Argument: Why a 1977 Case for a Lyrical Language Still Resonates
#Trends

APL's Enduring Argument: Why a 1977 Case for a Lyrical Language Still Resonates

Startups Reporter
7 min read

A 1977 essay by Alan J. Perlis makes a compelling case for APL as the ideal first programming language, arguing its terse, array-oriented syntax fosters deeper algorithmic thinking and faster mastery than mainstream languages of the time—a perspective that challenges modern programming education's focus on verbosity and explicit control.

In 1977, computer scientist Alan J. Perlis published a passionate defense of APL in SIAM News, arguing that the array-oriented language was "the most rational first language for a first course in computer science." His essay, "In Praise of APL: A Language for Lyrical Programming," presents a case that remains strikingly relevant today, challenging conventional assumptions about what makes a language suitable for beginners.

Perlis's argument rests on five core objectives for introductory computer science education: understanding algorithms, grasping computer organization, developing fluency in a programming language, appreciating complexity control through system design, and recognizing the societal impact of computing. He contends that APL uniquely serves these goals through its distinctive characteristics.

The Case for Terseness and Fluency

The central thesis revolves around APL's remarkable density of expression. Perlis observes that "complicated acts can be described briefly," noting that APL programs typically require only 1/5 to 1/10 the lines of equivalent programs in FORTRAN, BASIC, ALGOL, PL/I, or Pascal (collectively termed "FBAPP" languages). This brevity isn't merely aesthetic—it has practical pedagogical consequences.

When students can write payroll systems in "a relatively few lines" or display polygon transformations in ~20 lines, they avoid the cognitive overhead of wrestling with verbose syntax. More importantly, they develop what Perlis calls "phrase growth"—the ability to build complex expressions from simple components, much like constructing sentences in natural language. The "one-liners" that APL enables aren't just clever tricks; they represent a different way of thinking about problem decomposition.

This density also affects error patterns. Perlis notes that APL programs tend to have fewer errors than their FBAPP counterparts because control is distributed into the semantics of primitive functions rather than explicit control statements. When errors do occur, they're often semantic rather than syntactic—misunderstandings of purpose rather than misplaced semicolons. This shifts the learning focus from language mechanics to algorithmic thinking.

Arrays as a Foundational Abstraction

APL's array orientation provides a powerful model for data organization. While Perlis acknowledges that "prolonged contact with APL makes one wish for the added presence of more heterogeneous structures," he argues that arrays are "extraordinarily useful" for beginners. They provide a uniform, composable abstraction that scales from simple scalars to multi-dimensional data.

This uniformity enables students to explore diverse domains with the same tools. Perlis provides concrete examples: a permuted-index generator for title lists (12 lines), polygon rotation and scaling (20 lines), function graphing (~5 lines). The same primitive operations—indexing, reshaping, reduction—apply across these domains, reinforcing transferable concepts rather than domain-specific syntax.

Modeling Computer Architecture

A particularly compelling argument involves computer organization. Perlis demonstrates that APL makes it "straightforward to model a computer" at any level of detail. Students can implement a machine language assembler in about 40 lines, exploring the fetch-execute cycle and the relationship between hardware and software. This is possible because APL's array operations map naturally to the parallel, repetitive nature of computer architecture.

This capability addresses a fundamental challenge in computer science education: bridging the gap between high-level software concepts and low-level hardware realities. When students can write their own assemblers or simulate CPU components in a few dozen lines, they develop an intimate understanding of how abstraction layers work. As Perlis writes, this creates "man-machine symbiosis"—the student becomes capable of building tools tailored to their needs rather than depending on external, pre-packaged systems.

Structured Programming and Verification

Perlis engages directly with the structured programming debate of the 1970s, offering a nuanced perspective. He defines good structure not as the elimination of goto statements, but as "lexical continuity: small changes in program capability are acquired by making changes within lexically close text." By this measure, APL programs naturally support better structure because their density means that "the consequences to APL programs of weak structuring are less disastrous."

The essay also addresses program verification. Perlis argues that APL serves well as a specification language because assertions and verification conditions can be "much more easily expressed as APL predicates." The distributed control into primitive functions means verifications can be shorter and more analytic—students focus on proving properties of algorithms rather than tracking control flow through explicit loops and conditionals.

The Economic and Practical Case

Perlis confronts cost objections head-on. In 1977, an APL system cost about $10,000 per terminal, roughly double a BASIC system. His counterargument is pragmatic: "one can do more than twice as much with APL in a given period of time than with BASIC!" The cost-per-learning-outcome favors APL, especially when considering that students can accomplish more complex projects within a 16-week semester.

He also addresses the faculty knowledge gap, noting that "most university computer scientists don't really know APL." Yet he observes that faculty outside software—particularly in theory—pick up APL quickly and prefer it to FBAPP languages. This suggests the difficulty isn't inherent to the language but to unlearning assumptions about programming.

A Bridge to LISP

Perhaps most presciently, Perlis identifies a deep relationship between APL and LISP, suggesting that "acquiring simultaneous expertise in both languages is possible and desirable for the beginning student." He envisions a unified approach where these two languages—one array-oriented, one list-oriented—cover complementary domains of computation. This foreshadows modern functional programming's emphasis on composability and declarative thinking.

Why This Matters Today

Perlis's essay raises fundamental questions about programming education that remain unresolved. Modern introductory courses typically start with Python, Java, or JavaScript—languages that, while practical, demand significant syntactic overhead and explicit control structures. Students spend weeks learning loops, conditionals, and function declarations before tackling interesting problems.

APL offers an alternative path: start with powerful primitives, compose them fluently, and develop algorithmic thinking without language friction. The "lyrical" quality Perlis describes—the ability to write programs that read like mathematical expressions—aligns with contemporary interest in declarative programming and functional composition.

The essay also challenges the assumption that beginner-friendly languages must be verbose and explicit. Perlis argues that APL's initial learning curve ("a month versus a week" for BASIC) is justified by superior long-term outcomes. This trade-off between immediate accessibility and expressive power continues to shape language design debates today.

The Legacy and Limitations

APL never achieved mainstream adoption in education, though it maintains a dedicated community and has influenced languages like NumPy, Julia, and even SQL. Its array-oriented thinking permeates data science and scientific computing. Perlis's vision of a language where "the sweep of the eye across a single sentence can expose an intricate, ingenious and beautiful interplay of operation and control" finds echoes in modern functional programming and array processing libraries.

Yet the practical barriers remain: APL's special symbols require keyboard layouts or input methods, and its paradigm shift challenges both students and instructors accustomed to imperative programming. The ecosystem around APL is smaller than mainstream languages, limiting available libraries and community support.

A Call for Reconsideration

Perlis concludes with a forward-looking perspective: "Above all, remember what we must provide is a pou sto to last the student for 40 years, not a handbook for tomorrow's employment." This challenges educators to prioritize durable conceptual understanding over immediate job-market skills.

The essay invites us to reconsider whether our current introductory languages truly serve the five objectives Perlis outlines. Do students develop fluency in expressing algorithms, or do they learn to navigate language syntax? Do they understand computer organization, or do they remain at arm's length from hardware? Can they build systems from scratch, or do they primarily learn to use existing libraries?

APL's case, as Perlis presents it, isn't about nostalgia for an old language. It's about recognizing that the choices we make in early programming education shape how students think about computation for decades. Sometimes, the most powerful tool isn't the easiest to pick up initially—it's the one that, once mastered, expands your capabilities the most.

For those interested in exploring APL's philosophy, the APL Wiki provides comprehensive resources, while the Dyalog APL interpreter offers a modern implementation. The language's influence can also be seen in array-oriented libraries like NumPy and Julia, which bring some of APL's composability to mainstream programming.

Perlis's 1977 essay remains a provocative reminder that programming language design is fundamentally about shaping thought—and that sometimes, the most elegant path to understanding runs through symbols and concepts that initially seem foreign.

Comments

Loading comments...