Language Legacy

The Parenthetical Immortal

How the world's second-oldest programming language quietly won the war of ideas — and why its best days may still be ahead.

Listen
A cosmic tree of luminous parentheses branching into a constellation of programming concepts, representing Lisp's enduring influence on all of computing
Silicon wafer with Lisp parentheses etched as glowing circuit traces across ARM and RISC-V architectures
01

The 68-Year-Old Language That Just Shipped ARM64 Support

Here's a sentence that should make every "Lisp is dead" commentator feel slightly foolish: Steel Bank Common Lisp just shipped production-grade Windows ARM64 support in version 2.6.2. That's a programming language created in 1958 running natively on a Windows-ARM64 platform that barely existed five years ago.

SBCL maintains a strict monthly release cycle — more disciplined than most VC-funded startups can manage for their SaaS products. The February release also includes critical optimizations for RISC-V and improvements to bignum arithmetic. While the JavaScript ecosystem churns through build tools like a teenager through streaming services, Common Lisp's flagship compiler quietly adapts to every new chip architecture that emerges.

When Apple shipped M-series silicon, SBCL was ready. When Qualcomm pushed Snapdragon Elite for PCs, SBCL adapted. The language that powered the first AI boom at MIT in the 1960s now runs on the chips powering the second one. That's not nostalgia — that's architectural resilience that most modern languages can only dream of.

Bar chart comparing execution speed and memory efficiency across Lisp implementations, with SBCL and Chez Scheme leading in performance
SBCL remains the performance benchmark for the Lisp family, with Chez Scheme close behind on memory efficiency. Emacs Lisp, designed for text editing rather than computation, trails significantly.
GPU compute cores as a grid of glowing cubes with S-expressions flowing through them as streams of light
02

Functional Elegance Meets Raw GPU Power

The knock on Lisp has always been performance at scale — beautiful for symbolic manipulation, lousy for number crunching. Dragan Djuric's Uncomplicate project just obliterated that argument. The Clojure-based deep learning and linear algebra ecosystem now supports CUDA 13.0 and Apple's Metal API natively.

You can now write GPU-accelerated neural networks in a Lisp dialect, running on the same hardware that trains GPT-class models. Djuric's pitch is characteristically blunt — "You shouldn't have to choose between functional elegance and GPU performance." He's right. The Python hegemony in machine learning was never about Python being good at math. It was about library availability. Uncomplicate closes that gap for the Clojure ecosystem.

This matters beyond benchmarks. Clojure's immutable data structures and REPL-driven workflow make it arguably better than Python for exploratory ML research, where reproducing results and reasoning about state is half the battle. If you've ever lost track of which Jupyter notebook cell mutated which global variable, you understand the problem Lisp solved in 1958.

Branching railroad tracks splitting into parallel paths, each a different color, converging at a glowing indigo point
03

Racket 9.0 Breaks Free of Green Thread Prison

Racket has always been the "language laboratory" of the Lisp family — the place where wild ideas about programming language design get prototyped and stress-tested. Version 9.0 delivers the most significant concurrency overhaul in over a decade: native parallel threads, replacing the historical green-thread model that limited true multicore utilization.

But the real story is Rhombus — a Racket-based language that uses conventional syntax instead of S-expressions. It's Lisp's power without Lisp's parentheses, designed explicitly to attract developers who can't get past the visual shock of (define (factorial n) (if (<= n 1) 1 (* n (factorial (- n 1))))). This is strategic pragmatism from a community that's historically been content to let the uninitiated self-select out.

Racket's approach to "Language-Oriented Programming" — building custom languages for specific domains — is arguably more relevant now than when it was conceived. Every organization wrestling with complex business rules is essentially building a domain-specific language whether they realize it or not. Racket just has 30 years of tooling for doing it well.

A contemplative programmer at a vintage terminal, REPL cursor glowing with warm amber light, surrounded by whiteboards of lambda calculus
04

Rich Hickey Thinks You're Outsourcing Your Brain

Rich Hickey's Clojure/Conj keynotes have always been essential viewing for anyone who thinks about software design. His 2025 talk, "The Human Element in an AI World," went somewhere unexpected: philosophy. Hickey argued that Lisp's deepest legacy isn't macros or homoiconicity — it's the tradition of mentorship and intellectual generosity that the community cultivated.

"Teaching others solidifies your own understanding — a process we risk losing if we only interact with bots instead of people." That's not a Luddite rejecting AI. That's a language designer who spent decades optimizing for developer experience warning that we're optimizing away the developer. When Hickey says Lisp's primary value is enabling creative individuals to pursue a unique vision through a "fun and flexible medium," he's making a case that craftsmanship is the point, not just the output.

This philosophical turn from the Clojure community is significant. The language has always attracted developers who care deeply about simplicity and deliberate design. If even this community is sounding alarms about AI replacing thought rather than augmenting it, the rest of the industry should pay attention.

Horizontal bar chart showing how many mainstream languages adopted each Lisp concept, with garbage collection and closures leading at 12 and 11 languages respectively
Lisp's conceptual fingerprints are everywhere. Garbage collection and closures are now table stakes; macros and homoiconicity remain Lisp's competitive moat.
Split composition showing crystalline S-expression tree in indigo meeting organic neural network tendrils in warm amber, merging at center
05

The Symbolic Revenge: Why AI Agents Are Rediscovering S-Expressions

Here's the delicious irony of 2025's AI landscape: the field that abandoned symbolic AI and Lisp in the 1990s in favor of neural networks is now discovering that large language models work better when you give them... symbolic reasoning tools. Specifically, S-expressions.

A growing body of research proposes using a Lisp REPL as a persistent reasoning loop for LLMs. The argument is compelling: Lisp's uniform syntax reduces token cost when AI generates code, its homoiconicity means agents can inspect and modify their own reasoning, and the REPL provides a natural "scratchpad" for iterative problem-solving. One paper puts it bluntly: "Lisp is the natural intermediate representation for autonomous agents that must reason about and modify their own logic."

This isn't just academic curiosity. John McCarthy created Lisp in 1958 specifically for artificial intelligence research. The language was designed from the ground up to manipulate symbolic expressions — exactly what modern AI agents need to do when they plan, reason, and debug. The neural network detour was enormously productive, but the field is circling back to the insight that statistical pattern matching alone isn't enough. You need structure. You need symbols. You need (lambda (x) (x x)).

Timeline showing 68 years of Lisp development from 1958 to 2026, with major milestones including Scheme, Common Lisp, Clojure, Racket, and SBCL 2.6
From McCarthy's 1958 paper to SBCL's 2026 ARM64 release — Lisp's timeline spans the entire history of practical computing.
Single luminous parenthesis reflected infinitely in facing mirrors, each reflection slightly different, creating a fractal corridor
06

The Curse That Became a Blessing

Rudolf Winestock's "The Lisp Curse" essay has haunted the community since 2009, and its central thesis still stings: Lisp is too powerful. It allows a single developer to build a private universe, which makes collaboration — the boring, essential foundation of the software industry — unnecessarily difficult. "Lisp is the language of the individual genius, but the software industry is built on the collaboration of the mediocre."

The 2024-2025 addendum to this argument is even more pointed. LLMs struggle with Lisp because there's no single "standard" way to write it. Go and Java thrive with AI coding assistants precisely because their uniformity makes pattern-matching trivial. Lisp's image-based development model — where the running system is the source of truth — is antithetical to modern stateless CI/CD pipelines. Every advantage becomes a liability when the industry optimizes for something else.

But maybe that's the point. McCarthy's 1960 paper — "Recursive Functions of Symbolic Expressions and Their Computation by Machine" — didn't set out to create a popular language. It set out to discover the minimal essence of computation. Lisp's real legacy isn't market share; it's the ideas that every successful language eventually absorbs. Garbage collection, closures, interactive development, code as data, conditional expressions — all Lisp innovations, all standard features of languages that "beat" Lisp in popularity. Lisp didn't lose. It became the water.

Infographic showing the Lisp family tree from 1958 to present, with branches to Scheme, Common Lisp, Clojure, and Racket, plus arrows showing concept adoption by JavaScript, Python, Ruby, Swift, and Rust
The Lisp Family Tree: From McCarthy's 1958 paper to today's living dialects, plus the concepts that escaped into every mainstream language.

The Longest Game

Sixty-eight years in, Lisp's position is paradoxically stronger than ever — not because the world adopted Lisp, but because Lisp's ideas adopted the world. Every time you use a closure in JavaScript, a macro in Rust, or garbage collection in anything, you're writing Lisp with extra steps. McCarthy's parentheses are the invisible scaffolding of modern programming. And as AI agents start needing to reason about their own code, those parentheses might become visible again. The language of the future keeps being the language from the past.

Share X LinkedIn