Programming History

Fifty Years of Curly Braces

How a language born on a PDP-11 in a New Jersey office became the invisible substrate of modern computing.

Listen
A vintage PDP-11 terminal in a dimly lit Bell Labs office, amber screen glowing with C source code
01

Where It All Began: Room 2C-644

Interior of Bell Labs with PDP-11 minicomputer and teletype terminal

Every language has an origin myth. C's is better than most, because it's actually true: two guys in a New Jersey office, annoyed that their computer couldn't run the operating system they wanted, decided to build both the OS and the language to write it in. At the same time.

In 1969, Ken Thompson had written a stripped-down operating system for the PDP-7 using assembly language. He'd also created B, a typeless language descended from Martin Richards' BCPL. B worked fine on the word-oriented PDP-7. Then Bell Labs got a PDP-11, which was byte-addressable, and B's lack of types became a showstopper.

Dennis Ritchie took B and added what it needed: types. int, char, arrays, pointers, structures. The language passed through "New B" (NB) before becoming C around 1972. By 1973, Ritchie and Thompson had done something unprecedented: they rewrote the entire Unix kernel in C. Roughly 90% of the kernel was now in a high-level language, with only the hardware-critical 10% remaining in assembly.

"C is quirky, flawed, and an enormous success." — Dennis Ritchie

That decision—rewriting an operating system in a high-level language—was the moment that changed everything. It made Unix portable. And portable Unix meant C would spread to every machine that ran it.

02

The White Book That Taught a Generation

A well-worn copy of 'The C Programming Language' book on a university desk

Before there were Stack Overflow answers, before there were YouTube tutorials, there was a 228-page book with a beige cover. The C Programming Language by Brian Kernighan and Dennis Ritchie—universally known as "K&R"—didn't just document C. It defined what a programming book could be.

Page one introduced what may be the most famous program ever written:

#include <stdio.h>

main()
{
    printf("hello, world\n");
}

Kernighan later recalled that the "hello, world" idea came from a BCPL tutorial he'd seen. The lowercase, the comma, the lack of exclamation point—all deliberate. It was understated, like the language itself. That single example spawned a tradition that persists in every language tutorial written since.

For a decade, K&R's "Appendix A: Reference Manual" served as C's only formal specification. The book sold millions of copies and was translated into dozens of languages. By the mid-1980s, C compilers existed for everything from the Apple II to Cray supercomputers. But this success created a problem: every compiler vendor added their own extensions, and "K&R C" was fragmenting into incompatible dialects.

03

Taming the Dialects: The ANSI Standardization Wars

Abstract visualization of diverging code paths converging through a prism

By 1983, C had a serious identity crisis. Every platform had its own flavor. Code that compiled on one vendor's system would choke on another's. The ANSI X3J11 committee was formed to fix this, and they spent six years doing it—an eternity in computing, but the result was worth it.

The committee made several decisions that shaped C's future. They borrowed function prototypes from C++ (which Bjarne Stroustrup had been developing as "C with Classes" since 1979), enabling the compiler to type-check function arguments for the first time. They introduced void for functions that return nothing and void * for generic pointers. They added const and volatile qualifiers.

Timeline of C programming language milestones from 1969 to 2024
Key milestones in C's evolution, from Thompson's B language to the C23 standard.

The standard was ratified as ANSI X3.159-1989 (C89) and adopted internationally as ISO/IEC 9899:1990 (C90). The distinction is mostly bureaucratic—the language is identical. What mattered was that for the first time, there was one C. Vendors could compete on optimization quality rather than language extensions. Portable C code became a realistic goal, not an aspiration.

04

C99: The Modernization Nobody Could Agree On

A cracked bridge between two platforms symbolizing the C89 to C99 gap

C99 was supposed to bring C into the modern era. It did—eventually. The problem was that the most important compiler vendor in the world refused to implement it for over a decade.

The new features were genuinely useful: // single-line comments (borrowed back from C++), inline functions, variable-length arrays (VLAs), the _Bool type, stdint.h for fixed-width integer types, and the restrict keyword for pointer aliasing hints. For numerical computing, C99 added <complex.h> and improved IEEE 754 floating-point support.

But Microsoft's Visual C++ stubbornly refused to support C99, arguing that developers should use C++ instead. Since MSVC was the dominant compiler on Windows, this meant cross-platform developers were stuck writing C89 well into the 2010s. GCC and Clang supported C99 early, but the Windows ecosystem's resistance created a bizarre time warp where the world's most popular desktop platform was a decade behind on C standards.

Line chart showing C's TIOBE Index market share from 1988 to 2025
C's TIOBE Index share over time. Despite fluctuations, C has remained in the top 5 for nearly four decades.

The lesson: a standard is only as good as its implementations. You can write the most elegant spec in the world, but if the compiler your users depend on ignores it, you've written poetry, not infrastructure.

05

Threading the Needle: C11 Through C17

Abstract glowing threads weaving through dark space, representing concurrent execution

By 2011, multicore processors were everywhere and C still had no standard threading model. Every platform rolled its own: POSIX threads on Unix, Win32 threads on Windows, proprietary APIs on embedded systems. C11 tried to fix this with <threads.h> and <stdatomic.h>.

The "tried" is doing heavy lifting in that sentence. C11's threading library was made optional—a compromise that satisfied the embedded compiler vendors (who didn't want to implement threads on 8-bit microcontrollers) but frustrated everyone else. In practice, most C developers continued using pthreads or platform-specific APIs.

C11's real wins were quieter: _Generic for type-generic macros (a poor man's function overloading), _Static_assert for compile-time checks, and anonymous structs/unions. The standard also acknowledged reality by making VLAs optional—they'd proven to be a security risk and implementation headache.

C17 (ISO/IEC 9899:2018) was explicitly a "bugfix release" that introduced zero new features. It corrected defects and ambiguities from C11, nothing more. The WG14 committee had learned from C99: better to ship a clean standard than an ambitious one nobody implements.

06

C23: The Renaissance Standard

A phoenix rising from vintage computer hardware, formed from streams of source code

If C99 was the ambitious update that stumbled on adoption, C23 is the ambitious update that learned from every mistake. Formally published as ISO/IEC 9899:2024, it's the most significant revision since C99—and this time, the major compilers are on board.

The headline features read like a wish list that C programmers have carried for decades. nullptr finally gives C a type-safe null pointer constant (no more casting (void *)0). constexpr objects enable compile-time evaluation. Binary literals (0b1010) and digit separators (1_000_000) make bit manipulation and large numbers readable. typeof is standardized after decades as a compiler extension.

But the showstopper is #embed—a preprocessor directive that lets you include binary data directly in your source. Firmware blobs, lookup tables, embedded images: all the things C programmers have hacked around with xxd and code generators for decades now have a first-class solution. JeanHeyd Meneide, the proposal's champion, called it "the most impactful preprocessor feature since #include."

C23 feature highlights: nullptr, constexpr, typeof, #embed, binary literals, digit separators, _BitInt for arbitrary-width integers, and improved Unicode support.

07

Fifty Years of Curly Braces

A massive ancient tree with roots branching into modern programming languages

Here's a number that should stop you cold: the Linux kernel contains roughly 40 million lines of code, and approximately 80% of it is C. The Windows NT kernel's "OneCore" executive is about 1.7 million lines of C/C++. The XNU kernel powering every Mac, iPhone, and iPad is a mix of C and C++. C doesn't just influence modern computing. It is modern computing's foundation layer.

Infographic showing C's dominance in operating system kernels: Linux 40M lines, Windows 1.7M lines, embedded systems 60-70% market share
C powers the world's operating systems — from Linux servers to embedded microcontrollers.

The language's syntactic DNA is everywhere. C++ (1979) was literally "C with Classes." Java (1995) borrowed C's syntax wholesale while adding garbage collection and a virtual machine. C# (2000) took that further. Go (2009)—created by Ken Thompson himself—is essentially "C for the cloud era." Even Rust (2010), which exists partly to fix C's memory safety problems, inherited C's curly braces, semicolons, and systems-level ambition.

Horizontal bar chart showing C's syntactic influence on 13 modern programming languages
C's syntactic DNA in modern languages, from C++ (95% influence) to Swift (50%).

The Rust debate is real and ongoing. In 2024, CISA and the White House issued guidance urging a transition to memory-safe languages. Rust modules have been accepted into the Linux kernel since version 6.1. But C maintains a 60–70% market share in embedded systems, where its minimal runtime and direct hardware mapping are irreplaceable. As of early 2026, C sits at #4 on the TIOBE Index, trailing Python, Java, and C++—but still ahead of every language created in the last 30 years.

Dennis Ritchie died quietly on October 12, 2011, just days after Steve Jobs. His death received a fraction of the coverage. But every time you use a smartphone, browse the web, or fly in an airplane, you're running on infrastructure that exists because Ritchie decided B needed types.

The Best Abstractions Are Invisible

C succeeded not because it was the best language, but because it was the right language at the right level of abstraction—close enough to the machine to be fast, far enough from it to be portable. More than fifty years later, that balance still hasn't been surpassed. The curly braces aren't going anywhere.

Share X LinkedIn