Skip to main content

Play While Learning

In the early nineties, the basement of Mong Kok's Sino Centre always permeated a specific smell—cheap photocopier paper, plastic CD cases, and that faint burnt odor of computer fans overheating. That was my weekend pilgrimage site during my teenage years.

The owner of the software shop might not have recognized my face, but I recognized all the goods on his shelves. PC Era, PC Market, PC Home—these magazines disappeared collectively in the internet age, but at the time, they were practically my only window to understand this world.

I also had a teacher: an EEE (Electrical and Electronic Engineering) prodigy at the University of Hong Kong. Many of my games were copied from him. I would go to the computer shop to buy Maxell floppy disks—Japanese goods, more expensive than off-brands but durable; off-brand disks often got bad sectors back then—and take them to his house, copying them via disk drive under DOS. When encountering games that refused to run, or those that ate up abnormal amounts of memory, he would teach me how to tweak config.sys, how to replace COMMAND.COM, how to squeeze out a few extra KBs of memory. So I wasn't actually "a kid figuring it out all alone"—most of those modding skills were taught hand-in-hand by a master.

In 1995, the year I moved from Form 3 to Form 4, my teacher's whole family immigrated. Before leaving, he said something to me that I remember to this day: "There are endless things to learn in the world. You already don't need to wait for someone to teach you; you can learn by reading books and magazines yourself. The internet is already here, it's just not widespread yet—but it will be soon, and learning online will be much faster than waiting for magazines. And work on your English."

After that year, I never sought out another master.


When I was in primary school, we had a computer at home—and it was consistently top-of-the-line for its time. "Building a rig" (砌機) was popular back then: motherboards, CPUs, RAM, graphics cards could all be detached and swapped. My machine upgraded all the way from a 386 to a 486, then to a Pentium. My family wasn't badly off, and my parents spoiled me; back then, if you wanted to run the full suite of daily software—the ET3 Chinese System under DOS, Shakespeare Chinese typesetting, plus Windows 3.1 and CorelDRAW—you simply had to keep upgrading.

At that time, Windows 3.1 wasn't an operating system in today's sense. You booted into DOS, and to run Windows 3.1, you had to type the command win yourself. The vast majority of daily software had no graphical interface; what you faced was forever that blinking C:\> cursor. In that era, the very act of "using a computer" meant dealing with the command line—you had no other choice.

And that constantly evolving top-tier computer, in the hands of a primary school student like me, was mostly used for playing games. It was right in the process of simultaneously playing games and wrestling with the computer at home that I accidentally learned hexadecimal.

We had a Casio engineering calculator at home—originally my older sister's, the kind of standard-issue model almost every secondary school student had—and it featured BIN, OCT, HEX, and DEC keys. Mashing 255 into FF, then into 11111111 to kill time during math class was my pastime. So we knew how to pronounce the word "hexadecimal" early on, but nobody had ever told us why it was important.

That "why" was forced out by games. Whether modding save files or using cheat codes, you would repeatedly see the same numbers: characters capped at level 255, HP maxed out at 255, certain stat tables freezing at FF when they peaked. After twenty or thirty times of "why won't my character get any stronger," you connect the dots yourself—pressing the HEX key on the calculator for FF displays 255, so FF must be the state where all eight bits are lit up; a byte cannot hold a number larger than 255, so it overflows. Behind that key on the calculator you had pressed a thousand times stood the entire architecture of the CPU.

Kids of my era didn't sit down to learn computers—we learned while playing ("玩住學"). To play a game back then, you often had to manually tweak config.sys, calculate memory allocation, figure out which IRQ the sound card plugged into—you had to get your hands dirty with both software and hardware. In the eyes of kids back then, these processes weren't chores, they were a ritual: if you cleared the hurdles, the game ran; if it ran, you won.


Around the year 2000, I entered university. By then, departments like Computer Science, Computer Engineering, and Software Engineering had long existed and were heavily populated. Traditional EEE wasn't unpopular either—it was stable, offered points for immigration, and was accepted by parents—but everyone studying EEE was aiming for the traditional "heavy electrical" paths: building subcontractors, power plants, electromechanical engineering. Very few thought of using EEE to nurture a cross-layer perspective.

I chose EEE, adding a double major in Computer Engineering.

I grew up playing with Gundam models, RC cars, and RC helicopters, and what I always wanted to do in my heart was "make a product myself," not simply write software. To me, crossing from hardware to product required EEE as the most appropriate foundation—even if there really weren't many people in my cohort thinking from this angle.

In the first semester, we received an assignment: use a Motorola HC11 microcontroller, a variable resistor, and an RCA output cable to build a brick-breaker game console. No libraries, no frameworks. The HC11 directly bit-banged the timing signals, which were then converted to analog via a simple DAC to push pixels one by one to the RCA output. The analog signal from the variable resistor end was sampled via ADC and converted into the paddle's position. The entire program was finally burned into an EEPROM with a capacity measured in KB.

That semester, by myself, I built a complete game console from scratch.


On the surface, this book is about how video games propelled the tech hegemony of the past forty years. But my true motivation for writing this book is to answer a more personal question—

Why, in my era, could someone still grow up to be the kind of engineer who straddled hardware and software, understanding a little bit of everything from transistors to the cloud? Why has this growth path practically disappeared today?

The answer to this question is tied to every protagonist later in this book.

John Carmack is Carmack because the era he grew up in left the hardware exposed. He had to write directly to VGA memory, had to figure out how every single clock cycle of the CPU was spent himself—so when 3D graphics engines later had to make trade-offs at the hardware limits, he was the one who could still stand firm.

Gabe Newell's story is particularly personal to me. When I was young, I didn't understand why some games suddenly refused to run, why I had to swap COMMAND.COM, why every new version of Windows made DOS games harder to play—it wasn't until years later, when I encountered Linux, that I saw it clearly: it wasn't a technical problem, it was a business problem. Chapters 5 and 6 of this book expand on that history; but here in the preface, I just want to say, some paths you are forced to walk to the end, and only upon finishing do you realize that others were walking the exact same path.

Gabe Newell chose Linux because he knew the cost of a closed stack. The abstraction layer itself is not the problem; it is the only way to manage complexity. What is truly fatal is the black box—when Windows and Intel lock down every layer, developers become tenants of the platform holders, liable to be charged rent or evicted at any time. For Valve to walk out of this predicament required a leader who didn't just understand APIs, but could see through the entire stack—because to build your own gaming platform on an open but complex system (Linux), you must know which layer can be opened, and what can be done once it's open.

Jensen Huang bet on CUDA between 2006 and 2007, enduring for nearly a decade, because he could read the seam between hardware and software. He was neither a pure software engineer nor a pure hardware engineer, but a rare kind of person who could grasp both ends simultaneously.

The predicament facing Intel and Microsoft today can essentially be understood as: a generation of engineers who "only understand frameworks, not hardware" is taking over an empire that requires a systems-level perspective. They are incredibly smart, but their habitual way of working is to optimize standing atop countless abstraction layers, rather than piercing through the abstraction layers to redesign the foundation.


What this book records is forty years of tech history. But the question it truly wants to ask is:

In the AI era, is it still possible for us to cultivate the next generation's Carmack, Lisa Su, or Morris Chang?

Honestly, I've never really been a normal person—I've just always been pretending to be normal.

If this kind of person no longer exists, this book is a eulogy. If this kind of person is still here, this book is an instruction manual.