The first personal computer my family owned was a Commodore 64. By today’s standards, it was an ugly, clumsy dinosaur: a boxy monitor as deep as it was wide, a bulky gray keyboard with bulky black keys in which all the circuitry was housed, a stand-alone floppy disk drive that whirred and sputtered maniacally, a power adapter the size of Gibraltar. There was no internal hard drive, only sixty-four kilobytes of random access memory — a slate that wiped itself clean when you turned off the machine — of which, its startup screen asserted, 38,911 bytes were free for BASIC programming. The dot-matrix printer was fed paper in a continuous stream from a cardboard box too heavy for me to lift. The headache-inducing noise produced by the sound card became the stuff of my nightmares.
But to my seven-year-old mind, hypnotized by the digital flash of Star Trek: The Next Generation and convinced we were now the pioneers of a future which would take us to the stars, our Commodore 64 was a giant leap for mankind. The computer’s logo with its five horizontal bars spanning the spectrum — red, pink, yellow, green, blue — was as noble an insignia as the NASA chevron itself.
I made up outlandish tales about it at school. When I told my guidance counselor that my computer analyzed fingerprints just like the ones at the FBI, she raised an incredulous eyebrow, but I persisted in the lie. Already my peers were boasting of superior machines — Apples, IBMs — on which they played games that made Lode Runner, The Last Ninja, and Skate or Die! look like antiques. I was still waiting to get my hands on a copy of Starfox. No power I concocted for our Commodore was too extravagant. That machine could do it all.
Reality slapped me in the face, of course, every time I visited a friend’s house to play Nintendo. By the time I was ten, Dad had bought an IBM and handed down the Commodore to us kids, but we were not allowed to use the new computer, and the old one had begun to glitch badly. Spiderbot was an even more incomprehensible mess than it had been when the 5¼-inch floppy read cleanly.
I was too ashamed to invite friends over and let them see the technological shambles to which I had been reduced. Even my forays into BASIC programming —
10 PRINT“HELLO! WHAT’S YOUR NAME?” : INPUT N$
20 PRINT“GIVE ME A NUMBER, ”;N$
30 INPUT A
40 LET B=A*A
50 PRINT“THE SQUARE OF ”;A;“ IS ”;B
60 LET C=B*A
70 PRINT“THE CUBE OF ”;A;“ IS ”;C
— were spotted as mere parlor tricks. The amateur program that spilled “HELLO, WORLD!” endlessly across the monitor’s cyan face looked like a satiric jab at my lot in life. Shackled to a faulty joystick and mediocre memory, I was trapped in a prison of yesterday.
Dad eventually relented and let us use the IBM. It had Microsoft Windows 3.1, the latest operating system on the market. At the flip of a switch, I was booted out of a text-based world and given icons to rearrange like building blocks, fonts to play with and even create, a stunning palette of lifelike colors. Evolution of the pixel.
I was still playing games, mostly shareware. All the best stuff was on consoles now, but I didn’t care to compete over it anymore. I learned DOS, dallied with Visual C++ but couldn’t find my bearings, then discovered the GW-BASIC interpreter and dove into what I now thought of as my native tongue. My programs were byzantine, riddled with passwords, secrets, nested subroutines — labyrinths in which I was a minotaur, the sole keeper of the ways in and out. A few programs even had booby-traps: the wrong keystroke would kick the user out, wipe the computer’s hard drive, and replace the boot-up batch file with an inane loop of taunts and derision.
Fancying myself a hacker but not imagining my activities might be illegal, I was so thoughtless as to boast of them. One schoolmate seized a golden opportunity to return some slight: he got hold of an especially nasty 3½-inch floppy and took down every machine in the technology lab. Hell rained down on me at home. At school, the principal rebuked me sharply and then sent me to the school’s IT specialist, Mr. Genly.
His office was littered with the innards of computers. I might have thought I was looking at an inner sanctum, had I not been sensible I was not really there to learn anything — the principal just wanted me out of trouble and out of harm’s way. Mr. Genly knew which way was up. He put me to a little use, carting hardware around the school, but mostly left me to program on my own in an unused back room. While other kids were at gym, I dug deeper into BASIC, writing my first graphical game.
Colored circles fell from the top of the screen. You aimed and shot them down with the mouse. So conceptually it was a vertical-scrolling shooter, like Tyrian, though it lacked a background against which to observe movement. I was only working in BASIC, after all — it’s not a graphics-friendly language. Nonetheless, rudimentary as it was, I was proud of what I had created.
I embarked on high school planning to focus on mathematics in preparation for taking a computer science degree. Then came disasters in love, the first of many. Staring at a computer screen didn’t buoy me up, but I found staring at a book did. My flirtation with books had begun at the age of twelve, when my parents separated and, bored out of my wits while staying with my grandmother, I picked up a few thrift paperbacks — adventure stories, mostly Robert Louis Stevenson and Jules Verne. By high school, I was up to my eyeballs in the wondrous escapism of science fiction and fantasy: Clarke, Bradbury, Le Guin, McCaffrey, and of course Tolkien. For a year I carried a dog-eared copy of The Hobbit in my jacket pocket that I rescued from my uncle’s effects after he died. But my true literary metamorphosis began sophomore year, when for the first time I read Shakespeare with a teacher who really seemed to care what I thought of it.
The texts we read in class were plays, but poetry attracted me. I spent hours contemplating the sonnets and tried my hand at a couple — overwritten tripe. Then I came across a college textbook on Dad’s bookshelves, Laurence Perrine’s Literature: Structure, Sound, and Sense. I read the whole tome in a month. It was in those pages, I recall, that my life as a writer really began. I even remember which poem effected this pivot: Wilfred Owen’s “Dulce et Decorum Est.” I immediately recognized it as a response to other wartime poems such as John McCrae’s “In Flanders Fields,” which years before I had learned by rote from Dad’s volume of familiar verse, One Hundred and One Famous Poems. Here was a true, meaningful drama: one writer arguing with another about the casualties of a war in which both fought, both died. Here was life’s slaughterhouse.
To Dad’s consternation, I declared an English major in college. An engineer who had lifted all of us out of poverty very much by his bootstraps, he couldn’t comprehend a choice so vague and so likely to land me back in the poorhouse. By this time, though, I was cracking away at writing a novel, and he couldn’t dissuade me. I didn’t know whether I would become a novelist, a poet, an English teacher, a college professor, or even a journalist, but I knew my future lay in words. I applied myself with zeal to an editorship on the student literary journal. I wrote poetry and began to feel that was my calling. Eventually, after teaching English for a year at a rural high school and deciding there I had no future in that, I went after an MFA in creative writing, a pursuit I could never explain to Dad’s satisfaction.
I never gave up computers, really. I stopped programming and became a mere user, but my immersion continued. College being an intensely social experience, I felt obligated to join in the fray of social media when my friends did. Over time, as we all floated off on the currents of life’s vagaries and vicissitudes, I watched them become icons floating over assemblages of text.