Illustration by Ana Vasquez

The Death and Life of the ’90s Computer Show

John Fischer
Magenta

--

The enthusiast tech culture of the time was no match for closed systems and corporate interests. But that DIY ethos is making a comeback today.

TT o describe the computer show circuit of the 1990s is to recall something apocryphal: an artifact of an earlier technological epoch that has avoided the enshrinement offered to mushroom-eating video game plumbers and free America Online disks. The computer show—to the extent that it’s remembered at all—inhabits the awkward period of personal computing’s adolescence, when it was just a clunky interloper in the mainstream consciousness of the millennium’s final decade.

I mean this as affectionately as possible, since I was, during my own awkward adolescent years, a regular visitor of such events. Where I grew up in the New York suburbs, computer shows were a cottage industry. The weekend events, organized by promoters with names like MarketPro and Tri-State Computer Shows, assembled the regional vendors of PC hardware and software under a single roof: the convention center of the local Hyatt, an indoor amusement park called Sports Plus, or the gymnasium of the nearby state university.

Computer shows sprouted up in metropolitan suburbs across the country during a time when the personal computing industry was only beginning to expand beyond a catalog business of obscure parts and abbreviations. Computer retail stores were then small, independent affairs relegated to strip malls, or otherwise short-lived chains like Egghead Software and CompUSA. So for those of us living in the hinterlands of the so-called computer revolution, computer shows made both financial and social sense: once a month, for the cost of $10, you could walk amongst your fellow man, buy memory cards and CD burners directly from tables of Taiwanese wholesalers, and maybe try out the latest shareware demos. These were not the famous trade shows like Macworld Expo or DEF CON — these were handwritten signs, 32-ounce Big Gulp cups, crates in the back of beat-up Toyota hatchbacks. This was local commerce at work, performing the sweaty and incremental labor of pushing desktop computing into the lives of middle-class Americans.

Though computer shows were unglamorous, they were also prescient in a way that bears re-examination. To attend a computer show in the 1990s was to experience an unlikely celebration of the tinkerers and inventors who saw mechanical systems not as products to be consumed, but systems that could be bent and modified and tuned to their particular needs. In the years before our devices became sealed and seamless fetish-objects, computer shows were a brief but vibrant manifestation of the open-source spirit underlying all great technological innovations: invention as play. And among today’s tech-savvy young people, that same spirit is again in resurgence.

If the history of the computer is the story of two ever-competing philosophies: the open system versus the closed. When I mentioned my interest in computer shows to Clive Thompson — a technologist, Wired magazine columnist, and author of the book Smarter Than You Think: How Technology is Changing our Minds for the Better — he suggested that they were actually part of an ongoing fluctuation between the technological poles of freedom and control.

“When you look at the history of modern computers, there are distinct moments of creative ferment that occur when people are left alone to experiment,” Thompson tells me. In his opinion, it’s during these moments of “openness,” of unfettered access to systems seen as inconsequential or unprofitable, that the most significant technological innovations are born. He points to the 1950s and ’60s, when a concentration on building massive hardware machinery led to an undercurrent of programming fervor that produced some of the earliest computer languages, such as FORTRAN and COBOL. Similarly, the microcomputers of the early ’80s, such as the primitive Altair 8800, which was based on one of Intel’s earliest-ever microprocessors and sold primarily through Popular Electronics magazine. The Altair required users to program it by flipping individual switches on its front panel that represented binary numbers. “Because microcomputer kits were cheap and crappy, you had thousands of people ordering them,” says Thompson. “Pretty soon you have a guy named Bill Gates figuring out how to get the programming language BASIC to run on a microcomputer, and that’s when Microsoft is born.”

But openness coexists uneasily with commerce. Once a new innovation is deemed profitable, commercial interests attempt to extract its value through the logic of predictability, replicability, and control. In other words: closure. As micro-computing metamorphosed from homebrew experiments into the products that would become Windows and Macintosh, the ethos of invention was subsumed into the imperative of profitability. It is in fact possible to find this dynamic occur within almost any emergent electronics innovation of the past half-century: synthesizers, websites, social media, and so forth. Call it the gentrification of intellectual property.

Though it’s tempting to cry foul, the domination of open systems by corporate interests has a less-than-nefarious cause of cost management. An invention without an application only serves a market of one; bringing it to a mass audience is expensive, complicated, and often risky. In the 1990s, the issue of cost manifested most clearly in computer hardware. Hardware, after all, was the true economic rationale for the existence of computer shows. You could, for a time, purchase a desktop that HP might retail at $3,000 for almost $1,000 less, if you were willing to assemble it yourself. But, as desktop computing gained mainstream popularity in the early 2000s, it drove the cost of hardware production precipitously downward, forcing computer retailers to make up their lost profits through scale. Within a few short years, deals that could once be found at computer shows shifted to big-box retailers and websites like Amazon.

It is perhaps no surprise, then, that two of the most iconic products of the new millennium — Apple’s iMac and iPod — owe their success in no small part to the obfuscation of their inner workings, appealing to customers through the softer-sell of personal taste rather than the particulars of speed, size, and power. They were, in effect, a harbinger of the model to which the entire personal computing industry would move in a few short years. If hardware was fast becoming a commodity, then the product itself would have to evolve beyond its constituent elements.

Hlynur Atlason, an industrial designer and founder of the Manhattan-based Atlason Studio, which has worked extensively with Microsoft and other tech companies, sees this as a natural progression of the consumer electronics category reaching maturity. “As a product becomes available to everyone, it becomes a reflection of the buyers themselves, their needs, and their identities,” says Atlason. “The old beige boxes were ultimately business machines, whereas modern devices are products that help you express your taste and personality.”

In Atlason’s view, the sealing up of consumer electronics is an unintended but necessary side-effect of selling technology at mass-market scale. This is particularly due to the fact that the role of computers in our everyday lives has shifted over time: from devices used to run specific creative or analytical applications, to glass panels intended for the creation, consumption, and manipulation of content. And enabling the kind of mass accessibility needed for this more universal function means ensuring that devices are simple, intuitive, and free of the technical roughness that bedeviled the early PC era.

“There’s no question that you can get better performance and a better user experience when you don’t let people swap out their own parts,” Limor Fried tells me. Fried is founder of Adafruit Industries, a New York City-based manufacturer of DIY electronics components who’s been featured on the covers of Wired and Make magazine. But, says Fried, this has not squashed the open-source movement as much as it has propelled it in new directions. “The same enthusiast culture of the ’90s is behind today’s DIY electronics community. I bump into ex-IRC folks at Maker Faires and DIY events,” Fried says. “VR-hackers are still doing VR, just with better goggles. I think DIY and maker culture is a little healthier now. It’s more welcoming, diverse, and inclusive.”

The trend of systems closure that killed the computer show is driving a kind of backlash, placing a new premium on discovery, learning, and play. “There’s been a concerted effort by people who like that early computer culture and want to bring it back,” Thompson says. Do-it-yourself tech has lately found its way into clothing, bicycles, musical instruments, and is even taught in graduate schools such as New York University’s Interactive Telecommunications Program (whose name itself is an anachronistic echo of its origins). There are thousands of projects on Kickstarter, as well as the even more robust culture of internet tutorials, guides, instructional YouTube videos, and forums where makers detail their latest experiments.

And this time around, even the companies that profit from closed systems encourage openness when it comes to products geared toward the younger generation. Consider Microsoft Makecode, a block-based programming system for kids. “Not only is it really well engineered, it’s also open source and doesn’t have vendor lock-in,” says Fried, who has been working with Microsoft to make it available on the AdaFruit Circuit Playground. “This would be shocking to communicate back to someone in 1996, and I think [Microsoft] should be commended for it.”

“At this moment, tech is really synonymous with confidence,” says Veronica Chambers, an author working at the intersection of technology, education, and social issues. Chambers was most recently a John S. Knight Innovation Fellow at Stanford University, and took the fellowship in part to bring her daughter Flora, who’s a rising fifth grader, into closer contact with the tech world. Chambers says raising a daughter who is fluent in technology is as important as teaching her a second language. Flora’s classmates’ parents work for Google and Tesla, and one of their recent Girl Scout troop outings was to the offices of Nest, to watch “Hidden Figures” and perform their own miniature rocket test.

“I want her to get past just being tech-native, so that she never gets stuck building pieces of other people’s ideas. I want her to have agency,” Chambers emphasizes. She describes an 8-week video game-building course Flora took last fall. “To see her build her own video game was really exciting. Her character had two afro-puffs. She was literally creating characters that look like her, and that says a lot about how she can navigate the world.”

Culture tends to move in cycles, and the period of time that birthed the computer show was a uniquely fertile moment for the open-source mentality. The internet was beginning its spread across the country; Linux distributions were available to any teenager with a CD-ROM drive and a desire to partition a hard drive; 2600 Magazine published code exploits for everything from point-of-sale kiosks to cable boxes. In no small part, it was this sense of sprawling potential that compelled my teenage self to view computer shows in terms beyond the strictly commercial. Like so many makeshift suburban haunts, they were an entry point into something larger, a much-diluted version of a purer trend, but part of it nonetheless. They were a strange analogue of punk-rock and skate culture, a means by which I was free to make mistakes, to try on different identities, to align myself with ideologies I didn’t fully understand. Had anyone asked me the origins of the phrase “Log on, tune in, drop out,” I wouldn’t be able to say, but I plastered it on my school notebooks nonetheless.

So it’s a funny everything-old-is-new-again experience to see a resurgent techno-culture taking root with a younger generation. Whereas I had the dial-up modem, the chat room, the closeout deal on a new sound card, children like Flora are building robots and launching drones — moving into a realm of technical expertise that is leaving me behind with alarming speed. It’s a world of colorful, silicone-wrapped objects, blinking lights, and playful gestures that just happens to have hundreds of times the processing power of my old Pentium that I lovingly overclocked. Stunning to witness but equally discouraging, considering that as an adult, I have never attempted to so much as jailbreak my iPhone.

From what I can tell, the last Tri-State Computer show died out in 2014. The Trenton Computer show was taken over by The College of New Jersey. And it’s been more than a decade since Ken Gordon closed his production company that hosted PC Shows. MarketPro, the last remaining holdout, shut its doors in June 2016, having only six remaining vendors. When I reached out to Bill Grammar, a former MarketPro vendor and co-owner of Millennium BG (the computer retailer that has taken over the last MarketPro venue in Maryland and relaunched a monthly event under the MarketPro name), he sympathized with my sense of obsolescence. “Back when MarketPro North was doing six or seven shows a weekend, their take was easily a million dollars before expenses. I can remember five thousand people at a single show.”

Grammar has been in the hardware business since 1987, repairing and selling refurbished computers. Retailing with MarketPro allowed him to grow from a local retail business to a regional presence. MarketPro shows took Grammar’s sales teams to South Carolina, Virginia, and Delaware. “After a few years, we were doing a show every week,” Grammar says. But then 200 vendors dropped to 150, then to 125, and by the time that MarketPro went out of business, a show might only get 50 customers. He points to the twin pressures of internet retailing and falling hardware prices as the industry death knell that forced individual vendors to chase ever-shrinking margins. “Years ago, people asked us about Raspberry Pi [computers], but there was really no margin and not a lot of demand for it. We tried to stay involved with stuff where we could turn a dollar.”

And I suppose this is the thing that makes technology as astonishing as it is confounding: it maintains no allegiance, no permanent merchant or managerial class. Its economic disloyalty is precisely what makes it so stubbornly egalitarian. Its future sparkles at the margins, in those accidental spaces where commerce can’t quite reach. It makes fortunes and breaks livelihoods with equal ease. As much as it has come to be dominated by a few large companies, it will remain forever accessible to a kid with some cheap hardware and the desire to create.

“There’s always something weird happening,” Thompson says. “Because amateurs always do things that companies trying to make money would never do. And that’s exciting.”

--

--

Writing, marketing, general seriousness. Essays in @TheAtlantic, @Tin_House, @GuernicaMag, @TheMorningNews, elsewhere.