Tuesday, November 30, 2021

Atari took on Apple with the Atari 400 and Atari 800 PCs

Forty years ago, Atari released its first personal computers: the Atari 400 and 800. They arrived in the fall of 1979 after a prerelease marketing campaign that had begun the previous January when the company unveiled the machines at what was then called the Winter Consumer Electronics Show in Las Vegas.

Then as now, “Atari” was synonymous with “video game,” and the new machines packed more technological potential than any game console at the time, with custom graphics and sound chips, support for four joysticks or eight paddles, and the ability to play games on cartridge, cassette, or disk. At launch, one of the machines’ first games, Star Raiders, defined cutting-edge home entertainment.

And yet Atari initially marketed the 800 and its lower-cost counterpart, the Atari 400, as “second-generation” PCs—productivity machines with enhanced graphics and sound capabilities over the 1977 holy trinity of personal computing: the Apple II, Commodore PET, and TRS-80. The company intended them to crunch home budget numbers just as often as they simulated space battles.

Idiot-proof and rugged, Atari’s Home Computer System machines (I’ll call the platform “HCS” for short) represented a huge leap in consumer-friendly personal computing. Unlike many PCs of the time, the Atari machines exposed no bare electronics to the consumer. Unique keyed connectors meant that all of the machines’ ports, modules, and cartridges couldn’t be plugged into the wrong places or in the incorrect orientation. The 400 even featured a flat spillproof keyboard aimed at fending off snack-eating children.

And due to restrictive FCC rules that precluded the open expansion slots on the Apple II, Atari designed a suite of intelligent plug-and-play peripherals linked together by a serial IO bus that presaged the ease of the much-later USB.

In some ways the Atari computers even exceeded the state of the art from Atari’s coin-op department: In 1979, most Atari arcade games shipped with black-and-white monitors, using translucent gel overlays to generate pseudo-color. The Atari computers played games in color from the start—if the consumer provided the color TV set, of course.

At launch, the Atari 800 retailed for $999 with 16K of RAM (about $3,387 when adjusted for inflation), and the Atari 400 with 8K retailed for $549 (about $1,861 today). Compared to a game console such as the Atari VCS at $190, that was expensive, but it undercut the 16K Apple II’s $1,195 retail price in 1979.

This fancy retail kiosk let consumers learn about Atari’s computers—and even partake in a game of Pac-Man. [Photo: courtesy of Atari]
My own association with Atari’s computers goes back to 1981, when my father bought an Atari 800 for my older brother Jeremy, five years my senior. I grew up watching him wear out its joysticks by the half-dozen while mastering his skills in Asteroids, Dig Dug, and Archon. And the Atari served as more than a game machine for him. With its BASIC programming cartridge, the Atari opened up software as a malleable thing that could be shaped at will. It was on the Atari 800 that my brother amazed me with his homemade BASIC simulations of aircraft dogfights, among other wonders that my 4-year-old mind could hardly fathom but loved nonetheless. He later became a software engineer.
The author’s brother and a neighbor enjoy some Atari 800 quality time circa 1983. [Photo: courtesy of Benj Edwards]
Decades later, I still play Atari 800 games with my kids. It’s my home entertainment version of comfort food—a rich pastime best enjoyed by a roaring fireplace in a wood-paneled den. The Atari home computers projected a distinctive voice as an entertainment platform that I can’t get anywhere else. Games such as M.U.L.E., The Seven Cities of Gold, and Star Raiders take me back to a golden era in PC gaming and remind me that technology can create timeless classics as well as any other medium.
The author’s brother programming the Atari 800 in BASIC. [Photo: courtesy of Benj Edwards]
I’ve often wondered what cultural and business elements came together to make this breakthrough platform—this favorite electronic friend from my childhood. With some digging, I recently found out.

Video game genesis

In 1977, Atari released its first video game console with interchangeable cartridges, the Video Computer System, or VCS. (The company would later rechristen the machine “Atari 2600” from its model number, CX-2600.) A group of Atari engineers led by Jay Miner anticipated a three-year market lifespan for the 2600, which contained only 128 bytes of RAM. (As a frame of reference, the Nintendo Switch has more than 30 million times as much.)

That same year, Atari’s home computer platform began to take shape as a high-powered follow-up to the 2600. Many questions swirled around the next-generation machine. Should it remain compatible with the VCS but offer more features? Or should Atari make a clean break with the past and launch a far more advanced design?

Personal computers became the new cool thing, and Atari’s engineers wanted a piece of the action.”

“I was in the Homebrew Computer Club when Steve Wozniak introduced the Apple I in the winter of 1976,” says Joe Decuir, one of the Atari 800’s chipset architects and a veteran of the 2600 team. Decuir had begun working at Atari in 1975, hired to help with the VCS design. “One of the reasons I took the job is I thought that the project after a game machine was going to be a computer,” he explains.

Atari engineers weren’t blind to events around them in Silicon Valley. Ideas cross-pollinated between companies through social connections, local interest groups, and employee poaching between firms. One of the most important technological and societal movements of the 20th century had been taking shape: the birth of the personal computer. PCs became the new cool thing, and Atari’s engineers wanted a piece of the action.

Decuir says, “A lot of us were kicking around ideas about what a computer would do while we were doing [the 2600]. And as the core of the game machine grew, Jay Miner and I and company became the core of the computer design group, which was a much larger project.”

This group included talented Atari engineers such as Steve Mayer, Francois Michel, George McLeod, Doug Neubauer, Mark Shieu, and others. (Later, Doug Hardy and Kevin McKinsey handled industrial design.) After some brainstorming, the engineering group began with a simple goal: to take the 2600’s video chip, called TIA, and integrate computer-like capabilities such as text generation.

After many iterations, the new chip became the CTIA, the graphics integrated circuit at the heart of the new home computer. Then they designed a chip to take the load off the main CPU by feeding graphics data to the CTIA, and that became ANTIC, itself a custom microprocessor. The engineers also added a chip to handle keyboard, paddle input, and four-channel sound, called POKEY. These three custom chips, in league with a 6502 CPU, would form the core of Atari’s home computer architecture.

As a plan developed between 1977 and 1978 from many design meetings, Atari’s engineering team narrowed down the computer to three product options. There would be a low-end machine, nicknamed Candy, that would serve as a game console with an optional keyboard attachment; a high-end machine code-named Colleen with advanced, integrated features and an expansion bus; and a machine with an integrated monitor. They ended up dropping the integrated monitor idea and focusing on Candy and Colleen. Those would become the 400 and 800 computers.

To compete with the Apple II, the higher-end Atari 800 would need peripherals. And that’s where the FCC got in the way. All electronic circuits emanate radio waves when current flows—it’s one of the fundamental properties of electricity. To make sure that TV-related electronics devices don’t degrade TV reception, the FCC tightly regulates the radio frequency (RF) emissions that they can release.

At the time of the Atari HCS’s development, the FCC kept very strict rules on RF interference. Atari wanted an RF output that would allow the 800 to use a regular TV set as a display, but that meant clamping down on the potential expandability of the system. Atari engineers designed thick metal shielding within the 400 and 800 that blocked electromagnetic emanations from its core electronics.

That kept Atari from creating an “open box” type system, similar to the Apple II, where users could plug in any expansion card they wanted. The Apple II avoided FCC issues by not connecting to a TV set directly; instead, Apple allowed a third-party company to provide that as an aftermarket option. As a hobbyist machine, the Apple II could get away with that. The TI-99/4, released in 1979, skirted the RF interference issue by shipping with its own special monitor—a stripped-down TV set.

Texas Instruments lobbied to have the RF interference rules relaxed, and the FCC granted a conditional waiver of the rules in late 1979 (it finally changed them in 1983), but by then it was too late for Atari to simplify the design of its machines before launch.

M.U.L.E., an early Electronic Arts game, combined action, strategy, and economic theory on a planet named Irata (get it?). [Screenshot: courtesy of MobyGames]

Killer app in space

After finishing up design work on the POKEY chip, engineer Doug Neubauer began writing a game for the new computer system in development. It would be a first-person interpretation of his favorite computer space strategy game, Star Trek, which was making the rounds on high-powered mainframes at the time. His game, Star Raiders, included a real-time 3D universe full of alien ships, starports, and meteoroids. Its unique design began to attract attention within the firm.

“The first day I came [to Atari], one of the programmers sat down with me and said, ‘Get a load of this,’ and showed me Star Raiders,” recalls Chris Crawford, who had recently joined Atari as a VCS software developer and was to become a high-profile game creator and software evangelist for the HCS platform. “And that was what blew me away. There was absolutely nothing like it in the world of personal computers. It was just way beyond what anybody would have expected.”

In its time, Star Raiders was as dazzling as video gaming got. [Screenshot: courtesy of Moby Games]
Neubauer sought to realistically model 3D space, and he integrated advanced graphics routines that had never been seen in a PC or home console game. When an enemy ship exploded, the game engine calculated its flotsam as 3D points that could be viewed from any angle, including an aft ship viewpoint as you flew through it. Rich and dramatic sound effects complemented the game’s visual flair to an extent that wasn’t possible on competing home PCs or game consoles at the time.

While playing Star Raiders, you use a joystick to pilot a starship from a first-person cockpit view. A starfield swirls around your viewpoint realistically as you move the stick, but the full breadth of the controls proved too complex for just the one-button Atari joystick to handle. Players can call up a detailed galactic map, change speed, turn on shields, or choose other options by pressing certain keys on the computer keyboard. Upon engaging hyperspace with the H key, your ship’s engines rev up, and stars streak across the screen like the Millennium Falcon in Star Wars.

Star Raiders just blew [Atari cofounder and then CEO] Nolan [Bushnell] and upper management away,” recalls Decuir. (Bushnell was involved with initial plans for the new computers, though he was forced out of Atari by its owner, Warner Communications, around a year before they hit the market.) “They said, ‘Well, we can’t sell the game machine without a keyboard.’ So they came up with this membrane keyboard for the 400. That was our original game machine, but it came out as a minimally functional but useful computer. You needed a keyboard to play Star Raiders.”

Shortly thereafter, the low-end Candy model of the Home Computer System became the Atari 400 that Atari was to release—a sleek, dark beige machine with a completely flat keyboard built in. Though good looking, the keyboard wasn’t fun to type on—but it did let everyone experience Star Raiders. The high-end machine, the 800, would have a conventional, full-travel keyboard more suited to tasks such as word processing.

Even the lower-end Atari 400 offered dazzling multimedia capabilities compared to the Apple II and other first-generation home PCs. [Photo: courtesy of Atari]
Atari kept the 400 and 800 segregated within a new home computing division within Atari. Its VCS game console had just begun to soar on the market, and some within the firm feared that the 400/800 would cannibalize its sales if marketing emphasized the computers’ gaming attributes too keenly. The revolutionary nature of Star Raiders completely disrupted that plan.

In an era energized by 1977’s blockbuster film Star Wars, Atari’s new space game provided an engrossing mixture of action, strategy, and simulation unlike any that came before. Shortly after their launch, people began buying Atari 400 and 800 machines solely to play Star Raiders. It became the killer app for the Atari computers and remained the game to beat for at least two years into the HCS’s lifespan.

In 1981, Mike Moffitt, a Pennsylvania newspaper journalist, described Star Raiders as “the Cadillac of home video games” and “the most sophisticated of all home video games.” He also noted the high price of the systems required to play the game but concluded it was worth it.

Just as the Atari 400/800 soared thanks to Star Raiders, the competing Apple II—then the main target of Atari marketing—became a breathtaking success due to business applications such as VisiCalc. For a time, Atari charged ahead with the serious business angle for its home PCs, reluctant to fully and publicly accept the platform’s deep video game capabilities. It created an unusual dissonance noticed by the press and consumers alike.

“Atari all along struggled with its identity,” said former Atari employee Dale Yocum in a 2014 interview with the ANTIC Atari podcast. “Atari was a game company, and people identified it as a game company. But Atari really wanted to be a personal computer company. And it was hard to convince a Fortune 1000 company that ‘Hey, what you really want to do is buy a bunch of Atari computers and put them on everybody’s desk.'”

Despite the huge gaming draw, many dedicated owners did use their Atari 800s as serious computers for productivity tasks and telecommunications. But with a 40-column text display, slow serial-based peripherals, and limited expandability, the Atari 800 wasn’t the most efficient machine for the job. (By the mid-to-late 1980s, my dad kept an Apple IIc right next to our Atari 800. The Atari reigned for gaming, while the IIc pulled duty as 80-column word processor and spreadsheet machine.)

According to Crawford, Atari wasn’t too upset about the tepid reception of their new product line as a “serious” machine—it was rolling in the dough from video game sales. A cost-saving quirk of the 2600 video chip design allowed Atari’s creative programmers to extend the console’s lifespan far beyond what Atari’s engineers expected, resulting in more sophisticated games and greater sales by 1979.

“That Christmas, the VCS was so successful that they gave a huge bonus to all the programmers,” says Crawford. “And so, the feeling was, ‘Wow, we are on the right track.'”

The golden age of indie software

After the 400 and 800 launched, power users awed by Star Raiders proved eager to flex the machine’s advanced capabilities. But Atari, following its closed model with the 2600, had never intended to spill the secrets of the HCS architecture outside of special agreements with contracted developers. Crawford recalls, “There were about half a dozen people I knew who’d been bugging me for that information, and I had told them, ‘No, I can’t tell you anything.'”

The Atari 400’s flat keyboard frustrated typists but didn’t prevent it from being a superb game console. [Photo: Flickr user Michael Dunn]
Software for the HCS came slowly. At first, Atari retained only one programming department for both the VCS and HCS lines. As Atari’s main breadwinner, VCS software took precedence. “The rule was you cannot do anything on the HCS until you’ve had one game completed for the VCS,” says Crawford. But the easier-to-program HCS became an attractive target. “The general sentiment in the programming department was, ‘I want to move to the HCS as soon as possible.'”

The executive preference for 2600 software put an internal chokehold on Atari 400/800 program development, especially in terms of games, which Atari management frowned upon. Crawford recalls making a presentation to Atari marketing about a new educational simulation about energy policy (later called Energy Czar). “I went through the presentation, and at the end, the VP of marketing fixed me with a cold stare and asked, ‘Is this a game?,'” he remembers. “I hastily replied, ‘No, no, it’s an educational simulation.’ He looked at me warily and said, ‘I don’t know; it sure looks like fun to me.'”

Atari’s Chris Crawford in a humorous personal shot taken by his wife recalling his WWII-themed Eastern Front game. [Photo: courtesy of Atari]
Up until that point, all software for the Atari’s 2600 game console had been published by Atari. But times were changing in the video game industry. In 1979, a group of disgruntled star software developers left Atari and founded Activision, which would later release blockbuster titles for the VCS. Some of the programmers, such as David Crane and Al Miller, had been responsible not only for most of Atari’s hit 2600 titles but also for writing the operating system and several games and applications for the new 400/800 platform. Although many talented programmers remained at Atari, the loss of top game design talent proved a setback for Atari’s internal software development capacity.

In 1980, things began to shift. After considering the demand from independent developers, the Activision exodus, and the success of Apple’s large and vibrant third-party software market, Atari executives reversed its closed-platform home computer policy. Crawford received the news with joy and contacted developers. “I got on the phone, called them all up, and said, ‘Well, guess what? Where do I mail the documentation to?'”

The Atari Program Exchange feels like an early, mail-order-based version of the iOS App Store.”

As a productivity machine, Atari had lost valuable time in the market with a slim suite of primitive applications (mostly developed internally), although an Atari version of the original spreadsheet, VisiCalc, did land in late 1980. By early 1981, the size of the HCS software library paled in comparison to those of machines from Apple and Radio Shack. A 1981 review of the Atari 800 in

InfoWorld, 

about one and a half years into the HCS launch, noted the Atari’s distinct lack of software and called it “an impressive machine that has not yet reached its full computing potential.”

Atari needed software, and fast. To champion developers, Chris Crawford created the Software Development Support Group within Atari. As a first project, it created a user-friendly development bible called De Re Atari (meaning “All About Atari”), which became the de facto guide for Atari computer programming. Crawford also began flying around the country to give in-person two-day seminars about how the Atari 800 worked and how to program it.

On another innovative front, an Atari employee named Dale Yocum petitioned Atari management to start a new division within the firm that would solicit programs from the general public and publish them in low-cost bare-bones packaging under the name Atari Program Exchange (APX).

With APX, authors submitted programs for consideration to Atari. If the firm accepted their creations, authors received a royalty for sales of their product through a quarterly catalog published by Atari. In retrospect, the model feels like an early, mail-order-based version of the iOS App Store.

Thanks to this push for software by people such as Crawford and Yocum, the Atari 800’s software library expanded dramatically in size and quality after 1981, with some of PC gaming’s greatest hits of the golden era originating on the machine. In addition to the groundbreaking Star Raiders, Atari’s HCS played host to seminal masterpieces such as M.U.L.E., The Seven Cities of Gold (both by Dani Bunten Berry and Ozark Softscape), and Archon (Free Fall Associates), all published by a then-new company called Electronic Arts. Text adventure games such as Infocom’s Zork also did well on Atari’s computers.

Atari’s APX arm published games such as Chris Crawford’s ambitious Eastern Front 1941. [Photo: courtesy of Atari]
An indie software market similar to the one that had been flourishing on the Apple II sprung up around the Atari 800. A few early APX titles, such as Caverns of Mars and Eastern Front (1941)—a war game from Crawford himself with revolutionary scrolling map techniques—became breakout hits that sold tens of thousands of units, at a time when that was a big deal.

Game industry legend Sid Meier, the creator of Civilization, began his professional development career at home thanks to Atari’s computer. “When I got my 800, probably the first game I wrote was very similar to Space Invaders,” Meier told me in 2007. “I took it to my local computer store, and they had very little software for sale. I put it on a cassette tape and into a plastic bag. I remember they bought 5 or 10 copies of it.”

Thousands of other small developers would develop games for Atari’s home computers by the end of the decade.

The end of an era

Even though vibrant software flourished on Atari’s home computers in the early 1980s, the platform’s business foundations remained far from certain. Atari’s home computer division remained largely unprofitable, carried along by the success of Atari’s coin-op and home console divisions. At the worst possible time for Atari, competition in the home computer space from the Texas Instruments TI-99/4A and the Commodore VIC-20 began to heat up to fever pitch—just as other competitive factors came together to threaten Atari’s future.

By mid-1982, the Atari 2600 game market resembled a frenzied gold rush. The astounding financial success of Activision inspired dozens of firms, including food manufacturers and media companies, to create their own VCS software. The market became glutted with poor-to-middling quality games. Around that time, American consumers also began to embrace low-cost home PCs for gaming.

Electronic Arts’ Archon was a chess-like game with arcade-style arcade action. [Image: courtesy of Moby Games]
With sales of 2600 hardware and software slowing down, Atari released its long-anticipated follow-up to the VCS, the Atari 5200 SuperSystem, in November 1982. Despite being over five years old, Atari’s HCS architecture remained advanced enough to form the basis of the 5200, which held its own graphically against competing consoles such as Intellivision and ColecoVision. But Atari slipped up. Most 5200 games shipped as slightly enhanced ports of earlier 400/800 titles on larger, incompatible cartridges. Terrible controllers and software incompatibility with the 2600 and 800 held the 5200 back, and a dramatic turn in the video game market the following year sealed the console’s fate.

There was more trouble on the horizon. In August 1982, Commodore released the Commodore 64, a low-cost home computer with a beefy 64K of RAM and advanced graphics that borrowed numerous pages from Atari’s playbook. It also benefited from lower raw materials cost (due to relaxed FCC rules, not as much RF shielding was required) and lower-priced chips. Like the Atari 800, the C64 included a keyboard, played games on cartridge or disk, used Atari-style joysticks, and even included a serial IO bus for disk drives and other accessories.

To undercut Texas Instruments and Atari, Commodore began a price war that dropped the cost of home PCs from $500-$1000 per machine to unsustainable $50-$200 levels by mid-1983. Earlier that year, Atari launched the lean and stylish Atari 1200XL computer, which remained largely compatible with Atari 400/800 software but shipped with 64K of RAM. With an $899 price and no significant new features over the cheaper 800, the 1200XL was a dud with both reviewers and consumers.

The Atari 1200XL had an updated look but wasn’t a technological advance over the 800. [Photo: courtesy of Atari]
In the summer of 1983, Atari released yet more iterations of the HCS in the form of the 600XL and 800XL, which replaced the aging 400 and 800 machines. They fared much better with the press (and sold well with consumers once manufacturing quantities rose the following year). But these new machines couldn’t undercut the Commodore 64 on price.

With the 64, Commodore found itself with a hit on its hands, but its scorched-earth success came at a terrible cost to the industry around it. TI pulled out of the market, and Atari sustained heavy losses that coincided with simultaneous losses in the home video game division. Commodore underwent its own round of turmoil, leading to the resignation of founder Jack Tramiel. The market would eventually recover, but the short-term damage was immense.

The troubles at Atari precipitated an investor panic in its parent company, Warner Communications, and before long, Warner began soliciting offers to offload Atari’s consumer hardware divisions. In 1984, Commodore founder Tramiel rounded up a group of investors and bought Atari’s consumer divisions for around $200 million.

Seven Cities of Gold pioneered open-world gaming—with surprisingly evocative graphics for its time. [Image: courtesy of Moby Games]
Once the dust settled and Atari’s consumer divisions changed hands, the new Atari Corporation released several other variations of the same 1977-era 400/800 architecture in 1985: The Atari 65XE and Atari 130XE (the latter of which included 128K of RAM, a first for the platform). Announced alongside the more powerful Motorola 68000-based Atari 520ST, Atari’s 8-bit machines continued to target the low-cost home computer and gaming markets.

The 1979-era Atari HCS technology received one final chance on the video game market as the Atari XE Game System in 1987, but it was too little, too late—Nintendo’s brilliant post-arcade software for the NES made the XEGS’s stale game rehashes pale in comparison.

Throughout the 1980s, Atari’s home computers remained moderately popular entry-level home machines, but they never eclipsed Commodore in market share in the U.S. Despite gaining some success with its XE line in Eastern Europe, Atari formally pulled the plug on all of its 8-bit computer products on January 1, 1992. Tramiel’s version of Atari held on a bit longer, selling newer computers and video game consoles, but reached the end of its run in 1996.

A rich legacy

As the world collapsed around corporate Atari in the 1980s, my brother and I remained blithely unaware of the turmoil. I didn’t learn about “The Great American Video Game Crash” until the 1990s. Our Atari 800 still worked, and we kept enjoying the fun moments it brought us. Endless games of Archon and Salmon Run enriched our lives. It remained our family’s chief game console until we bought an NES in 1988, and even then, we never really put the Atari away; it usually came out every year around Thanksgiving or Christmas.

Friends of the author’s family play an Atari game. [Photo: courtesy of Benj Edwards]
Even today, some 35 years after I first played an Atari 800, I am still discovering amazing new games on the platform. The catalog is deep and full of unique gameplay ideas that weren’t often seen in later 2D console games, and nostalgic hobbyists still develop new games for it. While I own a PlayStation 4, a Nintendo Switch, and a Steam PC, an Atari 800 XL takes pride of place on a desk in my family’s gaming room. My kids love it.

Just before filing this article, I unexpectedly found an old email from my dad, printed out in a binder of my Atari notes. He passed away in 2013, but a decade earlier, I had asked him about our family’s personal computer history. “We bought the Atari 800 about the time you were born,” he wrote. “It cost $1000 (plus another $450 for the disk drive later), which was more than we could afford, and mom was unhappy that I spent the money on it.”

“In retrospect, the Apple II would have been a better long term investment. But also in retrospect, stretching the budget to afford the computer was well worth it since it gave you and your brother valuable skills worth more than money. Mom knew that soon after, of course–she never held a grudge about those purchases.”

Some successes are bigger than business. The Atari home computers were a cultural phenomenon that brought joy to a generation. Thanks, dad—and happy birthday, Atari 800.



from Hacker News https://ift.tt/2EGnMMi

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.