[5] The architecture is designed around the 8-bit MOS Technology 6502 CPU and three custom coprocessors which provide support for sprites, smooth multidirectional scrolling, four channels of audio, and other features.
[12] They developed essentially a greatly updated version of the VCS, fixing its major limitations but sharing a similar design philosophy.
[13] During the early development period, the home computer era began in earnest with the TRS-80, PET, and Apple II—what Byte magazine dubbed the "1977 Trinity".
[6] To adapt the machine to this role, it needed character graphics, some form of expansion for peripherals, and run the then-universal BASIC programming language.
All on-screen graphics are created using sprites and a simple background generated by data loaded by the CPU into single-scan-line video registers.
Commodore was developing a video driver at the time, but Chuck Peddle, lead designer of the MOS Technology 6502 CPU used in the VCS and the new machines, saw the Atari work during a visit to Grass Valley.
"[17] Management identified two sweet spots for the new computers: a low-end version known internally as "Candy", and a higher-end machine known as "Colleen" (named after two Atari secretaries).
[20] To minimize handling of bare circuit boards or chips, as is common with other systems of that period, the computers were designed with enclosed modules for memory, ROM cartridges, with keyed connectors to prevent them being plugged into the wrong slot.
When no software is loaded, rather than leaving the user at a blank screen or machine language monitor, the OS goes to the "Memo Pad" which is a built-in full-screen editor without file storage support.
The introduction of many game consoles during this era had led to situations where poorly designed modulators would generate so much signal as to cause interference with other nearby televisions, even in neighboring houses.
In response to complaints, the Federal Communications Commission (FCC) introduced new testing standards which are extremely exacting and difficult to meet.
[25] In a July 1977 visit with the engineering staff, a Texas Instruments salesman presented a new possibility in the form of an inexpensive fiber-optic cable with built-in transceivers.
The internal slots were reserved for ROM and RAM modules; they did not have the control lines necessary for a fully functional expansion card, nor room to route a cable outside the case to communicate with external devices.
[31] In an August 1979 interview Atari's Peter Rosenthal suggested that demand might be low until the 1980–81 time frame, when he predicted about one million home computers being sold.
He concluded by stating "The Atari is like the human body – a terrific machine, but (a) they won't give you access to the documentation, and (b) I'd sure like to meet the guy that designed it".
[34] Kilobaud Microcomputing wrote in September 1980 that the Atari 800 "looks deceptively like a video game machine, [but had] the strongest and tightest chassis I have seen since Raquel Welch.
[35] InfoWorld favorably reviewed the 800's performance, graphics, and ROM cartridges, but disliked the documentation and cautioned that the unusual right Shift key location might make the computer "unsuitable for serious word processing".
SALLY adds logic to disable the clock signal, called HALT, which ANTIC uses to shut off the CPU to access the data/address bus.
[8] Brian Moriarty stated in ANALOG Computing that Atari "fail[ed] to keep up with Christmas orders for the 600 and 800XLs", reporting that as of late November 1983 the 800XL had not appeared in Massachusetts stores while 600XL "quantities are so limited that it's almost impossible to obtain".
While disapproving of the use of an operating system closer to the 1200XL's than the 400 and 800's, and the "inadequate and frankly disappointing" documentation, ANALOG concluded that "our first impression ... is mixed but mostly optimistic."
The magazine warned, however, that because of "Atari's sluggish marketing", unless existing customers persuaded others to buy the XL models, "we'll all end up marching to the beat of a drummer whose initials are IBM.
Commodore founder Jack Tramiel resigned in January 1984 and in July, he purchased the Atari consumer division from Warner for an extremely low price.
[60] They were never an important part of Atari's business compared to video games, and it is possible that the 8-bit line was never profitable for the company though almost 1.5 million computers had been sold by early 1986.
Companies stated that one reason for not publishing for Atari was the unusually high amount of software piracy on the computer, partly caused by the Happy Drive.
Each mode varies based on whether it represents text or a bitmap, the resolution and number of colors, and its vertical height in scan lines.
[73] Since each row can be specified individually, the programmer can create displays containing different text or bitmapped graphics modes on one screen, where the data can be fetched from arbitrary, non-sequential memory addresses.
This stream then passes to GTIA which applies the playfield colors and incorporates Player/Missile graphics (sprites) for final output to a TV or composite monitor.
Any graphics mode in the default CTIA/GTIA color interpretation can be freely mixed without CPU intervention by changing instructions in the display list.
The POKEY chip—and its dual- and quad-core versions—was used in many Atari coin-op arcade machines of the 1980s, including Centipede and Millipede,[77] Missile Command, Asteroids Deluxe, Major Havoc, and Return of the Jedi.
The magazine advised them to "clear out those cobwebs" for Atari's Star Raiders,[80] which became the platform's killer app, akin to VisiCalc for the Apple II in its ability to persuade customers to buy the computer.