Silicon’s Speedier Cousins

Nov. 1, 1989

Sand is one of the most abundant materials on the planet, which is one reason for the central role it has captured in modern micro­electronics. Compounds of silicon and oxygen, typically in the form of sand, account for about three-fourths of the earth’s crust. In a highly refined form, they also are the materials of choice for the elec­tronics industry because of their ex­cellent semiconducting properties.

A semiconductor is a substance about halfway between a conductor (like metal, which passes electrical current along readily) and an in­sulator (such as rubber, which stops the electron flow). When overlaid with the right impurities, silicon chips become semiconductors, car­rying current along designated paths and through selected gates un­der certain conditions.

This makes them good switches, which is essential. All data in mod­ern information processing systems are represented in digital form, ei­ther as a “one” when the devices are conducting or as a “zero” when in­sulating.

In addition to its universal avail­ability, silicon has the advantage of high electron mobility. That’s why silicon is so good for switches; the faster the electrons can move through it, the faster the computers made of silicon devices can operate.

Silicon is a member of the carbon family, found in column four of the periodic table. Carbon, the basic building block of all life and an elec­tronics material, forms more com­pounds than all the other elements on the table combined. The other members of this family are ger­manium—used to make the first transistor in 1947 and today a key ingredient of optical fibers—tin, and lead, which are basic to future appli­cations of superconductivity and advanced sensor systems.

Physical chemists have long known that still higher rates of elec­tron mobility are inherent in com­pounds formed out of elements from columns three and five of the peri­odic table. In theory, these 3-5 com­pounds, as they’re known, should replace silicon in future high-perfor­mance information-processing sys­tems. They haven’t yet because they’re hard to work with, and the materials-processing technologies haven’t kept pace with silicon.

Until now. Starting in 1986 with the Microwave/Millimeter Wave Monolithic Integrated Circuits (MIMIC) program at the Defense Advanced Research Projects Agen­cy (DARPA), 3-5 technology has ad­vanced to the point where it’s begin­ning to close the gap with silicon.

MIMIC was aimed at analog ap­plications in weapon sensor sys­tems such as radar. Instead of pro­cessing the data as ones and zeros, the MIMIC devices generated rep­resentations, or analogs, of the tar­gets of interest in the form of volt­ages that were then converted to a digital format and processed by on­board digital computers. The idea was to quickly accumulate and pass on to the computers large volumes of intelligence data for identification and appropriate response.

First of the 3-5s

MIMIC also pushed the technolo­gy of the first of the 3-5 compounds, gallium arsenide (GaAs). With an electron mobility about five times that of silicon, plus reduced power requirements and inherent resis­tance to radiation, GaAs was the perfect material to break the front- end bottleneck of airborne systems. GaAs devices derived from MIM­IC, such as the transmit/receive modules for the Westinghouse phased-array radar on the Air Force’s Advanced Tactical Fighter (ATF), have begun finding their way into operational systems.

Now, building on that base, the solid-state physics community is accelerating its efforts along two paths: using these new materials in the more demanding digital applica­tions and experimenting with other 3-5 compounds that have even greater potential performance than GaAs.

Once again DARPA is leading the military effort, this time with a pro­gram to insert digital GaAs technol­ogy into eleven current weapon sys­tems as one-to-one replacements for existing silicon devices (see box). The major advances, however, are coming from the commercial world. Even before the DARPA program began, Seymour Cray, the legendary progenitor of supercom­puters, began designing his next ma­chine entirely out of digital GaAs logic. Known as the Cray 3 and ex­pected to be the world’s most powerful supercomputer, the new ma­chine is scheduled for initial deliv­eries in mid-1991.

Meanwhile, the basic research or­ganizations of Bell Laboratories (which demonstrated that first ger­manium transistor), Hughes, IBM, and others are reporting initial suc­cesses with such other 3-5 combina­tions as indium phosphide (InP), aluminum indium arsenide (Al­InAs), and gallium indium arsenide (GaInAs). In each case, the key to improved performance lies in in­creasing the content of indium, be­cause it has the potential of improv­ing electron mobility three or four times beyond GaAs.

Memory is the Bellwether

GaAs thus represents the opening wedge of a revolution that is trans­forming electronics. Today, the technology is about where silicon was in the early 1970s, when Silicon Valley in California was turning out sample quantities of the first crude microprocessors and semiconduc­tor memories. Those devices have since become ubiquitous, making possible the present era of dis­tributed computing and intelligent weapons. A bellwether is memory, which has gone from a thousand bits per chip to a million, at essentially the same price.

Price-performance achievements like that are a function of volume production. That’s what the DAR­PA program is all about, according to Dr. Arati Prabhakar, GaAs pro­gram manager in the agency’s De­fense Sciences Office: building the necessary infrastructure. That’s also why DARPA chose weapon systems currently in production. “We’re taking one risk at a time,” she adds.

Two of the Army demonstration projects, in particular, should drive up digital GaAs volume, according to Sven A. Roosild, DARPA assis­tant director for electronic sci­ences. They are a new modem and frequency synthesizer being devel­oped by E-Systems to provide anti-jam capability for the ANIPRC-126 radio and a digital signal processor from Martin Marietta to replace bulky analog components in the RF Hellfire antitank missile and thus increase the lethality of its warhead. Each of these projects will require hundreds of thousands of the new components, he says.

In addition to assuring military program managers of reliable sources of supply, this increased volume should give the United States an edge in the inevitable GaAs shoot-out with Japan. Al­ready, four Japanese companies—NEC, Oki, Hitachi, and Sumitomo —are actively marketing GaAs inte­grated circuits in the United States, although these are generally less complex devices derived from the companies’ programs in fiber op­tics.

There are only two, relatively low-volume, Air Force projects on the list, but they could have the big­gest impact on pushing GaAs tech­nology. E-Systems in Greenville, Tex., is developing a distributed ar­ray processor for “special mission aircraft” under a contract from the Air Force Logistics Command that will increase processing speed six-­fold while reducing subsystem weight by 300 pounds. Martin Mar­ietta in Denver is developing a one-chip on-board spacecraft computer for a classified reconnaissance satellite that will increase processing speed from 75,000,000 operations a second to 560,000,000—without changing the system architecture or software. Vitesse Semiconductor of Camarillo, Calif., is supplying 15,000-gate arrays for this project.

For the Navy, Texas Instruments in Dallas is developing a GaAs thirty-two-bit computer operating at 200 MHz (two hundred million cycles per second) to improve the resolu­tion of the AN/APS-137 surface-search radar on the P-3C patrol air­craft. Under a separate DARPA-sponsored program, the Navy is also looking at GaAs devices to im­prove the performance of its Ariadne undersea antisubmarine warfare system. The devices would reduce power requirements by a factor of five to ten at each node of the fiber optic cables.

The Speed-Power Product

These DARPA-sponsored proj­ects illustrate another edge for GaAs. Compared to silicon, it has what is known as a speed-power product that is about six times bet­ter. That means you can send the same amount of data traffic for one-sixth the power (particularly impor­tant for spacecraft and, to a lesser extent, aircraft) or six times as much information for the same elec­trical power requirement (a better choice for terrestrial applications). Moreover, radiation resistance comes for free, which makes GaAs particularly attractive for military use.

Outside the DARPA program, Giga­Bit Logic of Newbury Park, Calif., one of the GaAs chip suppliers, is under contract to the Air Force’s Ballistic Systems Division at Nor­ton AFB, Calif., to develop a GaAs version of the Air Force’s popular 1750 airborne computer. The one-chip, sixteen-bit microprocessor is intended to be capable of speeds of 1 GHz (a billion cycles per second) with error correction. GigaBit Log­ic is teamed with Jaycor, a company in San Diego that specializes in radi­ation resistance, and Galaxy Micro­systems, a 1750 architecture design firm in San Jose, Calif., and Austin, Tex., on the project.

Another small start-up company, Gazelle of Santa Clara, Calif., has developed a GaAs program logic ar­ray that is compatible with conven­tional transistor-transistor logic (TTL) but is twice as fast. This ar­ray will replace an entire box in a military system with a single chip.

As is customary in any new tech­nology, costs are initially high but are plummeting rapidly. Dr. Prabhakar estimates that a three-­inch-diameter wafer costs $160 to $175 and that the processing and testing of the devices adds another $2,000 per wafer. However, the industry is moving up to four-inch-­diameter wafers, which reduce costs. Mike Pawlik, vice president for marketing at GigaBit, estimates that twenty suppliers have been qualified on three-inch wafers and eight on four-inch wafers. The sil­icon industry has long been working with five-inch wafers.

GaAs has an inherent cost advan­tage over silicon at the processing end. The photolithographic process used to imprint the devices onto a substrate requires only twelve masking steps for even the most complex GaAs devices. That corn- pares to twenty or more for com­parable silicon devices made with the complementary metal oxide semiconductor process.

Prices May Drop

In fact, as all the new companies scramble to get on board this tech­nology, a worldwide drop in GaAs prices may be in the works, accord­ing to Dr. David Miller, a manager at the Litton Airtron Division, Morris Plains, N. J., one of the principal GaAs wafer suppliers. “Because of all the hype, the hockey stick [an allusion to steep sales curves in growth industries] is lying down. It’s not straight up,” he says. “There’s overcapacity at every level.”

Gallium is a material for which the United States is completely de­pendent on foreign sources of sup­ply: Canada, France, Germany, and Japan. “Arsenic is cheap. It’s every­where,” says Pawlik of GigaBit. “Gallium is priced like silver.” In addition to Airtron, the major wafer suppliers include the Canadian con­glomerate Cominco (which extracts gallium as a by-product of alumi­num refining), MIA Com of Lowell, Mass., and the Japanese firms Mit­subishi and Sumitomo.

Dataquest, a market research firm in San Jose, Calif., that has a reputation for conservative fore­casts, projects that the merchant market for GaAs devices will rise from $328 million this year to nearly $1.3 billion by 1993. That’s a thirty-five percent compound annual growth rate, but about thirty per­cent of the total today is accounted for by nonrecurring engineering costs for technology and product development.

The growth is projected to be fast­est for digital GaAs—from $127 mil­lion this year to $656 million in 1993—which is currently domi­nated by the “big three” merchant suppliers, GigaBit, TriQuint, and Vitesse. The market for analog de­vices (not including those for DARPA’s MIMIC program) is projected to grow from $201 million to $621 million over the same period. (The figures also do not include the out­put of the “captive” suppliers such as AT&T, Hughes, McDonnell Douglas, Rockwell, Texas Instru­ments, TRW, and Westinghouse, which produce solely for their own needs.)

Competition for Silicon

The real competition is not among the GaAs producers, ac­cording to Louis Pengue, marketing manager at TriQuint, but against the entrenched silicon devices, particu­larly top-of-the-line emitter-coupled logic (ECL), which has replaced TTL and dominated military sys­tems in recent years. He calls the problem the “FUD factor” (fear, un­certainty, and doubt), which he ex­pects to be erased as more program managers become familiar with the new technology.

Reducing costs is essential to win­ning acceptance, according to Mr. Pengue. That means driving down the cost per gate of a large logic array (10,000 gates or more) to three to five cents so it can compete head-on with ECL. One of the best ways to do that is to increase the chip size, and GaAs has been moving up steadily from 100 mils on a side to 200 mils. (One mil is a thousandth of an inch, so 100 mils is a tenth of an inch.) It needs to beyond 300 mils, Mr. Pengue maintains, but he says there’s a “fear threshold” among users at about 250 mils.

If anybody should know the pluses and minuses of GaAs, it’s Seymour Cray, who has been build­ing the world’s most powerful super­computers for at least twenty years. He laid it all out at last year’s Super-computing ’88 conference in Orlan­do, Fla., cosponsored by the IEEE Computer Society and the Associa­tion for Computing Machinery.

“Gallium arsenide is pretty horri­ble to work with. It has a lot of grain,” which affects the flow of electrons, he said. “It’s . . . tough to get everything lined up right in order to make the system function properly,” he explained. “It’s like working with potatoes. There are soft spots, hard spots, eyes, and skin—and it’s very hard right now to get good quality basic material to work with. But it gets better every year.”

The Cray 3 uses GaAs for all the logic functions (but silicon devices for memory) to achieve breathtak­ing speeds of sixteen gigaflops (that’s 16,000,000,000 floating point operations a second). It was origi­nally scheduled for initial deliveries right about now, but technical prob­lems and a move of the company from Minneapolis to Colorado Springs, Colo., have caused a slip of nearly two years. But already Cray is reported to be working on an all-GaAs Cray 4 capable of 128 giga­flops.

Out of the Sandbox

Looming over this GaAs free-for-all is the emergence of indium phos­phide. Paul Greiling, manager of gallium arsenide research at the Hughes Research Laboratories in Malibu, Calif., reported in July on a high-electron-mobility transistor in which individual layers of indium phosphide, aluminum indium arse­nide, and gallium indium arsenide as thin as five individual atoms were deposited using the molecular beam epitaxial process.

The result is a fifteen-fold improvement in sensitivity for a com­munications satellite receiver, which means the ground antenna for use with a direct broadcast satel­lite could be reduced to one foot in diameter. Data rates as high as twenty-five gigabits (twenty five bil­lion bits) a second were achieved. Reasonable extrapolations of this technology could lead to ultrasen­sitive radars capable of spotting stealth vehicles—or to the “Dick Tracy” wristwatch radio.

What may be even more exciting is that the gallium indium arsenide layer can serve as a laser diode oper­ating at the wavelengths of 1,300 and 1,550 nanometers (billionths of a meter). Those are the wavelengths of state-of-the-art single-mode op­tical fibers, which would permit data output from the computer chip to be in the form of photons rather than electrons. That would be a big step toward the post-electronics world of photonics (see “Beyond Electronics,” p. 78, June 1989 is­sue). Nobody is predicting an end for silicon-based electronics de­vices, but the industry has begun venturing cautiously out of its sand­box.

Eleven weapon systems will be updated with GaAs technology.

Infusions of Arsenic

The Defense Advanced Research Projects Agency (DARPA) this year selected nine major defense contractors to participate in a program to insert digital gallium arsenide (GaAs) integrated circuits into eleven operational weapon systems.

The selection follows a broad agency announcement issued by DARPA on Janu­ary 22, 1988, in which the agency sought proposals from industry on ways to upgrade current silicon-based devices to improve performance.

Supporting the nine prime contractors are three new companies founded specifi­cally to produce GaAs digital integrated circuits for military and commercial mar­kets: GigaBit Logic of Newbury Park, Calif., TriQuint of Beaverton, Ore., and Vitesse Semiconductor of Camarillo, Calif.

Program manager at DARPA is Dr. Arati Prabhakar, and the program is under the direction of Sven A. Roosild, assistant director for electronic sciences in DARPA’s Defense Sciences Office. The projects, by service, are as follows:

Air Force: E-Systems, Greenville (Tex.) Division: distributed-array processor for special-mission (reconnaissance) aircraft, Air Force Logistics Command.

Martin Marietta Space Systems, Denver, Cob.: spacecraft on-board processor for reconnaissance satellites (a “black” program for which the program office was not disclosed).

Army: E-Systems, ECI Division, St. Petersburg, Fla.: modem and synthesizer for AN/PRC-126 small unit radio, Army Communications Electronics Command (C-ECOM).

ITT Avionics, Nutley, N. J.: digital radio frequency memory for AN/ALQ-136 jam­mer, C-ECOM.

Martin Marietta Electronic Systems, Orlando, Fla.: signal processor for RF Hellfire seeker, Army Missile Command.

McDonnell Douglas Electronic Systems, Huntington Beach, Calif.: mast-mounted sight processor for OH-58D Scout helicopter, Army Aviation Systems Command,

Navy: Grumman, Bethpage, N. Y: radar processor for E-2C airborne early warn­ing aircraft, Naval Air Systems Command (NavAir).

Honeywell Defense Avionics Systems, Albuquerque, N. M.: digital map computer for the multiservice V-22 Osprey and other aircraft, NavAir.

KOR Electronics, Huntington Beach, Calif.: digital radio frequency memory for AN/ULQ-21 threat jamming simulator, Navy Pacific Missile Test Center.

Sanders Associates, Nashua, N. H.: digital radio frequency memory for the AN! ALQ-126B used on several tactical aircraft (perhaps later on the A-12 advanced tactical aircraft), NavAir.

Texas Instruments Defense Systems and Electronics Group, Dallas: high-resolu­tion upgrade for the AN/APS-137 surface search radar, NavAir.

John Rhea is a free-lance writer living in Woodstock, Va., who specializes in military technology issues and is a frequent contributor to AIR FORCE Magazine. His book Department of the Air Force is scheduled to be published this month by Chelsea House, New York, N. Y.