The Electronic Wind Tunnel

Feb. 1, 1989

Computational fluid dynam­ics—a technology that has emerged within the past decade be­cause of the availability of ever more powerful supercomputers—is com­pletely changing the way aerospace vehicles are developed.

This new technology represents as significant a milestone in flight as the invention of the wind tunnel, which it complements. Just as the wind tunnel was the essential first step toward heavier-than-air vehi­cles, the new computer-based ana­lytical techniques will make possi­ble the high-performance vehicles of the future.

The role of the wind tunnel is often overlooked. Everybody knows of the Wright brothers’ suc­cess at Kitty Hawk, N. C., on De­cember 17, 1903. What is not widely known is that three years earlier, back in their shop in Dayton, Ohio, the two bicycle-makers achieved the breakthrough that made that flight possible and ushered in the age of aviation. They built the first crude wind tunnel to test their de­signs before they flew them.

Until then there was only one way to test aircraft: Fly them. That’s what all the other aviation pioneers did. Many of them, like Otto Lilienthal, died in the process. Oth­ers, like Samuel Langley, suffered a series of embarrassing failures.

The Wright brothers correctly guessed that the key to powered flight was the way the cross-section shape of the wings provided lift. Birds don’t fly simply by flapping their wings; birds fly because their wings are remarkably efficient air­foils.

Once that principle of lift had been established in ground-based testing, the first flight was, scien­tifically speaking, almost an anti­climax. Nearly ninety years later, aircraft designers still base their work on this principle as they ex­pand the flight envelope to ever greater speeds and altitudes.

Testing the Next Generation

Air Force Systems Command op­erates the world’s largest aerospace ground-test facility, the $3 billion complex of wind tunnels and en­vironmental chambers at the Ar­nold Engineering Development Center (AEDC) near Tullahoma, Tenn. Since it opened for business in 1951, this facility has tested most of the Air Force’s new aircraft and missiles along with such NASA ye-hicks as Gemini, Apollo, and the Space Shuttle.

This also is where the Air Force will test its next generation of vehi­cles, including the Advanced Tac­tical Fighter and the X-30 National Aerospace Plane. These new vehi­cles will operate in a much more demanding environment and there­fore will require much more com­plex testing. This is where comput­ers become a critical factor.

Computerized simulation of aero­dynamics is not new. The idea of “flying” an airplane in a computer before undertaking dangerous flight tests emerged after World War II from pioneering work by the Air Force, the National Advisory Com­mittee for Aeronautics (NACA, the predecessor to NASA), and the aerospace industry.

What is new is the power of to­day’s supercomputers, which can analyze the airflow around aerody­namic vehicles with sufficient preci­sion to enable them to operate in the more demanding flight regimes of the future. Although the Wright brothers were the first to demon­strate ground testing, they made a fundamental error: They thought the flow of air under the wing pro­vided the lift. Today aerodynam­icists know that it is the partial vacu­um created above the airfoil that is responsible for lift. An error like that was no problem for an aircraft with the performance of the Wright Flyer. It would be fatal for today’s aircraft.

All new flight programs will rely on computational fluid dynamics. Breaking the term into its compo­nent parts makes it easier to under­stand. The computational part is obvious. This is a technology based on the use of computers to do cal­culations that were heretofore im­possible. A fluid is what airplanes fly in; it’s called air. The key to the concept is the third part—dynam­ics. By knowing the dynamic inter­action of a vehicle with its environ­ment, developers can optimize its performance.

Thus, CFD, as it’s known, is es­sentially a set of software tech­niques that takes advantage of trends within the computer industry to build much more powerful ma­chines for a variety of demanding applications.

At its Ames Research Center near San Francisco, for example, NASA has just put into operation a Cray Y-MP supercomputer capable of more than a billion computations a second. NASA is shooting for a trillion computations per second at its Numerical Aerodynamic Simu­lation Facility there by the end of the century.

New Tools at Tullahoma

At the Arnold test site, the Air Force operates two smaller Cray su­percomputers, an X-MP and an ear­lier model Cray 1, both linked to each other and to a larger Cray 2 at Kirtland AFB, N. M. These are the hardware tools of AEDC’s CFD ef­forts.

The critical software tools have evolved over the past ten years, re­calls Dr. Donald C. Daniel, chief scientist at the Arnold center. They consist of two parts: gridding, which is a mathematically gener­ated picture of the air vehicle that he calls “a sophisticated checker­board,” and the algorithms that the computer uses to calculate the air­flow over the simulated vehicle (or through it, in the case of a propul­sion system). The more grid points that can be analyzed and the more sophisticated the algorithms (actu­ally partial differential equations) used to analyze them, the more ac­curately the vehicle’s performance can be calculated.

Furthermore, these calculations only begin on the vehicle’s surface. They must be extended outward from the vehicle’s body with em­phasis on flow gradients (changes of flow) that affect vehicle perfor­mance. This would be a simple pro­cess if all aerospace vehicles were perfect spheres or cylinders. They aren’t.

Because of the complex shapes that have to be tested, according to Dr. Daniel, the software engineers’ task is to develop equally complex adaptive grids incorporating a feed­back loop between the solution and the grid. This is a tedious process, and Dr. Daniel notes that it took a year to initially set up the grid and solve the flow field for the F-16 fighter.

Questions at Mach 15

Further complicating the process is the need for a better understand­ing of the basic aerodynamic pro­cesses. “We still don’t understand turbulence,” Dr. Daniel says. “It’s more or less random, and we can’t model a random event well.” He ex­pects there’s enough research to be done in this area to keep scientists busy for the rest of this century.

The problem isn’t so bad at sub­sonic and supersonic speeds. It’s the transonic regime that worries scientists like Dr. Daniel. He calls that “the most nonlinear part” of the flight envelope, or the one in which the relationship between flow fields and vehicle performance is least un­derstood. When it comes to hyper­sonic vehicles like the X-30 operating at Mach 15 at 300,000 feet, Dr. Daniel can only shrug, “What’s your guess?”

Nonetheless, the basic principles of CFD are in place to handle future flight programs. Dr. Daniel pays tribute to Boeing for its pioneering work on its 757 and 767 commercial jetliners, adding that the Air Force will get maximum benefits from the technology on the ATF, “where the tools were there from the inception of the aircraft.”

Dr. Edward M. Kraft, manager of the technology and analysis branch of the Calspan Corp. contractor team operating the wind tunnel test facilities at Arnold, describes the synergistic relationship among the three facets of vehicle testing: ground testing (in which the wind tunnel is the traditional tool), flight testing, and CFD. “Each tool has its limitations,” he says, “but the other tools overlap and accommodate them.”

Ground tests can’t duplicate all conditions, particularly in the case of a spacecraft, but they are less costly and less dangerous. Flight tests are still essential because they represent the “truth,” according to Dr. Kraft: “What you see is what you get.” CFD is now entering the picture as part of an effort to do the diagnostics first and thus minimize ground testing and certification changes later in the program. As Dr. John H. Fox, a principal engineer with the Calspan technology and analysis branch, puts it, “We fly the aircraft on the computer.”

“The name of the game is optimiz­ing,” adds Ralph E. Graham, chief of the aeronautical systems division at Arnold’s directorate of aerospace flight dynamics test. “We’re looking for the last one percent of perfor­mance.”

Certifying Stores Release

Graham cites a very practical ap­plication of CFD that is paying off for the Air Force right now: certify­ing the release of stores. The Air Force has 110 kinds of stores (fuel tanks, bombs, missiles) in its in­ventory, he explains, and they’re used with a variety of different air­craft. This adds up to thousands of possible combinations, so certify­ing a particular store for a particular aircraft can be a lengthy, costly pro­cess.

Instead, by using CFD in con­junction with wind tunnel test and analysis to determine the basic aerodynamic behavior of the stores and their host aircraft, the Air Force will be able to greatly reduce flight testing—in some cases by fifty per­cent—and “mix and match” the two. To do this entirely in a wind tunnel could take up to three years.

With CFD, wind tunnel testing and analysis, according to Graham, the process at AEDC can be cut to three months. How much money could this save? “The cost of an F-15,” Graham quips. There’s also a potential performance improve­ment in better circular error proba­ble (CEP) for air-to-ground and air-to-air missiles.

Tracy Donegan, a Calspan senior engineer, describes a typical CFD project completed last August for the F-15 fighter: The entire aircraft (except its tail) with its seven py­lons, a store, and pod was computationally simulated with 1.1 million grid points. It took four engineers six months working part-time to de­velop all the algorithms for the grids and boundary conditions.

The initial purpose was to deter­mine the aircraft/store flow field, but the program became much broader than that, Donegan ex­plains. For the first time it gave the Air Force a picture of the flow field around a complete aircraft. That picture is available on demand at a video computer terminal in three di­mensions and color-coded to show flow field gradients. This technolo­gy is now available to airframe prime contractors, and Donegan es­timates the X-30 would require about the same number of grid points.

“CFD hasn’t been extensively ap­plied from cradle to grave,” says Col. Dale F Vosika, Arnold’s depu­ty for operations, “but it does give us a level of expertise when integrat­ed with ground and flight tests.” In the case of the X-30, he notes, the lack of ground-test facilities will re­quire a lot of computer simulation. This program, as well as the ATF, will require coordination with the Air Force’s Aeronautical Systems Division (particularly the Flight Dy­namics Laboratory) and the Air Force Flight Test Center. Colonel Vosika cites the complexity of the aircraft—higher performance, speeds, and maneuverability. The supercomputer complex at the NASA Ames Center will also be heavily involved in CFD studies to support the X-30.

Smarter Tests

Reducing test time by using the electronic analog known as CFD has a major impact on costs, accord­ing to Rampy, who estimates that electrical power requirements eat up seventy percent of all test costs. That’s cheaper than the operating costs of test aircraft, but it is still a cost to be avoided if possible.

In fact, this voracious appetite for electrical power is why the Arnold center is located in the heart of Ten­nessee Valley Authority territory. The availability of relatively low-cost power—plus water for cooling the test facilities—reduces overall costs.

They are still hefty. Col. (Brig. Gen. selectee) Stephen P. Condon, Arnold’s Commander, has an electricity bill that would make most homeowners blanch: $2 million a month. That amounts to nearly 500,000 megawatt-hours a year—enough, he says, to provide power for a city of more than 50,000.

CFD Saves Money

Dr. Keith L. Kushman, chief of the center’s facility technology divi­sion, has pinpointed some of the cost savings attributable to CFD. He figures the computational costs at Arnold at about $4 million a year, of which half is salaries and most of the rest is the amortized cost of the supercomputers. He has docu­mented more than $2 million in cost savings to the center’s customers (principally other elements of the Air Force Systems Command), but he estimates there is another $8 mil­lion in intangible savings from re­duced risks to conventional ground-test equipment by doing the tests in a computer instead of wind tunnels. Furthermore, he maintains, half of the tests his team has conducted couldn’t be done at all without CFD.

His colleagues at Wright-Patter­son AFB, Ohio, agree. “Computa­tional aerodynamic simulation now is a valid, inexpensive alternative to wind tunnel testing of new aircraft and aerospace designs,” according to a statement by Dr. Joseph J. S. Shang, a technical manager at the Flight Dynamics Lab, after a series of simulations four years ago using the X-24C lifting body. The com­puted results duplicated the results of earlier wind tunnel tests for flow fields and aerodynamic forces on the vintage 1974 experimental reen­try vehicle.

As supercomputers become even more powerful, the technology of CFD can be extended even further, according to Arnold chief scientist Dr. Daniel. He is more concerned about memory capacity than about multibillion-operation speeds and says even the 256-million-word memory of the top-of-the-line Cray 2 “won’t be nearly enough” for some of the projects he has in mind.

“The great thing about supercom­puters is that they unlock the mind,” Dr. Daniel concludes.

John Rhea is a freelance writer in Woodstock, Va., who specializes in technology issues. He is the author of SDI—What Could Happen: 8 Possible Star Wars Scenarios, published in 1988 by Stackpole Books.