Thursday, February 15, 2007
Heavyduty Lightweighting
Remember beer and coke cans with seams? For Andy Trageser and Bob Dick, who practice the art and science of engineering at ALCOA, they tell an interesting story.
In the 60s, nearly all beverage cans were a seamed wraparound of tin-plated steel. Then the aluminum industry developed a way to make seamless cans -- drawing them from a flat sheet with a separate lid clamped on after filling. "That started us into light weighting," says Trageser, "to save our can manufacturing customers' costs and to compete with steel." Of 100 billion beverage cans made in the U.S. each year, about 95 percent -- one per person per day -- are aluminum.
The story goes on. Aluminum costs more than steel, and the price has been rising. Steel "minimills" now have continuous casting processes that make sheet steel thin enough to form seamless cans. And there is competition from other materials as well. "We h ave to find ways to make cans lighter and lighter to keep fending off polymers, steel and glass. Lighter cans means lower prices to the consumer, who's then more likely to buy cans off the grocery shelf instead of two-liter bottles or glass."
ALCOA's answer is lightweighting, designing cans to use the thinnest aluminum possible within the constraints of strength and appearance.
Progressive Lightweighting
In the 1970s the aluminum in beverage cans was nearly as thick as aluminum gutters, .015 inches. Lightweighting progress leveled-off in the early 80s, then resumed in 1984 due in part to computer modeling.
Using supercomputing at the Pittsburgh Supercomputing Center, ALCOA has developed a sophisticated approach to finite-element modeling of beverage cans that allows engineers to develop prototype can designs with a high level of confidence that the modeling accurately predicts how the can will perform under the stress of manufacturing, distribution and use. This modeling reduces the need for costly laboratory prototypes and signifcantly cuts time-to-market for a new design.
Can Bottom Snap-Through
These images represent a sequence from "dynamic snap-through" modeling of a can bottom, with color indicating pressure. From these computations, ALCOA engineers analyze whether a proposed can design will meet the internal pressure specifications of the manufacturer. The final shape shown here closely matches experiment, and the results for velocity agree well with high-speed filming of experimental tests.
Researchers: Bob Dick & Andy Trageser, ALCOA Laboratories
Hardware: CRAY Y-MP C90
Software: User-developed code
Keywords: aluminum, finite-element, beverage can, lightweighting, sheet thickness, dynamic snap-through, dent analysis, 3-D modeling, manufacturing, can design, materials
Liquid Steel
What Happens in a Tundish?
Computer modeling combined with physical modeling has paid off in steel manufacturing. That's the message from Achilles Vassilicos, research consultant at U.S. Steel Technical Center in Pittsburgh. Since 1989 when the Technical Center, the research arm of U.S. Steel, became a corporate affiliate of the Pittsburgh Supercomputing Center, U.S. Steel researchers have used the high-powered computing of the CRAY Y-MP, and now the C90, to help understand the flow patterns of molten steel.The U.S. Steel research team wants to know what happens when a "heat" of steel -- a huge ladle containing more than 200 tons of molten metal at close to 3,000 degrees F. -- empties its fiery brew into a continuous casting "tundish." In particular, they want to know as precisely as possible what happens inside the tundish as the molten steel churns and swirls around. The tundish holds the white-hot liquid and feeds it out the bottom into a continuous casting mold, where it forms a moving strand of steel that eventually cools from white to red hot and gets cut into slabs for further processing.
Continuous casting is the most up-to-date technology available for producing high-quality steel at low cost, and good understanding of what goes on in the tundish is critical because it affects the purity and chemistry of the output steel. Impurities, such as oxides of aluminum, calcium and iron, tend to float to the top of the tundish bath. The steel flow must be controlled to enhance this flotation and to prevent turbulence from drawing impurities back down into the bath. Furthermore, you need to know how the chemistry of the mix feeding out the bottom of the tundish varies as a new heat pours in the top.
"The objective is to have the caster running continuously," says Vassilicos, "and you usually aim for a string of several hundred heats. The chemistry often varies significantly from heat to heat. If you know exactly what is happening in the tundish in real time, you can precisely and intelligently disposition the output steel to meet the specifications of customer orders."
Using a combination of computer and laboratory modeling of tundish flows, the U. S. Steel team developed an automated process control method for predicting the chemistry of output steel at its Gary, Ind. plant. Another research effort led to a "turbulence suppressor pad," a patented device that controls the quality of the very high quality steel used in thin-wall beverage cans.
Computed and Measured Tracer Effects
In recent calculations at the Pittsburgh Supercomputing Center, A. K. Sinha and Achilles Vassilicos compared physical measurements of tracer response in scale model and real tundishes to results from computer modeling. Tracers such as a pulse of copper are added to a tundish mix to give a reading of residence time -- how long it takes for the tracer to exit the bath -- and tracer density over time at the exit. This information gives a valuable index of the flow characteristics of a tundish.
The study shows that commonly used numerical techniques are not sufficiently accurate. The researchers developed FORTRAN code that adapts a more accurate algorithm, known as QUICK (quadratic upstream interpolation), for efficient use on the CRAY.
The fast response of the CRAY as compared to in-house workstations -- a day turnaround versus as much as a week -- is important to the U. S. Steel researchers. "These computations sometimes require a lot of tweaking and adjustments to parameters," notes Vassilicos, "sometimes with several restarts. With the CRAY, we can see right away what we're getting, and if something needs to be changed, we can do it."
Researchers: Achilles Vassilicos & A. K. Sinha, U.S. Steel Technical CenterHardware: CRAY Y-MP C-90
Software: QUICK (quadratic upstream interpolation)
Keywords: tundish, steel manufacturing, flow patterns, molten metal, continuous casting, metallurgy, turbulence, transition slabs, tracer effects, process design, drawn and iron (D&I) steel, technology transfer, U.S. Steel, USX.
Car Lite
Aluminum Means Light and Quick
An engineering tour de force, a rare blend of "youthful exuberance, mature judgment and technical excellence" — that's what Chrysler Corp. is saying about the Plymouth Prowler. This update of a 1950s hot rod is a beauty to look at, and for the discerning its beauty is more than skin deep. The Prowler is the first U.S. car engineered from the ground up to exploit aluminum technology. At 2,800 pounds, 50% less than it would weigh with traditional steel design, the Prowler is zippety-quick, fuel efficient and impervious to rust — with no loss in crashworthiness and durability.
| Aluminum space-frame for the Prowler | |
| | |
| |
As the world's largest aluminum company, Alcoa knows what aluminum can do, and it helped develop the Prowler as well as the Audi A8, a German-produced aluminum car making waves in the upscale market. Both cars have an aluminum skeleton, a "space frame," similar to those used in aircraft, with attached aluminum body panels for their sleek skin. Alcoa automotive engineers like Edmund Chu see these cars as harbingers of the future in automotive design and engineering.
"The auto industry tends to look at aluminum's cost per pound," says Chu, "which is substantially higher than steel, and they have years of experience with steel. We encourage them to look at the dollar per pound saved. Aluminum sheet weighs half as much as steel. Cost per pound isn't an appropriate measure of the economics. We emphasize overall cost, and we're working aggressively to make aluminum easier to use." The key, notes Chu, is computational modeling. Alcoa has sophisticated ability to do computer simulations that predict how aluminum structures and body sheets will perform, reducing costly prototyping and trial-and-error processes.
Alcoa has partnered with Pittsburgh Supercomputing Center since 1987, and Alcoa engineers used PSC resources in designing parts for the Prowler and Audi A8. "Automotive products are a key part of Alcoa's future," says Peter Bridenbaugh, Alcoa executive vice-president of automotive structures. "Scientific modeling on the supercomputer helps us design these parts and the manufacturing processes that make them. It allows us to solve time-critical problems in a competitive manner, which is particularly important in automotive design."
Tool & Die: The Inner Hood
A large part of the development cost for body sheets, such as hood and door panels, is designing the "dies" used for stamping sheet-metal parts in mass production. Chu leads Alcoa's effort in this area. Traditionally, sheet-metal forming relies on the ingenuity of tool-and-die craftsman, who have over many years built up sophisticated artisan's know-how for working with steel sheets.
This approach has its limits, however, with new materials like aluminum and with the complex geometries of modern automotive design, where it frequently involves many trial-and-error iterations. Advances in computing make it possible to use mathematical tools to predict the effectiveness of a design before casting the die and trying it out, potentially saving hundreds of thousands of dollars and weeks of time. And Alcoa's modeling ability with aluminum, says Chu, is more advanced than similar techniques with steel.
| |
| This aluminum inner-hood panel shows the multi-cone design developed by Alcoa engineers. |
As an example, Chu points to recent work his engineering group did for a major car company on the underbody of a hood, a part known as an "inner-hood panel." Traditionally, these panels employ steel with a "beam" design, the beams giving strength and rigidity. Cut-outs in the flat part of the sheet reduce weight, but require an extra die and press step, adding production cost. To exploit the unique properties of aluminum, Alcoa developed a "multi-cone" design.
"To integrate product design with material design," says Chu, "you don't want to force aluminum to behave like steel. You want to build aluminum characteristics into your design." With the inner-hood panel, this meant using a lighter gauge than is possible with steel, leading to the multi-cone design, which gives structural integrity and rigidity equivalent to a steel beam panel at half the weight, and without cutouts, avoiding potentially millions of dollars in manufacturing cost.
| |
| Thickness Distribution of Inner-Hood Panel By predicting high-stress areas prior to casting the sheet-metal die, simulations substantially reduce design costs. |
Initial modeling of this panel predicted several locations of high strain. As a validation check on the modeling, a die was cast and a Detroit "stamping house" stamped the panel. Analysis confirmed the predicted high-strain regions where problems occurred in the stamped part. To adjust, Alcoa modified the geometry of the cones and shifted to a more formable alloy — much easier to do, notes Chu, with computer simulation than the traditional trial-and-error process.
"By integrating material design with process design through computational modeling, we can select the optimum alloy to maximize formability of the part. When we run the model, we can do multiple iterations, trying a number of different alloys. "This is possible for two reasons, both of which depend on advanced scientific computing.
Supercomputing and Aluminum Product Design
Alcoa has developed a highly accurate "constitutive model" for aluminum — a mathematical description that relates the microstructure of the metal to how it behaves when formed into a manufactured product. Alcoa's partnership with PSC has allowed it to refine this model to its current high degree of accuracy. "This is one of our strengths," says Chu. "We have the ability with simulations to describe all complex loading conditions."
Supercomputing, furthermore, because it gives fast turnaround, makes it feasible to adjust parameters and look at multiple possibilities, providing design flexiblity that wouldn't otherwise be available. "With the Cray," says Chu, "we can look at five or six scenarios and all at once. The turnaround is six times faster than on our own workstations, and this is critical in the design stage, where you need to make changes quickly."
In the foreseeable future of design with aluminum, says Chu, engineers will create new alloys to meet product requirements. "For a particular product, I want to find out what governs the deformation. Is it strain hardening or elongation or something else? With the model, I can put in a design parameter and different material behavior, fictitiously — make up some new alloy, and then we could turn around and, based on the model predictions, say 'Hey, we can develop this alloy.'"
Researchers: Edmund Chu, Alcoa.
Hardware: CRAY C90
Software: User-developed Code.
Keywords: Alcoa, aluminum, cars, Plymouth Prowler, space frame, Edmund Chu, Audi A8, automotive design, tool-and-die, multi-cone, alloy, constitutive model, automotive engineering, sheet metal forming, metallurgy, materials.
Related Material on the Web:
Welcome to Plymouth — Prowler .
Projects in Scientific Computing , PSC's annual research report.
Turbin Simulation
Electrical power generation is a multi-billion dollar global business, with developing countries creating a growing demand for the 21st century. To gain an edge in this fiercely competitive market, the key for companies like Westinghouse is more efficient turbines. Regardless of the energy source — hydroelectric, coal-fired or nuclear — turbines are the bulwark of modern power plants. These huge jet-engine like machines do the heavy-duty work of converting raw energy into megawatts of electricity, and even slight improvements in turbine efficiency translate into significant reductions in the cost of generating power.
Can high-performance computing help design more efficient turbines? That question confronted senior scientist Paul Cizmas of the Westinghouse Science and Technology Center in 1996. "We needed more realistic simulations and faster turnaround," says Cizmas. "We wanted to solve the problems involved with real turbine configurations, not just simplified versions."
Existing software was sequential, based on a single-processor computing paradigm. It simulated fluid flow interacting with the rotating and stationary blades inside the turbine on a blade-by-blade basis — calculating the aerodynamics one blade at a time. For more accurate results and faster turnaround, Cizmas realized, it made sense to parallelize the software. This would allow Westinghouse to take advantage of systems like Pittsburgh Supercomputing Center's 512-processor CRAY T3E, which puts many processors to work simultaneously on the same job.
"This was a change of paradigm," says Cizmas, "a big step. We believe we're the only company in the United States to have tackled this problem, and we're ahead of the game because we have a close relationship with PSC." After attending a June 1996 Pittsburgh Supercomputing Center parallel-processing workshop, Cizmas discussed the problem with PSC senior consultant Ravi Subramanya. In January 1997, they began collaborating and four months later had working parallel code that was more accurate and much faster than its sequential predecessor. "Jobs that would have taken three months before," says Cizmas, "now run in under 12 hours."
Part of this impressive gain in performance is due to "superlinear" speedup — on test cases, 10 processors together run 15 times faster than one by itself. This seemingly impossible result occurs because parallelizing the turbine aerodynamics, assigning a processor to each turbine blade, improves how data is handled in memory, an advantage that becomes more and more significant with larger simulations. Cizmas has begun tackling these larger problems, involving hundreds of turbine blades, and he expects these computations to provide new understanding of how shape and arrangement of the blades affect aerodynamics, knowledge that will improve the efficiency of Westinghouse turbines.
From Windmills to Power Plants
There's nothing new about turbines. The idea — fluid flow propelling a series of airfoils or blades on a rotating shaft — is at least as old as windmills. As early as 70 B.C., Romans used the waterwheel, an ancestor of water turbines, to grind grain, and Hero of Alexandria, a Greek inventor/mathematician, designed a precursor of the steam turbine in the first century A.D.
Turbines came to the fore in a new context in the 1880s. To generate electricity required rotational speeds beyond what reciprocating engines could produce. The technological innovation that met this need, invented by Sir Charles Parsons in 1884, was the steam turbine. There have been many improvements, but the basic idea has changed little in 100 years — rotating blades convert the energy of high-pressure steam or combustion gases into rapid circular motion.
| | |
| This visualization, with color corresponding to Mach number, shows the flow speeding up through the first stator passageway, slowing down through the rotors, where the passageway diverges, then speeding up even more through the second stage of differently shaped stators. | |
| Download larger version (602KB) of this image. View animated version (200KB GIF) of this image. | |
Modern turbines are comprised of alternating rows of stationary blades (stators) that direct flow into the rotating blades (rotors). The narrowing passageway between stators works like a nozzle to increase fluid velocity as it strikes the rotors. Each stator-rotor series is called a stage, and operational turbines have many stages, with hundreds of blades, to generate the constant 3,600 rpm rotation that is standard for most power plants.
Democratic Computing: One Processor, One Blade
Before Cizmas and Subramanya went to work, Westinghouse's ability to simulate rotor-stator interaction was limited to simplified cases. "Sequential codes are too slow and expensive," says Cizmas, "to provide useful input to the design process." A test case involving one-and-a-half stages — an eight-blade configuration, would take three months of computing on a CRAY C90 single-processor. This made it unrealistic to even think about simulating actual production turbines, often involving three or more stages with 150 or more blades.
To parallelize this software, Subramanya chose to take the approach of assigning each turbine blade to a separate processor, thereby avoiding a major problem with the sequential version. "The sequential code ran slowly to begin with," says Subramanya, "because whenever the calculation shifted from one blade to another, all the data in the processor cache became invalid. It lowered an almost new dataset into memory each time it moved to the next blade."
This difference, especially significant because of the time-dependent nature of the computation, accounts for the superlinear speedup. A set of mesh-like computational grids for each blade keeps track of fluid properties at that blade, and this data updates with each advance in time. Although one processor doesn't have enough associated memory to hold the grid data for 10 blades, it can handle the grids for one blade, greatly reducing the need to swap data back and forth from disk to memory as the computation advances in time.
On an SGI Challenge, for a test case the parallel code runs 15.6 times faster using 10 processors than with one. The CRAY T3E delivers similar speedups, and will perform markedly better than the SGI Challenge, notes Subramanya, for larger simulations that require more processors than available on the SGI machine.
The parallel software is not only faster, it's also more accurate — due to modifications built into the new code. "In the serial code," says Subramanya, "you're using an outdated boundary condition part of the time. This is not physically accurate, but you were forced to do it, because you had to compute sequentially. In parallel, we solve everything at the same time, so we get more realistic results."
On test cases, the new code gives good agreement with data from experimental studies. "We hope to reduce the turbine experimental investigations to a minimum," says Cizmas, who believes that simulations with the new software will save money by scaling down the experimental work.
| | |
| This visualization represents simulation of a turbine with three stages of stators alternating with rotors. Color corresponds to entropy, which shows the wake shedding from each stage and interacting with the next stage. | |
| Download larger version (313KB) of this image. |
Simulations & Design
Larger simulations now underway on the T3E, involving actual turbine blade configurations rather than test cases should improve turbine design in several ways. Experiments have shown that turbine efficiency varies according to the radial spacing between blades. "A wake sheds from the blade's trailing edge and interacts with successive rows of blades," says Subramanya. "This interaction can be constructive or destructive. If it's destructive, then changing the relative position of a blade can improve efficiency."
Accurate simulations also predict hot spots. "Turbines operate at very high temperatures," says Cizmas, "and the flow patterns create hot gas cells that increase blade temperature in certain areas." Pinpointing these hot spots helps with the design of cooling conduits and blade coatings. Simulations can also predict unsteady forces on the blades, adds Cizmas, that cause "flutter" leading to mechanical failure.
In current and future work, Cizmas expects to extend the new software from 2D to 3D simulations, and to use it to optimize the shapes of turbine blades, a factor that also — along with relative positioning — can affect efficiency. "We are no longer limited," says Cizmas, "in the number of blades and rows we can simulate. Time is no longer a problem. Our only restriction is the number of processors. From my point of view, what I'm waiting for now is parallel systems with more processors."
Researchers: Paul Cizmas, Westinghouse Science and Technology Center, Ravi Subramanya, PSC.
Hardware: CRAY T3E
Software: User developed code.
Related Material on the Web:
PaRSI: Parallel Rotor Stator Interaction
Westinghouse Science and Technology Center
Projects in Scientific Computing , PSC's annual research report.
Clean Power
Engineering Knowledge for 21st Century Turbines
On the voyage home to Ithaca, Odysseus and his sailors had to navigate between Scylla and Charybdis — dangerous rocks and a whirlpool. Maneuver to avoid one peril and you risk the other. Researchers at the National Energy Technology Laboratory know the feeling. Their job is to steer the course of environmental stewardship in the face of accelerating demands for electrical power around the globe."America is running short of electricity," said a front-page story in the Wall Street Journal a few months ago (May 11, 2000). The information age — temperature controlled machine rooms and offices — and surging appliance purchases have juiced power consumption. Summertime U.S. peak demand is now about 700,000 megawatts, up from 525,000 in 1989, a rise that threatens to outstrip capacity, now about 780,000 megawatts. Complicating things is that deregulation of the electric utility industry has spawned uncertainty about the return on investment in new plants.
Adding fuel to the fire, literally, developing countries are a burgeoning market for energy. One recent projection holds that over the next few years 300 megawatts of new electric generating capacity will be installed somewhere in the world each day!
What about acid rain? What about greenhouse gases? These and other environmental imperatives drive research that will provide clean power options for the world's energy. At present, 85 percent of U.S. consumption and 90 percent of the world's comes from fossil fuel, and as the president's commission of science and technology advisors reported last year, the current best opportunity for environmental progress in power generation is high-efficiency, low-emission combustion.
"The challenge is to convert fuel to energy without creating pollutants," says George Richards, who leads NETL's combustion dynamics team. The workhorses of electrical-power generation are the jet-engine-like gas turbines that convert fossil fuel into megawatts of electricity, and the mission of Richards' team is to help develop the engineering knowledge to make 21st century turbines more efficient, cleaner and cheaper to operate. In a recent series of simulations at the Pittsburgh Supercomputing Center, they've made progress toward this goal.
Lean, Pre-Mixed Combustion
| |
| The NETL Dynamic Gas Turbine Combustor Download a larger (182 KB) version of this image. |
The power industry began to shift its new installations toward low-emission technology about 10 years ago, says Richards, and many new power plants employ low-emission turbines. The key to these advanced systems is "lean, pre-mixed combustion" — mixing the fuel, typically natural gas, with a relatively high proportion of air prior to burning. This substantially reduces nitrogen oxide pollutants (known as NOx) while allowing high-efficiency operation. The high efficiency reduces carbon dioxide, a major greenhouse gas, and lowered NOx alleviates smog and decreases other byproducts that affect air quality.
But a nasty problem bedevils these systems. With a lean-fuel mix, the combustor flame burns on the thin edge of not having enough fuel to keep burning, and a phenomenon analogous to a flickering candle sets up pressure oscillations — like a series of very rapid small explosions rather than a steadily burning flame. These oscillations can resonate with the vibration modes of the combustion unit and, literally, shake it to pieces.
"This instability is a major issue that every turbine developer using pre-mix combustion has to face," says Richards. "It comes up in every conceivable stage — in development, during engine commissioning, in engine-fielding applications. It comes up in permitting these engines and in keeping them operating. It's a very tricky problem. I'm happy to say that there's been a lot of progress, and we can now see fielded engines using these incredibly clean combustors. But we also know that avoiding instability places very tight restrictions on how the engine can operate. Adding desirable features, like fuel flexibility, or a wider operating range, can lead to the same old problem."
| Swirl Vanes |
To zero-in on the problem, NETL researchers conducted extensive experiments with their Dynamic Gas Turbine Combustor. This state-of-the-art test facility makes it possible to adjust parameters involved in turbine-combustor design — such as location of the fuel injector relative to the flame — and to observe and measure what happens.
The experiments revealed an unexpected result. Changing the location of a nozzle component called the "swirl vane" affected the pressure oscillations. The swirl vane — so-called because it swirls the air flow to create aerodynamics that mix the fuel and air — sits upstream of the fuel injector. In experiments comparing two swirl-vane locations, with other parameters unchanged, when the swirl vane was moved two inches farther upstream the pressure oscillations virtually disappeared. Why?
What to Measure?
The objective, stresses Richards, is to understand the physics behind the observed data, so it can be incorporated rationally into turbine design. Moving the swirl vane gave better performance in one set of conditions, but the data was inconclusive when it came to explaining the results. Prior research suggested that the time lag between when fuel is injected and when it burns is a key factor for the oscillations, but presumably, since the fuel-injector didn't move, the swirl-vane would have little or no effect on this.
| |
| Fuel Mixing in the Combustor Nozzle Download a larger (130 KB) version of this image. |
"You can place the swirl vane either closer to the flame or farther away," says Richards, "and it makes a difference. But we didn't know why. We had some conjectures, and we tested those, but we still couldn't prove what was going on. There's subtle effects, like decay of turbulence and swirling flow, that impact the important time scales — multiple, simultaneous processes, and you can't interpret the experimental data without quantifying the contributions from these simultaneous events."
To sort out the details, Richards and his colleagues turned to simulations on PSC's CRAY T3E. In recent years, the NETL team worked with consultants for FLUENT, commercial fluid-dynamics software, to develop 3D modeling that realistically simulates experiments in the experimental combustor. In summer 1999, with help from PSC scientists, they adapted FLUENT to the CRAY T3E and ran a series of simulations replicating the experiments.
Each computation — one for each experiment — required about a week of computing on 20 T3E processors to simulate 30 milliseconds of combustion. Each produced 20 gigabytes of compressed data, an enormous amount of information, which itself created a huge post-processing task.
When the results were in, they told an interesting story: The aerodynamics in the nozzle are such that moving the swirl vane, with no change to the fuel injector, significantly affects the time lag between injection and burning. In the two cases of interest, moving the swirl vane two inches upstream slows this lag time by a millisecond, and that millisecond makes a big difference in combustion stability.
| Time Lag with Change in Swirl-Vane Location |
"We looked at the simulations," says Richards, "and said 'ah-ha.' It was obvious. The change in this time lag from the point of injection is what we need to measure. That's a whole different universe to work in from where we were, a definite conclusion. It helped us set up the next set of experiments in which we've been trying to make a verifiable measurement of those time scales. And we've made some progress on that."
Flame Volume & Reaction Rate
Along with focusing their analysis of the swirl-vane results, the CRAY T3E simulations also provide the NETL team with a way to look deeper yet at the physics of turbine combustion. A key factor in combustor stability is the flame's reaction rate, the speed of burning, which varies with time. The NETL group would like to know what drives this variable. Does the volume of the flame change, such as when a candle-flame flickers, or does the flame volume stay constant as the burning-rate varies?
"We don't know which occurs in practical systems," says Richards. "We want to use these simulations and identify the dominant mechanism. It's probably some of each, but is it 90/10, 50/50 or 20/80? We may find that it's different under different conditions. That's where the simulations really help. If we show that you go from one mechanism to the other in the same combustor, depending on operating conditions, you'd have to do different things to make the system quiet. With simulations, and going back and forth iteratively with the experiments, we're learning a lot about fundamental physics."
| Researchers | George Richards, National Energy Technology Laboratory |
|---|---|
| Hardware | CRAY T3E |
| Software | FLUENT |
| Related Material on the Web | Super Computing Science Consortium. PSC-NETL Collaborative Projects (1999-2000) and Scientific Visualizations. National Energy Technology Laboratory. Studying Instability in Lean Premixed Combustion, FLUENT Inc. |
| Production Credits | Writing: Michael Schneider HTML Layout/Coding: R. Sean Fulton |
All the World’s a Stage (That Includes the Internet)
While Mr. Raphaeli, known professionally as Magic Roy, has been entertaining people with card tricks and sleight-of-hand since he was 5, he does not perform at birthday parties or casino showrooms.
Instead, Mr. Raphaeli’s stage of choice is the Internet, where he has posted 30 short video clips to Metacafe, a Web site that pays video creators based on how many viewers their work attracts. So far, Mr. Raphaeli has earned more than $13,000 from the site, where his most popular card trick has been seen 1.4 million times.
As video sites look for ways to attract higher-quality content, they are dangling cash, usually offering to cut creators in on the advertising revenue their work generates.
Revver, the Los Angeles company that pioneered the practice, shows a still-frame ad at the end of a video, and funnels money to the creator every time a viewer clicks on the ad to visit the advertiser’s Web site. Metacafe inserts a similar still-frame graphic at the end of a clip; it pays creators $100 when their video has been viewed 20,000 times, and $5 for every 1,000 additional views.
Other sites, like TurnHere and ExpertVillage.com, offer upfront payments for videos on assigned topics, like a tour of Golden Gate Park in San Francisco, or an instructional video about skydiving. And in January, Chad Hurley, a YouTube co-founder, announced at the World Economic Forum that his site, now owned by Google, was exploring similar ways to “reward creativity.”
While the sums involved are not yet impressive enough to lure established TV or movie producers into the world of Internet video, they can be significant for people on the fringes of the entertainment industry, or those who see video production as a sideline to their day job, like Mr. Raphaeli.
He had originally planned to sell DVD compilations of his best tricks, before discovering that he could earn more, and reach a larger audience, by posting his videos online.
Kent Nichols, co-creator of Ask a Ninja, a series of comic videos in which a cranky ninja responds to viewer questions, says he managed to earn more than $20,000 last year on Revver. Mr. Nichols now has an agent, who recently helped him negotiate what he says is a more lucrative advertising deal with another company.
Ahree Lee, a graphic designer in San Francisco, earned “a couple of thousand dollars” in 2006 when a short film she had made became a hit on AtomFilms.com. It featured a fast-paced succession of still photographs she had taken of her face over several years, set to music composed by her husband.
More than a dozen sites now offer payments for videos that range from short snippets to full-length feature films. Some, like Revver, Metacafe and Manhattan-based Blip.tv, generate money from advertising; others, like Brightcove, DivX Stage6 and Cruxy, allow a video’s creator to set a price viewers must pay to view it, and exact a small transaction fee.
Most of the sites require that videos be uploaded to them, rather than sent on a DVD or a tape. When a video is viewed enough times to start generating revenue for its creator, the money is typically transferred to a PayPal account set up by the creator.
But the biggest challenge is attracting an audience.
A co-founder of Metacafe, Arik Czerniak, says his site has around 100,000 people who like reviewing new videos. “They’re practically video addicts,” he said. “If a video is interesting or engaging, it will get very high ratings from them.”
Videos that win raves can wind up on the site’s home page, where, Mr. Czerniak said, “a video can get 500,000 views in a single afternoon, all without you really worrying about marketing your video.”
Others say that a little self-promotion can’t hurt. “We have a MySpace page and a Facebook group,” said Matt Wyatt, a member of the Los Angeles comedy troupe Invisible Engine, referring to two popular social networking sites where he posts the group’s latest videos. “We also e-mail a link to sites like StupidVideos.com and Transbuddha.com — sites that can help a video take off.”
When these sites choose to “embed” a video, using a bit of HTML code to weave it into one of their pages, the advertising still appears and the view is tallied, which generates revenue for the creator.
Some videos manage to catch fire with little effort. Fritz Grobe, a juggler who lives in Buckfield, Me., still cannot explain why a video he posted last June became so popular. It featured an array of two-liter bottles of Diet Coke that he and his partner, Stephen Voltz, detonated using Mentos candies.
“It sparked an instant reaction,” Mr. Grobe said. “Two days after we’d posted it on Revver, ‘The Late Show With David Letterman’ called.”
Burn Again Turbin
Light, heat, kitchen appliances, television and stereo, air conditioners, computing — life as we know it runs on electricity, and electricity comes from turbines. Think of them as jet engines bolted to the floor. Rather than thrust to lift an aircraft off the ground, these turbines produce powerful rotation that drives generators to produce megawatts of electricity, which flows through wires into your home.
The process starts with fossil fuel, still the raw energy source for nearly 90 percent of electrical power worldwide. Two kinds of turbines, steam and gas, share the load. For steam turbines, coal and fuel oil heat boilers from which pressurized steam turns the windmill-like turbine blades. In gas turbines, combustors ignite the fuel and blast hot, pressurized exhaust gas to do the turning work.
As a cleaner-burning fuel, natural gas is favored for low-emission, nearly pollutant-free turbines of the present and foreseeable future. High efficiency — as complete as possible conversion of raw energy into turbine rotation — is the key not only to low CO2 emission, but also to the cost of electricity. Small gains in efficiency that slightly reduce cost per megawatt translate to huge savings overall, and turbine engineers measure efficiency in tenths of a percent.
One new way of thinking about gas turbines is to let combustion carry over from the combustor into the turbine and even to inject additional fuel into the turbine for an extra kick of power, roughly analogous to a jet engine afterburner. This potentially would allow more complete burning and more overall work from the fuel as well as a healthy power boost. The idea is called a "turbine-combustor" and actually, says Paul Cizmas of Texas A&M, it's an old idea revived for new times.
"The idea started in the 1960s, but we didn't have the technology to test it," says Cizmas, a former senior scientist at Westinghouse in Pittsburgh, "and until two or three years ago, it was considered bad to have flame in the turbine, mainly because of the problem of cooling the blades. Then we realized we already design the first row of blades, the vanes, as if the flame is there. The problems come in the rows after that, and recent studies have shown we need to worry about how to cool these rows anyway, even if we don't have turbine combustion."
In September 2000, the U.S. Department of Energy (DOE) Strategic Center for Natural Gas selected Siemens-Westinghouse Power Corp. to study the turbine-combustor idea, with Texas A&M on board to focus on computational simulation. "There's an opportunity to significantly reduce the cost of an engine," says Tom Lippert, manager of science and technology center combustion programs at Siemens-Westinghouse, "and to reduce the cost of electricity."
A necessary step out of the starting blocks is to develop a reliable way to simulate the physical processes. "This could be a different combustion process altogether from what we're used to," says Lippert. "You have to have analytical tools that allow you to combine combustion kinetics with aerodynamics. This hasn't been done before, at least in this fashion."
In work a few years ago at the Pittsburgh Supercomputing Center, Cizmas developed software that simulates the complex flows involved in turbine blade design. By relying on the computing power of massively parallel systems, Cizmas's approach proved to be both faster and more accurate than prior approaches. In 2001, through the Super Computing Science Consortium, a southwest Pennsylvania-West Virginia regional partnership that links DOE's National Energy Technology Laboratory with PSC, Cizmas used the 512-processor CRAY T3E at Pittsburgh to develop and test a computational method for the much more complex flow problems involved in a turbine-combustor.
Unexplored Territory
As they began this project, Cizmas and his colleague, Dragos Isvoranu of Politehnica University Bucharest, soon realized they had an interesting opportunity. A literature search revealed no prior data on combustion within a turbine.
The first challenge was not a new one — simulating flow in the alternating rows of stationary and rotating blades, vanes and rotors, that characterize modern power turbines. The passageway between vanes is like a nozzle that speeds up the hot gas and shoots it toward the spinning rotors. Each vane-rotor series is called a stage, with many stages, hundreds of blades, in operational turbines. Although these flows are complicated, Cizmas had already tackled this problem with success.
The added complication is combustion, which changes everything. By itself, combustion has been modeled, but not when the flow is this complicated. "Flow in the turbine," says Cizmas, "is like a nightmare because of the rotor-stator interaction. Now on top of this, we want to have a stable flame. Flow in the combustor is almost steady, if not steady, but in this case, it's very unsteady. So you have the problem not only of correctly simulating the combustion, but also unsteadiness in the combustion — a very challenging problem."
Added heat not only increases temperature, it also can radically change the flow patterns. Flow from the combustor alone induces "hot streaks" downstream in the turbine, which affect blade life and blade design. With the added stress of combustion within the turbine, these problems are exacerbated.
As a practical matter, computer simulation is the only hope for gaining enough understanding to arrive at a design concept. "The question is how should we inject fuel," says Cizmas, "at what location, what pressure, what quantity? Should it be continuous or pulsed? The possible variations are infinite. Experimental investigation with scale models would take years."
When the Fuel Hits the Rotor
The numerical challenge was to develop an algorithm that coupled the equations for fluid flow with the "species equations" for each chemical constituent of the combustion reaction — natural gas (methane), oxygen, carbon-dioxide, carbon-monoxide, water. Cizmas and Isvoranu used an efficient approach — a fully implicit, finite-difference method — that employs a moving set of mesh-like grids to subdivide the space around the turbine blades. In these initial computations, the blades are represented as a uniform cross-section extended from the turbine hub.
Using the CRAY T3E to put their software through its paces, the researchers modeled a one-stage turbine-combustor with 32 vanes and 49 blades — a relatively simple configuration that corresponds to an existing turbine. The work is shared among processors, one for each blade. For this initial test, they modeled fuel injection with basic parameters — through a single small hole at the trailing edge of the vanes, at low velocity, and relatively high temperature and pressure.
As Cizmas expected, the simulations show delayed ignition. In the space between blades, the fuel rapidly gains heat and mixes with oxygen and then ignites as it hits the rotating blades. Ignition occurs with unexpected intensity, however, almost like an explosion, says Cizmas. "At the moment when the wake of fuel from the stator hits the rotor, you see rapid acceleration of the reaction. We were expecting strong coupling between the flow and combustion, but we didn't know we'd see this quite important increase in reaction speed."
To assess accuracy, Cizmas ran the simulations with three different grid resolutions — coarse, medium and fine — with nearly 50,000 processor hours on the T3E. Analysis showed good agreement between the high and medium resolution grids, a strong indication that the modeling is accurately representing the physical process. An innovative correction algorithm for the species computation, says Cizmas, appears to work well. Coupling between the equations for flow and combustion raised a theoretical question about reaction thermodynamics, a specialty of Isvoranu, which the researchers are looking at and which may necessitate a revised approach.
Cizmas expects that the Siemens-Westinghouse research program will lead to operational turbine-combustors within five years with implementation in power plants in ten years. One possibility envisioned by this research is a turbine-combustor unit that eliminates the combustor as a separate component. "This is an opportunity," says Lippert, "to significantly reduce the cost of an engine. If you integrate the combustion system into the blade path, you eliminate a big piece of hardware."
A more immediate use, though, is likely to come from adding turbine-combustion to existing turbines, creating a flexible power reserve. "If there's a point in the day that the plant needs to produce more power," says Cizmas, "it can be done by turning on combustion in the turbine." The only current option for a sudden power crunch is to bring an additional turbine on-line, potentially producing more power than required, which gluts the bidding market for power and lowers profitability.
Cizmas is preparing his turbine-combustor software for a more comprehensive series of simulations on LeMieux, PSC's 3,000-processor terascale system. These studies will incorporate a fully 3D representation of the blades, with one processor assigned to each of 50 cross-sectional slices per blade to capture the finely detailed aerodynamics of blade curvature.
Though results are preliminary, the first round of simulations offers new understanding to help point the way ahead. The accelerated ignition at the rotors suggests the fuel may need to be pulsed in time with rotor frequency. "Other issues," says Cizmas, "include what angle to inject. We're looking at a variation of angles to increase turbulence, which helps mixing. And at what velocity should we inject? What temperature and pressure? How many injection points per blade? How should they be spaced? We're also looking at fuel composition. Should we use pure methane, or methane mixed with air? And what about using hydrogen? We're going to be busy for awhile."
Researchers:
Paul Cizmas, Texas A&M University
Hardware:
CRAY T3E
Software:
User-developed code.
Related Material on the Web:
Turn, Turn, Turn, Projects in Scientific Computing, 1998
References:
Author:
Michael Schneider, Pittsburgh Supercomputing Center
Photo Credits:
Kim D. Miller
Revised: October 21, 2002
URL: http://www.psc.edu/science/2002/cizmas/burn_again_turbine.html