School of Aerospace Engineering. Georgia ... capacity vehicle (~ 40 passenger class) similar to ...... Mechanics and Reliability Section, San Antonio,. Texas ...
IPPD THROUGH ROBUST DESIGN SIMULATION FOR AN AFFORDABLE SHORT HAUL CIVIL TILTROTOR Mr. Andrew P. Baker* Graduate Research Assistant
Dr. Dimitri N. Mavris† Manager of ASDL
Dr. Daniel P. Schrage† Director, NRTC CoE
Aerospace Systems Design Laboratory (ASDL) School of Aerospace Engineering Georgia Institute of Technology Atlanta, GA 30332-0150 RSM RSE DOE CCD OEC FPI CDF
Abstract Beyond the Bell/Boeing 609, the next step in civil tiltrotor evolution will most likely be a larger capacity vehicle (~ 40 passenger class) similar to NASA’s vision of a Short Haul Civil Tiltrotor (SHCT). This vehicle will be designed, built and operated in an era being shaped by today’s increased emphasis on affordability. This paper discusses the authors’ views on the subject and outlines the steps taken to develop a new methodology which will allow a true assessment of the affordability of such a SHCT. Affordability will not be defined by cost metrics alone. Instead, it will be based on the concept of value and tradeoffs between cost and mission effectiveness; measured by maintainability, reliability, safety, etc. In addition, the motivation for this shift in design philosophy and the resulting need for knowledge to be brought forward in the proposed methodology is reviewed. Furthermore, this shift in knowledge calls for a paradigm shift in the design evolution process based on the realization that decisions made during the early design phases are not deterministic in nature and should therefore be handled probabilistically. The approach taken acknowledges this need and defines a suitable probabilistic design environment. The fundamental building blocks of this method are also outlined and discussed including key concepts, tools, techniques, and the approach taken to implement this process.
Motivation “ In a broad sense, the most important benefit needed today in the helicopter business -- the most exciting man-on-the-moon project -- is dramatically 1 This reduced cost, or improved affordability.” excerpt, taken from Dr. David Jenney’s 1996 Alexander A. Nikolsky lecture, holds true for the entire helicopter industry and is particularly fitting to the civil tiltrotor concept. For a new concept vehicle (at least in the minds of the airlines and the public) it is imperative that affordability is addressed in all phases of this vehicle’s design. The notion, advocated by Dr. Jenney, that affordability must be defined in a “broad sense” is crystallized in the new design methodology presented in this paper. With the Bell/Boeing 609 slated for first delivery in 2001, the next step in civil tiltrotor progress is exemplified by NASA’s Short Haul Civil Tiltrotor (SHCT) which will be used as the baseline vehicle (Figure 1). Through this new methodology, the SHCT will benefit from upgrades to the synthesis/sizing code which will provide a better representation of the various contributing disciplines including the economic module. In addition, this vehicle will benefit from the methodology’s ability to account for the uncertainty associated with new technologies which are expected and probably required for vehicle success.
Acronyms IPPD LCC RDS
Response Surface Methodology Response Surface Equation Design of Experiments Central Composite Design Overall Evaluation Criterion Fast Probability Integration Cumulative Distribution Function
Integrated Product/Process Development Life Cycle Cost Robust Design Simulation
* Graduate Student Member, AHS † Faculty Member, AHS
Introduction At its most basic level, Design for Affordability entails comparing the benefits derived from a system versus the costs necessary to achieve these benefits.
Presented at the American Helicopter Society 53rd Annual Forum, Virginia Beach, VA, April 29 - May 1 1997. Copyright © 1997 by the American Helicopter Society, Inc. All rights reserved.
1
Under the cost category, defining acquisition cost as the metric or evaluation criteria for system cost is
deterministic, evolutionary solutions for a small number of design alternatives which have few links to a truly affordable system. Some of the upgrades 100%
Kn ow led ge
Know le
dge
st Co
Today’s Design Process Future Design Process
Co st
50%
d Free om
Concept
Preliminary Design
Freedom
0%
Analysis and Detail Design
Prototype Development
Redesign
Product Release
Figure 2: Cost, Knowledge & Freedom Relations (Adapted from Reference 14)
Figure 1: CTR2000 Civil Tiltrotor
needed by the synthesis/sizing code to allow for a more realistic, representative assessment of system affordability include: 1) linking the code to a cost model that incorporates the needs of the manufacturer and the operator as well as manufacturing processes 2) increasing the fidelity of the discipline level analysis modules within the synthesis code 3) eliminating weight-based cost relationships and moving towards activity or process based cost estimating relationships 4) incorporating risk/readiness assessment for infusion of new technologies 5) addressing issues of code fidelity probabilistically 6) updating the sizing scaling rules inherent to synthesis into more vehicle specific ones 7) bringing life cycle considerations upstream to the conceptual phase where they could be treated as constraints, etc.. By bringing knowledge forward there is a fundamental change in the design process. The deterministic approach is no longer applicable or even desired. The early design phases now become probabilistic in nature. Noise variables in the economic model such as fuel prices and load factors are beyond the control of the designer and must be modeled with probability distributions, if their statistics are known, or with other stochastic methods (i.e. fuzzy logic) if they are unknown. The infusion of technology to enhance affordability, enhance capability or to avoid “show stoppers” must also have an associated uncertainty. This uncertainty is again probabilistic or stochastic in nature since it relies on an assessment of the readiness level of the 4 Even the fidelity of the proposed technology. discipline analyses are fundamentally stochastic in nature since there is always some uncertainty
2
inadequate. Thus, the evaluation of a system’s cost has shifted from the simple acquisition cost metric to include costs associated with its entire life cycle such as operation and support costs as well as retirement 3 and disposal costs . With this emphasis on life cycle costs (LCC), it is necessary to appreciate the relationship between cost, knowledge, and freedom in the context of system design. Figure 2 shows these relationships for today’s design process as well as the desired relationships of the future design process. As Figure 2 illustrates, a large portion of a system’s LCC is committed or “locked in” by the decisions that are made in the early design stages of today’s design approach. Yet the knowledge of the system is limited during these phases. In addition, freedom to make design changes rapidly vanishes in this approach. Therefore, the early design stages present the only opportunity for the designer to efficiently and inexpensively leverage the cost and design freedom available. The tools at the disposal of the conceptual designer, however, are not geared toward this end. The primary tool utilized is the synthesis/sizing code which usually is historically based or limited to first order analyses and may include some kind of optimization routine. This process will provide
2
associated with an analysis module. The proposed design methodology presented in this paper acknowledges the need and calls for the development of a probabilistic design environment which ultimately will provide the ability to truly assess the affordability of a complex system.
system is no longer dictated solely by mission capability or cost in isolation. Instead, it is a robust design that balances mission capability with other system effectiveness attributes while keeping cost under close attention. This balance between benefit and cost is the main foundation behind design for affordability, and it may be simply measured by the ratio of benefits provided or gained from the product or service to the cost of giving or achieving those benefits. In order to identify the disciplines/sciences needed to measure and predict affordability, one must examine what attributes contribute to overall system effectiveness. The approach taken presently is based on the idea that the only way to measure or evaluate total system effectiveness is through the identification and inclusion of all key contributing attributes. An example breakdown appears in Figure 3.
Affordability Buried within the design for affordability methodology lies a key observation about the relationship between improvement and affordability. Improvements in the design of complex systems whether on the technical/discipline level or the methodology/process level must be linked to some tangible assessment of a vehicle’s affordability. Thus, even the current definition of affordability as the minimization of a system’s life cycle cost is still lacking. Design for Affordability represents a paradigm shift where design and evaluation of a
Figure 3: System Effectiveness Chart With this breakdown in hand, an inclusive metric for affordability can be postulated, and it is defined below as:
Affordability =
Mission Effectiveness Cost of Achieving This Effectivene
(1)
where Mission Effectiveness = k1(Capability) + k2(Dispatch Reliability) + k3(Safety/Dependability)
System Effectiveness can be formally defined by selecting three discipline metrics, each of which
represents one of the three key attributes. The metric coefficients, ki, provide the ability to tailor this
3
effectiveness to specific needs, preferences, or points of view of a customer. These attributes are directly linked to the traditional product and process disciplines such as aerodynamics, structures, propulsion, dynamics, stability and control, manufacturing, and supportability. Key Elements Needed to Address Affordability
Integrated Product/Process Development IPPD incorporates a systematic approach to the early integration and concurrent application of all the disciplines that play a part throughout a system’s life 5 cycle. The framework for bringing knowledge forward builds on a generic IPPD methodology. The flow of design tradeoffs at different levels with the generic IPPD methodology at the center is illustrated in Figure 4. The time line is from Conceptual Design (top box) to the Manufacturing Process (bottom box) and essentially accounts for the system development process. Illustrated are three levels of parallel design trades (represented as circular iterations): system, component and part. The right half of the circle represents system decomposition (traditional systems engineering approach) and principally includes product design trades, while the left half represents system recomposition (more recent quality engineering approach) and principally includes process design trades. Numerous short design iterations at the system level are sought, with an appropriate reduction in the number of iterations at the component and part levels. The long iteration around the outer loop is definitely to be avoided, for it would indicate that the system had to be redesigned due to design incompatibilities with the manufacturing and/or other downstream processes.
Figure 4: IPPD Flow Diagram The generic IPPD methodology developed to execute this flow in simulation is illustrated in Figure 5. It consists of an “umbrella” with four key elements identified: systems engineering methods, quality engineering methods, a top-down design decision support process, and a computer-integrated environment. Each of these elements by themselves is necessary, but not sufficient, for the conduct of IPPD. Below the “umbrella” are the major activities of each element. Systems engineering was aerospace/military initiated to deal primarily with the performance of large scale complex systems and is predominantly decomposition oriented and product design driven, while quality engineering was predominantly commercially initiated for competitiveness and is predominantly recomposition oriented and process design driven. Since “design tradeoffs” imply a decision-based approach, a top-down design decision support process is placed at the center with a set of generic decision-making steps. The arrows indicate the required interaction and iteration between various methods/tools and the necessity of a computer integrated environment. The primary iteration is between “System Synthesis through Multidisciplinary Design Optimization (MDO)” to generate feasible alternatives which are then addressed for “Robust Design Assessment & Optimization”. The evaluated alternatives are then fed back for updated “System Synthesis” to complete the iteration.
4
Figure 6: Robust Design Simulation
Figure 5: The Georgia Tech IPPD Methodology
RDS differs from this approach by accounting for both product and process contributions, to the chosen evaluation criterion, in the presence of risk and uncertainty. Robust Design Simulation may also account for manufacturing issues (i.e. process characteristics) and uncertainty associated with new technologies. These can be measured in terms of confidence and readiness levels. The uncertainty associated with the system is usually provided in the form of a probability distribution when the statistics are known or by a fuzzy set when limited information as to the range and shape of the distribution is available. Thus, RDS does not aim at the traditional optimized point design but provides definition of the design space dictated by the customer requirements, product/process characteristics, and environmental/design constraints in the presence of risk and uncertainty. The design solution sought may not be the optimum solution based on the traditional approach but it will be an optimal solution that is affected least by the variables outside the control of the designer. The success of the RDS approach will hinge on the ability to integrate it into the design process and enhance the decision making capability of the designer and program management. In order to properly represent the product and process characteristics inherent to the RDS approach more physics-based, higher fidelity simulation tools are required to replace the historically based, “artificially regressed” analyses inherent to the sizing and synthesis code. Figure 7
Robust Design Simulation (RDS) Robust Design is defined as the systematic approach to finding optimum values of design factors which result in economical designs with low 6 variability. In this case, variability may be due to analytical tool fidelity, operational uncertainty, manufacturing tolerances or due to uncertainty and risk associated with the infusion of new technologies. A Robust Design Simulation approach has been developed which incorporates all elements essential to the success of the design into an IPPD framework. The key elements and objectives of RDS are illustrated in Figure 6. Traditionally, design is comprised of a simulation code (sizing/synthesis with or without economic analysis capability) and an optimization routine which varies the design or economic parameters to yield an “optimum” solution subject to all imposed design constraints. In this approach, a system’s “affordability” was directly linked to a readily attainable performance or weight metric. Typically a historical parametric relation linking cost to some combination of gross weight or empty weight and/or required fuel weight was used to define system affordability.
5
The RSE is a regression curve (surface) whose coefficients are determined by applying a least squares analysis to the responses generated by a set of experiments or simulations. Although past experience with RSE generation has validated the use of a second order polynomial model the need for a higher order model is possible. In this case, dependent or independent variable transformations may be attempted or the use of neural networks may be employed to model the required responses. Design of Experiments (DOE) As mentioned in the previous section, the coefficients of the RSE are determined utilizing a carefully planned design of experiments or simulations. This approach ensures that the resulting RSE will be applicable in a sufficiently large design space without requiring an unrealistic number of simulation runs (or cases) to provide the response data for the regression analysis. The DOE chosen will dictate the number of simulation runs required based on the number of levels considered, the number of interactions modeled and the number of variables prescribed. Table 1 illustrates the number of cases required for different DOEs at three levels. Even for 7 variables at three levels, the full factorial design represents an unrealistic number of design cases. By employing a fractional factorial DOE the required cases are manageable with higher order effects neglected. Fractional factorial designs neglect third or higher order interactions and, in the case of RSE generation, account for only main and quadratic effects and second order interactions (see Equation 2). Table 1 also illustrates the ability to limit the number of cases by limiting the number of variables.
Figure 7: RDS Integration provides an overview of this process. Direct integration of the various analyses to the synthesis code will undoubtedly lead to a cumbersome and potentially unmanageable situation for the designer. This state is avoided by capturing the essence of the higher fidelity tools by parametrically modeling them with response surface equations (RSE) and incorporating these RSEs into the synthesis/sizing code. This method provides for the smoothest integration of the disciplines into the design process.
Simulation / Probabilistic Tools and Techniques
Response Surface Equations The method used to create RSEs is a statistical technique which seeks to identify and relate the relative contributions of various design variables or factors to the system responses. Generally, the exact deterministic relationships that govern the behavior of the measured responses to the set of design variables is either too complex or unknown. Therefore an empirical model is constructed which captures the system response as a function of the design variables. The empirical model used in this methodology is assumed to be second order with k number of design variables. This second-degree model is assumed to exist and can be expressed in the following form. k
RSE=bo +
k
i=1
k
biix2i +
bixi+ i=1
7 Variables
12 Variables
Equation
3-level, Full Factorial Central Composite Box- Behnken D-Optimal Design
2,187
531,441
143
4,121
n 2 +2n+1
62 36
2,187 91
(n+1)(n+2)/2
3
n
Table 1: Number of Cases for Different DOEs
k
bijxixj
DOE
(2)
Screening Test
i