Design verification and validation in product lifecycle - CiteSeerX

98 downloads 52384 Views 1MB Size Report
development. The paper also defines the design application areas ..... as well as to consider the cost and quality issues defined by GD&T makes this subject an ...
G Model

CIRP-598; No. of Pages 20 CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

Contents lists available at ScienceDirect

CIRP Annals - Manufacturing Technology jou rnal homep age : ht t p: // ees .e lse vi er. com/ci rp/ def a ult . asp

Design verification and validation in product lifecycle P.G. Maropoulos (1)a,*, D. Ceglarek (1)b a b

Department of Mechanical Engineering, University of Bath, Claverton Down, Bath BA2 7AY, UK Warwick Digital Laboratory, University of Warwick, Coventry, UK

A R T I C L E I N F O

A B S T R A C T

Keywords: Design Validation Verification Lifecycle management

The verification and validation of engineering designs are of primary importance as they directly influence production performance and ultimately define product functionality and customer perception. Research in aspects of verification and validation is widely spread ranging from tools employed during the digital design phase, to methods deployed for prototype verification and validation. This paper reviews the standard definitions of verification and validation in the context of engineering design and progresses to provide a coherent analysis and classification of these activities from preliminary design, to design in the digital domain and the physical verification and validation of products and processes. The scope of the paper includes aspects of system design and demonstrates how complex products are validated in the context of their lifecycle. Industrial requirements are highlighted and research trends and priorities identified. ß 2010 CIRP.

1. Introduction Globalisation coupled with product customisation and short time to market have spearheaded new levels of competition among manufacturers. In CIRP, the needs for design adaptability [1], the ability to develop products and services for the ecommerce era [2] and the issues of dealing with design complexity [3] have been recognised. To be successful in the global market, manufacturing companies are increasingly expanding simulation models from product and process based (value chains) to service based (value networks) by focusing on lifecycle simulations and design for product variation [4] to obtain both quality of product and robustness of processes, and to enable the validation and verification of products and processes to 6-sigma. These methods are vital to reduce process faults and facilitate efficient and effective engineering changes. Current validation and verification-based approaches mainly focus on product conformance to specifications, product functionality and process capability. However, even the most robust systems can be subject to failures during product verification and validation. This paper presents the concepts of validation and verification in the product lifecycle by including analysis and review of literature and state-of-the-art in: (i) preliminary design, (ii) digital product and process development; (iii) physical product and process realisation; (iv) system and network design; and (v) complex product verification and validation. The paper starts with a summary of the scientific motivation for the review of design verification and validation. The definitions of verification and validation are then covered, including concepts and definitions arising from ISO standards as well as software

* Corresponding author.

development. The paper also defines the design application areas in terms of products, processes and systems and reviews mainstream methods and systems. 2. Motivation, scope and definitions of verification and validation methods and technologies 2.1. Motivation The current product and production system requirements that influence the way products are developed and verified include:  Mass customisation and personalisation.  Reconfigurability and flexibility of production systems.  Responsive factories. Products and processes need to be designed, verified and validated in a manner that is compatible with the above industrial requirements. Fig. 1 shows a representation of validating products and processes after the digital modelling phase, clearly identifying the research questions and business drivers. Validation in the digital space is a key objective and industrial requirement that drives research and development. If this were to be feasible, the results would have been reduced lead times and critically, fewer failures and better perceived product quality by the customers. Fig. 2 shows the closed-loop nature of the process required for managing the lifecycle data capture for design validation. This ability presupposes:  Integrated and holistic views of design in order to be able to validate in an integrated manner.  Digital modelling and representation ability for both the product and the process (function and specification testing).  A time horizon that includes the product lifecycle.

0007-8506/$ – see front matter ß 2010 CIRP. doi:10.1016/j.cirp.2010.05.005

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

2

Fig. 1. Validation and verification requirements in the product lifecycle.

Fig. 2. Closed-loop validation and verification.

The following observations are valid in relation to the present industrial practice for design verification and validation:

overall functionality at component, subsystem and complete product level. Processes are also validated at each one of their physical levels so as to provide the required physical attributes of components, sub-assemblies and the overall product. The system and network design and development also includes a digital phase and major considerations are confirmed by validating real system performance. Product lifecycle aspects are best exemplified by considering how complex products are validated in the context of lifecycle considerations. The framework shown in Fig. 3, puts a coherent structure to the multiplicity of digital analyses, manufacturing processes and metrology technologies needed for the verification and validation of complex products in their lifecycle. These techniques and methods and their relevance to design verification and validation are analysed herein.

 Such activities are usually executed when the design process is almost complete, during prototyping and first-off testing and development. This results in frequent deviations from the required form, dimensions or function, extending development times and increasing the compliance cost.  This problem is both procedural (stage or time of execution of such activities and requirement for different skills) and theoretical (lack of robust verification and validation methods for deployment during the digital design stages).  The aim is to execute verification and validation as early as possible during the design process, by developing new generation digital or virtual testing methods.  Complexity in design makes verification and validation even more difficult to apply as part of the design process. 2.2. Scope of the keynote paper 2.2.1. A framework for design verification and validation Fig. 3 shows the scope of the new framework for engineering design verification and validation which is lifecycle based, tracking the progression of engineering designs across four key stages: (i) from the preliminary design stage that sets the requirements, (ii) to the digital design domain, (iii) the physical, product and process development and prototyping phase, and (iv) the consequent design of the production system and network for the realisation of complex products and processes. Product and process designs are developed in the digital domain and the final validation usually requires the execution of physical trials to confirm the product properties, dimensions and

2.2.2. Keynote scope The scope for this keynote is outlined in Fig. 4. The main focus of the paper is on product and process verification and validation. System perspectives are also included for completeness and lifecycle aspects are covered by reviewing standards and practices in relation to the verification and validation of complex products. The paper principally deals with mechanical engineering design from meso-scale to large-scale, and the corresponding processes, typical of high complexity and value industry sectors such as aerospace, marine and automotive. 2.3. Definitions of verification and validation Verification and validation are the methods that are used for confirming that a product, service, or system meets its respective specifications and fulfils its intended purpose. In general terms, verification is a quality control process that is used to evaluate

Fig. 3. A conceptual framework for design verification and validation.

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

Fig. 4. Scope of the keynote paper.

whether or not a product, service, or system complies with regulations, specifications, or conditions imposed at the start of a development phase [5,6]. Validation, on the other hand, is a quality assurance process of establishing evidence that provides a high degree of assurance that a product, service, or system accomplishes its intended use requirements [5,6]. Verification and validation have been defined in various ways that do not necessarily comply with standard definitions. For instance, journal articles and textbooks use the terms ‘‘verification’’ and ‘‘validation’’ interchangeably [7,8], or in some cases there is reference to ‘‘verification, validation, and testing (VV&T)’’ as if it were a single concept, with no discernible distinction among the three terms [9]. Table 1 shows definitions of verification and validation as provided by international and national bodies. The definitions given by ISO 9000 [16] originate from the general field of quality and focus on the provision of ‘‘objective evidence’’ that specified requirements have been fulfilled. The

verification process according to ISO is broadly defined, and validation is focused on fulfilling an intended use or application. The Global Harmonisation Task Force, defines verification in a manner compatible with ISO, and process validation is based on consistent generation of results that satisfy predetermined requirements [19]. However, such generic definitions evolved due to the specific demands of application domains. For example, in the field of metrology, the Joint Committee for Guides in Metrology defines verification on the basis that a ‘‘target measurement uncertainty has been met’’ [17]. The definition of validation is much less specific, referring to the adequacy of requirements for an intended use. The verification definition by the International Organisation of Legal Metrology [18] is based on the interpretation of the word ‘‘accurate’’, and it clearly creates a direct link with metrology in the process of establishing how different the real artefact is from its modelling representation. There are extensive definitions of verification and validation in the context of digital design and these definitions also cover aspects of modelling and simulation. These include the IEEE Standard 610 [10] and the definitions of the US Department of Defence (DoD) [12], as shown in Table 1. The US Department of Navy [13] and the CFD Committee of AIAA [14] provide definitions for modelling and simulation software systems that are derivatives of those provided by the US DoD. The US Food and Drug Administration has given definitions of digital systems verification and validation [15], which explicitly include references to the ‘‘consistency’’ and ‘‘correctness’’ of the software. SAE Aerospace [20] and Sargent [21] reported a variety of design verification aspects, as shown in Fig. 5. In summary, the generic definitions for design verification and validation are given by ISO 9000 [16]. As the digital stages of design become increasingly important, the verification of the modelling

Table 1 Definitions of verification and validation in the digital and physical domains.

V&V processes in digital design phase

V&V processes in physical world

3

Verification

Validation

The process of evaluating software to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase [10] The process of determining that a computational model accurately represents the underlying mathematical model and its solution [11] The process of determining that a computer model, simulation, or federation of models and simulations implementations and their associated data accurately represent the developer’s conceptual description and specifications [12] The process of determining the degree to which a modelling and simulation (M&S) system and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model [13] The process of determining that a model accurately represents the developer’s conceptual description of the model and the solution to the model [14] Providing objective evidence that the design outputs of a particular phase of the software development lifecycle meet all of the specified requirements for that phase [15]

The process of evaluating software during or at the end of the development process to determine whether it satisfies specified requirements [10]

Confirmation, through the provision of objective evidence, that specified requirements have been fulfilled [16] Provision of objective evidence that a given item fulfils specified requirements, such as confirmation that a target measurement uncertainty can be met [17] Pertains to the examination and marking and/or issuing of a verification certificate for a measuring system [18] Confirmation by examination and provision of evidence that the specified requirements have been fulfilled [19]

The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model [11] The process of determining the degree to which a model, simulation, or federation of models and simulations, and their associated data are accurate representations of the real world from the perspective of the intended use(s) [12] The process of determining that an M&S implementation and its associated data accurately represent the developer’s conceptual description and specifications [13]

The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model [14] Confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented through software can be consistently fulfilled [15] Confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled [16] Where the specified requirements are adequate for an intended use [17] Objective evidence that a process consistently produces a result or product meeting its predetermined requirements [19] Validation of requirements and specific assumptions is the process of ensuring that the specified requirements are sufficiently correct and complete so that the product will meet applicable airworthiness requirements [20]

The verification process ensures that the system implementation satisfies the validated requirements [20]

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 4

P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

Fig. 6. Transition of designer’s intent to physical realisation through GPS guidelines.

Fig. 5. Verification in digital and physical world (adapted from Refs. [20,21]).

and simulation aspects [10,12] will become increasingly applicable. The overall process for integrated digital and physical prototype verification and validation is exemplified by SAE Aerospace [20], see Fig. 5, and the metrological practice governing the physical prototypes is given by VIM [17]. 3. International standards related to product and process design in the lifecycle perspective International standards play an important role in preserving the designer’s intent and seamlessly utilising the associated information and manufacturing practices in a heterogeneous manufacturing environment. The transition of the designer’s intent from the digital design specification to the actual product and associated service realisation is illustrated in Fig. 5. Today, as each phase of the product’s lifecycle is globally dispersed in supply and knowledge chains [2], international standards are essential to deploy standardised manufacturing execution protocols in order to establish an unambiguous definition ‘‘language’’ throughout a global supply chain and ensure consistent product performance in the service phase. Hence, the provisions of the most relevant to product and process verification and validation standards are analysed herein. 3.1. Standards for representing product information Computer interpretable representation of product information is utilised within a variety of CAx applications for design verification and validation. The majority of these standards represent geometric information and evolved to cover other aspects. Standards such as Geometrical Product Specification (GPS) [22], ASME Y14.5: Geometric Dimensioning and Tolerancing (GD&T) [23], STandard for Exchange of Product model data (STEP) [24] have thus evolved for modelling and preserving other aspects of product related information such as tolerances, kinematics, dynamics and manufacturing processes. For example, the STEP and GPS standards have evolved, providing product specific informa-

tion constructs known as ‘‘application protocols’’ in STEP and ‘‘GPS matrix’’ in GPS. Current GPS standards define global guidelines along with fundamental principles for capturing designer’s intent and expressing design requirements. Product and process design characteristics such as size, angle, orientation and surface texture are considered as individual chains as shown in Fig. 6. The information regarding each characteristic is categorised according to its relevance in the product lifecycle. Each category is called a ‘‘link’’ within the GPS masterplan [22]. Thus, a comprehensive ‘‘chain-link’’ matrix (Fig. 6) has resulted in a number of GPS standards which address how product specific characteristics can be represented and utilised throughout the design, manufacture and verification phases of the product. For example, designer’s intent regarding the size of the product’s feature is preserved in the ‘‘size’’ chain of the GPS matrix. Mathieu and Dantan [25] proposed to ISO a new model for Geometric Specification and Verification called ‘‘GeoSpelling’’ as a basis for GPS standards rebuilding. The merits of GPS standards have been exploited in a variety of digital product design applications such as coherent tolerancing process [26], evaluation of measurement uncertainty [27] and quantitative characterisation of surface texture [28,29]. Srinivasan [30] identified the merits of unifying and standardising ad hoc approaches practiced by industry. GPS allows such unification and standardisation through global guidelines described in the GPS masterplan [22]. More recent GPS standards [31] introduced the concepts of specification uncertainty and correlation uncertainty that directly influence validation and verification. A symbolic language called GD&T [23] has been developed for describing nominal geometry of parts and assemblies and allowable variation in the product design and verification phase. GD&T brings significant benefits in design and inspection activities as a correct GD&T representation captures design intent and shows the functional requirements of the part as well as the method for its inspection [23]. Arguably, the most important benefit of the GD&T approach lies in ensuring, at the design phase, that component parts will assemble into the final product and function as intended [32]. Shen et al. [33] proposed a semantic GD&T representation model, named the ‘‘constraint-tolerance-feature-graph’’ that is claimed to satisfy all tolerance analysis needs. Kong et al. [34] formulated an approach for the analysis of non-stationary

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

tolerance variation during a multi-station assembly process with GD&T considerations. The application of GD&T for mechanical design has gained widespread acceptance by industry [35]. However, several organisations have attempted to implement the method without a fundamental understanding of how the design process is impacted [36]. Poorly applied GD&T, ambiguous plus/minus location or orientation controls, and sometimes no variation specifications are commonly encountered [37]. The need to capture functional requirements and improve the design of parts as well as to consider the cost and quality issues defined by GD&T makes this subject an even more important element of mechanical engineering design [38]. In summary, the GPS [22,31] and GD&T [23] standards are vital for the correct and efficient verification of mechanical engineering designs. There are exciting new research opportunities arising from the utilisation of these standards to automate the bidirectional relationships between design specifications, process capability and measurement uncertainty. The STEP project was launched with the objective of conserving the manufacturing context and developing information bridges between segregated CAx domains [24]. EXPRESS [39] is used to specify requirements on information content as ‘‘it consists of language elements that allow an unambiguous data definition and specification of constraints on the data defined’’. The development of the STEP standard was governed by industry’s need to overcome interoperability problems. The standard established a neutral data file format that is used for developing domain specific applications using application protocols (APs). For example, AP 219 [40] provides information requirements for analysing the dimensional inspection data and results of solid parts and assemblies. Fig. 7 shows a selected set of application protocols that are vitally important for the communication and sharing of data required in design verification and validation of mechanical components. 3.2. Standards for representing manufacturing processes A ‘‘process’’ in a manufacturing context is defined as a combination of activities that occur over a period of time in which objects participate [41]. The National Institute of Standards and Technology (NIST) in the USA developed the Process Specification Language (PSL) [42] to create a generic, neutral and high-level language for specifying processes and the integration of multiple process-related applications. PSL uses the ontology based Knowledge Interchange Format to specify concepts, terminology and relationships for processes. Similarly, a data model for representing manufacturing processes was developed by NIST, which later became a part of the international standard ISO 16100 for exchanging information between design and manufacturing process planning software systems for mechanical products [43]. The need for comprehensive information regarding specific manufacturing processes and the verification of components, compelled practitioners to develop process specific international

5

standards such as DMIS [44], DML [45] and I++DME [46] for the exchange of inspection process information and measurement results in the production environment. Similarly, the BS EN ISO 8062 series [47] and the BS EN ISO 10135 [48] series of standards within the GPS framework cover the requirements for casting and moulding processes. Another set of process specific standards is the ISO 14649 series [49], with parts corresponding to different processes; for instance, part 16 [50] for performing inspection operations in a STEP-NC manufacturing environment. 3.3. Standards for representing manufacturing resources A typical manufacturing system consists of a range of resources such as machine tools, material handling systems, fixtures, robotic arms, and measurement systems [51]. Each resource has a distinct purpose and thus provides specific capabilities that are utilised in manufacturing decision-making. A variety of international standards have evolved in order to utilise and exchange the information regarding manufacturing resources and their capabilities in a digital environment [52]. For example, ISO 13584 [53] with the acronym PLIB is a series of standards for the computerbased representation and exchange of part library data. PLIB is fully inter-operable with STEP [24]. Resource specific standards have evolved to satisfy business needs. For example, ISO 13399 [54] deals with the representation and exchange of cutting tool data and ASME B5.59-2 [55] is an information model for machine tools. Measurement equipment related GPS standards [56,57] were developed to describe the acceptance tests for co-ordinate measuring machines and general requirements for GPS measuring equipment respectively. 3.4. Standards for preserving design verification knowledge International standards are used to preserve and seamlessly transfer context specific knowledge obtained through design verification, within a heterogeneous manufacturing environment. Business sectors such as, aerospace manufacturing, defence, ship building and military equipment manufacturing intensively invest in research and development activities and have a strong requirement to conserve and reuse knowledge acquired through the design verification processes. Consequently, ISO 10303 AP 209 [58] has been developed by aerospace and commercial research organisations for associating engineering analysis data with geometric data. ISO 10303 AP 237 deals with the exchange of computational fluid dynamics (CFD) information, including product geometry, associated meshes defining the computational details and CFD boundary conditions [59]. 4. Verification and validation in the early stages of design – capture intent and confirm requirements The early design stages are vitally important for the correct capture of technical and lifecycle requirements arising from understanding and interpreting market needs. Verification is inherent in methods deployed during these important early stages, although this is not always appreciated by designers and manufacturing practitioners. This section outlines methods for design idea validation and quality function deployment (QFD) as well as the more technical aspects of ensuring that consistency in terms of key design objectives is maintained using key characteristics (KCs) and Design for X (DFX) techniques. 4.1. Product idea validation and market analysis

Fig. 7. Integration of designer’s intents within STEP framework.

There are three key considerations that are applied in the early stages of design: (1) to prioritise customer needs (CNs) in a quantitative manner based on market analysis; (2) to select the best design schema; and (3) to improve communication at all levels of the organisation. Methods such as matrix prioritisation and analytical hierarchy process [60] are applied to help the

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

6

4.4. The use of key characteristics in early design

Fig. 8. Four-phase process planning by QFD [63].

enterprise determine where to invest the development resources to achieve maximum payoff. The traditional way is to analyse CNs systematically and to transform them into the appropriate product features. However, it is difficult to assess the performance of the transformation process with an accurate quantitative evaluation. Bu¨yu¨ko¨zkan et al. [61] presented a fuzzy group decision-making approach to better align CNs with objectives of product development in QFD. This prioritisation of customer needs creates a set of criteria that is used for validating the final product i.e., assessing whether the enterprise is building the right product, service or system. 4.2. Quality function deployment QFD is a customer-driven methodology for product design and development that underpins quality systems and has found extensive applications in industry via the development of a multiplicity of tools and systems that aid an enterprise in understanding the voice of the customer [60]. QFD efficiently translates CNs into design requirements and parts deployment [62]. As shown in Fig. 8, a generic QFD process consists of four phases in order to relate the voice of the customer to product design requirements (phase 1), and then translate these into parts characteristics (phase 2), manufacturing operations (phase 3), and production requirements (phase 4) [63]. During early design, the first and second phases of the four QFD phases are implemented [63] and part characteristics are defined. In summary, QFD is critical to design validation as it translates customer needs into part characteristics and production controls that can then be used for design verification, by forming the set of criteria against which product and process compliance can be assessed. 4.3. Functional decomposition and flow analysis The verification and validation process of a function can be viewed as functional decomposition and flow analysis which aim to break overall functionalities down to functionally independent sub-functions as finely as possible [64]. A functional structure can be validated by considering both logical and physical dependencies and confirming matching inputs and outputs among sub-functions [65]. Several flow analysis methods such as bond graph and Petri nets [66] and modularity methods such as function structure heuristic method [67], design structure matrix [68] and modular function deployment [69] are applicable to the verification and validation of functional structures. In an era of increasing product sophistication, engineered systems are likely to become more complicated, increasing the functional requirements [3]. Suh [3] defined complexity as the measure of uncertainty in achieving the functional requirements of a complex system and outlined how axiomatic design can be used to reduce design complexity while satisfying the functional requirements within given constraints. As such, axiomatic design can enhance the functional validation of designs.

Variability in production and measurement procedures can result in lower than expected quality levels, compromised product performance and increased rectification costs. Key characteristics (KCs) are being used to help identify and reduce important root causes of variability [70]. Research focused on KCs has had a significant impact in improving product and process performance in the context of the lifecycle [71,72]. KC methodologies have been introduced into the product development practices of world-class companies [73]. Thornton [74] categorised product related KCs according to the level of the product model as KCs belonging to; product, subsystem, component, feature and feature face. Thornton [75] proposed a method for variation risk management in aircraft and automotive production by establishing a direct link between KCs and the type of inspection process used for verification. The use of KCs for manufacturing planning during early design enhances process verification. Dai and Tang [76] defined verification parameters by prioritizing KCs. Whitney [77] proposed a KC oriented method for assembly planning by selecting the necessary part features, tools and machine capabilities. Wang and Ceglarek [78] developed a KC based methodology for quality-driven sequence planning. Suri et al. [79] introduced a technique based on key inspection characteristics to enhance process capability. Maropoulos et al. [80] proposed the use of aggregate product models as a method for the early integration of dimensional verification and process planning for complex product design and assembly. Maropoulos et al. [81] outlined the verification and validation related benefits arising from the integration of measurement and assembly using a digital enterprise framework that links key elements of the product, process and resource models. 4.5. Design for X Design for X (DFX) is an umbrella term used to denote design philosophies and methodologies which aim to improve designs by raising the designer’s awareness for a certain product lifecycle value or characteristic represented by ‘X’ [82]. The design considerations applied in DFX have a direct relationship to the verification methods for the ‘‘X’’ objective. Design for Manufacture (DFM) [77,83] includes a wide range of design rules and guidelines defined from the perspective of improving the manufacturability of parts. For example, the design guidelines for end milling stipulate that milled features should be designed in such a way so that the end mill required is limited to 3:1 in length to diameter ratio; the reason being that longer end mills are prone to chatter that deteriorates surface quality. Applying this DFM guideline will impact directly on end milling process capability in terms of surface quality and this will influence the process verification procedure, such as the sampling method deployed and the method of surface roughness measurement. The impact of Design for Assembly (DFA) [77,83] on verification is also direct. For instance, the part reduction of an electromechanical sub-assembly as a consequence of applying DFA may result in more complex parts that have additional features. This will directly change the inspection plan in terms of the number, type and sequence of measurement operations, the measurement points per operation and the selection of the measuring device. Also, DFA for automated assembly stipulates design methods so that parts can be supplied in the right orientation and do not tangle with other parts [84]. This again increases process yield and influences the sampling method deployed for assembly verification data collection and analysis. Design for Ergonomics is important in labour intensive industries [85] and has a noticeable and positive effect on process verification, as controls and displays are re-designed so that readings cannot be misinterpreted. Design for changeover is vital in high variety environments [86] and improves process verification as a consequence of high repeatability set-ups.

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

Design for 6-sigma (DFSS) is a design activity that aims to generate high capability, 6s processes, before production commences. DFSS is usually deployed within QFD and is also referred to as ‘‘Define–Measure–Analyse–Design–Verify’’ [87]. This is an explicit reflection of the inherent ability of DFSS to enhance the verification and validation of processes. There are considerable research challenges in developing new methodologies that link DFSS with KCs, so that key product features and dimensions are specified and evaluated by applying process capability criteria. Such methods would need to be directly integrated with the definition of GD&T, so that datum points, key dimensions, inspection methods and process capability are interlinked in an unambiguous manner. 5. Design verification and validation in the digital environment Digital prototyping helps manufacturers to virtually simulate a product and its associated lifecycle phases such as, product manufacture, assembly and functionality, before the product is physically realised. This gives manufacturers an excellent opportunity to visualise and anticipate aspects of the physical performance of a design with less reliance on costly physical experimentation. Physical prototyping and testing is still a requirement, especially for complex products. However, the clear current industry trend is toward reducing physical testing by replacing suitable aspects by virtual testing and verification. The digital verification results are compared with the experimentation results; this validates and certifies computational code embedded in a digital prototype. Thus, a validated digital prototype can be utilised for verifying the physical performance of the product manufactured in the globally dispersed supply chain. 5.1. Digital mock-up A digital mock-up (DMU), sometimes referred to as a virtual prototype, is essentially a digital simulation of a physical prototype and is increasingly used for the verification of product functionality. DMU is emerging as the core design collaboration tool, around which different engineering teams verify the product through its entire lifecycle, from production planning to functional testing, maintenance and recycling [88,89]. Multiple engineering teams can now operate in parallel, working on the same DMU, and this facilitates the enterprise wide application of concurrent engineering practice. Recently, the usage of DMU has increased, mainly among aerospace and automotive companies, owing in a large part to the availability of more robust models and enhanced computing resources. For instance, the Chrysler Corporation, used DMU to reduce automobile development cycle by half, while resolving 1200 potential issues before the first physical mock-up was built [90]. Using proprietary DMU systems, Boeing was able to reduce errors and rework on its 777 airliner by 70–80%, saving 100,000 design hours and millions of dollars [90]. Similarly, Airbus is also increasingly exploiting the advantages of DMU [91]. For complex engineering products, the use of DMU is not without problems, the largest of which is ensuring data quality between all of its suppliers, customers and design offices. For instance, data loss when transferring from one CAD format to another remains a major issue [91]. In summary, DMU is a powerful verification tool and research for its development should be based on: (i) enhanced capabilities to simulate functional performance using functional mock-up methods, and (ii) the solid foundation of international standards. The existing STEP (ISO 10303) standard captures adequately geometric data, while data pertaining to history based modelling [92], assembly [93], and kinematics linkages are less well represented [94]. ISO 10303-105 [95] is a good base for kinematic structure representation and supports case studies for machine tool modelling [96].

7

5.2. Tolerance analysis and optimisation The primary function of tolerance setting is to balance the product functionality with economic factors [97]. Excessively tight tolerances will add cost due to more complex processing stages whereas inadequately wide tolerances will result in insufficient quality and costly rework. Tolerances are vitally important in the process of dimensional verification of mechanical parts and assemblies as the uncertainty of the measurement instrument needs to be an order of magnitude smaller than the tolerance value. Historically, tolerances are decided on the basis of legacy practice within a company and as Maropoulos et al. [81] suggest, many tolerances are set based on process capability and not on the study of tolerance build-up during assembly. A review of tolerancing methods by Singh et al. [98] identifies the main academic and industrial practices dealing with tolerancing as belonging to either ‘‘tolerance analysis’’ or ‘‘tolerance synthesis’’. In essence, tolerance analysis attempts to estimate the assembly tolerance stack-up, while synthesis considers the assembly and product requirements and distributes the assembly tolerances accordingly [99]. 5.2.1. Modelling assembly tolerances Dantan and Qureshi [100] describe statistical tolerance analysis as a 2D method that computes the probability that the product can be assembled and will function under a given set of tolerances. The assembly response function can be expressed as a function of the individual and independent component dimensions [101]. As shown in Fig. 9, there are two basic approaches to tolerance analysis, the worst-case method and the root sum square method [98]. The worst-case method assumes that the tolerances are at their respective extremities and the stack-up is consistently accumulative (i.e., there is no tolerance cancellation). This is a pessimistic estimate, but due to its simplicity it is still relevant today; however it can only be employed in one-dimension at a time [102]. The root sum square (RSS) method conversely gives a rather optimistic assembly tolerance estimate, as it is a simple statistical model based on the normal distribution. As before, the RSS method is only suited to single dimensional tolerance problems [103]. A more advanced method that is somewhat more indicative of tolerance stack-up in the physical world, is the Spotts modified approach [104]; this is essentially an average of the worse-case and the RSS model. The ‘‘correction factor’’ approach is also experimentally based, based on scaling the RSS to make it a more realistic figure. However, this method has particular limitations if the tolerances/dimensions in the stack-up vary greatly and/or are of small quantities [98]. More complex assembly response functions and non-normal tolerance distributions can cause difficulties when using traditional analytical techniques as a high number of samples is required to create an accurate estimation of the assembly response. In such cases, Monte Carlo Simulation (MCS) has become a viable solution. MCS can be applied when the assembly response function cannot be expressed analytically as a linear model and also when dealing with the effects of tolerance stack-up within kinematic systems [105]. In the ‘‘kinematic’’ approach [106], the tolerance chain is treated as a kinematic loop, with the understanding that the movements of the links are actually small displacements within prescribed tolerance zones. This approach involves modelling the small displacements using small displace-

Fig. 9. Tolerance analysis [98].

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 8

P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

ment torsors [107] and modelling the effects that local small displacement have on the remote functional requirement using Jacobian transforms [108]. Desrochers et al. [109] proposed a unified Jacobian-torsor model for statistical or worst-case tolerance analysis or synthesis [110]. 5.2.2. Digital tolerancing methods and tolerance optimisation Optimizing tolerances aims to maximise the functional performance and economic factors associated with tolerances. The economic factor is often expressed in a quality loss function [111] and in most applications the Taguchi loss function is used. Govindaluri et al. [97] consider the quality loss from the perspective of the customer and the manufacturing and rejection costs by the manufacturer. When incorporating Taguchi’s quality loss function Cheng and Maghsoodloo [112] found that when a component’s mean varies, only the quality loss associated with that component will be changed; whereas when a component’s variance shifts, the optimal allowance, tolerance costs, and quality losses associated with each component will be affected. Tolerance optimisation methods are classed as either deterministic or stochastic; the former considers the nominal values of design variables with respect to given input values, using a single point for evaluation, whereas the latter consider the statistical variation of the design variables [113,114]. Computer Aided Tolerancing systems can provide a simulation platform for modelling the effects of tolerance setting within a manufacturing process or assembly [115,116]. Tolerance analysis and synthesis are considered within a DMU to include aspects of tolerance build-up and assembly clashes [117]. Tolerance design methods have been summarised by Singh et al. [99] as shown in Fig. 10, including traditional and advanced methods.

design has made a direct and very positive impact on part verification as helped to codify and standardise both the manufacturing processes and the inspection methods used for types of features, thus improving design verification. Research is still required to provide coherence in relating inspection systems and methods to processes, especially in cases where there is a wide range of measurement options available, such as the verification of machined features, or complex assembly features. Case [122] used methods associated with external approach directions for features to enhance process capability and Wong and Wong [123] used volumetric machining features for part modelling in their feature-based design system. Several feature-based design systems are reported with a focus on prismatic machining process. In the case of machining, feature-based design allows the corresponding definition of ‘‘standardised’’ machining processes that are proven in terms of process capability. This is of major significance, as it allows rapid verification of a design in terms of its modelling entities and the corresponding machining process. Feature-based methods had a profound effect on computer automated process planning (CAPP) for machining. Gu et al. [124] identified the sequence of machining process in four stages namely; feature extraction, feature prioritisation, clustering of operations and the identifying of precedence relationships. Laperriere and ElMaraghy used precedence graphs for assembly sequence planning [125]. Qiao et al. [126] used a genetic algorithm method to sequence the machining operations for prismatic parts. Li et al. [127] and Ong et al. [128] tried to solve the process planning problems by combining the non-traditional optimisation techniques, namely genetic algorithm and simulated annealing. Azab and ElMaraghy used quadratic assignment for reconfiguring process plans [129]. The common problems and characteristics of these CAPP approaches for machining are one or more of the following:

5.3. Features for machining CAD/CAM/CAPP verification In the last two decades, extensive research efforts in various segments of CAx integration using feature technology have been reported especially for the integration of CAD and CAM. Salomons et al. [118], and Subrahmanyam and Wozny [119] have identified three major approaches of feature technology namely; interactive feature definition, automatic feature recognition and design by features. In interactive feature definition, features are defined with human assistance after creating the geometric model. Automatic feature recognition involves the comparison of the geometric model with pre-defined generic features. Many approaches for feature recognition have been reported; Lin et al. [120] extracted manufacturing features present in a feature-based design model, while ElMaraghy and ElMaraghy [121] introduced the concept of functional and manufacturing features. Presently, the design – by – features approach has become the core technology for product modelling. Feature definitions (templates) are placed in the feature library, from which features are instantiated by specifying dimension parameters, location parameters and application related attributes. Feature-based

 Feature recognition is used in most of the approaches. Hence, the feature-based databases of commercial software are not utilised.  After recognition, the features (mostly design oriented) are converted into application (manufacturing) features using a knowledge base or heuristic rules. The common attributes are not directly transferred to application features.  The process plans produced by these systems consider only a single machine set-up. But, in the factory environment, several machines may be used in different set-ups.  The precedence constraints in the component are represented with respect to features and not with respect to low-level entities, namely operations.  The set-ups were optimised with respect to the tool approach directions. This in turn reduces the search space or looses feasible design points. To conclude, process planning research has not as yet reached the maturity of key methods to focus on verification and validation in an integrated manner. The feature recognition approach is theoretically the most generic approach to process planning but it partly negates the design and process standardisation and verification benefits of feature-based design. 5.4. Virtual assembly modelling and simulation

Fig. 10. Tolerance design methods [99].

Virtual or digital assembly modelling is a powerful and effective technology for the verification of assemblies during the digital design phase. Assembly process planning (APP) is a core component of virtual assembly modelling as it deals with assembly constraint identification, equipment selection and sequence generation [130]. Wang and Ceglarek [131] proposed an assembly sequence planning method which comprises: (1) sequence generation for predetermined line configurations using k-piece mixed-graph representation of assembly; (2) dimensional quality model of variation propagation for assembly processes with compliant parts; and (3) evaluation of sequences based on the multivariate process capability index.

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

9

Ceglarek [144] proposed a GA and low-discrepancy sampling technique-based optimal design space reduction method to optimise the locator positions in a multi-station assembly system while ensuring the robustness of the fixturing system in terms of the product’s dimensional quality.

Fig. 11. The VADE usage scenario [134].

Using Virtual Reality (VR), the 3D digital mock-up of the product can be manipulated with the assistance of VR interactive devices. It, therefore, attracted great interest from researchers dealing with assembly planning. The advantages of applying virtual engineering for assembly process planning were summarised by Jun et al. [132]. From the concurrent engineering perspective, it is preferable to implement the assembly and disassembly process in a virtual environment at an early stage of design, when only the geometric forms are determined and the functions can still be defined [132,133]. The Virtual Assembly Design Environment (VADE) was created to demonstrate the potential and the challenges involved in the design and manufacturing processes [134]. Fig. 11 illustrates the usage scenario of VADE. The VADE system allows the user to perform assembly processes by hand and assembly tools on the virtual product with the import data from a parametric CAD system. By maintaining a dynamic correlation with a CAD system, the design information created during the virtual assembly process is updated at the end of using VADE. Banerjee et al. [135] studied the effectiveness of VR in assembly planning by comparing: blueprints, a non-immersive desktop VR environment and an immersive projection-base VR environment. The results showed that the completion time of the assembly process was approximately halved by utilising VR. An Augmented Reality (AR) based human-computer interface was developed by Ong et al. [136] to provide an immersive and intuitive environment. Unlike VR, the assembly design and planning using AR can be verified by manipulating the virtual prototypes in the real assembly environment, which will decrease the possibility of redesigning and re-planning. 5.4.1. Digital tooling and fixturing for assembly Digital assembly modelling is now well established in the advanced engineering industries, like aerospace and automotive, for the design of assemblies and their integration with the design of tooling and the associated jigs and fixtures. Commercial software systems allow the seamless integration of product, process and resource models [137]. The data generated during assembly tolerance analysis can be utilised by tool designers to define appropriate tooling tolerances. Such systems are also being deployed within ITER – the nuclear fusion project – to model the manipulation of cassette tooling, the loading of which is robot controlled [138]. Additionally, the digital mock-up of tooling can simulate accessibility issues and lines of sight for an optical measurement system [139]. Digital fixturing is a key enabling technology for low cost tooling that will enhance industry’s capability for batch production and customisation of products [140]. As an extension from the established methods of rapid prototyping (RP) from a DMU to a physical mock-up, a range of rapid tooling applications are being developed [141]. An alternative to rapid tooling is to employ reconfigurable tooling; this generally requires modular components that allow a virtually unlimited number of tooling configurations. Ceglarek et al. [142] extended the ‘‘N-2-1’’ fixture layout design methodology by introducing a movability restraint condition which is essential for material handling fixture design. Kong and Ceglarek [143] addressed a fixture workspace synthesis method for reconfigurable assembly systems. Phoomboplab and

5.4.2. Stream-of-variation modelling and design synthesis Stream-of-Variation Analysis (SOVA) is a mathematical model to describe the relation between final product quality and process parameters of complex multistage assembly [145,146]. SOVA can predict potential downstream assembly problems, based on evaluations of the design using a large array of process variables. By integrating product and process design in a pre-production simulation, SOVA can head off individual assembly errors that contribute to an accumulating set of dimensional variations, which ultimately result in out-of-tolerance parts and products. Once in the ramp-up stage of production, SOVA can compare predicted misalignments with actual measurements to determine the degree of mismatch in the assemblies and diagnose the root causes of the errors [145,146]. Individual design tasks must be integrated in order to optimise the design of the entire system. Phoomboplab and Ceglarek [147] proposed a design synthesis method based on a hybrid design structure matrix which integrates design tasks with design configurations of key control characteristics, especially for dimensional management in multistage assembly systems. The method can generate design tasks sequences to minimise simulation time as well as benchmark design task sequences in terms of dimensional quality improvement. 5.5. Digital measurement modelling and planning 5.5.1. Measurement and inspection planning techniques The measurement process, often called inspection process, is now a vital element of integrated design and manufacturing [148]. Computer Aided Inspection Planning (CAIP) systems have been developed to accomplish the measurement planning task by the following generic procedures: (1) CAD interface and feature recognition, (2) determination of the inspection sequence of the features of a part, (3) determination of the number of measuring points and their locations, (4) determination of the measuring paths, and (5) simulation and verification [149]. The stages of CAIP for Co-ordinate Measuring Machines (CMMs), are defined as; establish the best sequence of inspection steps, the detailed inspection procedure of each feature, feature accessibility by probes, probe path planning and collision checking, generating the CMM control commands, and the post-processing of measured data such as statistical and cost analysis [150]. The first generation of inspection planning systems was developed by Hopp [151] and ElMaraghy and Gu [152]. Automatic inspection planning for dimensional and geometric inspections has two distinguished levels: macro- and micro-level planning [153,154]. Subsequently, Lee et al. [155] divided the planning process into two steps: global inspection planning that is focused on the generation of an optimum inspection sequence and local inspection planning that is focused on minimizing errors and times throughout the measurement process. Research in CAIP falls into two categories: (a) tolerance-driven inspection process planning and (b) geometry-based inspection process planning [148]. The former considers inspections on features with allocated tolerance requirements while the latter aims to conduct an entire geometry inspection by comparing the obtained complete geometric description of a part or product with the design model. The geometry-based CAIP systems theoretically offer a more coherent inspection process but at a high time and cost [148]. Recent research has been carried out aiming at the automation of the inspection features reorganisation, by extracting from the CAD model directly. Similar research concerning feature clustering, probe accessibility and orientation analysis dominated research interest for CMM-based inspection planning carried out

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 10

P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

Fig. 12. Overview of the theoretical framework for integrating measurement with assembly planning.

by Limaiem and ElMaraghy [156], Zhang et al. [157] and Hwang et al. [158]. With the rapid development of artificial intelligence and knowledge-based techniques, Expert Systems, Neural Networks and Fuzzy Logic were used to automate the measurement planning process. The expert system developed by Moroni et al. [159] tackles the problem of selecting touch probes and generating the measurement configurations. Lu et al. [160] and Hwang et al. [158] employed artificial neural network techniques to obtain the optimum inspection sequence while Beg and Shunmugam [161,162] achieved the same objective utilizing Fuzzy Logic on a prismatic part inspection process. Mohib et al. [163] used knowledge rules to select the most appropriate probe type and optimised the planned inspection tasks using a hybrid laser/CMM for complex geometries. 5.5.2. Metrology process modelling for verification planning Process modelling is an essential technology for design evaluation and process planning based on the codification of engineering knowledge and analytical methods [164,165]. There is a scarcity of metrology process models for measurement planning and this may be due to the traditional industrial perception of metrology simply being a verification step, rather than being an essential element of the production process [166]. Moreover, new frameless metrology systems have been integrated with production and assembly, enhancing the need for developing a process model to codify their capabilities [80,81]. Maropoulos et al. [166] proposed a theoretical framework for the development of metrology process models for integrating product design with assembly planning, based on the Digital Enterprise Technology methodology [167,168]. Fig. 12 shows the metrology framework, with the metrology process model positioned central to the integration of the design verification process with the verification of assembly operations and the subsequent deployment of measurement systems that support measurementassisted automation. The framework explicitly recognises the need to co-ordinate the digital verification aspects (left part of Fig. 12), with those that involve the physical deployment of measurement equipment for product and process verification (right part of Fig. 12) [166,168]. Industry requires the definition of new research projects addressing the development and evaluation of integrated metrology and assembly methods and systems that offer superior positional and orientation accuracy, with in-built verification capability. Such systems must be fully compliant with relevant standards and best practice guides including; ISO GUM [169], ASME B89.4.19 [170] and STEP (ISO 10303) [24]. 5.5.3. Measurement and inspection equipment selection A vitally important stage in the digital verification planning is the identification and selection of inspection equipment. This largely refers to measuring systems deployed for dimensional and shape validation of parts and assemblies. There is a very wide

spectrum of physical scale and accuracy requirements for which such systems need to be selected covering industrial production from small parts (measured in millimeters) to large, complex products such as aircraft, ships, and wind turbines [166,171,172]. New techniques such as absolute length measuring interferometry and six-degrees-of-freedom probes are frequently combined with more traditional systems such as CMMs to cover the dimensional and shape verification needs of modern products [171,172]. The selection process needs to be based on metrology process models and employs multiple criteria with a key requirement being the definition and minimisation of measurement uncertainty [163,171]. Cai et al. [168,173] proposed an approach for large volume metrology instruments selection based on measurability characteristics (MCs) analysis. Inspired by the concept of quality characteristics, MCs can be used for instrument selection on the basis of measurement capability, cost and technology readiness. Muelaner et al. [174] proposed an approach employing a data filtering technique for instrument selection and Cuypers et al. [175] specify the task requirements and part restrictions before selecting instruments manually. There are exciting, new research challenges in generic measurement systems modelling and capability derivation that are essential for instrument selection and measurement planning within CAIP. Research is also needed for the integration of CAIP with CAPP, based on the coherent modelling of capabilities. 5.6. Computational and virtual methods for functional product verification and optimisation 5.6.1. Structural function verification and finite elements analysis The growing interest in reducing reliance on testing and cut the cost and time of certification of structural systems has pushed the academic and industrial world toward the development of Virtual Testing Labs (VTL) where the Finite Element Analysis (FEA) technique is employed to predict the possible behaviour of real world structures until failure (Fig. 13). However, to reduce and replace physical testing by virtual FEA testing, procedures must be put in place to demonstrate that the virtual tests are able to replicate actual tests and to generate the necessary confidence within the design and certification communities. The first stage of FEA is the ‘‘idealisation process’’ which takes the real-life structural design problem and turns it into an idealised mathematical model, the Finite Element Model (FEM). The second stage involves selecting appropriate finite elements, mesh layouts and solution algorithms to define the structural behaviour of the idealised mechanical system. The creation of an error-control

Fig. 13. Virtual testing procedure.

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

procedure to facilitate the user of the FEA in solving structural design problems has been extensively studied. Other methods for creating error-free FE models may involve the use of sensitivity analyses [176]. Besides these intrinsic FEA errors, other uncertainties are present such as the experimental boundary conditions, exact panel geometry and presence of initial imperfections that affect the accuracy of the virtual testing. Such issues are more pronounced for structures made of newly developed materials such as hybrid materials, fibre reinforced plastics (composites) due to their high dimensional variability of products. This is becoming an important issue for thick large-scale structures where measurement of residual stresses and distortion are challenging tasks. To solve these issues, upstream 3D digital measurement and quality control techniques need to be employed in a synergistic manner with the finite element method for accurate representation of structural and material behaviour under in-service loads (static, vibration, cyclic loads and impact). While classical computational stress analyses provide good predictions in the elastic regime, they have not previously achieved predictive accuracy in the presence of damage and fracture. This limitation is starting to be overcome by new simulation strategies, which combine advances in the generality and physical realism of damage formulations with new experimental techniques for probing the physics of failure at the micron and nanometer scales. These research advances are making possible high-fidelity virtual tests, where the mechanical behaviour of a structure up to ultimate failure is computed through simulations of the physical processes involved at the atomic [177], microscopic and structural scales [178]. 5.6.2. Design function verification using computational fluid dynamics With the increasing availability of affordable access to substantial computing resources, computational fluid dynamics (CFD) is now becoming established as a viable tool for computer aided engineering and design, in spite of uncertainties that continue to surround the topics of automated mesh generation, solution sensitivity to mesh size and distribution, and the verification and realism of turbulence models. CFD software offers increasingly sophisticated (and computationally demanding) analysis features such as free-surface modelling, fluid-structure interaction (FSI) and large eddy simulation (LES). The turbomachinery and aircraft industries have made use of CFD for many years to study flows around smooth-shaped aerodynamic surfaces. Calibrated physical models are used for these flows using highly structured ‘‘curvilinear’’ (body-fitted) meshes to make best use of available resources. CFD has resulted in significant improvements to the design of compressor and turbine blades [179], including the use of inverse design and multiobjective optimisation techniques [180], with the attention of the industry and researchers now turning ever more assiduously to improving the use of valuable compressor bleed air in gas-turbine internal-air cooling systems [179,181]. In aircraft design, the requirement to carry out large-scale computations of complete aircraft configurations motivated the development of empirical ‘‘one-equation’’ models of turbulence for computational economy [182]. Following a period in which turbulence models tended to move toward more complicated, multiple-equation closures (such as shear–stress, v2-f or the even more substantial Reynolds–Stress models), the robustness and relative economy of one-equation models, such as Spalart and Allmaras [182], is enjoying a return to more widespread favour, and developments of such models to account for more complicated flow situations are now being proposed and introduced [183]. An important issue with the handling of complex geometries such as car body surfaces is the efficient translation from solid model geometry (CAD) representations into a form suitable for automated mesh generation for CFD. Dawes [184] proposes a tightly integrated approach in which a pre-defined mesh also acts as the surface geometry detection mechanism (using algorithms

11

Fig. 14. Isosurface of instantaneous vorticity over an F-18C aircraft at 308 angle of attack [185].

derived from medical imaging). This also lends itself to boundary surface adaptation in response to the flow, a process known as sculpting. Similar modelling of the interface between flexible membranes or solid surfaces and the forces exerted on them by a fluid medium is the basis of FSI, where finite element modelling can be integrated with CFD to calculate structural deformation in response to varying fluid dynamics loads. LES offers the prospect of less reliance of solutions on the often incomplete representation of flow physics using turbulence models. In LES, an unsteady turbulent flow is simulated in full three-dimensional and time-accurate detail, with only the exception of very small-scale (so-called ‘‘sub-grid’’) energy dissipation processes. The matching of LES techniques to more traditional modelling methods in low turbulence research, such as near walls, offers the prospect of high-fidelity ‘‘numerical experiments’’ being conducted replacing the need for large-scale physical testing. The unsteady information provided by the LES technique also lends itself naturally to the unsteady aerodynamics of separated flows, for example around wind turbine blades or around aircraft at very high angles of attack as shown in Fig. 14, as well as providing the fluctuating pressure information that is vital for controlling unsteady vibrations or acoustic signatures.

6. Physical product and process verification and validation 6.1. Product design – physical verification and validation Before digital prototyping and testing became the prerequisites of rapid product development, physical prototyping techniques were prevalent in industry and have influenced product performance, quality and competitiveness in the global markets. Physical testing is still an expected industry practice, frequently linked to product certification. For example, aerospace products undergo strict testing to pass certification criteria and automobile manufacturers are required to test their prototypes following combustion and safety standards. Moreover, physical testing generates valuable knowledge and data that can be utilised to enhance the design of future products or variants. 6.1.1. Dimensional and shape verification and validation Component verification is the process of assessing the conformance of key features and characteristics of a manufactured component to the specifications prescribed by the product designers, as these are captured by the GD&T notations. The scope of this paper is according to the GPS standard [186], that prescribes the surface, geometric and dimensional characteristics involved in verification, as shown in Fig. 15.

Fig. 15. Dimensional and shape characteristics of GPS standards [186].

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 12

P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

Designers define tolerances on core models that are intended to describe the maximum allowable variation from the nominal size. Tolerances do not include any allowance for, or knowledge of, the measurement uncertainty of the equipment used to verify the dimensions. The standard ISO 14253 [187] makes it clear that the onus is on the supplier of the measurement data to guarantee the conformance to specification (tolerance) of the measurements, and that the data takes account of measurement uncertainty. There are several ways of carrying out dimensional and shape verification [171] including direct or indirect measurements, and measuring either all the parts (100% inspection) or a selection of parts. Direct measurements are taken off the part itself by deploying metrology systems suitable for the physical size and scale of the artefacts and these systems are outlined in the enabling technologies Section 6.4. Indirect dimensional verification requires taking inferred dimensions from something other than the part, for example by measuring the jig that is used to assemble the part. Verification may also be inferred statistically through controlling and measuring the process, as outlined in Section 6.3, and this can bring significant cost benefits through improvements to process capability. The level of inspection required for any given feature is dictated by the risk of non-conformance. Depending on the industry sector, design risk is driven by performance, safety and fit. Process and inspection risks are dictated by the capability of the process and inspection systems. Due to the criticality of aerospace components, high-risk features will always be subject to 100% inspection. Features that can be effectively controlled by validating the manufacturing process can be subjected to a reduced inspection regime, typically yielding a 50–75% reduction in final inspection load, reducing measurement time per part. A freeform surface, also known as a complex or sculpted surface, is classified in ISO 17450-1:2007 [186] as a complex feature with no invariance degree. Existing technologies for measuring free form surfaces are detailed by Savio et al. [188]. Photogrammetry and laser scanning are mature technologies for surface characterisation with measurement accuracies of 5 parts in 105 [189] and 1 part in 104 respectively. Structured light devices are less mature technologies with accuracy 1 part in 105 but they have potential for achieving higher accuracy than laser line scanners due to the fundamental limits imposed by speckle effects [190,191]. This is where a hybrid system [163] would be advantageous. While the ISO GPS standard allows profile tolerances on freeform surfaces like straightness [192], roundness [193] and cylindricity [194], there is no standard for the verification of freeform surfaces. Multiple instruments are applicable for surface verification, as shown in Fig. 16. The production uncertainties of a free form surface, compounded by the edge trimming and the assembly processes that freeform surfaces typically are involved in, eventually manifest themselves in gaps, steps and interferences between the surfaces. Gap and flush problems on a fluid dynamic device, such as an aircraft wing, are detrimental to its performance and the fit of automotive panels is indicative of the build quality of the product.

The assembly methods used to minimise freeform surface interface problems can be classified as follows;  Build to nominal: the assembled product tolerance is met by simply making the key features of the parts as accurately as possible. Typically used for small products with features that can be accurately produced.  Measure and adjust: the assembled product tolerance is met by measuring the interfaces and adjusting some of the parts’ position and/or orientation to minimise interface problems. For larger parts which can be difficult and expensive to produce to tight tolerances (such as door panels in the automotive industry), the position and orientation may be manipulated manually or automatically to minimise the overall interface problems [195,196].  Measure for production: the assembled product tolerance is met by measuring one side of the interface and producing the other side using the measured data. For very large freeform shapes such as wings and wind turbine blades, it is very difficult and expensive to produce parts to tight tolerances. It is often preferable to tailor parts to fit the specific physical assembly by producing parts directly using measurements from the assembly [90,188]. 6.1.2. Design structure mapping and hidden features Hidden features can be defined as those which do not easily provide line-of-sight access, as occurs commonly in cluttered assembly environments and complex and enclosed products. Measurement of these features generally requires an ability to ‘‘see around corners’’ or measure directly through opaque objects. Possible approaches include; networks of line-of-sight instruments; mirrors; articulated CMM arms; through-skin sensing (using Hall effect sensors to locate holes, fitted with magnets, on components hidden by other components); and six-degrees-offreedom probing. A key issue with networks of line-of-sight instruments is closing the metrological loop and including sufficient common points from one instrument to the next, so as to minimise error buildup. 6.1.3. Measurement equipment deployment Production metrology begins with the set-up of systems and continues through the in-process measurement and metrology enabled automation [80,81]. Metrology must be seen as a manufacturing process and Muelaner et al. [174] developed a method for measurement planning and instrument deployment. Specification of the environmental conditions in which the measurement is to be carried out should include the average temperature, temperature gradients, pressure, humidity and carbon dioxide content [197]. Accuracy, properly defined as measurement uncertainty [169], is a key performance indicator for metrology. Much work has already been carried out to model measurement uncertainty in industrial measurement processes especially for large volume applications [171] using models

Fig. 16. Examples of freeform surface verification applications.

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

13

Table 3 Non-destructive evaluation techniques. NDE method

Principle of operation

Acoustic emission

Detection of stress waves from defects in materials Ultrasonic detection of sub-surface defects Monitoring of metallic structures under a magnetic field Colour change of dyes in cracks based on capillary action IR camera measures thermal profile of structures Pulsed light generates radiation from sub-surface defects Laser beam Doppler shift detects vibrations and defects Sheared laser-generated image acts as a reference image of a surface. Application of load or heat reveals defects Ultrasonic imaging process

C-scan Eddy current Fig. 17. Mechanical design, verification and validation of products.

created for laser-based spherical co-ordinate measurement systems, such as laser trackers and laser radar [170,197]. Coordinate measurements may be calculated from a number of angular measurements obtained using cameras, theodolites, and iGPS [198]. Calculating the measurement is a complex task, since measurement uncertainty impacts on part rejection rates [173,174] and the accuracy of manufacturing processes. Decision rules for proving conformance or non-conformance with specifications are clearly defined by international standards. A component dimension must be accompanied by a tolerance [199] giving a lower specification limit (LSL) and an upper specification limit (USL) while a measurement result must be accompanied by an estimate of measurement uncertainty (U) [169]. Product conformance can be proven by a measurement result that is greater than LSL + U and less than USL-U [187]. 6.2. Product testing and validation 6.2.1. Mechanical design testing The effective mechanical design of a stand-alone product or a structural component is predicated on key stages of development which are summarised in Fig. 17. As already described in Section 5.6.1, the output of FEA modelling depends on the construction of accurate meshed or meshless continua and the correct assignment of materials properties. In many cases such materials property information is available from materials textbooks [200] or in the form of software [201] but if new materials or bespoke composite materials are to be used, materials evaluation is needed to define mechanical properties. Using a range of test coupon geometries, materials evaluation performs the dual role of firstly confirming the correct selection of materials and secondly providing materials properties for FE modelling. Mechanical tests are published by standards bodies such as ASTM International and BSI British Standards. The mechanical testing of fibre composites is given by Hodgkinson [202]. Some materials parameters and materials tests are given in Table 2. Materials tests will determine elastic properties and the onset of yield and will determine whether a linear or a non-linear FE Table 2 Selected materials parameters and associated test methods. Property

Parameter

Test method

Strength (maximum, yield, etc.)

s (MPa)

Strain (maximum, yield, etc.)

e

Young’s modulus, stiffness

E, cij (GPa)

Tension, compression, flexure, etc. Tension, compression, flexure, etc. Tension, compression, flexure, etc. Vibration, time of flight Torsion, shear, tension Torsion, shear, tension Torsion, shear, tension All of the above Tension, compression Pendulum and drop impact Fracture mechanics tests Fracture mechanics tests Dilatometer DSC, DMTA

Dynamic stiffness Shear strength Shear–strain Shear modulus, stiffness Elastic compliance Poisson’s ratio Work of fracture

Edyn (GPa)

t (MPa) g G, cij (GPa) Sij (m2 N 1)

nij gf (J m 2) 2

Critical strain energy release rate

Gc (J m

Critical stress intensity factor

Kc (Pa m1/2)

Thermal expansion coefficient Glass transition temperature

a (K 1) Tg (K)

)

Dye penetrant Infrared thermography Photothermal imaging Laser vibrometry Shearography

Acoustography

model is required to model the mechanical behaviour of parts. A key feature of the measurement of materials parameters is the effective use of instrumentation. Strain measurement devices such as strain gauges, extensometers and lasers are well known but techniques such as Electronic Speckle Pattern Interferometry (ESPI), Holographic Interferometry and Digital Image Correlation (DIC) [203] provide more accurate 2D and 3D information on strain distributions around stress concentrations. An obvious method of evaluating products and components is to perform static structural tests in tension, compression and shear to destruction. Performance under cyclic load (fatigue), constant stress (creep) and constant strain (stress relaxation) will allow the determination of parameters such as fatigue life (constant amplitude and complex load or strain), fatigue limit, creep compliance and stress relaxation modulus. The observation and understanding of fracture is achieved by the application of optical, electron and atomic force microscopy. Non-destructive evaluation (NDE) includes a plethora of techniques, often used to locate defects. Some NDE methods are summarised in Table 3. 6.2.2. Flow related physical verification and validation The validation of CFD analysis deals with the assessments of comparison between computational and experimental results [14,204] as shown in Fig. 18 and this generates valuable data for improving the convergence of Large Eddy Simulation and experimental tests. The key parameters in CFD validation tests deal with the aerodynamic forces that consist of three force components (lift, drag, side force) and three moments (pitching, yawing, rolling). The static aerodynamic forces and moments can be measured indirectly by integrating the surface pressure distribution [204] or directly by strain gauge balance, internal spring balance and load cell. The unsteady aerodynamic forces and moments acting on a maneuvering air vehicle [205] can be measured by using strain gauge balance and load cell. The external flow structure of an air vehicle can be illustrated qualitatively by flow pattern images and quantitatively by measuring flow velocities. Qualitative flow patterns can be

Fig. 18. Flow validation process [14].

Please cite this article in press as: Maropoulos PG, Ceglarek D. Design verification and validation in product lifecycle. CIRP Annals Manufacturing Technology (2010), doi:10.1016/j.cirp.2010.05.005

G Model

CIRP-598; No. of Pages 20 14

P.G. Maropoulos, D. Ceglarek / CIRP Annals - Manufacturing Technology xxx (2010) xxx–xxx

obtained by using flow visualisation techniques such as; light scattering particles, dye visualisation, smoke wire, tuft-grid method and oil-film method. The Laser-Induced Fluorescent technique can visualise the flow pattern in a 2D plane of a 3D flow field [206]. Quantitative data of the flow structures can be obtained by measuring flow velocities using pitot tubes (one velocity) and five hole probes (three velocities) for steady velocity measurement at one point. Fluctuating velocities can be measured by using thermal anemometers (intrusive) and laser doppler velocimetry (non-intrusive). Particle tracking velocimetry and particle image velocimetry are capable of obtaining velocity information on a 2D plane and volumetric three-component velocimetry has been applied successfully in capturing the whole volumetric flow information [207,208].

Fig. 19. Digitisation methods for dimensional verification.

follows normal distributions. However, these approaches are insufficient in the case of an ill-conditioned system. An Enhanced Piecewise Least Squares approach was proposed by Ceglarek et al. [219] to diagnose the six sigma root causes associated with product variation.

6.3. Physical process verification and validation 6.4. Enabling verification technologies The formal manufacturing process verification involves the stages of inspection, analysis, testing and demonstration. Process validation is a means of ensuring that manufacturing processes are capable of consistently producing a finished product of the required quality and it typically involves the following formal methods; fault inspection, dependability analysis, hazard analysis, reproducibility analysis and risk analysis [11]. Process validation is conducted in the context of a system including design control, quality assurance, process control, and corrective and preventive action [19]. 6.3.1. Statistical process control and Taguchi’s robust design Within the field of statistical process control (SPC), a large number of techniques [209] have become established with the goal of improving the quality of manufactured products through the reduction of variability. SPC uses empirical evidence and statistical analysis to identify quality problems. All processes contain some unavoidable random variability with random causes referred to in SPC as chance causes. Avoidable sources of variability such as faults in machinery, operator errors or defects in materials are referred to as assignable causes. A primary objective in SPC is to detect where processes are out of statistical control so that assignable causes can be identified and eliminated. Taguchi’s robust design objective is to reduce the output variation from the target by reducing the sensitivity to noise, such as manufacturing variations and deterioration over time [210]. The approach uses the ‘‘loss model’’ because it actually fits a loss measure in a signal-to-noise (S/N) ratio format. The idea is to maximise S/N through design of experiments. The focus is to increase the robustness of the system’s performance. 6.3.2. Six sigma and root cause analysis Developed at Motorola in the early 1980s, 6-sigma is a business process methodology that enhances customer satisfaction from products or services by improving manufacturing processes [211]. Design for Six Sigma (DFSS) is a methodology utilizing tools, training and measurements to enable the design of products and processes that meet customer needs and can be produced at six sigma quality levels [87,212]. To control dimensional variations during manufacturing, efficient six sigma fault root cause diagnosis is critical for improving the quality and productivity of processes [144,213]. Ceglarek and Shi [214] proposed a diagnostic approach involving single faults in a single assembly fixture and this work was extended by Ding et al. [215], using the state space modelling technique. In order to overcome problems related to an illconditioned system, Rong et al. [216] have proposed unrotated Singular Value Decomposition and matrix partitioning techniques. Liu and Hu [217] proposed designated component analysis for dimensional fault diagnosis by pre-defining a set of fault patterns called designated components. Apley and Lee [218] proposed independent component analysis to model the fault variation pattern with the assumption that no more than one error source

The physical scale and shape of the component and the accuracy of the required measurement tasks are key determinant factors for the selection of verification methods and technologies. Fig. 19 shows a classification of digitisation methods for dimensional verification and validation. Broadly speaking, contact methods are suitable for small to medium size components, of

Suggest Documents