PROTOTYPE IMPLEMENTATION OF A LOOSELY ... - Cumincad

5 downloads 2974 Views 2MB Size Report
The optimization framework controls the processes between the authoring tool and parametric engine on one side and the optimization algorithm on the other.
6B-173.qxd

4/29/2013

1:20 PM

Page 675

R. Stouffs, P. Janssen, S. Roudavski, B. Tunçer (eds.), Open Systems: Proceedings of the 18th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2013), 675–684. © 2013, The Association for Computer-Aided Architectural Design Research in Asia (CAADRIA), Hong Kong, and Center for Advanced Studies in Architecture (CASA), Department of Architecture-NUS, Singapore.

PROTOTYPE IMPLEMENTATION OF A LOOSELY COUPLED DESIGN PERFORMANCE OPTIMISATION FRAMEWORK Volker MUELLER, Drury B. CRAWLEY and Xun ZHOU Bentley Systems, Incorporated, Exton, PA, United States [email protected], [email protected], [email protected]

Abstract. Integration of analyses into early design phases poses several challenges. An experimental implementation of an analysis framework in conjunction with an optimization framework ties authoring and analysis tools together under one umbrella. As a prototype it served intensive use-testing in the context of the SmartGeometry 2012 workshop in Troy, NY. In this prototype the data flow uses a mix of proprietary and publicised file formats, exchanged through publicly accessible interfaces. The analysis framework brokers between the parametric authoring tool and the analysis tools. The optimization framework controls the processes between the authoring tool and parametric engine on one side and the optimization algorithm on the other. In addition to some user-implemented analyses inside the parametric design model the prototype makes energy analysis and structural analysis available. The prototype allows testing assumptions about work flow, implementation, usability and general feasibility of the pursued approach. Keywords. Design-analysis integration; design refinement; optimization.

1. Introduction In building design there has been increasing interest in integrating analyses into the design process at earlier stages. This poses a number of challenges like interoperability between design authoring and analysis tools, the cognitive distance between conceptual design and the detail level of information required by analysis software or actionable presentation of analysis results to designers. This research is aimed at finding solutions to these challenges. Solutions are implemented iteratively according to an agile development approach, while testing assumptions in formal and informal usability settings. This paper reports about the progress of this process, intermediate findings and conclusions for further development. The assumption is, of course, that these capabilities may enable design teams to arrive at better performing designs 675

6B-173.qxd

4/29/2013

676

1:20 PM

Page 676

V. MUELLER, D. B. CRAWLEY AND X. ZHOU

when compared to traditional design approaches. This seems to be an appropriate response to today’s demands for high performance buildings. 2. Challenges of Design and Analysis Integration and Pursued Solutions Interoperability: The situation in the software industry concerned with designing buildings appears to reflect the state of the building industry at large: it is fragmented. There are many incompatible design and analysis software tools using incompatible data formats. There are various approaches towards increased interoperability: (1) A tight, closed proprietary system of software applications, unified through a proprietary data format. (2) a loose system of many software tools using as many data formats. Individual translation modules resolve data transformations as needed resulting in many one-to-one mappings. Systems like this are being proposed by some of the researchers collaborating through the Open Systems and Methods for Built Environment Modelling initiative1 (Janssen et al., 2012). (3) A loose system of tools using an open or otherwise publicised, standardised data format like the Industry Foundation Classes (IFCs) and Green Building XML (gbXML). Latter two approaches frequently employ workflow control systems or specific structured data layers to glue the pieces together (Flager et al., 2008; Toth et al., 2012). This research project uses a mix of the first and third approaches by using a shared data representation while initially focusing on the implementation of frameworks that allow loose association of authoring and analysis tools through an application programming interface (API). Data equivalency: Authoring tools may not allow creation of all the data required by a specific analysis tool, leading to gaps in data that prevent successful analyses. There is no general remedy to this problem. When an authoring tool does not have the capability to generate the required data there is little to do beyond alerting the designers to the problem of missing data. As a response to the challenge of data equivalency we propose an intermediary analytic framework between authoring tools and analysis tools. The analytic framework API allows analysis engines to register the type of analysis they offer as well as the data required by the analysis services. Data discrepancy: Conceptual design moves through large ideas and concepts, and is less concerned with the detailed specification of constructions. Analysis software often requires very detailed specifications in order to compute performance of a building (Bleil de Souza, 2012). Analogous to the challenge of lacking data equivalency, missing information can be pointed out to designers based on the knowledge about which analysis services require what type of data. A complementary remedy is the search for types of analyses applicable to early design requiring only an adequate amount of information. This is a separate research project.

6B-173.qxd

4/29/2013

1:20 PM

Page 677

PROTOTYPE OF OPTIMISATION FRAMEWORK

677

Speed of feedback: Design is an iterative process. Iterations are frequent and fast. Any analyses feeding back into these design iterations have to be fast enough that they remain relevant for the current iteration (Hetherington et al., 2011). In order to accelerate feedback about the design, various strategies are being pursued. In the case of this research project, opportunities in cloud computing are being investigated. Performance proxies: As mitigating strategy to accurate but slow analyses science often uses proxy indicators which are sufficiently reliable, well enough aligned with the behaviour under investigation, and quickly computed. However, as of yet there is not sufficient research to permit use of such performance proxies. Results display: Visualization of analysis results is not always intuitive and often not visually related to the geometry model of the design (Dondeti and Reinhart, 2011). When working with parametric design tools, there are no well-developed conventions how to compare design variants by their predicted performance (see Figures 3 and 5). Within this research project opportunities have been provided to expand investigation of results display but have not been deeply explored, yet. Initial focus in the context of design feedback has been in displaying analysis results in the context of the model. In-context results: Analysis results are commonly only visualized in a representational way without the analysis results actually available in the digital design model for access by computational processes. Direct accessibility in the model enables automation of refinement iterations or optimisation processes. Because the goal of this research is the integration of analyses into the early design process, displaying analysis results in the context of the design is an important part of the implementation. Relevant to the idea of integration of analysis feedback into the parametric design workflow is the actual parametric accessibility of the analysis results to the computational system. In this research, analysis results can be matched with the analyzed elements so that results can be used in parametric dependencies in order to change the design as a consequence of how it performs. Human-machine balance: Especially in early design not all design goals are measurable, commensurate, or even quantifiable at all, latter also because qualities are a strong factor in successful design. This poses the challenge how to balance computed performance metrics with qualitative aspects whose performance evaluation is based on the designers’ judgments. There are several approaches possible to resolve this dilemma: (1) Avoid any type of automation and fully focus on supporting design team decisions. The design team may utilize analyses and in-context result display to inform its decisions. (2) Implement some complex, pre-defined work flow that mixes design refinement automation with interaction between the system and the design team (Geyer and Beucke, 2010).

6B-173.qxd

4/29/2013

678

1:20 PM

Page 678

V. MUELLER, D. B. CRAWLEY AND X. ZHOU

(3) Rely on the multi-level influence the design team can exert on the entire system, from the reasoning behind the parametric model with its behaviours to the selection of parameters that are accessible by the optimization engine or the metrics for the fitness of a specific design. Additional decision support for the team may be available through an interface that allows analysis of all computed solutions, or exploration of the Pareto-optimal set of optimisation runs. All scenarios are suited to support the design team in its tasks; however, the latter approach makes the full power of computational design available to the design team. 3. System Implementation for SmartGeometry 2012 3.1. PARAMETRIC DESIGN AND ANALYSIS WORK FLOW

High-level work flows for establishing a parametric model for design and analysis have been described in detail in various places, probably most thoroughly by CIFE (Chachere and Haymaker, 2008). The design work flow for the prototype uses a variation on the theme under acknowledgement of the difference between measurable (and computable) performance goals, and hard-to-measure (or noncomputable) design goals (Figure 1). Performance and design goals are deduced from the project goals. Performance goals determine which simulations or analyses are appropriate to help assess the

Figure 1. The parametric design and analysis workflow including human-centred critique and computation-driven evaluation and optimisation.

6B-173.qxd

4/29/2013

1:20 PM

Page 679

PROTOTYPE OF OPTIMISATION FRAMEWORK

679

performance of the design. Simulations and analyses require specific analysis models. In order to generate these analysis models specific inputs are required. The set of analysis models and on the design side the design models will react to changing design inputs and generate new models as design variants, material for simulation, analysis, or the architectural design parti. While the results of computational simulations and analyses may be evaluated computationally, the resulting version of the design parti is commonly subject to human critique. The design team may set up an automated design feedback loop that uses some optimisation scheme. The optimisation algorithm manipulates parameters that were identified by the design team in order to generate design variants for renewed evaluation. The response to the critique, or human response to any type of automated performance-driven design refinement, however, is not limited to the parametrically accessible change domain. Change could be effected at any level, starting at the top level of project goals. Deduced performance or design goals can be changed, the performance metrics, collection of simulations or analyses, or the system of parametric models can be modified, as well as the nature, range, and resolution of input variables. 3.2. PROTOTYPE FOR SMARTGEOMETRY 2012

For the SmartGeometry event in Troy, NY, in March of 2012, a first implementation step for this system was taken. The workshop cluster “Material Conflicts” used a prototype implementation of the design system (Figure 2).

Figure 2. The architecture for the prototype at SmartGeometry 2012.

6B-173.qxd

4/29/2013

680

1:20 PM

Page 680

V. MUELLER, D. B. CRAWLEY AND X. ZHOU

Designers create a parametric model of their design, in parts with GC features that represent structural members (nodes, beams, columns) or energetically active building elements (surfaces as walls, floors or windows). In combination with complementary project level information (project name, location, weather data, building type and foundation type) these data are routed to their respective analysis nodes in GC. These analysis nodes communicate with the analytic framework which in turn routes the analysis requests with all required data to the analysis engines. In the prototypical implementation, analysis engines reside either on the client computer or in “the cloud”, in this case Microsoft Azure. Post-processing scripts extract a selection of analysis results. The analysis framework transmits these analysis results back to the analysis nodes in GenerativeComponents where they are made available for further computational consumption for example to determine a design’s fitness in form of decision values. The decision values are passed to the Design Evolution node in GC, which connects to the Darwin Framework and its evolutionary optimisation algorithms.2 Designers also identify all design variables that they want to be considered changeable for design iterations, indicating what the design’s genome is that the evolutionary algorithm needs to consider. After evaluating the genome, the optimisation framework generates genotypes for all individuals in a generation’s population and then waits for all fitness values to arrive. It evaluates the fitness of the individuals to determine the next parent genome and develop the next generation’s phenotypes. 3.3. LIMITATIONS OF THE IMPLEMENTATION

Prototype implementation will always remain limited and some of those limitations are prone to overshadow other observations. Analysis models were kept at the minimal implementation necessary to allow analyses to execute while possibly achieving sufficient completeness of the models for conceptual design. The prototype suffered from a general lack of robustness and the variables that could be optimised were limited. 3.4. USER EXPLORATIONS AT SMARTGEOMETRY 2012

Material Conflicts cluster participants explored various approaches with the prototype system. Optimisation schemes that were examined comprised an urban study for Penang by Greig Paterson of Aedas (Figure 3), a natural ventilation scheme utilizing a modulated stack effect by Lem3a and Kristoffer Negendahl (Figure 4), and Dr. Woodbury’s exploration of an alternative representation of a Pareto front (Figure 5).

6B-173.qxd

4/29/2013

1:20 PM

Page 681

PROTOTYPE OF OPTIMISATION FRAMEWORK

681

Figure 3. Greig Paterson’s urban scheme for Penang - twelve top solutions.

Figure 4. Lem3a’s and Kristoffer Negendahl’s stack effect utilization for natural ventilation, optimization of envelope to modulate stack effect in relation to window openings.

Figure 5. Dr. Woodbury’s experiment in Pareto-front visualisation.

6B-173.qxd

4/29/2013

682

1:20 PM

Page 682

V. MUELLER, D. B. CRAWLEY AND X. ZHOU

3.5. CONCLUSIONS FROM SMARTGEOMETRY EXPERIMENTS

There were various lessons learned, suspicions or prior awareness confirmed, and assumptions refuted during the experiments at SmartGeometry 2012: • Special features representing elements for analyses cannot be added to the parametric platform in a sustainably scalable manner. With additional analyses the solution must pursue flexible utilization of existing features rather than the addition of more specialized features. • Some assumptions made for simplification of user input of required analysis data were shown to be too limiting. • Some features such as shading surfaces in the energy model or explicit tagging of foundation nodes in the structural model had been omitted due to time constraints; however, the user explorations at Smartgeometry 2012 confirmed that they are necessary features for conceptual design. • The analytic framework needs to include on the parametric platform side an automatically generated footprint based on the specifications of required inputs by the analysis engine that registers with the analytic framework. It is not a sustainable solution to add analysis nodes for each new analysis type. • Analysis failures did not post clear messages about reasons of failure to the users. Most failures occurred due to errors in the data submitted to the analyses. A validity check for analysis data appears like a helpful addition. Certainly improved error reporting would already increase usability. • Some work flows pursued by workshop participants clearly lead to models that required additional clean-up before submitting them to analyses. However, these work flows appear common enough to pose a constant challenge to such analysisintensive design approaches. • Running only the analyses in the cloud adds a triple penalty of sending the request with the required data to the cloud, slower processing on usually low-performing cloud nodes and extraction and download of analysis results from the cloud to the parametric platform for further processing. • The average wait time of a job on the cloud was 6 seconds which with appropriate parallelization of analyses with longer durations appears feasible in light of the expected overall time savings. • Individual analyses took anything from fractions of seconds to several minutes to several hours, latter case certainly an outlier; however, if an analysis like that were executed for an individual in a population generated by an evolutionary algorithm it would delay progress on that generation. There may be other approaches necessary for the optimisation framework to deal with such outliers without delaying the overall progress of the optimisation. Without this outlier, average analysis job duration was 1’46”. • The optimisation framework’s interface was lacking robustness; it was not possible to assess the optimisation framework’s performance because of that. However, some optimisation runs did execute for a fair amount of generations indicating the potential of this technology.

6B-173.qxd

4/29/2013

1:20 PM

Page 683

PROTOTYPE OF OPTIMISATION FRAMEWORK

683

4. Ongoing and Future Work Based on the results of the experiments at SmartGeometry 2012 and subsequent experiments work has focused in several areas: • Increasing the robustness of the analytic framework. • Increasing the robustness of communications between software components and services on the cloud. • Refactoring of the optimization framework for robustness. • Redesigning the user interface to the optimization node in the parametric model. • Increasing the “completeness” of the analysis and simulation models without increasing required model complexity. • Increasing the capabilities of the analytical models, for example by adding automated zone creation for energy analysis. • Redesigning the software architecture so that all components can run as services on the cloud in order to minimize cloud upload and download penalties while at the same time maximizing parallelization in the cloud (Figure 6). • Find proxy analyses or simulations that may guide conceptual design with sufficient reliability while executing at speeds superior to traditional analyses.

Figure 6. Conceptual sketch of cloud services architecture.

Acknowledgements This work was supported by contributions from various groups within Bentley Systems, Incorporated, especially the Applied Research Group, Special Technology Projects, Design and Simulation’s Structural Analysis team, and the BentleyCONNECT team.

6B-173.qxd

4/29/2013

684

1:20 PM

Page 684

V. MUELLER, D. B. CRAWLEY AND X. ZHOU

Endnotes 1. Open Systems and Methods for Built Environment Modelling initiative. Available from: (accessed 7 December 2012). 2. Darwin Optimization (version 0.91) by Dr. Zheng Yi Wu. Available from: Be | Communities (accessed 7 December 2012).

References Bleil de Souza, C.: 2012, Contrasting paradigms of design thinking: The building thermal simulation tool user vs. the building designer, Automation in Construction, 22, 112–122. Chachere, J. and Haymaker, J.: 2008, Framework for Measuring Rationale Clarity of AEC Design Decisions, CIFE Technical Report TR177, Stanford University. Dondeti, K. and Reinhart, C.F.: 2011, A “Picasa” for BPS – An interactive data organization and visualization system for building performance simulations, in V. Soebarto and T. Williamson (eds.), Proceedings of Building Simulation 2011: 12th Conference of International Building Simulation Association, Sydney, Australia. Flager, F., Soremekun, G., Welle, B., Haymaker, J. and Bansal, P.: 2008, Multidisciplinary process integration and design optimization of a classroom building, CIFE Technical Report TR175, Stanford University. Geyer, P. and Beucke, K.: 2010, An integrative approach for using multidisciplinary design optimization in AEC, in W. Tizani (ed.), Proceedings of the International Conference on Computing in Civil and Building Engineering, Nottingham University Press. Hetherington, R., Laney, R., Peake, S. and Oldham, D.: 2011, Integrated building design, information and simulation modelling: the need for a new hierarchy, Proceedings of Building Simulation 2011, Sydney. Janssen, P., Stouffs, R., Chaszar, A., Boeykens, S. and Toth, B.: 2012, Data transformations in custom digital workflows: Property graphs as a data model for user-defined mappings, International Workshop: Intelligent Computing in Engineering 2012 – EG-ICE 2012, Munich, Germany. Toth, B., Boeykens, S., Chaszar, A., Janssen, P. and Stouffs, R.: 2012, Custom digital workflows: A new framework for design analysis integration, in T. Fischer, K. De Biswas, J. J. Ham, R. Naka and W. X. Huang (eds.), Beyond Codes and Pixels: Proceedings of the 17th International Conference on Computer-Aided Architectural Design Research in Asia, Association for Computer-Aided Architectural Design Research in Asia (CAADRIA), Hong Kong, 163–172.