Hostname: page-component-78c5997874-4rdpn Total loading time: 0 Render date: 2024-11-10T16:29:34.959Z Has data issue: false hasContentIssue false

Design of complex engineered systems

Published online by Cambridge University Press:  30 September 2014

Irem Y. Tumer*
Affiliation:
School of Mechanical, Industrial, and Manufacturing Engineering, Oregon State University, Corvallis, Oregon, USA
Kemper Lewis
Affiliation:
Department of Mechanical and Aerospace Engineering, University at Buffalo, SUNY, Buffalo, New York, USA
*
Reprint requests to: Irem Y. Tumer, School of Mechanical, Industrial, and Manufacturing Engineering, Oregon State University, Corvallis, OR 97331, USA. E-mail: irem.tumer@oregonstate.edu
Rights & Permissions [Opens in a new window]

Abstract

Type
Guest Editorial
Copyright
Copyright © Cambridge University Press 2014 

As engineering systems grow in size and complexity, they pose increasingly significant challenges. In particular, the cost and time required for design and development are growing at an unsustainable rate, including delays and cost overruns for major institutions including Boeing, Airbus, NASA's Constellation program, General Motors, and Chrysler. Furthermore, failures introduced during the development process result in serious consequences for leading industries developing large complex systems such as aircraft, space launch systems, submarines, and military vehicles.

Over the last decade, these industries have acknowledged that the challenges warrant new theories and methodologies. As a result, multiple research communities have risen to the challenge of exploring the fundamental issues leading to the growing problems. This Special Issue includes a subset of the research in these communities, focusing specifically on the early design stages.

The Special Issue begins with some very interesting work from Steven Eppinger, Nitin Joglekar, Alison Olechowski, and Terence Teo, who, in their paper, “Improving the Systems Engineering Process with Multilevel Analysis of Interactions,” present some advances in multilevel design structure matrix (DSM) representations of complex engineering systems. The method accounts for multilevel data in the analysis of dependencies using DSM models, extending representation schema that incorporate multilevel and multiple time-scale test coverage data as vectors into the off-diagonal DSM cells. A particularly inventive aspect of the method integrates the product architecture, as modeled by the DSM, and the traditional systems engineering V-model of tasks in a systems engineering development process. Readers of this paper will recognize that the multilevel analysis of DSMs contributes a data collection and mapping methodology, providing engineering managers with insights on improving the systems integration process. It also contributes a theoretical basis and a method for data aggregation and query that accounts for differing scales, in terms of both level and timing, to explore if different types of integration risks may be evident at different time scales.

Reliability and uncertainty are concepts of paramount importance in designing complex engineering systems. Appropriately, there are a set of papers that focus on unique elements of reliability and uncertainty modeling, analysis, and simulation in complex systems. In their paper, “A Robust System Reliability Analysis Using Partitioning and Parallel Processing of Markov Chain,” Po Ting Lin, Yu-Cheng Chou, Yung Ting, Shian-Shing Shyu, and Chang-Kuo Chen present a robust method to analyze the reliability of complex redundant systems. Using an approach to partition and reorder a Markov chain's transition probability matrix, parallel computing techniques are leveraged to facilitate the execution of the method because submatrices calculations are independent of each other. The coupled submatrices can represent subsystems, modules, or controllers of a larger complex system, allowing the model to be used across a diverse set of applications. Simulation results demonstrate that compared with the sequential method applied to an intact Markov chain, the proposed method can improve the performance and produce allowable accuracy for the reliability analysis on large-scale systems.

In their paper, “Managing Uncertainty in Potential Supplier Identification,” Yun Ye, Marija Jankovic, Gül Kremer, and Jean-Claude Bocquet focus on the problem of identifying a network of suppliers to provide a large number of compatible modules for a complex system integration and assembly. Because of the use of modular design in complex systems, suppliers are typically more involved in the innovative design of the system. However, using novel architectures and suppliers with potentially better performance often comes with higher levels of uncertainty, because new suppliers usually add uncertainties to the system development. This makes the technical ability of suppliers more important to better satisfy system requirements. To address this challenge, in this paper, an Architecture and Supplier Identification Tool is presented, which generates all possible product architectures and the corresponding suppliers based on new requirements through matrix mapping and propagation. The Architecture and Supplier Identification Tool allows for the overall uncertainty and requirements satisfaction of generated architectures to be estimated and controlled. The proposed method effectively uses both requirement satisfaction and uncertainty thresholds to filter possible architectures and suppliers, providing decision support for the early design of complex systems.

Many design processes, especially when complex systems are being developed, are subjected to a wide range of uncertain disruptions, interruptions, and changes. These may range from local analysis errors and modeling mistakes to global natural disasters or the bankruptcy of a supplier. Regardless of the sources of these disruptions, they can have a significant impact on the transient response and convergence of a design process to an effective solution. Sourobh Ghosh, Erich Devendorf, and Kemper Lewis, in their paper, “Exploring the Effectiveness of Parallel Systems in Distributed Design Processes Subjected to Stochastic Disruptions,” study the aggregate impact of two classes of disruptions on the process of solving a network of coupled optimization problems, representing an abstraction of a decomposed complex system. One of the novel contributions of the study is the recognition that solving the coupled optimization problems in parallel tends to best mitigate the impact of the stochastic disruptions, allowing for the process to converge to an equilibrium solution while also minimizing impact on the convergence time.

As the sources of potential uncertainties increase in the design and operation of complex systems, the potential of emergent system failure also increases unless the uncertainties are modeled in predictive system simulation. Predicting the existence and impact of such failures is even more challenging early in a design process when detailed system models are not available. In their paper, “Reasoning about System-Level Failure Behavior From Large Sets of Function-Based Simulations,” David Jensen, Oladapo Bello, Christopher Hoyle, and Irem Y. Tumer develop an innovative approach to predicting system-level failure behavior early in a design process using qualitative descriptions of component behavior and a functional failure reasoning tool. In turn, the classification of these system-level behaviors can be used by designers to improve system design. Two data analysis tools, a modified k-means clustering algorithm and the statistical approach latent class analysis, are presented, and their ability to support designer reasoning about how the system responds to complex fault scenarios is compared.

The final two papers offer novel contributions into factors influencing the design of complex systems, including designer cognition and measures of complexity. Designers face a number of cognitive challenges when developing and evaluating engineering concepts. Vismal Viswanathan and Julie Linsey, in their paper, “Spanning the Complexity Chasm: A Research Approach to Move From Simple to Complex Engineering Systems,” study the impact of these cognitive challenges in the conceptual design of complex engineering systems. They present a multistudy approach of design thinking in complex systems by triangulating causal controlled lab findings with coded data from more complex products. Understanding the effects of cognitive challenges in a realistic and complex engineering system is difficult due to a wide variety of factors and uncertainties influencing the results. At the same time, studying the design of such systems in a controlled environment is challenging due to the scale and complexity of such systems and the time needed to design the systems. To address these challenges, this paper includes a controlled experiment with a simple system and a qualitative cognitive-artifacts analysis on more complex engineering systems, followed by the triangulation of results. This approach uniquely and effectively combines the advantages of quantitative and qualitative study methods, making them more powerful while studying complex engineering systems.

Finally, as systems become more complex, metrics to measure levels of complexity can be useful in the accompanying design processes. In their paper, “Measures of Product Design Adaptability for Changing Requirements,” Serdar Uckun, Ryan MacKey, Minh Do, Rong Zhou, Eric Huang, and Jami J. Shah present a set of metrics specifically to measure a system's ability to adapt to changing requirements, which is a critical characteristic of complex systems. In the paper, two alternative approaches to measuring adaptability to changing requirements are presented. One approach, based on utility and cost of design changes, has a sound theoretical foundation but is potentially difficult to use in actual development processes. The other approach, based on studying product architecture characteristics that accommodate changes, is more heuristic in nature but potentially more practical to use. Such measures, if calibrated effectively, could serve as a surrogate for real adaptability, allowing for engineers to more rigorously and formally evaluate potential system architectures.

Irem Y. Tumer is a Professor and the Associate Dean for Research and Economic Development in the College of Engineering at Oregon State University. She was previously a Research Scientist and Group Lead in the Complex Systems Design and Engineering group in the Intelligent Systems Division at NASA Ames Research Center. Professor Tumer was also involved in Project/Program Management in various NASA Programs including Intelligent Systems, Engineering for Complex Systems, Aviation Safety, and the Constellation Programs. She received her PhD in mechanical engineering from University of Texas at Austin in 1998. Dr. Tumer is Associate Editor of ASME's Journal of Mechanical Systems. Her expertise touches on systems engineering, model-based design, risk-based design, system analysis and optimization, function-based design, integrated systems health management, and vibration monitoring, which has resulted in numerous journal and refereed conference publications. Her research focuses on the overall problem of designing highly complex and integrated systems with reduced risk of failures, developing formal methodologies and mathematical frameworks to help understand and enhance complex system design.

Kemper Lewis is a Professor in and Chair of the Department of Mechanical and Aerospace Engineering at the University at Buffalo and a Fellow of the American Society of Mechanical Engineers. He has also served as Executive Director of the NYS Center for Engineering Design and Industrial Innovation and as the University at Buffalo Site Director of the National Center for e-Design. Dr. Lewis' research interests are in the areas of complex system trade-offs, modeling and optimization of decision networks, and adaptive energy systems.