Browsing by Author "Conrad, Steven, committee member"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Open Access An enterprise system engineering analysis of KC-46A maintenance program decision-making(Colorado State University. Libraries, 2023) Blond, Kyle E., author; Bradley, Thomas, advisor; Ender, Tommer, committee member; Conrad, Steven, committee member; Herber, Daniel, committee member; Ozbek, Mehmet, committee memberThe KC-46A Pegasus is a United States Air Force (USAF) tanker, transport, and medical evacuation commercial derivative aircraft based on the Boeing 767. It is a top acquisition priority to modernize the USAF's refueling capabilities and is governed by a lifecycle sustainment strategy directed by USAF commercial variant policies aligned to Federal Aviation Administration (FAA) policy. While this strategy provides robust mechanisms to manage the KC-46A's performance during its operations and support phase, opportunity exists for the KC-46A sustainment enterprise to better achieve reliability, availability, maintainability, and cost (RAM C) objectives through enhancing KC-46A maintenance program decision making in the context of USAF and FAA policies. This research characterizes the KC-46A maintenance program as an industrial enterprise system governing the maintenance, repair, overhaul, and modification of KC-46A aircraft. Upon this basis, enterprise systems engineering (ESE) characterizes the KC-46A maintenance program and identifies decision making improvement opportunities in its management. Canonical ESE viewpoints are tailored to abstract the organizations, processes, and information composing KC-46A maintenance program decision making and model how decision support methods can better achieve KC-46A sustainment enterprise objectives. A decision making framework then evaluates the RAM C performance of KC-46A maintenance tasks as part of the KC-46A Continued Analysis and Surveillance System (CASS) program. The framework's heuristics classify the compliance, effectiveness, and optimality of a maintenance task to prescribe KC-46A CASS responses. A rule based expert system applies this framework and serves as the knowledge engine for the KC-46A CASS decision support system referred to as the "Pegasus Fleet Management Tool." A focus group of KC-46A sustainment experts evaluated the framework and produced consensus that it advances the state of the art in KC-46A maintenance program decision making. A business case analysis roadmaps the programmatic and technical activities required to implement the framework in PFMT and improve KC-46A sustainment.Item Open Access Characterizing and improving the adoption rate of model-based systems engineering through an application of the Diffusion of Innovations theory(Colorado State University. Libraries, 2024) Call, Daniel R., author; Herber, Daniel R., advisor; Aloise-Young, Patricia, committee member; Conrad, Steven, committee member; Shahroudi, Kamran Eftekhari, committee memberAs the environment and operational context of new systems continue to evolve and become increasingly complex, the practice of systems engineering (SE) must adapt accordingly. A great deal of research and development has gone and continues to go into formulating and maturing a model-based approach to SE that addresses many of the shortcomings of a conventional, document-based SE approach. In spite of the work that has been done to advance the practice of model-based systems engineering (MBSE), it has not yet been adopted to a level that would be expected based on its demonstrated benefits. While research continues into even more effective MBSE approaches, there is a need to ascertain why extant MBSE innovations are not being adopted more widely, and if possible, determine a way to accelerate its adoption. This outcome is particularly important as MBSE is a key enabler to an agile systems engineering (ASE) approach that satisfies the desire of many stakeholders to apply agile principles to SE processes. The diffusion of innovations (DoI) theory provides a useful framework for understanding the factors that affect the adoption rate of innovations in many fields. This theory has not only been effective at explaining why innovations are adopted but has also been able to explain why objectively superior innovations are not adopted. The DoI theory is likely to provide insight into the factors that are depressing the adoption rate of MBSE. Despite prior efforts in the SE community to promote MBSE, the DoI theory has not been directly and deliberately applied to understand what is preventing widespread MBSE adoption. Some elements of the theory appear in the literature addressing MBSE adoption challenges without any recognition of awareness of the theory and its implications. The expectation is that harnessing the insights offered by this theory will lead to MBSE presentation and implementation strategies that will increase its use. This would allow its benefits to be more widely realized in the SE community and improve the practice of SE generally to address modern, complex environments. The DoI theory has shown that the most significant driver of adoption rate variability is the perceived attributes of the innovation in question. A survey is a useful tool to discover the perceptions of potential adopters of an innovation. The primary contribution of this research is the development of a survey to capture and assess a participant's perceptions of specified attributes of MBSE, their current use of MBSE, and some limited demographic information. This survey was widely distributed to gather data on current perceptions of MBSE in the SE community. Survey results highlighted that respondents recognize the relative advantage of MBSE in improving data quality and traceability, but perceived complexity and compatibility with existing practices still present barriers to adoption. Subpopulation analysis reveals that those who are not already involved in MBSE efforts face the additional adoption obstacles of limited trial opportunities and tool access (chi-squared test of independence between these populations resulted in p = 0.00). The survey underscores the potential for closer alignment between MBSE and existing SE methodologies to improve the perceived compatibility of MBSE. Targeted actions are proposed to address these barriers to adoption. These targeted actions include improving the availability and use of reusable model elements to expedite system model development, improved tailoring of MBSE approaches to better suit organizational needs, an increased emphasis on ASE, refining MBSE approaches to reduce the perceived mental effort required, a lowering of the barrier to entry for MBSE by improving access to the resources (tool, time, and training) required to experiment with MBSE, and increased efforts to identify and execute relevant MBSE pilot projects. The lessons and principles from the DoI theory should be applied to take advantage of the opportunity afforded by the release of SysML v2 to reframe perceptions of MBSE. Future studies would benefit from examining additional variables identified by the DoI theory, incorporating control questions to differentiate between perceptions of SE generally and MBSE specifically, identifying better methods to assess current MBSE use by participants, and measures to broaden the participant scope.Item Open Access Integrating geometric deep learning with a set-based design approach for the exploration of graph-based engineering systems(Colorado State University. Libraries, 2024) Sirico, Anthony, Jr., author; Herber, Daniel R., advisor; Chen, Haonan, committee member; Simske, Steven, committee member; Conrad, Steven, committee memberMany complex engineering systems can be represented in a topological form, such as graphs. This dissertation introduces a framework of Graph-Set-Based Design (GSBD) that integrates graph-based techniques with Geometric Deep Learning (GDL) within a Set-Based Design (SBD) approach to address graph-centric design problems. We also introduce Iterative Classification (IC), a method for narrowing down large datasets to a subset of more promising and feasible solutions. When we combine the two, we have IC-GSBD, a methodological framework where the primary goal is to effectively and efficiently seek the best-performing solutions with lower computational costs. IC-GSBD is a method that employs an iterative approach to efficiently narrow down a graph-based dataset containing diverse design solutions to identify the most useful options. This approach is particularly valuable as the dataset would be computationally expensive to process using other conventional methods. The implementation involves analyzing a small subset of the dataset to train a machine-learning model. This model is then utilized to predict the remaining dataset iteratively, progressively refining the top solutions with each iteration. In this work, we present two case studies demonstrating this method. In the first case study utilizing IC-GSBD, the goal is the analysis of analog electrical circuits, aiming to match a specific frequency response within a particular range. Previous studies generated 43,249 unique undirected graphs representing valid potential circuits through enumeration techniques. However, determining the sizing and performance of these circuits proved computationally expensive. By using a fraction of the circuit graphs and their performance as input data for a classification-focused GDL model, we can predict the performance of the remaining graphs with favorable accuracy. The results show that incorporating additional graph-based features enhances model performance, achieving a classification accuracy of 80% using only 10% of the graphs and further subdividing the graphs into targeted groups with medians significantly closer to the best and containing 88.2 of the top 100 best-performing graphs on average using 25% of the graphs.Item Open Access Model-based systems engineering application to data management for integrated sustainable human settlement modeling(Colorado State University. Libraries, 2024) Adjahossou, Anicet, author; Grigg, Neil, advisor; Bradley, Thomas, committee member; Conrad, Steven, committee member; Willson, Bryan, committee member; Fremstad, Anders, committee memberThe challenges associated with the transition from current approaches to temporary humanitarian settlement to integrated, sustainable human settlements is largely due to a significant increase in the number of forcibly displaced people over the last few decades, the difficulties of sustainably providing the needed services to the required standards, and the prolongation of emergencies. According to the United Nations High Commissioner for Refugees (UNHCR)'s Global Appeal 2023, more than 117.2 million people were forcibly displaced or stateless in 2023, representing a little over 1% of the world's population. The average lifespan of a humanitarian settlement is between 17 and 26 years (UNHCR), and factors such as urban growth and adverse environmental changes have exacerbated the scale of the difficulties. Despite these problematical contexts, short-term considerations continue to guide the planning and management of humanitarian settlements, to the detriment of more integrated, longer-term perspectives. These factors call for a paradigm shift in approach to ensure greater sustainability right from the planning phases. Recent studies often attribute the unsustainability of humanitarian settlements to poor design and inadequate provision of basic resources and services, including water, energy, housing, employment and economic opportunities, among others. They also highlight apparent bottlenecks that hinder access to meaningful and timely data and information that stakeholders need for planning and remediation. More often than not, humanitarian operations rely on ad hoc methods, employing parallel, fragmented and disconnected data processing frameworks, resulting in the collection of a wide range of data without subsequent analysis or prioritization to optimize potential interconnections that can improve sustainability and performance. Furthermore, little effort has been made to investigate the trade-offs involved. As a result, major shortcomings emerged along the way, leading to disruption, budget overruns, disorder and more, against a backdrop of steadily declining funding for humanitarian aid. However, some attempts have been made to move towards more sustainable design approaches, but these have mainly focused on vague, sector-specific themes, ignoring systemic and integrative principles. This research is a contribution to filling these gaps by developing more practical and effective solutions, based on an integrated systemic vision of a human settlement, defined and conceptualized as a complex system. As part of this process, this research proposes a model-driven methodology, supported by Model-Based Systems Engineering (MBSE) and a Systems Modeling Language (SysML), to develop an integrated human settlement system model, which has been functionally and operationally executed using Systems Engineering (SE) approach. This novel system model enables all essential sub-systems to operate within the single system, and focuses on efficient data processing. The ultimate aim is to provide a global solution to the interconnection and integration challenges encountered in the processing of operational data and information, to ensure an effective transition to sustainable human settlements. With regard to the interconnectedness between the different sectors of the sub-systems, this research proposes a Triple Nexus Framework (TNF) in an attempt to integrate water, energy and housing sector data derived from one sub-system within the single system by applying systems engineering methods. Systems engineering, based on an understanding of the synergies between water, energy and housing, characterizes the triple nexus framework and identifies opportunities to improve decision-making steps and processes that integrate and enhance quality of data processing. To test and validate the performance of the system model, two scenarios are executed to illustrate how an integrated data platform enables easy access to meaningful data as a starting point for modeling an integrated system of sustainable human settlement in humanitarian contexts. With regard to framework performance, the model is simulated using a megadata nexus, as specified by the system requirement. The optimization simulation yields 67% satisfactory results which is further confirmed from a set of surveyed practitioners. These results show that an integrated system can improve the sustainability of human settlements beyond a sufficiently acceptable threshold, and that capacity building in service delivery is beneficial and necessary. The focus on comprehensive data processing through systems integration can be a powerful tool for overcoming gaps and challenges in humanitarian operations. Structured interviews with question analysis are conducted to validate the proposed model and framework. The results prove a consensus that the novel system model advances the state of the art in the current approach to the design and management of human settlements. An operational roadmap with substantial programmatic and technical activities required to implement the triple nexus framework is recommended for adoption and scaling-up. Finally, to assess the sustainability, adaptability and applicability of the system, the proposed system model is further validated using a context-based case study, through a capacity assessment of an existing humanitarian settlement. The sustainability analysis uses cross-impact matrix multiplication applied to classification (MICMAC) methodologies, and results show that the development of the settlement are unstable and therefore unsustainable, since there is no apparent difference between influential and dependent data. This research tackles an important global challenge, providing valuable insights towards sustainable solutions for displaced populations, aligning with the United Nations 2030 Agenda for Sustainable Development.Item Open Access Novel assessments of country pandemic vulnerability based on non-pandemic predictors, pandemic predictors, and country primary and secondary vaccination inflection points(Colorado State University. Libraries, 2024) Vlajnic, Marco M., author; Simske, Steven, advisor; Cale, James, committee member; Conrad, Steven, committee member; Reisfeld, Bradley, committee memberThe devastating worldwide impact of the COVID-19 pandemic created a need to better understand the predictors of pandemic vulnerability and the effects of vaccination on case fatality rates in a pandemic setting at a country level. The non-pandemic predictors were assessed relative to COVID-19 case fatality rates in 26 countries and grouped into two novel public health indices. The predictors were analyzed and ranked utilizing machine learning methodologies (Random Forest Regressor and Extreme Gradient Boosting models, both with distribution lags, and a novel K-means-Coefficient of Variance sensitivity analysis approach and Ordinary Least Squares Multifactor Regression). Foundational time series forecasting models (ARIMA, Prophet, LSTM) and novel hybrid models (SARIMA-Bidirectional LSTM and SARIMA-Prophet-Bidirectional LSTM) were compared to determine the best performing and accurate model to forecast vaccination inflection points. XGBoost methodology demonstrated higher sensitivity and accuracy across all performance metrics relative to RFR, proving that cardiovascular death rate was the most dominant predictive feature for 46% of countries (Population Health Index), and hospital beds per thousand people for 46% of countries (Country Health Index). The novel K-means-COV sensitivity analysis approach performed with high accuracy and was successfully validated across all three methods, demonstrating that female smokers was the most common predictive feature across different analysis sets. The new model was also validated with the Calinski-Harabasz methodology. Every machine learning technique that was evaluated showed great predictive value and high accuracy. At a vaccination rate of 13.1%, the primary vaccination inflection point was achieved at 83.27 days. The secondary vaccination inflection point was reached at 339.31 days at the cumulative vaccination rate of 67.8%. All assessed machine and deep learning methodologies performed with high accuracy relative to COVID-19 historical data, demonstrated strong forecasting value, and were validated by anomaly and volatility detection analyses. The novel triple hybrid model performed the best and had the highest accuracy across all performance metrics. To be better prepared for future pandemics, countries should utilize sophisticated machine and deep learning methodologies and prioritize the health of elderly, frail and patients with comorbidities.