Repository logo
 

Theses and Dissertations

Permanent URI for this collectionhttps://hdl.handle.net/10217/199889

Browse

Recent Submissions

Now showing 1 - 20 of 116
  • ItemOpen Access
    Engineering and scaling cement-based carbon storage systems
    (Colorado State University. Libraries, 2024) Winters, Dahl, author; Simske, Steven, advisor; Bradley, Thomas, committee member; Arabi, Mazdak, committee member; Troxell, Wade, committee member; Goemans, Christopher, committee member
    This work is a contribution to the body of knowledge surrounding cement-based carbon storage systems, their engineering, and their scaling to meet the requirements of global sustainability in a relevant timeframe. Concrete is the most produced material by weight per year, surpassing water and all biomass we use per year, thus requiring by virtue of its total mass the largest share of total energy produced. Today, it is a source of net greenhouse gas emissions and environmental damage because of our appropriation of natural resources for its use in construction. However, it could serve as our largest land-based engineered sink for such emissions. Such potential is the focus of this work, addressed not only by experiments to improve the engineering of cement-based carbon storage systems, but also by suggested practices to achieve scale for such systems to have a beneficial impact on our economy and environment. The ubiquity of concrete means that cement-based carbon storage can also be ubiquitous, offering continued opportunities for carbon removal and sequestration within built materials. To engineer and scale the world's largest product into its largest engineered carbon sink, this research focuses on the use of biochar and calcium carbonate within structural and non-structural concrete uses, such as tetrapods: structures offering the benefits of reduced sand mining, protections against sea level rise, and enabling cement industry decarbonization. The results demonstrated that 4 wt% biochar with 1.5 wt% CaCO3 can replace cement for carbon storage while maintaining sufficient compressive strength. Along with the use of 30 wt% biochar as aggregate, 100,000 10-tonne tetrapods could sequester 1 million tonnes of CO2. Over a year of global emissions, 40 Gt CO2, could be stored in such stacked tetrapods within a land area smaller than Kuwait, 17,400 km2. Thus, this work contributes to the engineering of systems with industrial significance capable of countering the effects of global warming at meaningful scales.
  • ItemOpen Access
    Enhancing flight testing leveraging software testing techniques implemented in model-based systems engineering
    (Colorado State University. Libraries, 2024) Alvarado, Jose L., Jr., author; Bradley, Thomas H., advisor; Herber, Daniel, committee member; Simske, Steven, committee member; Windom, Bret, committee member
    The Department of Defense (DoD) is significantly shifting toward digital engineering across all systems engineering lifecycle phases. A vital aspect of this transformation is the adoption of model-based testing methodologies within the Test and Evaluation (T&E) processes. This dissertation investigates a grey box Model-Driven Test Design (MDTD) approach that leverages model-based systems engineering (MBSE) artifacts to create flight test scenarios and plans and compares this novel approach to the traditional document-centric methods. The study utilizes the Systems Modeling Language (SysML) to represent artifacts, enabling a comparative analysis between traditional and MDTD processes. Through a case study involving a training system used by the Air Force Operational Test and Evaluation Center (AFOTEC), the dissertation evaluates the MDTD process's effectiveness in generating validated test scenarios and plans that align with established methods. Two additional case studies demonstrate the reuse of SysML elements across different systems under test (SUT), highlighting the benefits, costs, and practical applications of this approach in operational flight testing. The findings include metrics such as Model Reuse Percentage (MR%), Reuse Value Added (RVA), and System Usability Survey (SUS) scores, which measure the reusability of model artifacts and the usability and effectiveness of the "AFOTEC Methodology" model approach in generating flight test plans. This research underscores the importance of model-based testing in operational flight testing and supports the DoD T&E community's ongoing move toward a fully integrated digital engineering ecosystem.
  • ItemOpen Access
    Geographically-resolved life cycle assessment and techno-economic analysis of engineered climate solutions with an innovative framework for decision support
    (Colorado State University. Libraries, 2024) Greene, Jonah Michael, author; Quinn, Jason C., advisor; Reardon, Kenneth, committee member; Coburn, Tim, committee member; Baker, Daniel, committee member
    The urgent challenge of addressing climate change requires a thorough evaluation of engineered solutions to ensure they are both economically viable and environmentally sustainable. This dissertation performs a comprehensive assessment of two key climate technologies: microalgae biorefineries for biofuel production and anaerobic digestion (AD) systems for reducing greenhouse gas (GHG) emissions on dairy farms. Using high-resolution life cycle assessment (LCA) and techno-economic analysis (TEA), it provides detailed insights into the sustainability performance of these technologies. In addition, this work goes further by introducing a decision-support framework that improves the interpretation of LCA and TEA results, enhancing decision-makers' ability to form sustainable policies and implement actionable outcomes that drive the transition to green energy solutions. The first segment of this dissertation integrates high-resolution thermal and biological modeling with LCA and TEA to evaluate and compare two different microalgae biorefinery configurations targeting renewable diesel (RD) and sustainable aviation fuel (SAF) production in the United States. A dynamic engineering process model captures mass and energy balances for biomass growth, storage, dewatering, and conversion with hourly resolution. These configurations support facilities in remote areas and cultivation on marginal lands, enabling large-scale biofuel production. The two pathways under examination share identical biomass production and harvesting assumptions but differ in their conversion processes. The first pathway evaluates hydrothermal liquefaction (HTL) to produce RD, while the second explores the Hydroprocessed Esters and Fatty Acids (HEFA) process to produce SAF. Results indicate that the Minimum Fuel Selling Price (MFSP) for RD could decrease from $3.70-$7.30 to $1.50-$4.10 per liter of gasoline equivalent, and for SAF from $9.90-$19.60 to $2.20-$7.30 per liter under future scenarios with increased lipid content and reduced CO2 delivery costs. Optimization analyses reveal pathways to achieve an MFSP of $0.75 per liter and 70% GHG emissions reductions compared to petroleum fuels for both pathways. Additional analysis covers the water footprint, land-use change emissions, and other environmental impacts, with a focus on strategic research and development investments to reduce production costs and environmental burdens from microalgae biofuels. Beyond renewable transportation fuels, achieving a sustainable energy future will require innovations in the circular economy, such as waste-to-energy systems that reduce GHG emissions while simultaneously producing renewable energy. Accordingly, the second segment of this dissertation examines the GHG reduction potential of adopting AD technology on large-scale dairy farms across the contiguous United States. Regional and national GHG reduction estimates were developed through a robust life cycle modeling framework paired with sensitivity and uncertainty analyses. Twenty dairy configurations were modeled to capture key differences in housing and manure management practices, applicable AD technologies, regional climates, storage cleanout schedules, and land application methods. Monte Carlo uncertainty bounds suggest that AD adoption could reduce GHG emissions from the large-scale dairy industry by 2.45-3.52 million metric tons (MMT) of CO2-equivalent (CO2-eq) per year when biogas is used solely in renewable natural gas programs, and as much as 4.53-6.46 MMT of CO2-eq per year when combined heat and power is implemented as an additional biogas use case. At the farm level, AD technology may reduce GHG emissions from manure management systems by 58.1-79.8%, depending on the region. The study highlights the regional variations in GHG emissions from manure management strategies, alongside the challenges and opportunities surrounding broader AD adoption. It is vital to confirm that engineered climate solutions offer real improvements and to identify key enhancements needed to replace existing technologies. This process hinges on effective policy and decision-making. To address these challenges, the final segment of this dissertation introduces the Environmental Comparison and Optimization Stakeholder Tool for Evaluating and Prioritizing Solutions (ECO-STEPS). ECO-STEPS offers a decision-support framework that utilizes outputs from LCA and TEA to help decision-makers evaluate and prioritize engineered climate solutions based on economic viability, environmental impacts, and resource use. The tool's framework combines stakeholder rankings for key sustainability criteria with diverse statistical weighting methods, offering decision support aligned with long-term sustainability goals across various technology sectors. Applied to a biofuels case study, ECO-STEPS compares algae-based RD, soybean biodiesel (BD), corn ethanol, and petroleum diesel, using an expert survey to determine criteria rankings. Results indicate that soybean BD is a strong near-term solution for the biofuels sector, given its economic viability and relatively low environmental impacts. In contrast, corn ethanol, while economically competitive, demonstrates poor environmental performance across multiple sustainability themes. Algae-based RD emerges as a promising long-term option as ongoing research and development reduce costs. The results of this case study illustrate that ECO-STEPS provides a flexible and comprehensive framework for stakeholders to navigate complex decision-making processes in the pursuit of sustainable climate solutions. In conclusion, the integration of high-resolution LCA, TEA, and a stakeholder-driven decision-support framework in this dissertation presents a comprehensive approach to evaluating engineered climate solutions. The results from these studies provide geographically resolved insights into the sustainability performance of key climate technologies, offering actionable pathways for optimizing biofuel production, reducing GHG emissions, and supporting sustainable decision-making to advance the transition to a green economy.
  • ItemOpen Access
    Qualitative comparative analysis of software development practices translated from scene to screen using the real-to-real method for inter-industry learning
    (Colorado State University. Libraries, 2024) Hawkey, Barry, author; Vans, Marie, advisor; Simske, Steve, committee member; Gallegos, Erika, committee member; Rodgers, Tim, committee member
    Many projectized industries, in fields as diverse as healthcare, live theater, and construction, have developed sets of specific project management practices that are consistently associated with success. These practices – assignable activities, tasks, processes, and methods – have been acquired through decades of lessons painfully learned by project teams. Well-known, existing processes allow project teams to capture and disseminate these best practices and lessons learned between projects and across organizations, allowing new teams to benefit from previous efforts. Although overall progress may at times seem fitful, these knowledge-sharing processes have allowed each industry to improve their project management methodologies over time. Unfortunately, the specificity required to make a practice actionable, assignable, and beneficial within the domain of one industry also renders it difficult to apply in another. There is no formal method, or method in widespread use, for the translation of specific project management practices across the boundaries of industry and knowledge domains. As a result, most of the benefits of these learnings - each industry's collective knowledge of best practices – are restricted to their original domain, providing little guidance to project teams in other industries. This research examines several previous attempts to apply project management practices across multiple domains and synthesizes a novel method for such inter-industry learning. The Real-to-Real method presented here begins by identifying potential barriers to project success within a target industry. Next, an industry that has developed different approaches to similar challenges is chosen as a source of inspiration. After holistically examining project management practices within that source industry, a set of evident principles is synthesized through an iterative process of inductive reasoning which explain that industry's approach to project management and these shared challenges. Using these principles as a transformative intermediary, a set of specific practices suitable for the domain of the target industry can then be identified or developed, mirroring or paralleling practices used in the source industry. These practices may lead to improved project outcomes when used in target industry projects that have characteristics similar to those found in the source industry. This method may allow for the translation and practical application of hard-won project management expertise across many projectized industries, potentially improving project outcomes in multiple fields. To provide an illustrative example of the Real-to-Real method in use, the software development industry is selected as an example target, and barriers to project success in that domain are examined. A review of the existing literature finds that the lack of simple, heuristic guidance on tailoring existing practices to better support hedonic requirements, which specify the intended emotional response of the user, may be a significant source of risk within the target industry, although the effect of hedonic requirements on project outcomes has not yet been empirically determined. With this potential source of risk in mind, the film industry is selected as a source of inspiration, as projects there share many similarities with software development projects and must routinely consider hedonic requirements. A holistic evaluation of film production project management practices suggests four evident, explanatory principles guiding that industry's approach to managing projects. This research then identifies and proposes a set of specific practices, suitable for software development projects, which also support or adhere to these same principles, thus mirroring practices used in film production projects. To support these findings, the identified software development practices are situated within existing theory, and potential mechanisms by which they may consistently lead to improved project outcomes when used in projects with high levels of hedonic requirements are discussed. A series of semi-structured interviews with experienced practitioners in the film industry are then conducted to verify an accurate understanding of film production project management practices, the synthesized explanatory principles, and the pairing of each principle to a set of related practices through. Next, a second series of interviews with experienced practitioners in the software development industry is used to verify the selection of software development practices supporting these principles. To empirically validate these findings, and to determine the effect of hedonic requirements on project outcomes, a practitioner survey is then conducted, measuring project success, use of the identified practices, and the level of hedonic requirements in 307 software development project cases in five culturally similar countries. First, the perceived criticality of hedonic requirements is compared to five measures of project success, to determine the impact of such requirements on project outcomes. Then, using Qualitative Comparative Analysis, causal recipes of the identified practices that consistently resulted in project success, across these same measures, are identified for projects with varying levels of hedonic requirements. These results validate the benefits of the identified principles and practices to projects with high levels of hedonic requirements, and provide simple, heuristic guidance to software development project teams on how to quickly and effectively tailor their management practices to better support individual projects based on the criticality of such requirements. This guidance may serve to significantly improve outcomes in software development projects with high levels of hedonic requirements. These results also help to validate the Real-to-Real method of translating management practices across industry and knowledge domains, potentially enabling additional opportunities for valuable inter-industry learning.
  • ItemOpen Access
    Merging systems engineering methodologies with the Agile Scrum framework for Department of Defense software projects
    (Colorado State University. Libraries, 2024) Rosson, Dallas, author; Bradley, Thomas, advisor; Batchelor, Ann, advisor; Coleman, David, committee member; Eftekhari Shahroudi, Kamran, committee member; Wise, Dan, committee member
    Only large-scale Department of Defense (DoD) software projects executed under the direction of the DoD Instruction 5000.2, Operation of the Adaptive Acquisition Framework, are required to follow rigorous systems engineering methods. Many software projects lack the benefits of established systems engineering methodologies and good engineering rigor and fail to meet customer needs and expectations. Software developers trained in the use of the various Agile frameworks are frequently strongly opposed to any development methodology that could be viewed as infringing on the principles of the Agile Manifesto. Agile projects, by their nature, embrace the concept of change, but uncontrolled change leads to project failures whereas controlled change can lead to sustained and innovative forward progress. In order to improve the results of these vital software projects, Department of Defense (DoD) software projects require a methodology to implement systems engineering rigor while still employing Agile software practices. The Agile Scrum framework alone is not rigorous enough to fully document customer needs as User Stories are written tracking only who, what, and why at a non-atomic level and commonly never looked at again after development needs are met. Systems engineering methods alone are not flexible enough to take advantage of the inherent nature to change capability required in software projects, which require flexibility in schedule and requirements. A new methodology, the Systems Engineering Focused Agile Development method, takes a rigor-flexibility-rigor approach to development and makes use of the strengths of the Agile Scrum framework with the best practices of systems engineering methodologies resulting in a common language that better allows cross-functional teams to communicate project needs while also allowing software developers to maintain flexibility in the execution of software projects. This research has determined that the thoughtful blending of Agile systems engineering and modern systems engineering methods has the potential to provide DoD software projects with benefits to cost, schedule, and performance.
  • ItemOpen Access
    A new automotive system architecture for minimizing rear-end collisions
    (Colorado State University. Libraries, 2024) Rictor, Andrew, author; Chandrasekaran, Venkatachalam, advisor; Cheney, Margaret, committee member; Herber, Daniel, committee member; Simske, Steven, committee member
    Advanced Driver Assistance Systems, more frequently referred to as ADAS, are intelligent systems integrated into newer automotive vehicles to improve safety and minimize accidents. These systems utilize radar, sonar, lidar and camera sensors mounted around the vehicle to maintain situational awareness of the vehicle and the surrounding environment. The majority of ADAS that focus on collision avoidance modify the host vehicle's operation. Some existing ADAS will stop the vehicle, sound an audible alert, initiate internal warning lights or dash warning messages, and prevent lane change operations. The ADAS proposed and detailed here focuses on enabling the host vehicle to communicate with the inbound vehicle's driver via the brake lights so that the driver has the opportunity to modify the inbound vehicle's operation before a collision occurs. This is called the Aft Collision Assist (ACA). This work presents the Model Based System Engineering (MBSE) diagrams, SIMULINK models and simulation of the ACA, data derivation utilized in the simulations, validation with empirical data, and future work for optimizing the ACA's algorithms.
  • ItemOpen Access
    Relational and technological process concept utilizing a human-in-the-loop-centered methodology for USSOCOM
    (Colorado State University. Libraries, 2024) Corl, Kenneth Casselbury, author; Gallegos, Erika, advisor; Bradley, Thomas, committee member; Simske, Steve, committee member; Mumford, Troy, committee member; Crocker, Jerry, committee member
    The Department of Defense (DoD) employs broad human factors requirements across various applications, resulting in a universal application of the same standards to a multitude of DoD acquisition systems. In unconventional warfare, specifically within missions conducted by US Special Operations Command (USSOCOM), operators face intensified workloads and domain-specific challenges that current human factors considerations do not adequately address. The objective of this dissertation aims to introduce and validate the Relational and Technological Capstone (RTC), which expands upon existing human factors requirements through both architectural and behavioral diagrams in a well-defined set of methodology-driven process steps. In referencing the system lifecycles as defined by the Defense Acquisition University (DAU) and the International Council on Systems Engineering (INCOSE), the objective is to diversify and enhance the consideration of Human Systems Integration (HSI) requirements in USSOCOM platforms by addressing the unique challenges posed by intensified workloads and domain-specific ontologies. The RTC employs a methodology-driven approach utilizing architectural, behavioral, and parametric diagrams. It integrates with Model Based Systems Engineering (MBSE) and the Systems Modeling Language (SysML) to improve the design of human-system interactions, incorporating a Special Operations Task List and Performance Shaping Factors (PSFs) into aggregated performance values. The results of this dissertation demonstrate the efficacy of RTC within MBSE, showcasing its value through improved design processes and as a foundation for new programs. The RTC can integrate existing models to further benefit customer needs through initiatives like Engineering Change Proposals (ECPs) as well as assist starter models for new programs and projects. The containment tree format aids in developing USSOCOM MBSE and opens possibilities for automation tools as well as an easily transferrable modeling package for future use on all complex systems. Continual use of RTC contributes to the maturity of MBSE models and diagrams, fostering the evolution of a federation-of-models and Program of Record standards. This not only benefits subsequent SOCOM programs and projects but also facilitates the emerging field and methodology of mission engineering to realize and forecast capability gaps before a system reaches the implementation and integration phase. The ultimate goal is to center the RTC around the operator, ensuring man-machine compatibility and optimization throughout special operation acquisitions.
  • ItemOpen Access
    Evaluating micromobility adoption, perception, and implementation
    (Colorado State University. Libraries, 2024) Pourfalatoun, Shiva, author; Gallegos, Erika, advisor; Daily, Jeremy, committee member; Simske, Steve, committee member; Bradley, Thomas, committee member; Jin, Ziyu, committee member
    Micromobility, a term that encompasses compact and efficient transportation modes such as bicycles and scooters, has rapidly emerged as an important element of urban mobility. These small, often electrically-powered vehicles offer a versatile solution to urban congestion and provide an eco-friendly alternative to traditional transportation modes. Particularly, shared bicycles and e-scooters have become popular due to their convenience and accessibility, offering significant benefits but also presenting new challenges in urban planning and traffic management. This transition in urban transport paradigms raises several pertinent questions about user behaviors, preferences, and the interplay of various socio-psychological factors. This dissertation aims to explore three key aspects of micromobility. The first research question investigates the differences between shared e-scooter users and non-users, along with the factors influencing their decisions regarding e-scooter usage. The second question examines the shift in micromobility preferences and perceptions before, during, and after the COVID-19 pandemic, focusing on how these changes correlate with different quarantine behaviors. The third and final question delves into the interactions between drivers, bicyclists, and pedestrians, analyzing how drivers' risk-taking propensity and emotional intelligence influence these interactions. Each of these questions is approached through specific methodological frameworks, employing a mix of statistical analyses and behavioral observations to provide insights into the evolving dynamics of urban mobility. The findings from this research provide a systematic approach to integrating micromobility, by understanding at the individual level the factors that effect decision-making on usage, as well as interaction effects with other road users that impact safety.
  • ItemOpen Access
    Security shortcomings of embedded network protocols in commercial vehicles
    (Colorado State University. Libraries, 2024) Chatterjee, Rik, author; Daily, Jeremy, advisor; Ray, Indrakshi, committee member; Ray, Indrajit, committee member
    Modern commercial vehicles depend on embedded systems that communicate via standardized protocols, forming the foundation of their internal networks. The Controller Area Network (CAN) protocol is commonly employed for communication, with protocols such as SAE J1939 and Unified Diagnostic Services (UDS) playing critical roles in medium and heavy-duty vehicles. This thesis investigates multiple attack vectors that exploit vulnerabilities in both the SAE J1939 and UDS protocols, potentially compromising electronic control units (ECUs) in commercial vehicle networks. The study presents five case scenarios related to the SAE J1939 standard, including two that validate previously proposed attack hypotheses using extensive testing setups. Additionally, three new attack vectors are explored through bench tests and in-vehicle trials. Simultaneously, the research highlights three vulnerabilities within the UDS protocol, specifically addressing weaknesses in the ISO 14229 and ISO 15765 specifications. Testing was conducted on real-world systems, including bench setups with ECUs connected to a CAN bus and in-vehicle evaluations using a 2014 Kenworth T270 and a 2018 Freightliner Cascadia Truck Front Cab configured as a test bench. The results demonstrate how these protocol-based attacks can target and compromise specific ECUs, revealing significant security gaps in current vehicular communication systems. Engineers and developers working with SAE J1939 and UDS stacks must consider these vulnerabilities to enhance the resilience of communication subsystems in future designs.
  • ItemOpen Access
    Cybersecurity vulnerabilities in electronic logging devices and development of a software defined truck testbed
    (Colorado State University. Libraries, 2024) Jepson, Jacob, author; Daily, Jeremy, advisor; Simske, Steve, committee member; Ray, Indrajit, committee member
    This thesis addresses critical cybersecurity vulnerabilities in Electronic Logging Devices (ELDs), mandated equipment for modern commercial trucks, and introduces an innovative solution for comprehensive system testing. Through extensive reverse engineering and practical testing, significant security flaws in commonly used ELDs are uncovered. These vulnerabilities enable unauthorized control over vehicle systems through arbitrary CAN message injection, allow upload of malicious firmware, and most alarmingly, present the potential for a self-propagating truck-to-truck worm. To demonstrate these vulnerabilities, bench-level testing and real-world experiments were conducted using a 2014 Kenworth T270 Class 6 research truck equipped with a vulnerable ELD. The findings reveal how these security weaknesses could lead to widespread disruptions in commercial fleets, with severe safety and operational implications. Addressing the fundamental challenge of disparate design and testing of after-market systems in trucks, this research introduces CANLay, a key networking component of the Software Defined Truck (SDT) concept. CANLay enables the virtualization of in-vehicle networks, facilitating the transportation of Controller Area Network (CAN) data and sensor signals over long-distance networks. This innovation allows for holistic security assessments and efficient testing of integrated vehicle systems, accounting for emergent behaviors that arise from system integration. The efficacy of CANLay in heavy vehicle network performance testing is demonstrated, showcasing its potential to streamline system integration and verification efforts in a versatile digital engineering environment. This work contributes to the field by illuminating current vulnerabilities in mandated trucking technology, demonstrating potential attack vectors, and providing a framework for more comprehensive and efficient testing of integrated vehicle systems. This research underscores the urgent need to improve the security posture of ELD systems and offers recommendations for enhancing their security. The findings and proposed solutions have significant implications for improving cybersecurity in the trucking industry and, by extension, safeguarding critical supply chains.
  • ItemOpen Access
    Integrating geometric deep learning with a set-based design approach for the exploration of graph-based engineering systems
    (Colorado State University. Libraries, 2024) Sirico, Anthony, Jr., author; Herber, Daniel R., advisor; Chen, Haonan, committee member; Simske, Steven, committee member; Conrad, Steven, committee member
    Many complex engineering systems can be represented in a topological form, such as graphs. This dissertation introduces a framework of Graph-Set-Based Design (GSBD) that integrates graph-based techniques with Geometric Deep Learning (GDL) within a Set-Based Design (SBD) approach to address graph-centric design problems. We also introduce Iterative Classification (IC), a method for narrowing down large datasets to a subset of more promising and feasible solutions. When we combine the two, we have IC-GSBD, a methodological framework where the primary goal is to effectively and efficiently seek the best-performing solutions with lower computational costs. IC-GSBD is a method that employs an iterative approach to efficiently narrow down a graph-based dataset containing diverse design solutions to identify the most useful options. This approach is particularly valuable as the dataset would be computationally expensive to process using other conventional methods. The implementation involves analyzing a small subset of the dataset to train a machine-learning model. This model is then utilized to predict the remaining dataset iteratively, progressively refining the top solutions with each iteration. In this work, we present two case studies demonstrating this method. In the first case study utilizing IC-GSBD, the goal is the analysis of analog electrical circuits, aiming to match a specific frequency response within a particular range. Previous studies generated 43,249 unique undirected graphs representing valid potential circuits through enumeration techniques. However, determining the sizing and performance of these circuits proved computationally expensive. By using a fraction of the circuit graphs and their performance as input data for a classification-focused GDL model, we can predict the performance of the remaining graphs with favorable accuracy. The results show that incorporating additional graph-based features enhances model performance, achieving a classification accuracy of 80% using only 10% of the graphs and further subdividing the graphs into targeted groups with medians significantly closer to the best and containing 88.2 of the top 100 best-performing graphs on average using 25% of the graphs.
  • ItemOpen Access
    Model-based systems engineering application to data management for integrated sustainable human settlement modeling
    (Colorado State University. Libraries, 2024) Adjahossou, Anicet, author; Grigg, Neil, advisor; Bradley, Thomas, committee member; Conrad, Steven, committee member; Willson, Bryan, committee member; Fremstad, Anders, committee member
    The challenges associated with the transition from current approaches to temporary humanitarian settlement to integrated, sustainable human settlements is largely due to a significant increase in the number of forcibly displaced people over the last few decades, the difficulties of sustainably providing the needed services to the required standards, and the prolongation of emergencies. According to the United Nations High Commissioner for Refugees (UNHCR)'s Global Appeal 2023, more than 117.2 million people were forcibly displaced or stateless in 2023, representing a little over 1% of the world's population. The average lifespan of a humanitarian settlement is between 17 and 26 years (UNHCR), and factors such as urban growth and adverse environmental changes have exacerbated the scale of the difficulties. Despite these problematical contexts, short-term considerations continue to guide the planning and management of humanitarian settlements, to the detriment of more integrated, longer-term perspectives. These factors call for a paradigm shift in approach to ensure greater sustainability right from the planning phases. Recent studies often attribute the unsustainability of humanitarian settlements to poor design and inadequate provision of basic resources and services, including water, energy, housing, employment and economic opportunities, among others. They also highlight apparent bottlenecks that hinder access to meaningful and timely data and information that stakeholders need for planning and remediation. More often than not, humanitarian operations rely on ad hoc methods, employing parallel, fragmented and disconnected data processing frameworks, resulting in the collection of a wide range of data without subsequent analysis or prioritization to optimize potential interconnections that can improve sustainability and performance. Furthermore, little effort has been made to investigate the trade-offs involved. As a result, major shortcomings emerged along the way, leading to disruption, budget overruns, disorder and more, against a backdrop of steadily declining funding for humanitarian aid. However, some attempts have been made to move towards more sustainable design approaches, but these have mainly focused on vague, sector-specific themes, ignoring systemic and integrative principles. This research is a contribution to filling these gaps by developing more practical and effective solutions, based on an integrated systemic vision of a human settlement, defined and conceptualized as a complex system. As part of this process, this research proposes a model-driven methodology, supported by Model-Based Systems Engineering (MBSE) and a Systems Modeling Language (SysML), to develop an integrated human settlement system model, which has been functionally and operationally executed using Systems Engineering (SE) approach. This novel system model enables all essential sub-systems to operate within the single system, and focuses on efficient data processing. The ultimate aim is to provide a global solution to the interconnection and integration challenges encountered in the processing of operational data and information, to ensure an effective transition to sustainable human settlements. With regard to the interconnectedness between the different sectors of the sub-systems, this research proposes a Triple Nexus Framework (TNF) in an attempt to integrate water, energy and housing sector data derived from one sub-system within the single system by applying systems engineering methods. Systems engineering, based on an understanding of the synergies between water, energy and housing, characterizes the triple nexus framework and identifies opportunities to improve decision-making steps and processes that integrate and enhance quality of data processing. To test and validate the performance of the system model, two scenarios are executed to illustrate how an integrated data platform enables easy access to meaningful data as a starting point for modeling an integrated system of sustainable human settlement in humanitarian contexts. With regard to framework performance, the model is simulated using a megadata nexus, as specified by the system requirement. The optimization simulation yields 67% satisfactory results which is further confirmed from a set of surveyed practitioners. These results show that an integrated system can improve the sustainability of human settlements beyond a sufficiently acceptable threshold, and that capacity building in service delivery is beneficial and necessary. The focus on comprehensive data processing through systems integration can be a powerful tool for overcoming gaps and challenges in humanitarian operations. Structured interviews with question analysis are conducted to validate the proposed model and framework. The results prove a consensus that the novel system model advances the state of the art in the current approach to the design and management of human settlements. An operational roadmap with substantial programmatic and technical activities required to implement the triple nexus framework is recommended for adoption and scaling-up. Finally, to assess the sustainability, adaptability and applicability of the system, the proposed system model is further validated using a context-based case study, through a capacity assessment of an existing humanitarian settlement. The sustainability analysis uses cross-impact matrix multiplication applied to classification (MICMAC) methodologies, and results show that the development of the settlement are unstable and therefore unsustainable, since there is no apparent difference between influential and dependent data. This research tackles an important global challenge, providing valuable insights towards sustainable solutions for displaced populations, aligning with the United Nations 2030 Agenda for Sustainable Development.
  • ItemOpen Access
    Characterizing and improving the adoption rate of model-based systems engineering through an application of the Diffusion of Innovations theory
    (Colorado State University. Libraries, 2024) Call, Daniel R., author; Herber, Daniel R., advisor; Aloise-Young, Patricia, committee member; Conrad, Steven, committee member; Shahroudi, Kamran Eftekhari, committee member
    As the environment and operational context of new systems continue to evolve and become increasingly complex, the practice of systems engineering (SE) must adapt accordingly. A great deal of research and development has gone and continues to go into formulating and maturing a model-based approach to SE that addresses many of the shortcomings of a conventional, document-based SE approach. In spite of the work that has been done to advance the practice of model-based systems engineering (MBSE), it has not yet been adopted to a level that would be expected based on its demonstrated benefits. While research continues into even more effective MBSE approaches, there is a need to ascertain why extant MBSE innovations are not being adopted more widely, and if possible, determine a way to accelerate its adoption. This outcome is particularly important as MBSE is a key enabler to an agile systems engineering (ASE) approach that satisfies the desire of many stakeholders to apply agile principles to SE processes. The diffusion of innovations (DoI) theory provides a useful framework for understanding the factors that affect the adoption rate of innovations in many fields. This theory has not only been effective at explaining why innovations are adopted but has also been able to explain why objectively superior innovations are not adopted. The DoI theory is likely to provide insight into the factors that are depressing the adoption rate of MBSE. Despite prior efforts in the SE community to promote MBSE, the DoI theory has not been directly and deliberately applied to understand what is preventing widespread MBSE adoption. Some elements of the theory appear in the literature addressing MBSE adoption challenges without any recognition of awareness of the theory and its implications. The expectation is that harnessing the insights offered by this theory will lead to MBSE presentation and implementation strategies that will increase its use. This would allow its benefits to be more widely realized in the SE community and improve the practice of SE generally to address modern, complex environments. The DoI theory has shown that the most significant driver of adoption rate variability is the perceived attributes of the innovation in question. A survey is a useful tool to discover the perceptions of potential adopters of an innovation. The primary contribution of this research is the development of a survey to capture and assess a participant's perceptions of specified attributes of MBSE, their current use of MBSE, and some limited demographic information. This survey was widely distributed to gather data on current perceptions of MBSE in the SE community. Survey results highlighted that respondents recognize the relative advantage of MBSE in improving data quality and traceability, but perceived complexity and compatibility with existing practices still present barriers to adoption. Subpopulation analysis reveals that those who are not already involved in MBSE efforts face the additional adoption obstacles of limited trial opportunities and tool access (chi-squared test of independence between these populations resulted in p = 0.00). The survey underscores the potential for closer alignment between MBSE and existing SE methodologies to improve the perceived compatibility of MBSE. Targeted actions are proposed to address these barriers to adoption. These targeted actions include improving the availability and use of reusable model elements to expedite system model development, improved tailoring of MBSE approaches to better suit organizational needs, an increased emphasis on ASE, refining MBSE approaches to reduce the perceived mental effort required, a lowering of the barrier to entry for MBSE by improving access to the resources (tool, time, and training) required to experiment with MBSE, and increased efforts to identify and execute relevant MBSE pilot projects. The lessons and principles from the DoI theory should be applied to take advantage of the opportunity afforded by the release of SysML v2 to reframe perceptions of MBSE. Future studies would benefit from examining additional variables identified by the DoI theory, incorporating control questions to differentiate between perceptions of SE generally and MBSE specifically, identifying better methods to assess current MBSE use by participants, and measures to broaden the participant scope.
  • ItemOpen Access
    On the integration of materials characterization into the product development lifecycle
    (Colorado State University. Libraries, 2024) Dare, Matthew S., author; Simske, Steve, advisor; Yourdkhani, Mostafa, committee member; Herber, Daniel, committee member; Radford, Donald W., committee member
    The document is broken down into four sections whereby a more complete integration of materials characterization into the product development lifecycle, when compared to traditional approaches, is researched and considered. The driving purpose behind this research is to demonstrate that an application of systems engineering principles to the characterization sciences mechanism within materials engineering and development will produce a more efficient and comprehensive understanding of complex material systems. This will allow for the mitigation of risk, enhancement of relevant data, and planning of characterization procedures proactively. The first section proposes a methodology for Characterization Systems Engineering (CSE) as an aid in the development life cycle of complex, material systems by combining activities traditionally associated with materials characterization, quality engineering, and systems engineering into an effective hybrid approach. The proposed benefits of CSE include shortened product development phases, faster and more complete problem solving throughout the full system life cycle, and a more adequate mechanism for integrating and accommodating novel materials into already complex systems. CSE also provides a platform for the organization and prioritization of comprehensive testing and targeted test planning strategies. Opportunities to further develop and apply the methodology are discussed. The second section focuses on the need for and design of a characterizability system attribute to assist in the development of systems that involve material components. While materials characterization efforts are typically treated as an afterthought during project planning, the argument is made here that leveraging the data generated via complete characterization efforts can enhance manufacturability, seed research efforts and intellectual property for next-generation projects, and generate more realistic and representative models. A characterizability metric is evaluated against a test scenario, within the domain of electromagnetic interference shielding, to demonstrate the utility and distinction of this system attribute. Follow-on research steps to improve the depth of the attribute application are proposed. In the third section, a test and evaluation planning protocol is developed with the specific intention of increasing the effectiveness of materials characterization within the system development lifecycle. Materials characterization is frequently not accounted for in the test planning phases of system developments, and a more proactive approach to streamlined verification and validation activities can be applied. By applying test engineering methods to materials characterization, systems engineers can produce more complete datasets and more adequately execute testing cycles. A process workflow is introduced to manage the complexity inherent to material systems development and their associated characterization sciences objectives. An example using queuing theory is used to demonstrate the potential efficacy of the technique. Topics for further test and evaluation planning for materials engineering applications are discussed. In the fourth section, a workflow is proposed to more appropriately address the risk generated by materials characterization activities within the development of complex material systems when compared to conventional engineering approaches. Quality engineering, risk mitigation efforts, and emergency response protocols are discussed with the intention of reshaping post-development phase activities to address in-service material failures. While root cause investigations are a critical component to stewardship of the full system lifecycle during a product's development, deployment and operation, a more tailored and proactive response to system defects and failures is required to meet the increasingly stringent technical performance requirements associated with modern, material-intensive systems. The analysis includes a Bayesian approach to risk assessment of materials characterization efforts through which uncertainty regarding scheduling and cost can be quantified.
  • ItemEmbargo
    Investigating the association between public health system structure and system effectiveness
    (Colorado State University. Libraries, 2024) Orr, Jason, author; Golicic, Susan, advisor; Bradley, Thomas, committee member; Miller, Erika, committee member; Gutilla, Molly, committee member; Magzamen, Sheryl, committee member
    Public health systems in the United States face significant challenges due to their complexity and variability. This dissertation follows a three-paper format and examines these systems through a comprehensive analysis, using systems approaches, latent transition analysis (LTA), and ordinal regression to uncover patterns and inform improvements in public health governance and service delivery. The first essay (Chapter 2) explores the application of systems approaches to the design and improvement of public health systems. A scoping review was conducted, revealing a paucity of literature on the use of "hard" systems methodologies like systems analysis and engineering in public health. The findings highlight the potential for systems approaches to enhance the efficiency, effectiveness, and equity of public health services. However, the limited engagement by public health practitioners and the lack of depth in existing literature indicate significant gaps that need to be addressed to fully leverage systems science in public health governance and service delivery. Building on the literature review, the second essay (Chapter 3) introduces a novel typology of local health departments (LHDs) using LTA based on the National Association of County and City Health Officials (NACCHO) Profile study data. The LTA identified six distinct latent statuses of LHDs, characterized by variables such as governance centrality, colocation, and integration. This typology provides a robust framework for understanding the structural and operational diversity of LHDs, offering insights into how these factors influence public health outcomes. The final essay (Chapter 4) applies ordinal regression analyses to explore the relationship between the latent statuses of LHDs and various community health outcomes. Initial analyses using a cumulative logit model indicated a violation of the proportional odds assumption, necessitating a shift to a generalized logit model. This approach revealed significant predictors of latent statuses, such as poor physical health days, preventable hospital stays, and life expectancy. The findings underscore the complexity of public health systems and the need for careful selection of statistical models to accurately capture these dynamics. The study provides actionable insights for public health policy and strategic planning, highlighting areas for future research and potential interventions to optimize public health system design and operations. This dissertation underscores the importance of systems approaches in understanding and improving public health systems. By leveraging advanced statistical models and exploring the structural characteristics of LHDs, it contributes to a deeper understanding of the factors influencing public health governance and service delivery. The findings offer a foundation for future research and policy development aimed at enhancing the efficiency and effectiveness of public health systems to better serve communities.
  • ItemOpen Access
    Framework for optimizing survivability in complex systems
    (Colorado State University. Libraries, 2024) Younes, Megan Elizabeth, author; Cale, James, advisor; Gallegos, Erika, committee member; Simske, Steve, committee member; Gaofeng, Jia, committee member
    Increasing high probability low frequency events such as extreme weather incidents in combination with aging infrastructure in the United States puts the nation's critical infrastructure such as hydroelectric dams' survivability at risk. Maximizing resiliency in complex systems can be viewed as a multi-objective optimization that includes system performance, survivability, economic and social factors. Systems requiring high survivability: a hydroelectric dam, typically require one or more redundant (standby) subsystems, which increases system cost. To optimize the tradeoffs between system survivability and cost, this research introduces an approach for obtaining the Pareto-optimal set of design candidates ("resilience frontier"). The method combines Monte Carlo (MC) sampling to estimate total survivability and a genetic algorithm (GA), referred to as the MCGA, to obtain the resilience frontier. The MCGA is applied to a hydroelectric dam to maximize overall system survivability. The MCGA is demonstrated through several numerical case studies. The results of the case studies indicate that the MCGA approach shows promise as a tool for evaluating survivability versus cost tradeoffs and also as a potential design tool for choosing system configuration and components to maximize overall system resiliency.
  • ItemOpen Access
    The dual lens of sustainability: economic and environmental insights into novel carbon reduction technologies using systems modeling, data science, and multi-objective optimization
    (Colorado State University. Libraries, 2024) Limb, Braden Jeffery, author; Quinn, Jason C., advisor; Simske, Steven J., advisor; Gallegos, Erika E., committee member; Ross, Matthew R. V., committee member
    In an era marked by escalating climate change and increasing energy demands, the pursuit of sustainable solutions in energy production and environmental management is more critical than ever. This dissertation delves into this challenge, focusing on innovative technologies aimed at reducing carbon emissions in key sectors: power generation, wastewater treatment, and aviation. The first segment of the dissertation explores the integration of thermal energy storage with natural gas power plants using carbon capture, a crucial advancement given the dominant role of fossil fuel-based power plants in electricity generation. Addressing the economic and operational drawbacks of current carbon capture and storage (CCS) technologies, this study evaluates various thermal storage configurations. It seeks to enhance plant performance through energy arbitrage, a novel approach to offset the large heat loads required for carbon capture solvent regeneration. By optimizing these technologies for current and future grid pricing and comparing their feasibility with other production methods, this research aims to strike a balance between maintaining reliable power generation and adhering to stringent environmental targets. Results show that resistively charged thermal storage can both increase CCS flexibility and power plant profits through energy arbitrage when compared to power plants with CCS but without thermal storage. Beyond electrical systems, addressing climate change also necessitates improving the energy efficiency of water treatment technologies. Therefore, the dissertation investigates the potential of nature-based solutions as sustainable alternatives to traditional water treatment methods in the second section. This section probes into the efficacy of green technologies, such as constructed wetlands, in reducing costs and emissions compared to conventional gray infrastructure. By quantifying the impact of these technologies across the U.S. and evaluating the role of carbon financing, the research highlights a pathway towards more environmentally friendly and economically viable water treatment processes. Results show that nature-based water treatment technologies can treat up to 37% of future nutrient loading while both decreasing water treatment costs and emissions compared to traditional water treatment techniques. The transportation sector will play a key role in addressing climate change as it is the largest contributor to greenhouse gas emissions. While most of the transportation sector is expected to transition to electric vehicles to decrease its carbon footprint, aviation remains hard to decarbonize as electric passenger aviation is expected to be range limited. Therefore, the final segment of the dissertation addresses the challenge of meeting the U.S. Department of Energy's Sustainable Aviation Fuel (SAF) goals. It involves a comprehensive analysis of various bioenergy feedstocks for SAF production, using GIS modeling to assess their economic and environmental impacts across diverse land types. The study employs multi-objective optimization to strategize the deployment of these feedstocks, considering factors like minimum fuel selling price, greenhouse gas emissions, and breakeven carbon price. Furthermore, agent-based modeling is used to identify policy incentives that could encourage farmer adoption of bioenergy crops, a critical step towards meeting the SAF Grand Challenge goals. This dissertation offers a comprehensive analysis of novel carbon reduction technologies, emphasizing both economic viability and environmental sustainability. By developing integrated models across key sectors affected by climate change, it explores the benefits and trade-offs of various sustainability strategies. Incorporating geospatial and temporal dimensions, the research uses multi-objective optimization and systems thinking to provide targeted investment strategies for the greatest impact. The results provide important insights and actionable plans for policymakers and industry leaders, contributing to a sustainable and low-carbon future in essential areas of the global economy.
  • ItemOpen Access
    Quality control of front-end planning for electric power construction: a collaborative process-based approach using systems engineering
    (Colorado State University. Libraries, 2024) Nguyen, Frank Bao Thai, author; Grigg, Neil, advisor; Valdes-Vasquez, Rodolfo, advisor; Gallegos, Erika, committee member; Glick, Scott, committee member
    Controlling construction costs in the electric power industry will become more important as the nation responds to new energy demands due to the transition from gasoline to electric vehicles and to emerging trends such as artificial intelligence and use of cryptocurrency. However, managing electric utility construction project costs requires that the risk of field change orders (FCOs) during construction be controlled. In the electric power industry, utility companies face increasing risk from FCOs, due to conversion from overhead to underground systems required by security and climate change factors, and subgrade work is more challenging and less predictable than the more visible overhead work. Change orders cause cost overruns and schedule slippages and can occur for reasons such as changes in scope of work, unforeseen jobsite conditions, modifications of plans to meet existing field conditions, and correction of work required by field inspectors to meet safety standards. The best opportunity to control FCOs comes during front-end planning (FEP) when conditions leading to them can be identified and mitigated. This study utilized systems engineering methodologies to address risk of FCOs in three phases: (1) defining the root causes and identifying severities of FCOs, (2) evaluating stakeholder responsibilities to find and mitigate root causes of FCOs, and (3) developing a process to identify and find solutions for the risk of FCOs. The first phase involved using a descriptive statistical analysis of the project database of an electric utility company to identify and analyze the magnitude, frequency, and causes of FCOs in overhead and underground electrical construction. The results showed that FCOs with added scopes occurred more frequently in underground projects than in overhead projects. The analysis also indicated that most causes of FCOs could be managed during the FEP process, and it laid a foundation for the next phase, to promote collaboration among stakeholders to allocate responsibility to identify and mitigate risk of FCOs. In the second phase, the study used Analytical Hierarchy Process methodologies to distribute weights of stakeholder votes to create an integrated metric of front-end planning team confidence that a desired level of quality had been achieved. This study was significant in that it showed how effectiveness of collaborative working relationships across teams during front-end planning could be improved to create a quality control metric to capture risk of FCOs. In the third phase, the study used results from the first two phases and additional tools based on Swimlane diagrams and logical relationships between tasks and stakeholders to formulate a quality control roadmap model. This model is significant because it creates a roadmap to enhance the effectiveness of interdisciplinary teamwork through a critical path of the FEP process. The roadmap model shows a streamlined process for decision-making in each phase of front-end planning to minimize risk of FCOs through a logical path prior to final design. While there have been efforts to improve the design process, this study is the first one known to the researcher to address quality control of FEP using a roadmap process for quality control in electric power construction projects. The primary contribution is to enrich the body of knowledge about quality control of FEP by creating a roadmap model based on systems engineering and enhancing the effectiveness of collaborative working relationships in a logical process that captures risk of FCOs early in the FEP process. Besides the contribution of a method to reduce the risk of FCOs, the study points to another important concern to the construction industry about safety on the jobsite. The contractor normally requires a time extension to complete the work due to an FCO, but to reduce the impact to the project schedule, overtime is normally provided to the construction workers to perform the task. Additional research on this issue is required, but it is apparent that due to the fatigue of long working hours, this overtime may impact the task performance as well as the physical and psychological well-being of the construction workers, and they may lose safety awareness and have higher risk of accidents on the construction site. Thus, reducing the risk of FCOs will lead to less overtime and is an effective way for the construction project team to reduce the risk of construction accidents.
  • ItemOpen Access
    Novel assessments of country pandemic vulnerability based on non-pandemic predictors, pandemic predictors, and country primary and secondary vaccination inflection points
    (Colorado State University. Libraries, 2024) Vlajnic, Marco M., author; Simske, Steven, advisor; Cale, James, committee member; Conrad, Steven, committee member; Reisfeld, Bradley, committee member
    The devastating worldwide impact of the COVID-19 pandemic created a need to better understand the predictors of pandemic vulnerability and the effects of vaccination on case fatality rates in a pandemic setting at a country level. The non-pandemic predictors were assessed relative to COVID-19 case fatality rates in 26 countries and grouped into two novel public health indices. The predictors were analyzed and ranked utilizing machine learning methodologies (Random Forest Regressor and Extreme Gradient Boosting models, both with distribution lags, and a novel K-means-Coefficient of Variance sensitivity analysis approach and Ordinary Least Squares Multifactor Regression). Foundational time series forecasting models (ARIMA, Prophet, LSTM) and novel hybrid models (SARIMA-Bidirectional LSTM and SARIMA-Prophet-Bidirectional LSTM) were compared to determine the best performing and accurate model to forecast vaccination inflection points. XGBoost methodology demonstrated higher sensitivity and accuracy across all performance metrics relative to RFR, proving that cardiovascular death rate was the most dominant predictive feature for 46% of countries (Population Health Index), and hospital beds per thousand people for 46% of countries (Country Health Index). The novel K-means-COV sensitivity analysis approach performed with high accuracy and was successfully validated across all three methods, demonstrating that female smokers was the most common predictive feature across different analysis sets. The new model was also validated with the Calinski-Harabasz methodology. Every machine learning technique that was evaluated showed great predictive value and high accuracy. At a vaccination rate of 13.1%, the primary vaccination inflection point was achieved at 83.27 days. The secondary vaccination inflection point was reached at 339.31 days at the cumulative vaccination rate of 67.8%. All assessed machine and deep learning methodologies performed with high accuracy relative to COVID-19 historical data, demonstrated strong forecasting value, and were validated by anomaly and volatility detection analyses. The novel triple hybrid model performed the best and had the highest accuracy across all performance metrics. To be better prepared for future pandemics, countries should utilize sophisticated machine and deep learning methodologies and prioritize the health of elderly, frail and patients with comorbidities.
  • ItemOpen Access
    Time-delta method for measuring software development contribution rates
    (Colorado State University. Libraries, 2024) Bishop, Vincil Chapman, III, author; Simske, Steven J., advisor; Vans, Marie, committee member; Malaiya, Yashwant, committee member; Ray, Indrajit, committee member
    The Time-Delta Method for estimating software development contribution rates provides insight into the efficiency and effectiveness of software developers. It proposes and evaluates a framework for assessing software development contribution and its rate (first derivative) based on Commit Time Delta (CTD) and software complexity metrics. The methodology relies on analyzing historical data from software repositories, employing statistical techniques to infer developer productivity and work patterns. The approach combines existing metrics like Cyclomatic Complexity with novel imputation techniques to estimate unobserved work durations, offering a practical tool for evaluating the engagement of software developers in a production setting. The findings suggest that this method can serve as a reliable estimator of development effort, with potential implications for optimizing software project management and resource allocation.