Browsing by Author "Miller, Erika, committee member"
Now showing 1 - 15 of 15
Results Per Page
Sort Options
Item Open Access A combined classification and queuing system optimization approach for enhanced battery system maintainability(Colorado State University. Libraries, 2022) Pirani, Badruddin, author; Cale, James, advisor; Simske, Steven, committee member; Miller, Erika, committee member; Keller, Josh, committee memberBattery systems are used as critical power sources in a wide variety of advanced platforms (e.g., ships, submersibles, aircraft). These platforms undergo unique and extreme mission profiles that necessitate high reliability and maintainability. Battery system failures and non-optimal maintenance strategies have a significant impact on total fleet lifecycle costs and operational capability. Previous research has applied various approaches to improve battery system reliability and maintainability. Machine learning methodologies have applied data-driven and physics-based approaches to model battery decay and predict battery state-of-health, estimation of battery state-of-charge, and prediction of future performance. Queuing theory has been used to optimize battery charging resources ensure service and minimize cost. However, these approaches do not focus on pre-acceptance reliability improvements or platform operational requirements. This research introduces a two-faceted approach for enhancing the overall maintainability of platforms with battery systems as critical components. The first facet is the implementation of an advanced inspection and classification methodology for automating the acceptance/rejection decision for batteries prior to entering service. The purpose of this "pre-screening" step is to increase the reliability of batteries in service prior to deployment. The second facet of the proposed approach is the optimization of several critical maintenance plan design attributes for battery systems. Together, the approach seeks to simultaneously enhance both aspects of maintainability (inherent reliability and cost-effectiveness) for battery systems, with the goal of decreasing total lifecycle cost and increasing operational availability.Item Open Access Applying model-based systems engineering in search of quality by design(Colorado State University. Libraries, 2022) Miller, Andrew R., author; Herber, Daniel R., advisor; Bradley, Thomas, committee member; Miller, Erika, committee member; Simske, Steve, committee member; Yalin, Azer P., committee memberModel-Based System Engineering (MBSE) and Model-Based Engineering (MBE) techniques have been successfully introduced into the design process of many different types of systems. The application of these techniques can be reflected in the modeling of requirements, functions, behavior, and many other aspects. The modeled design provides a digital representation of a system and the supporting development data architecture and functional requirements associated with that architecture through modeling system aspects. Various levels of the system and the corresponding data architecture fidelity can be represented within MBSE environment tools. Typically, the level of fidelity is driven by crucial systems engineering constraints such as cost, schedule, performance, and quality. Systems engineering uses many methods to develop system and data architecture to provide a representative system that meets costs within schedule with sufficient quality while maintaining the customer performance needs. The most complex and elusive constraints on systems engineering are defining system requirements focusing on quality, given a certain set of system level requirements, which is the likelihood that those requirements will be correctly and accurately found in the final system design. The focus of this research will investigate specifically the Department of Defense Architecture Framework (DoDAF) in use today to establish and then assess the relationship between the system, data architecture, and requirements in terms of Quality By Design (QbD). QbD was first coined in 1992, Quality by Design: The New Steps for Planning Quality into Goods and Services [1]. This research investigates and proposes a means to: contextualize high-level quality terms within the MBSE functional area, provide an outline for a conceptual but functional quality framework as it pertains to the MBSE DoDAF, provides tailored quality metrics with improved definitions, and then tests this improved quality framework by assessing two corresponding case studies analysis evaluations within the MBSE functional area to interrogate model architectures and assess quality of system design. Developed in the early 2000s, the Department of Defense Architecture Framework (DoDAF) is still in use today, and its system description methodologies continue to impact subsequent system description approaches [2]. Two case studies were analyzed to show proposed QbD evaluation to analyze DoDAF CONOP architecture quality. The first case study addresses the analysis of DoDAF CONOP of the National Aeronautics and Space Administration (NASA) Joint Polar Satellite System (JPSS) ground system for National Oceanic and Atmospheric Administration (NOAA) satellite system with particular focus on the Stored Mission Data (SMD) mission thread. The second case study addresses the analysis of DoDAF CONOP of the Search and Rescue (SAR) navel rescue operation network System of Systems (SoS) with particular focus on the Command and Control signaling mission thread. The case studies help to demonstrate a new DoDAF Quality Conceptual Framework (DQCF) as a means to investigate quality of DoDAF architecture in depth to include the application of DoDAF standard, the UML/SysML standards, requirement architecture instantiation, as well as modularity to understand architecture reusability and complexity. By providing a renewed focus on a quality-based systems engineering process when applying the DoDAF, improved trust in the system and data architecture of the completed models can be achieved. The results of the case study analyses reveal how a quality-focused systems engineering process can be used during development to provide a product design that better meets the customer's intent and ultimately provides the potential for the best quality product.Item Open Access Cost optimization in requirements management for space systems(Colorado State University. Libraries, 2021) Katz, Tami E., author; Simske, Steve, advisor; Sega, Ron, committee member; Miller, Erika, committee member; Macdonald, John, committee memberWhen producing complex space systems, the transformation of customer needs into a realized system includes the development of product requirements. The ability to generate and manage the requirements can either enable the overall system development or drive significant cost and schedule impacts. Assessing practices in the industry and publications, it is observed that there is a substantial amount of documented approaches to address requirement development and product verification, but only a limited amount of documented approaches for requirements management. A complex system can have tens of thousands of requirements across multiple levels of development which, if not well managed, can lead to hidden costs associated with missed requirements and product rework. With current space system projects being developed at a rapid pace using more cost constrained approaches such as fixed budgets, an investigation into more efficient processes, such as requirements management, can yield methods to enable successful, cost effective system development. To address the optimal approach of managing requirements for complex space systems, this dissertation assesses current practices for requirements management, evaluates various contributing factors towards optimization of project costs associated with this activity, and proposes an optimized requirements management process to utilize during the development of space systems. Four key areas of process control are identified for requirements management optimization on a project, including utilization of a data focused requirements management approach, development (and review) of requirements using a collaborative software application, ensuring the requirement set is a consolidated with an appropriate amount of requirements for the project, and evaluating when to officially levy requirements on the product developers based on requirement maturation stability. Multiple case studies are presented to evaluate if the proposed requirements management process yields improvement over traditional approaches, including a simulation of the current state and proposed requirements management approaches. Ultimately, usage of the proposed optimized set of processes is demonstrated to be a cost effective approach when compared against traditional processes that may adversely impact the development of new space systems.Item Open Access Development and quasi-experimental study of the Scrum model-based system architecture process (sMBSAP) for agile model-based software engineering(Colorado State University. Libraries, 2023) Huss, Moe, author; Herber, Daniel R., advisor; Borky, John M., advisor; Miller, Erika, committee member; Mallette, Paul, committee memberModel-Based Systems Engineering (MBSE) is an architecture-based software development approach. Agile, on the other hand, is a light system development approach that originated in software development. To bring together the benefits of both approaches, this research is divided into two stages. The first stage proposes an integrated Agile MBSE approach that adopts a specific instance of the Agile approach (i.e., Scrum) in combination with a specific instance of an MBSE approach (i.e., Model-Based System Architecture Process — "MBSAP") to create an Agile MBSE approach called the integrated Scrum Model Based System Architecture Process (sMBSAP). The proposed approach was validated through an experimental study that developed a health technology system over one year, successfully producing the desired software product. This work focuses on determining whether the proposed sMBSAP approach can deliver the desired Product Increments with the support of an MBSE process. The interaction of the Product Development Team with the MBSE tool, the generation of the system model, and the delivery of the Product Increments were observed. The results showed that the proposed approach contributed to achieving the desired system development outcomes and, at the same time, generated complete system architecture artifacts that would not have been developed if Agile had been used alone. Therefore, the first contribution of this stage lies in introducing a practical and operational method for merging Agile and MBSE. In parallel, the results suggest that sMBSAP is a middle ground that is more aligned with federal and state regulations, as it addresses the technical debt concerns. The second stage of this research compares Reliability of Estimation, Productivity, and Defect Rate metrics for sprints driven by Scrum versus sMBSAP through the experimental study in stage 1. The quasi-experimental study conducted ten sprints using each approach. The approaches were then evaluated based on their effectiveness in helping the Product Development Team estimate the backlog items they can build during a time-boxed sprint and deliver more Product Backlog Items (PBI) with fewer defects. The Commitment Reliability (CR) was calculated to compare the Reliability of Estimation with a measured average Scrum-driven value of 0.81 versus a statistically different average sMBSAP-driven value of 0.94. Similarly, the average Sprint Velocity (SV ) for the Scrum-driven sprints was 26.8 versus 31.8 for the MBSAP-driven sprints. The average Defect Density (DD) for Scrum-driven sprints was 0.91, while that of sMBSAP-driven sprints was 0.63. The average Defect Leakage (DL) for Scrum-driven sprints was 0.20, while that of sMBSAP-driven sprints was 0.15. The t-test analysis concluded that the sMBSAP-driven sprints were associated with a statistically significant larger mean CR, SV , DD, and DL than that of the Scrum-driven sprints. The overall results demonstrate formal quantitative benefits of an Agile MBSE approach compared to Agile alone, strengthening the case for considering Agile MBSE methods within the software development community. Future work might include comparing Agile and Agile MBSE methods using alternative research designs and further software development objectives, techniques, and metrics. Future investigations may also test sMBSAP with non-software systems to validate the methodology across other disciplines.Item Open Access Development of a human factors hazard model for use in system safety analysis(Colorado State University. Libraries, 2021) Birch, Dustin Scott, author; Bradley, Thomas, advisor; Miller, Erika, committee member; Cale, James, committee member; Ozbek, Mehmet, committee memberTraditional methods for Human Reliability Analysis (HRA) have been developed with specific applications or industries in mind. Additionally, these methods are often complicated, time consuming, costly to apply, and are not suitable for direct comparison amongst themselves. The proposed Human Factors Hazard Model (HFHM) utilizes the established and time-tested probabilistic analysis tools of Fault Tree Analysis (FTA) and Event Tree Analysis (ETA), and integrates them with a newly developed Human Error Probability (HEP) predictive tool. This new approach is developed around Performance Shaping Factors (PSFs) relevant to human behavior, as well as specific characteristics unique to a system architecture and its corresponding operational behavior. This updated approach is intended to standardize, simplify, and automate the approach to modeling the likelihood of a mishap due to a human-system interaction during a hazard event. The HFHM is exemplified and automated within a commercial software tool such that trade and sensitivity studies can be conducted and validated easily. The analysis results generated by the HFHM can be used as a standardized guide to SE analysts as a well as design engineers with regards to risk assessment, safety requirements, design options, and needed safety controls within the system architecture. Verification and evaluation of the HFHM indicate that it is an effective tool for HRA and system safety with results that accurately predict HEP values that can guide design efforts with respect to human factors. In addition to the development and automation of the HFHM, application within commonly used system safety Hazard Analysis Techniques (HATs) is established. Specific utilization of the HFHM within system or subsystem level FTA and Failure Mode and Effects Analysis (FMEA) is established such that human related hazards can more accurately be accounted for in system design safety analysis and lifecycle management. Lastly, integration of the HFHM within Model-Based System Engineering (MBSE) emphasizing an implementation into the System Modeling Language (SysML) is established using a combination of existing hazard analysis libraries and custom designed libraries within the Unified Modeling Language (UML). The FTA / ETA components of the hazard model are developed within SysML partially utilizing the RAAML (Risk Analysis and Assessment Modeling Language) currently under development by the Object Management Group (OMG), as well as a unique recursive analysis library. The SysML model successfully replicates the probabilistic calculation results of the HFHM as generated by the native analytical model. The SysML profiles developed to implement HFHM have application in integration of conventional system safety analysis as well as requirements engineering within lifecycle management.Item Open Access Hybrid MBSE-DevOps model for implementation in very small enterprises(Colorado State University. Libraries, 2024) Simpson, Cailin R., author; Simske, Steven, advisor; Miller, Erika, committee member; Reisfeld, Brad, committee member; Sega, Ronald, committee memberThis work highlights the challenge of implementing digital engineering (DE) practices, specifically model-based systems engineering (MBSE) and DevOps, in very small entities (VSEs) that deliver software products. VSEs often face unique challenges due to their limited resources and project scale. Various organizations have authored strategies for DE advancement, such as the Department of Defense's Digital Engineering Strategy and INCOSE's System Engineering 2035 that highlight the need for improved DE practices across the engineering fields. This work proposes a hybrid methodology named FlexOps, combining MBSE and DevOps, to address these challenges. The authors highlight the challenges faced by VSEs and emphasize that MBSE and DevOps adoption in VSEs requires careful consideration of factors like cost, skill availability, and customer needs. The motivation for the research stems from the difficulties faced by VSEs in implementing processes designed for larger companies. The authors aim to provide a stepping stone for VSEs to adopt DE practices through the hybrid FlexOps methodology, leveraging existing MBSE and DevOps practices while accommodating smaller project scales. This work emphasizes that VSEs supporting government contracts must also adopt DE practices to meet industry directives. The implementation of FlexOps in two case studies highlights its benefits, such as offering a stepping stone to DE practices, combining Agile, MBSE, and DevOps strategies, and addressing VSE-specific challenges. The challenges faced by VSEs in adopting DE practices may be incrementally improved by adopting a hybrid method: FlexOps. FlexOps was designed to bridge the gap between traditional practices and DE for VSEs delivering software products.Item Embargo Investigating the association between public health system structure and system effectiveness(Colorado State University. Libraries, 2024) Orr, Jason, author; Golicic, Susan, advisor; Bradley, Thomas, committee member; Miller, Erika, committee member; Gutilla, Molly, committee member; Magzamen, Sheryl, committee memberPublic health systems in the United States face significant challenges due to their complexity and variability. This dissertation follows a three-paper format and examines these systems through a comprehensive analysis, using systems approaches, latent transition analysis (LTA), and ordinal regression to uncover patterns and inform improvements in public health governance and service delivery. The first essay (Chapter 2) explores the application of systems approaches to the design and improvement of public health systems. A scoping review was conducted, revealing a paucity of literature on the use of "hard" systems methodologies like systems analysis and engineering in public health. The findings highlight the potential for systems approaches to enhance the efficiency, effectiveness, and equity of public health services. However, the limited engagement by public health practitioners and the lack of depth in existing literature indicate significant gaps that need to be addressed to fully leverage systems science in public health governance and service delivery. Building on the literature review, the second essay (Chapter 3) introduces a novel typology of local health departments (LHDs) using LTA based on the National Association of County and City Health Officials (NACCHO) Profile study data. The LTA identified six distinct latent statuses of LHDs, characterized by variables such as governance centrality, colocation, and integration. This typology provides a robust framework for understanding the structural and operational diversity of LHDs, offering insights into how these factors influence public health outcomes. The final essay (Chapter 4) applies ordinal regression analyses to explore the relationship between the latent statuses of LHDs and various community health outcomes. Initial analyses using a cumulative logit model indicated a violation of the proportional odds assumption, necessitating a shift to a generalized logit model. This approach revealed significant predictors of latent statuses, such as poor physical health days, preventable hospital stays, and life expectancy. The findings underscore the complexity of public health systems and the need for careful selection of statistical models to accurately capture these dynamics. The study provides actionable insights for public health policy and strategic planning, highlighting areas for future research and potential interventions to optimize public health system design and operations. This dissertation underscores the importance of systems approaches in understanding and improving public health systems. By leveraging advanced statistical models and exploring the structural characteristics of LHDs, it contributes to a deeper understanding of the factors influencing public health governance and service delivery. The findings offer a foundation for future research and policy development aimed at enhancing the efficiency and effectiveness of public health systems to better serve communities.Item Open Access Leveraging operational use data to inform the systems engineering process of fielded aerospace defense systems(Colorado State University. Libraries, 2023) Eddy, Amy, author; Daily, Jeremy, advisor; Marzolf, Gregory, committee member; Miller, Erika, committee member; Wise, Daniel, committee memberInefficiencies in Department of Defense (DoD) Acquisition processes have been pervasive nearly as long as the DoD has existed. Stakeholder communication issues, funding concerns, large and overly complex organizational structures all play a role in adding challenges to those tasked with fielding, operating, and sustaining a complex aerospace defense system. As legacy defense systems begin to age, logistics and other supportability element requirements may change over time. While research literature supports the evidence that many stakeholders and senior leaders are aware of the issues and the DoD faces the impact those issues cause to mission performance, most research and attempts to improve the performance issues have been focused on high level restructuring of organizations or policy, processes, and procedures. There has been little research dedicated to identifying ways for working level logisticians and systems engineers to improve performance by leveraging operational use data. This study proposes a practical approach for working level logisticians and engineers to identify relationships between operational use data and supply performance data. This research focuses on linking negative aircraft events (discrepancies) to the supply events (requisitions) that result in downtime. This approach utilizes standard statistical methods to analyze operations, maintenance, and supply data collected during the Operations and Sustainment (O&S) phase of the life cycle. Further, this research identifies methods consistent with industry systems engineering practices to create new feedback loops to better inform the systems engineering life cycle management process, update requirements, and iterate the design of the enterprise system as a holistic entity that includes the physical product and its supportability elements such as logistics, maintenance, facilities, etc. The method identifies specific recommendations and actions for working level logisticians and systems engineers to prevent future downtime. The method is practical for the existing DoD organizational structure, and uses current DoD processes, all without increasing manpower or other resource needs.Item Open Access Lifecycle assessment modeling and encouraging reuse in the corrugated packaging industry using persuasion and operant conditioning(Colorado State University. Libraries, 2023) Ketkale, Harshwardhan, author; Simske, Steve, advisor; Miller, Erika, committee member; Conrad, Steve, committee member; Cleary, Anne, committee memberGreenhouse gas emission is a major contributor to climate change and global warming. Many sustainability efforts are aimed at reducing greenhouse gas emissions. These include recycling and the use of renewable energy. In the case of recycling, the general population is typically required to at least temporarily store, and possibly haul, the materials rather than simply throwing them away. This effort from the general population is a key aspect of recycling, and in order for recycling to work, some investment of time and effort is required by the public. In the case of corrugated cardboard boxes, it has been observed that there is less motivation for the general population to recycle them. Also, the manufacturing of a product such as a corrugated cardboard box (CCB) includes the extraction of a variety of raw materials in addition to supply chain efforts to get the raw materials to the industry. The extraction of raw material and its supply chain as well as the unproper end of lifecycle phase can significantly impact the carbon emission of a product over its lifecycle. This research explores different means of motivating people to reuse, and not just recycle, with different types of incentives. It addresses the use of persuasion techniques and operant conditioning techniques together to incentivize the general population to adopt sustainable efforts. Further, this study makes an attempt to segment the general population based on age, gender, persuasion preferences, operant condition preferences, personality types, awareness of environment/climate change as well as current recycling effort of the participants to use different forms of incentives and motivational work unlike any approaches found in the literature review. Four types of persuasion techniques and four types of operant conditioning are combined to give 16 different types of incentives. Three online surveys are conducted, and their data are analyzed (using entropy, Hamming distance, t-test, chi-square, and ANOVA). The results indicate that "positive reinforcement + ethos" is a cost-effective way to incentivize the general population. This study also conducts a Lifecycle Assessment (LCA) that gives the carbon emission of each phase of the product and a quantitative estimate of the overall product carbon footprint and its effect on the environment. This gives impetus to recommendations for improving the phases of the lifecycle to minimize carbon emissions. This research uses LCA to evaluate the carbon emission in each phase of the lifecycle of a typical 1 kg corrugated cardboard box in the United States. Carbon emission for the proposed "reuse" phase is also calculated, and the results are compared. To examine if the incremental cost of reusing the CCBs is less than the environmental and economic cost of reducing the extraction and supply chain of raw materials, this study explores the economic feasibility of the proposed "reuse" method that incentivizes the general population to reuse the CCBs instead of recycling or landfilling them. Economic tools such as willingness-to-pay vs. marginal cost curves and benefit-cost analyses are used to evaluate economic feasibility. The results indicate that the "reuse" method for CCBs is economically and environmentally feasible. It also supports the approach of using analytics, economics, and LCA to create a model that can be used for other products and processes as an evaluative process to determine if businesses can benefit from the reduction (or removal) of material extraction costs from the supply chain. The results of this study can be applied to a wide range of applications such as solar panels, incentives for vaccination, and other areas wherein sustainability-centric behavior is encouraged.Item Open Access Machine learning and artificial intelligence approaches to the analysis of physical activity from wearables and biosensors in clinical trials: applications of clustering and prediction of clinical outcomes(Colorado State University. Libraries, 2022) Vlajnic, Vanja M., author; Simske, Steve, advisor; Miller, Erika, committee member; Cale, Jim, committee member; Reisfeld, Bradley, committee memberAs human demographics continue to trend toward elderly, especially in advanced economies, the treatment of illness becomes more salient. Across many therapeutic areas, researchers examine potential treatments while incorporating novel technologies in an effort to prolong the years in which quality of life is achieved for patients around the world. In the area of cardiovascular disease, wearable and biosensor data is becoming increasingly used in order to compliment data traditionally collected from clinical trials. This work discusses a series of analytical approaches for the analysis of data from recent clinical trials in which accelerometry data from wearable devices were analyzed using clustering approaches (K-means and consensus clustering) and survival analyses (Cox proportional hazards and random survival forest) for the purposes of clustering patients and assessing their baseline clinical characteristics as well as for the prediction of clinical outcomes. Unique clinical phenotypes were identified within the patient aggregations as part of the clustering analyses. Furthermore, models were created with improved predictive accuracy for clinical outcomes of interest in the heart failure space. Taken collectively, the results from these analyses and the analytical approaches therein can be used to assess whether heterogeneous clinical subgroups of patients exist as well as further guide the clinical development programs.Item Open Access Optimizing text analytics and document automation with meta-algorithmic systems engineering(Colorado State University. Libraries, 2023) Villanueva, Arturo N., Jr., author; Simske, Steven J., advisor; Hefner, Rick D., committee member; Krishnaswamy, Nikhil, committee member; Miller, Erika, committee member; Roberts, Nicholas, committee memberNatural language processing (NLP) has seen significant advances in recent years, but challenges remain in making algorithms both efficient and accurate. In this study, we examine three key areas of NLP and explore the potential of meta-algorithmics and functional analysis for improving analytic and machine learning performance and conclude with expansions for future research. The first area focuses on text classification for requirements engineering, where stakeholder requirements must be classified into appropriate categories for further processing. We investigate multiple combinations of algorithms and meta-algorithms to optimize the classification process, confirming the optimality of Naïve Bayes and highlighting a certain sensitivity to the Global Vectors (GloVe) word embeddings algorithm. The second area of focus is extractive summarization, which offers advantages to abstractive summarization due to its lossless nature. We propose a second-order meta-algorithm that uses existing algorithms and selects appropriate combinations to generate more effective summaries than any individual algorithm. The third area covers document ordering, where we propose techniques for generating an optimal reading order for use in learning, training, and content sequencing. We propose two main methods: one using document similarities and the other using entropy against topics generated through Latent Dirichlet Allocation (LDA).Item Open Access Quality attributes of digital twins(Colorado State University. Libraries, 2021) Scheibmeir, Jim, author; Malaiya, Yashwant, advisor; Miller, Erika, committee member; Skiba, Hilla, committee member; Bradley, Thomas, committee memberDigital twins are virtual representations of their physical counterparts and offer modeling, monitoring, and prediction as common conveniences. Digital twins can enable autonomy in physical systems and transformation in operations, such as Industry 4.0. Digital twins may also be utilized to better secure cyber-physical systems. Currently, digital twins are found to be compelling, but the definitions of digital twin lack standardization and concerns related to the costs and quality of digital twins create reluctance of adoption. While the emerging technologies being utilized within digital twins increase capabilities, they also generate concerns towards cost and quality. For example, IoT is utilized to inform digital twins, yet many IoT devices have low power constraints and do not have robust security mechanisms. There are also concerns about the interoperability of IoT devices and the replacement and upgrade costs throughout the digital twin's life cycle. Augmented reality is an emerging technology and has been suggested as a user interface for digital twins. Augmented reality enables digital models and scenes to be annotated onto the physical landscape. Fusing physical and virtual worlds is common to both augmented reality and digital twin technologies. However, scaling and sharing immersive experiences is problematic. While research in areas such as IoT, augmented reality, and digital twins are frequent, there are still questions about evaluating the quality of these technologies, the composition and security of digital twins, and their maturity. The main goal of this research is to establish the quality characteristics that digital twin applications should exhibit. We also provide a framework for the construction of application programming interfaces for digital twins and a novel approach for testing augmented reality applications utilizing computer vision. Further outcomes of our research include a benchmarking scorecard for IoT cybersecurity communications and a maturity model for digital twins. Social media analytics are used to reveal the voice of stakeholders, further defining trends in the realm of digital twins. The social media analytics has shown a dearth of conversations regarding cybersecurity concerns. Organizations must protect their investments in digital twins and utilizing the frameworks, quality and maturity models, and scorecard within this research will reduce implementation risks.Item Open Access Situational strategic awareness monitoring surveillance system (SSAMSS)(Colorado State University. Libraries, 2023) Maldonado, Kenly R., author; Simske, Steven J., advisor; Miller, Erika, committee member; Herber, Daniel, committee member; Dandy, David, committee memberThis dissertation takes a Systems Engineering approach for the development of a cost-effective, deployable remote sensing materials safeguarding system. This co-called system-of-systems undergoes the major portions of the Systems Engineering development process to assure with confidence that a Situational Strategic Awareness Monitoring Surveillance System (SSAMSS) is a competitive product considered for actual development. The overall assessment takes a strategic approach by using selective tools to create, confirm and consider whether SSAMSS as a product idea ultimately should be developed into an actual prototypical Model (Engineering Model). Although the dissertation explores whether a prototype should be considerate with confidence and risk consideration it does not actually dive into the physical development of the model due to limited time and actual funding. Through the Systems Engineering V-Model, SSAMSS as a product and enterprise is vetted from the customer needs analyses through unit testing (the bottom of the V-model). This is the point where an actual customer would decide to continue the to incorporate SSAMSS into integration and testing, prototype to operations, maintenance, and retirement. Through simulations, assessment, and analysis it has been determined that SSAMSS as a product and enterprise is a viable option to supersede current material safeguarding systems that are competitive in the marketplace today.Item Open Access Systems engineering assessment and experimental evaluation of quality paradigms in high-mix low-volume manufacturing environments(Colorado State University. Libraries, 2023) Normand, Amanda, author; Bradley, Thomas, advisor; Miller, Erika, committee member; Vans, Marie, committee member; Zhao, Jianguo, committee member; Sullivan, Shane, committee memberThis research aimed to evaluate the effectiveness of applying industrial paradigm application in high-mix low-volume manufacturing (HMLV) environments using a Systems Engineering approach. An analysis of existing industrial paradigms was conducted and then compared to a needs analysis for a specific HMLV manufacturer. Several experiments were selected for experimental evaluation, inspired by the paradigms, in a real-world HMLV manufacturing setting. The results of this research showed that a holistic approach to paradigm application is essential for achieving optimal performance, based on cost advantage, throughput, and flexibility, in the HMLV manufacturing environment. The findings of this research study provide insights into the importance of considering the entire manufacturing system, including both technical and human factors, when evaluating the effectiveness of industrial paradigms. Additionally, this research highlights the importance of considering the unique characteristics of HMLV manufacturing environments, such as the high degree of variability and frequent changes in product mix in designing manufacturing systems. Overall, this research demonstrates the value of a systems engineering approach in evaluating and implementing industrial paradigms in HMLV manufacturing environments. The results of this research provide a foundation for future research in this field and can be used to guide organizations in making informed decisions about production management practices in HMLV manufacturing environments.Item Open Access The next generation space suit: a case study of the systems engineering challenges in space suit development(Colorado State University. Libraries, 2023) Cabrera, Michael A., author; Simske, Steve, advisor; Marzolf, Greg, committee member; Miller, Erika, committee member; Delgado, Maria, committee memberThe objective for a NASA contractor, the performing organization in this case study, is to develop and deliver the next generation space suit to NASA, the customer in this case study, against a radically different level of customer expectation from previous years. In 2019, the administration had proposed a return to the moon, thus transforming and changing the system context of the current, next generation space suit in addition to pushing schedule expectations forward two years. The purpose of this dissertation will serve as a case study in two specific areas with qualitative and quantitative analyses regarding a new process and approach to (i) project lifecycle development and (ii) requirements engineering with the intent that if utilized, these tools may have contributed to improvements across the project in terms of meeting cost, scope, budget and quality while appropriately accounting for risk management. The procedure entails a research method in which the current state of the project, current state of the art, and the identified systems engineering challenges are evaluated and iterative models are tempered through development by continual improvements by engineering evaluation of engineers on the project. The current results have produced (i) a prototype project lifecycle development method via agile, Lean and Scrum hybrid implementations into a Traditional Waterfall framework and (ii) a prototype requirements engineering scorecard with implementations of FMEA and quantitative analysis to determine root cause identification.