Browsing by Author "Grigg, Neil, advisor"
Now showing 1 - 10 of 10
- Results Per Page
- Sort Options
Item Open Access A collaborative planning framework for integrated urban water management with an application in dual water supply: a case study in Fort Collins, Colorado(Colorado State University. Libraries, 2018) Cole, Jeanne Reilly, author; Sharvelle, Sybil, advisor; Grigg, Neil, advisor; Arabi, Mazdak, committee member; Goemans, Chris, committee memberUrban water management is essential to our quality of life. As much of our urban water supply infrastructure reaches the end of its useful life, water managers are using the opportunity to explore alternative strategies that may enable them to better meet modern urban water challenges. Water managers must navigate the labyrinth of balancing stakeholder needs, considering all costs and benefits, reducing decision risk, and, most importantly, ensuring public health and protecting the environment. Innovative water managers need guidance and tools to help manage this complex decision space. This dissertation proposes a collaborative, risk-informed, triple bottom line, multi-criteria decision analysis (CRTM) planning framework for integrated urban water management decisions. The CRTM framework emerged from the obstacles and stakeholder needs encountered during a study evaluating alternative dual water supply strategies in Fort Collins, Colorado. The study evaluated four strategies for the dual supply of raw and treated water including centralized and decentralized water treatment, varying distribution system scales, and integration of existing irrigation ditches with raw water landscape irrigation systems. The results suggest that while the alternative dual water supply strategies offer many social and environmental benefits, the optimal strategies are dependent on local conditions and stakeholder priorities. The sensitivity analysis revealed the key parameters driving uncertainty in alternative performance were regulatory and political reinforcing the importance of participation from a wide variety of stakeholders. Evaluation of the decision process suggests the CRTM framework increased knowledge sharing between study participants. Stakeholder contributions enabled a comprehensive evaluation of the option space while examining the financial, social and environmental benefits and trade-offs of the alternatives. Most importantly, evolving the framework successfully maintained stakeholder participation throughout the study.Item Open Access Delay-caused claims in infrastructure projects under design-bid-build delivery systems(Colorado State University. Libraries, 2014) Mehany, Mohammed S. Hashem M., author; Grigg, Neil, advisor; Guggemos, Angela, advisor; Fontane, Darrell, committee member; Senior, Bolivar, committee memberDelay-caused claims lead to cost overruns in infrastructure projects due to the cost of construction time and impacts on related services. The purpose of this study was to evaluate the causes and effects of these claims on road and bridge projects and to create an effective claims management system. The study was conducted on road construction because data are more readily-available than for other infrastructures, but findings may apply to other categories, such as the construction of buried utilities. The study was limited to projects where the design-bid-build (DBB) delivery system was used. The study investigated the driving factors that give rise to claims, such as the extent to which delays affect them; how other parameters affect them; how the methods to estimate and analyze cost and delay claims work; what the most significant cost and resource items for delay-induced claims under DBB contracts are; whether a practical relationship can be established to analyze claims; how a workable claims management system can be created at the lowest level; and how this system can be addressed in DBB contracts. These questions were explored by isolating a list of variables including but not limited to the location of the project according to the Colorado Department of Transportation (CDOT) region plan, budget variations, schedule variations, schedule type, scope of work, and road material type. Data for the study covered the six regions of CDOT and included 1,060 projects with a timeframe that spanned from 1997 to 2012 with more than 213 claims. Dates on the claims were organized using several Microsoft Access databases and Excel worksheets to screen and organize all the information where for each claim, the corresponding data on for all the variables were identified and catalogued. The study used three different methods for statistical analysis including frequency analysis, logistic regression, and correlation analysis to analyze delays and the driving variables. One of the issues explored was how the causes of delays are identified during the complex evolution of the construction project through different methods and techniques. The methods explored included different analytical and forensic techniques including Net Impacted Method, Impacted as Planned (IAP), Collapsed As-Built (CAB), Schedule Window Analysis (SWA), and Time Impact Analysis (TIA). A case study was also formulated to explore and prove the differences between these different analytical and forensic techniques of delays within a construction schedule. The results indicated that schedule variation is the most significant contributing parameter to claims occurrences in the road infrastructure industry. It identified a significant relationship between schedule and budget variations and showed an expected association between costs and time in claims. Unlike the general consensus of the construction industry that claims occur due to issues of cost and added contract items, the study indicated that schedule delays were the main driver behind claims occurrences. The case study showed the differences and the manipulation ability of different schedule analysis techniques where a contracting party can analyze the same delays using a certain technique that favors the party's advantage or reduces its responsibility in the delay claim. This in turn shows the need for the standardization of the delay analysis techniques between the contracting parties. It also showed the advantages, disadvantages and accuracy of the different delay claim analysis techniques and proved the overall superiority of the Time Impact Analysis (TIA) technique due to its accuracy and contemporaneous and proactive approach. The study included development of a standardized delay claims management system and a set of best practices for owners and contractors to create a fair and proactive process to resolve claims and minimize disputes and delay costs. The claim management system and best practices identify in detail all issues of delay claims management including but not limited to delays in detection and documentation, standardization of the delay analysis method represented in the TIA, weather issues and delays, scheduling specification and delay claims schedule and cost documentations. Given that road agencies generally lack standardized methods of integrated cost and time estimates to facilitate the claim resolution process, a growing number of claims will create more disputes and conflict resolution issues. The identification of major parameters to explain the occurrence of claims and how to deal them in a claims management system should help to lower infrastructure costs and speed completion of projects.Item Open Access Identification of spatial and topographical metrics for micro hydropower applications in irrigation infrastructure(Colorado State University. Libraries, 2012) Campbell, Brian, author; Grigg, Neil, advisor; Catton, Kimberly, advisor; Zimmerle, Dan, committee memberA recent agreement between the Federal Energy Regulatory Commission and the State of Colorado seeks to streamline regulatory review of small, low-head hydropower (micro hydropower) projects located in constrained waterways, (Governor's Energy Office, 2010). This regulatory change will likely encourage the development of micro hydropower projects, primarily as upgrades to existing infrastructure. Previous studies of low-head hydropower projects have estimated the combined capacity of micro hydro projects in Colorado between 664 MW to 5,003 MW (Connor, A.M., et al. 1998; Hall, D.G., et al. 2004, 2006). However, these studies did not include existing hydraulic structures in irrigation canals as possible hydropower sites. A Colorado Department of Agriculture study (Applegate Group, 2011) identified existing infrastructure categories for low head hydropower development in irrigation systems, which included diversion structures, line chutes, vertical drops, pipelines, check structures and reservoir outlets. However, an accurate assessment of hydropower capacity from existing infrastructures could not be determined due to low survey responses from irrigation water districts. The current study represents the first step in a comprehensive field study to quantify the type and quantity of irrigation infrastructure for potential upgrade to support micro hydropower production. Field surveys were conducted at approximately 230 sites in 6 of Colorado's 7 hydrographic divisions at existing hydraulic control structures. The United States Bureau of Reclamation contributed approximately 330 additional sample sites from the 17 western states. The work presented here describes a novel method of identifying geospatial metrics to support an estimation of total site count and resource availability of potential micro hydropower. The proposed technique is general in nature and could be utilized to assess micro hydropower resources in any region.Item Open Access Information-augmented building information models (BIM) to inform facilities management (FM) guidelines(Colorado State University. Libraries, 2019) Sadeghi, Marjan, author; Grigg, Neil, advisor; Elliot, Jonathan W., advisor; Mehany, Mohammed S. Hashem M., committee member; Anderson, Charles W., committee memberThe asset portfolios of Higher Education Institutions (HEI) typically incorporate a highly diverse collection of buildings with various and often shared campus uses. These facilities are typically at different points in their operational lifecycle, have different characteristics, rehabilitation cost, maintenance costs, and mission criticality. In the resource-constrained context of higher education Facilities Management (FM), building data for all facilities needs to be integrated within a highly-informed decision-making process to promote efficient operation. Further, efficient building FM workflows depend upon accurate, reliable, and timely information for various building-specific systems, components, and elements. Traditional Facilities Information Management (FIM) platforms and processes have been shown to be inefficient and limited for capturing and delivering the extensive and comprehensive data needed for FM decision making. Such inefficiencies include, but are not limited to, information loss, inconsistencies of the available data, and manual data re-entry at construction-to-operation handover and project close out. Building Information Models (BIMs) are capable of integrating large quantities of data and have been recognized as a compelling tool for facility life-cycle information management. BIMs provide an object-oriented, parametric, 3D environment where meaningful objects with intelligent behavior can contain geometric and non-geometric data. This capability makes BIMs a powerful tool for use beyond building visualization. Furthermore, BIM authoring tools are capable of automatically integrating data with FM technologies. Although BIMs have the potential to provide a compelling tool to capture, deliver, validate, retrieve, exchange, and analyze facility lifecycle information, implementation of BIMs for FM handover and integration within the context of FIM remains limited. A plethora of academic and industry efforts strive to address various aspects of BIM interoperability for handing over building data for implementation in post-construction building operation workflows. Attempts to incorporate BIMs in FIM have generally focused on one of five domains; what information is to be exchanged, how, when, by whom, and why. This three-manuscript dissertation explores FM handover information exchange scenarios and provides a comprehensive, object-oriented BIM solution that identifies the requirements for model content for FM- specific needs. The results formalize an appropriate process and structured framework to deliver BIM content using FM-specific terminologies and taxonomies. BIMs created for design and construction using this framework provide a suitable 3D resource for post-handover FM and building operation. The BIM development framework presented herein can facilitate automated model data validation at project close out and the exchange of AEC data with FIM systems. This modeling process can reduce the need for manual data re-entry or interpretation by FM stakeholders during building operation. This study defines requirements for model Exchange Objects (EOs) to meet FM data Exchange Requirements (ERs) in conjunction with the established Industry Foundation Classes (IFC). The ERs, retrieved from closeout deliverables, are mapped to appropriate IFC Model View Definition (MVD) concepts for EOs, which ultimately provide the technical solution for the FM handover exchange scenario. These concepts determine required entities, their relationships, and properties. The author further translated the concepts to the ERs of Level of Development (LOD) definitions to provide the means for an owner to formalize conveyance of geometric requirements. To formalize a BIMs semantic requirements, not addressed in the LOD schema, this study introduces Level of Semantics (LOS) by mapping ERs to IFC categories and their respective property definitions. The results also include development of an implementation agreement, which customizes the FM handover IFC Model View (MV) according to an organization's specific needs. The IFC MV implementation agreement establishes a common understanding of the FM handover MV content in alliance with the buildingSMART Data Dictionary (bsDD) schema. The modularized and repeatable nature of the resulting framework facilitates its implementation to convey FIM data exchange requirements for future projects.Item Open Access Model-based systems engineering application to data management for integrated sustainable human settlement modeling(Colorado State University. Libraries, 2024) Adjahossou, Anicet, author; Grigg, Neil, advisor; Bradley, Thomas, committee member; Conrad, Steven, committee member; Willson, Bryan, committee member; Fremstad, Anders, committee memberThe challenges associated with the transition from current approaches to temporary humanitarian settlement to integrated, sustainable human settlements is largely due to a significant increase in the number of forcibly displaced people over the last few decades, the difficulties of sustainably providing the needed services to the required standards, and the prolongation of emergencies. According to the United Nations High Commissioner for Refugees (UNHCR)'s Global Appeal 2023, more than 117.2 million people were forcibly displaced or stateless in 2023, representing a little over 1% of the world's population. The average lifespan of a humanitarian settlement is between 17 and 26 years (UNHCR), and factors such as urban growth and adverse environmental changes have exacerbated the scale of the difficulties. Despite these problematical contexts, short-term considerations continue to guide the planning and management of humanitarian settlements, to the detriment of more integrated, longer-term perspectives. These factors call for a paradigm shift in approach to ensure greater sustainability right from the planning phases. Recent studies often attribute the unsustainability of humanitarian settlements to poor design and inadequate provision of basic resources and services, including water, energy, housing, employment and economic opportunities, among others. They also highlight apparent bottlenecks that hinder access to meaningful and timely data and information that stakeholders need for planning and remediation. More often than not, humanitarian operations rely on ad hoc methods, employing parallel, fragmented and disconnected data processing frameworks, resulting in the collection of a wide range of data without subsequent analysis or prioritization to optimize potential interconnections that can improve sustainability and performance. Furthermore, little effort has been made to investigate the trade-offs involved. As a result, major shortcomings emerged along the way, leading to disruption, budget overruns, disorder and more, against a backdrop of steadily declining funding for humanitarian aid. However, some attempts have been made to move towards more sustainable design approaches, but these have mainly focused on vague, sector-specific themes, ignoring systemic and integrative principles. This research is a contribution to filling these gaps by developing more practical and effective solutions, based on an integrated systemic vision of a human settlement, defined and conceptualized as a complex system. As part of this process, this research proposes a model-driven methodology, supported by Model-Based Systems Engineering (MBSE) and a Systems Modeling Language (SysML), to develop an integrated human settlement system model, which has been functionally and operationally executed using Systems Engineering (SE) approach. This novel system model enables all essential sub-systems to operate within the single system, and focuses on efficient data processing. The ultimate aim is to provide a global solution to the interconnection and integration challenges encountered in the processing of operational data and information, to ensure an effective transition to sustainable human settlements. With regard to the interconnectedness between the different sectors of the sub-systems, this research proposes a Triple Nexus Framework (TNF) in an attempt to integrate water, energy and housing sector data derived from one sub-system within the single system by applying systems engineering methods. Systems engineering, based on an understanding of the synergies between water, energy and housing, characterizes the triple nexus framework and identifies opportunities to improve decision-making steps and processes that integrate and enhance quality of data processing. To test and validate the performance of the system model, two scenarios are executed to illustrate how an integrated data platform enables easy access to meaningful data as a starting point for modeling an integrated system of sustainable human settlement in humanitarian contexts. With regard to framework performance, the model is simulated using a megadata nexus, as specified by the system requirement. The optimization simulation yields 67% satisfactory results which is further confirmed from a set of surveyed practitioners. These results show that an integrated system can improve the sustainability of human settlements beyond a sufficiently acceptable threshold, and that capacity building in service delivery is beneficial and necessary. The focus on comprehensive data processing through systems integration can be a powerful tool for overcoming gaps and challenges in humanitarian operations. Structured interviews with question analysis are conducted to validate the proposed model and framework. The results prove a consensus that the novel system model advances the state of the art in the current approach to the design and management of human settlements. An operational roadmap with substantial programmatic and technical activities required to implement the triple nexus framework is recommended for adoption and scaling-up. Finally, to assess the sustainability, adaptability and applicability of the system, the proposed system model is further validated using a context-based case study, through a capacity assessment of an existing humanitarian settlement. The sustainability analysis uses cross-impact matrix multiplication applied to classification (MICMAC) methodologies, and results show that the development of the settlement are unstable and therefore unsustainable, since there is no apparent difference between influential and dependent data. This research tackles an important global challenge, providing valuable insights towards sustainable solutions for displaced populations, aligning with the United Nations 2030 Agenda for Sustainable Development.Item Open Access Optimal sensor placement for sewer capacity risk management(Colorado State University. Libraries, 2019) Kimbrough, Hal Reuben, author; Duff, William, advisor; Grigg, Neil, advisor; Labadie, John, committee member; Ham, Jay, committee memberComplex linear assets, such as those found in transportation and utilities, are vital to economies, and in some cases, to public health. Wastewater collection systems in the United States are vital to both. Yet effective approaches to remediating failures in these systems remains an unresolved shortfall for system operators. This shortfall is evident in the estimated 850 billion gallons of untreated sewage that escapes combined sewer pipes each year (US EPA 2004a) and the estimated 40,000 sanitary sewer overflows and 400,000 backups of untreated sewage into basements (US EPA 2001). Failures in wastewater collection systems can be prevented if they can be detected in time to apply intervention strategies such as pipe maintenance, repair, or rehabilitation. This is the essence of a risk management process. The International Council on Systems Engineering recommends that risks be prioritized as a function of severity and occurrence and that criteria be established for acceptable and unacceptable risks (INCOSE 2007). A significant impediment to applying generally accepted risk models to wastewater collection systems is the difficulty of quantifying risk likelihoods. These difficulties stem from the size and complexity of the systems, the lack of data and statistics characterizing the distribution of risk, the high cost of evaluating even a small number of components, and the lack of methods to quantify risk. This research investigates new methods to assess risk likelihood of failure through a novel approach to placement of sensors in wastewater collection systems. The hypothesis is that iterative movement of water level sensors, directed by a specialized metaheuristic search technique, can improve the efficiency of discovering locations of unacceptable risk. An agent-based simulation is constructed to validate the performance of this technique along with testing its sensitivity to varying environments. The results demonstrated that a multi-phase search strategy, with a varying number of sensors deployed in each phase, could efficiently discover locations of unacceptable risk that could be managed via a perpetual monitoring, analysis, and remediation process. A number of promising well-defined future research opportunities also emerged from the performance of this research.Item Open Access Quality control of front-end planning for electric power construction: a collaborative process-based approach using systems engineering(Colorado State University. Libraries, 2024) Nguyen, Frank Bao Thai, author; Grigg, Neil, advisor; Valdes-Vasquez, Rodolfo, advisor; Gallegos, Erika, committee member; Glick, Scott, committee memberControlling construction costs in the electric power industry will become more important as the nation responds to new energy demands due to the transition from gasoline to electric vehicles and to emerging trends such as artificial intelligence and use of cryptocurrency. However, managing electric utility construction project costs requires that the risk of field change orders (FCOs) during construction be controlled. In the electric power industry, utility companies face increasing risk from FCOs, due to conversion from overhead to underground systems required by security and climate change factors, and subgrade work is more challenging and less predictable than the more visible overhead work. Change orders cause cost overruns and schedule slippages and can occur for reasons such as changes in scope of work, unforeseen jobsite conditions, modifications of plans to meet existing field conditions, and correction of work required by field inspectors to meet safety standards. The best opportunity to control FCOs comes during front-end planning (FEP) when conditions leading to them can be identified and mitigated. This study utilized systems engineering methodologies to address risk of FCOs in three phases: (1) defining the root causes and identifying severities of FCOs, (2) evaluating stakeholder responsibilities to find and mitigate root causes of FCOs, and (3) developing a process to identify and find solutions for the risk of FCOs. The first phase involved using a descriptive statistical analysis of the project database of an electric utility company to identify and analyze the magnitude, frequency, and causes of FCOs in overhead and underground electrical construction. The results showed that FCOs with added scopes occurred more frequently in underground projects than in overhead projects. The analysis also indicated that most causes of FCOs could be managed during the FEP process, and it laid a foundation for the next phase, to promote collaboration among stakeholders to allocate responsibility to identify and mitigate risk of FCOs. In the second phase, the study used Analytical Hierarchy Process methodologies to distribute weights of stakeholder votes to create an integrated metric of front-end planning team confidence that a desired level of quality had been achieved. This study was significant in that it showed how effectiveness of collaborative working relationships across teams during front-end planning could be improved to create a quality control metric to capture risk of FCOs. In the third phase, the study used results from the first two phases and additional tools based on Swimlane diagrams and logical relationships between tasks and stakeholders to formulate a quality control roadmap model. This model is significant because it creates a roadmap to enhance the effectiveness of interdisciplinary teamwork through a critical path of the FEP process. The roadmap model shows a streamlined process for decision-making in each phase of front-end planning to minimize risk of FCOs through a logical path prior to final design. While there have been efforts to improve the design process, this study is the first one known to the researcher to address quality control of FEP using a roadmap process for quality control in electric power construction projects. The primary contribution is to enrich the body of knowledge about quality control of FEP by creating a roadmap model based on systems engineering and enhancing the effectiveness of collaborative working relationships in a logical process that captures risk of FCOs early in the FEP process. Besides the contribution of a method to reduce the risk of FCOs, the study points to another important concern to the construction industry about safety on the jobsite. The contractor normally requires a time extension to complete the work due to an FCO, but to reduce the impact to the project schedule, overtime is normally provided to the construction workers to perform the task. Additional research on this issue is required, but it is apparent that due to the fatigue of long working hours, this overtime may impact the task performance as well as the physical and psychological well-being of the construction workers, and they may lose safety awareness and have higher risk of accidents on the construction site. Thus, reducing the risk of FCOs will lead to less overtime and is an effective way for the construction project team to reduce the risk of construction accidents.Item Open Access The development of a simplified asset management model for fixed US Air Force installations(Colorado State University. Libraries, 2014) Gregory, Colby S., author; Grigg, Neil, advisor; Bailey, Ryan, committee member; Glick, Scott, committee memberWater utility infrastructures that support Air Force installations are not only important to but also expensive to maintain and repair. While the Air Force strategic asset management structure focuses on mid- to long-term planning for budget allocation, at the installation level many issues confront the effectiveness of this program. Problems arise at every level within the installation's utility asset management program from asset inventory to condition assessments and failure consequence assessment. With inaccurate asset inventories, data disparities and uncertain information on system condition, installations are forced to take a "worst first" approach to maintenance operations. The largest issues confronting utility management at the installation level are time and money. Reductions in force size and spending provide the impetus to create a simplified method for asset management. To solve this complex problem, an investigation of various approaches to utility asset management has been conducted to encompass the intent of the Air Force's existing activity management framework. Using pre-existing information and new methods, a risk management model was developed to bolster the efficacy of the pre-existing management system. Knowledge-based condition assessments and criticality assessments allow utilities engineers to calculate infrastructure risk for their planning horizons, rather than strategic planning horizons. This research includes analytical and mathematical approaches that formulate the backbone of the simplified process. This study also provides a user-focused data model and an implementation strategy to outline the processes required to improve management conditions. By laying the groundwork for how utility infrastructures can be better managed, conclusions about feasible approaches are made considering the Air Force's monetary and manpower constraints. The research was validated through a case study at Francis E. Warren Air Force Base. A discovery was made that through both a paradigm shift in the calculation and communication of failure consequences, improvements can be made to the process by which infrastructure is managed at the installation level. The research concludes with an analysis of the roles of key factors in the process of asset management as practiced by the defense industry and fee-based public utilities. The implications of this research primarily benefit multi-layered organizations that currently use a top-down approach to asset management. By aiding the ability for lower levels to aggregate data and determine priority, improved levels of service, more effective mission support and reduced outages may be realized.Item Open Access The spatial distribution patterns of snow water equivalent data for the accumulation phase across the southern Rocky Mountains(Colorado State University. Libraries, 2020) Schrock, Isaac J. Y., author; Grigg, Neil, advisor; Fassnacht, Steven R., committee member; Sharvelle, Sybil, committee memberThe spatial characteristics and patterns of snow accumulation and ablation are used to estimate runoff volume, and timing of snowpack in mountainous regions across the western United States. This paper focuses on quantifying and characterizing the snow accumulation phase to investigate the spatio-temporal snow water equivalent (SWE) distribution in the Southern Rocky Mountains (SRM). Average daily SWE data were obtained from 90 Natural Resources Conservation Service (NRCS) Snow Telemetry (SNOTEL) data stations from southern Wyoming to northern New Mexico for the snow years between from 1982 to 2015. The stations range in elevation between 2268 and 3536 meters, and they were aggregated into seven sub-sets, based on elevation (high-low), latitude (north-south) and annual maximum SWE (above average, average, below average snow years). For the entire dataset and the seven data sub-sets, the standard deviation versus mean trajectories were developed. Each trajectory was comprised of average daily data points across the snow year, and each data point represented the standard deviation and mean SWE values from a sub-set of the SNOTEL stations. The trajectory can be used to describe and represent the change in the snowpack over the water year. Within each trajectory, the accumulation (increasing snowpack), hysteretic (increasing and decreasing snowpack) and ablation (decreasing snowpack) phases can be observed, characterized and modeled. For this paper, regression techniques were applied to the accumulation phase only. The regression form, average slope, maximum slope, minimum slope, and coefficient of determination values were extracted. These data were aggregated across elevation, latitude and snow year sub-sets, and spatial patterns were evaluated. Although the prior study (Egli and Jonas, 2009) used snow depth data, SWE data were the focus for this study. SWE data were available for a longer period of record than snow depth data in the SRM, and since SWE measures the mass of water rather than depth snow, the physical effects of snow settling were eliminated from the analysis. The snow settling signature appeared in the data as noise in the standard deviation versus mean depth trajectory plots, compared to SWE trajectory plots. The removal of this noise, i.e., use of SWE trajectory plots, yielded stronger correlations than were produced using snow depth data. The accumulation phase data most closely fit a truncated linear regression model, with the average slopes ranging between 0.36 to 0.40 (seven sub-sets), and the average standard deviation values ranging between 0.042 to 0.097. While the average accumulation slopes were fairly similar across all seven sub-sets, latitude impacted snowpack variability more significantly than did elevation. Within individual years, the accumulation snowpack in the south region was frequently more homogenous than the north region, but when aggregated across the 34-year study, the accumulation snowpack in the south region was less consistent on an inter-annual basis. In contrast to original hypotheses, when SWE were discretized by both elevation and latitude, the standard deviation of the accumulation slopes increased, rather than decreased. Snow year (above average, average, below average) was found to have a negligible impact on spatial homogeneity of the accumulation snowpack, except within the south-high sub-set, where range in average accumulation slope was 0.10. Generally, the snowpack was found to be more homogenous for below average snow years 3 compared to average or above average snow years, because below average snow years exhibited the lowest average accumulation slopes of the three categories.Item Open Access Waste to resource - beneficial use of water treatment residuals as a stormwater control measure amendment for phosphorus removal(Colorado State University. Libraries, 2020) Shehab, Omar, author; Grigg, Neil, advisor; Sharvelle, Sybil, committee member; Hoag, Dana, committee memberThe increase in nutrient pollution is an alarming issue, and innovative and cost-effective measures need to be taken. This study addressed two issues: removing dissolved phosphorus introduced through stormwater runoff using water treatment residuals (WTRs) and the economic value of diverting this waste material from landfills to be used as an amendment in stormwater best management practices for treating stormwater runoff. The City of Fort Collins has monitored a bioretention rain garden located at a municipal facility for several years and has consistently seen a slight decrease and, at times, even an increase in the total mass of phosphorous in stormwater effluent leaving these facilities. The increase in mass was primarily due to higher dissolved phosphorous concentrations in the rain garden's effluent. Based on prior research at Colorado State University, the use of water treatment residuals (WTRs) was selected for laboratory-scale analysis and field-scale evaluation. This research aimed to evaluate whether this waste material generated during drinking water treatment operations could be diverted from landfills and instead, used as an amendment in stormwater best management practices (BMPs) for treating stormwater runoff. Simultaneously, it is hoped that this waste product's beneficial use can result in a safe and significant reduction in dissolved phosphorous input into water bodies. WTRs from the local water treatment plant were evaluated and found to have a very high adsorptive capacity for phosphorus with a phosphorus sorption capacity (PSC) of 21.56 lbs. dissolved phosphorus per ton WTRs, making it a strong candidate as an amendment to current BMPs. A column test was conducted to demonstrate a proof of concept for how WTRs can reduce phosphorus loads leaving BMPs. Column tests revealed that exposure time and application location (top, mixed, or bottom) of WTRs within the BMP media were the critical factors of phosphorus removal. A study was also conducted to determine how much phosphorus load could be reduced if WTRs were applied to BMPs throughout Fort Collins. The citywide analysis displayed a significant reduction, if not an elimination, of the need to send this current waste product to local landfill facilities, thereby reducing disposal costs and increasing the useful life of local landfill operations. The current operation by the City of Fort Collins disposes WTRs into the county's landfill. This study estimated the cost of current operations, the cost of using WTRs in stormwater BMPs, and an additional potential scenario in where the landfill was moved twice as far. Transportation, tipping/application, and staff time were the main cost components and were estimated for the different scenarios. It was found that using WTRs as an amendment in stormwater BMPs would save the City around $5,000 annually compared to the current operation and $13,000 compared to the disposing of WTRs to the new landfill. The outcome of such an approach was shown to be not only economical, but it also provided environmental and social benefits as it would reduce dissolved phosphorus significantly from stormwater runoff, which results in improved water quality and elimination of a current product.