Browsing by Author "Woerner, Dale, committee member"
Now showing 1 - 10 of 10
- Results Per Page
- Sort Options
Item Open Access Dairy cow management systems: handling, health, and well-being(Colorado State University. Libraries, 2012) Adams, Ashley E., author; Román-Muñiz, Ivette N., advisor; Olea-Popelka, Francisco J., committee member; Woerner, Dale, committee member; Grandin, Temple, committee memberDairy cows are handled more regularly than other forms of livestock. Besides the daily milking routine, reasons for handling dairy cattle so frequently include routine veterinary checks, reproductive management, and vaccinations. Regardless of the reason for handling cows, a working facility that meets the needs of a particular dairy is necessary. An ideal facility allows for all aspects of cow management to be accomplished in one spot, with minimal risk of injury to either the handler or the cow, as efficiently as possible. The main objectives of this survey were: 1) to determine if there is a need for a new type of handling facility for working dairy cows, similar to that of a management rail, that would allow all injections to be administered according to Dairy Beef Quality Assurance (DBQA) standards; 2) to establish if Colorado dairy producers were concerned with DBQA; and 3) to assess the differences in responses by dairy owners and management/herd-personnel. Additionally, the survey was designed to enhance our current understanding of the Colorado dairy industry by improving knowledge on demographics, record keeping, culling decisions, synchronization programs, and cull cow marketing options. Of the 95 dairies contacted via electronic mail and telephone, 20 agreed to participate in the survey, for a response rate of 21%. The median number of cows per herd was 1177.50, with 90% of the respondents representing conventional dairy herds. The most common type of working facility was determined to be headlock stanchions, with 95% of the dairies using them as a form of restraint while handling cows. Just over half of those surveyed (55%) indicated that they would be willing to install a management rail when building a new handling facility. When asked to rank, in order of importance, 7 traits to consider when designing a new handling facility, 75% of producers ranked having the ability to administer injections as per Beef Quality Assurance (BQA) standards as last or second to last, illustrating the lack of concern for BQA protocols on Colorado dairy operations. When considering the actual drug administration practices, the majority (79%) of dairy producers stated that the preferred location for administering all intramuscular (IM) injections is in the neck, although only 20% confirmed that all IM injections are given in that area 100% of the time. The results of the survey in Chapter II demonstrate a vast difference in the ideal situation versus what is actually carried out in practice. These findings support the theory that, while producers may be aware that all IM injections should be given in the neck region, most dairy producers are not concerned enough about BQA to ensure that all injections are actually given in that location. By implementing stricter BQA protocols dairy owners could prevent the risk of discounts, potentially increasing the amount of revenue gained by the sale of these animals, ultimately increasing the percentage of income derived from cull cows. Survey results indicate a need for better DBQA practices on many Colorado dairies, and handling facilities that allow the safe and consistent administration of all medications in the neck area of dairy cows. More effective educational programs are needed in order to make the incentives tied to high quality dairy beef more apparent to dairy producers. These educational opportunities should also focus on the responsibility of providing a wholesome product to the consumer, free of lesions and drug residues. An additional study was carried out investigating associations between increases in body temperature and common production diseases of dairy cattle, which is presented in Chapter III. Body temperature monitoring is a common practice employed on dairy farms as a way of detecting disease in dairy cows. Common production disorders of dairy cows that can result in a deviation of the animal's body temperature from normal include metritis, mastitis, some causes of lameness, and pneumonia. The objective of the study was to investigate associations between increases in core body temperature in cows and the diagnosis of metritis, mastitis, lameness, and pneumonia by dairy personnel. A prospective case-control study was completed on a 2175-cow dairy operation in Colorado from May 2010 to April 2011. Each cow received an orally administered temperature sensing reticular bolus after parturition and reticular temperature measurements were recorded 3 times per day as lactating cows exited the milking parlor. A cow was identified as having an increased core body temperature when a deviation of 0.8°C above baseline (average of readings of previous 10 days) was recorded by the TempTrack® Sofware. During the same study period, dairy personnel without access to reticular temperature data, recorded health events and classified them according to clinical signs observed. A total of 201 health events (cases) were included in the data analysis. Cows with clinical mastitis and pneumonia had significantly higher odds (6.7 and 7.5 times higher, respectively) of having an increased core body temperature within 4 days preceding diagnosis when compared to control cows. No significant difference in core body temperature was found for cows diagnosed with lameness or metritis. Results of the study in Chapter III suggest that reticular temperature monitoring can be a useful tool in the early detection of mastitis and pneumonia in dairy cows.Item Open Access Driving change: the 2011 National Beef Quality Audit(Colorado State University. Libraries, 2012) Igo, Jessica Leigh, author; Belk, Keith, advisor; Tatum, Daryl, committee member; Woerner, Dale, committee member; Chapman, Phillip, committee memberThe National Beef Quality Audit - 2011 evaluated the current status and progress being made towards quality and consistency of cattle, carcasses, and beef products produced by the U.S. fed beef population since the introduction of the National Beef Quality Audit in 1991. The objectives of this research were to determine how each beef market sector defines seven identified quality categories, estimates willingness to pay (WTP) for specified quality categories within each beef market sector, and establishes a best-worst (BW) scaling for the identified quality attributes. Face-to-face interviews were conducted using a modern, dynamic routing instrument over an 11-mo period (February to December 2011) with representatives of the following beef market sectors: Government and Allied Industries (n = 47); Feeders (n = 59); Packers (n = 26); Food Service, Distribution, and Further Processors (n = 48); and Retailers (n = 30). To accomplish the objectives, all responses were characterized using seven pre-established quality categories as the basis for asking interviewees the WTP and BW scaling questions. To determine WTP of the beef market sectors for U.S. fed beef, it was first important to understand what "quality" meant to each sector as it related to the U.S. fed beef products they purchase. To achieve this, "quality" was divided into seven pre-established categories: (1) How and where the cattle were raised, (2) Lean, fat, and bone, (3) Weight and size, (4) Cattle genetics, (5) Visual characteristics, (6) Food safety, and (7) Eating satisfaction, and interviewees in each beef market sector were asked to explain iii exactly which quality-related details/practices were important within each category. Overall, "Food safety" was the attribute of greatest importance to all beef market sectors except Feeders, who ranked "How and where the cattle were raised" as the most important. "Eating satisfaction" was the attribute of second most importance to all beef market sectors, except Feeders. Feeders ranked "Weight and size" as the second most important. Overall, "How and where the cattle were raised" had the greatest odds (0.25) of being considered a "non-negotiable requirement" before the raw material for each sector would be considered at all for purchase, and differed (P < 0.05) from "Visual characteristics" (0.14), "Lean, fat, and bone" (0.12), "Eating satisfaction" (0.12), "Cattle genetics" (0.10), and "Weight and size" (0.06). Of all market sectors combined, "Eating satisfaction" calculated the highest average percentage premium (11.1%), but only differed (P < 0.05) from "Weight and size" (8.8%). Most notably, when a sector said that "Food safety" was a "non-negotiable requirement," no sector was willing to purchase the product at a discounted price if the "Food safety" of the product could not be assured.Item Open Access Effect of dietary beta-agonist supplementation on live performance, carcass characteristics, carcass fabrication yields, and strip loin tenderness and sensory traits(Colorado State University. Libraries, 2012) Arp, Travis Steven, author; Belk, Keith, advisor; Woerner, Dale, committee member; Tatum, J. Daryl, committee member; Pendell, Dustin, committee memberBeef steers (n = 3,906) were fed at a commercial feed yard to evaluate the effects of beta-adrenergic agonist supplementation on live performance, carcass characteristics, carcass fabrication yield and strip loin tenderness and palatability. Steers were weighed and ultrasonic carcass measurements were collected for allocation into four feeding blocks. Within each block, approximately 100 steers were assigned two a pen that was assigned one of five treatments, including: No beta-agonist; Ractopamine hydrochloride (RH) fed at 200 mg/hd/d for the final 30 d of finishing (RAC200); RH fed at 300 mg/hd/d for the final 30 d of finishing (RAC300); RH fed as a 400 mg/hd/d top dress for the final 30 d of finishing (RAC400); and Zilpaterol hydrochloride (ZH) fed at 6.8 g/ton beginning 23 d before slaughter, with a withdrawal period starting 3 d before to slaughter (ZIL). The study design included eight replicates (pens) per treatment (two per block). Each feeding block was harvested on consecutive weeks. Each week, carcass parameters were measured and strip loin samples were collected from 18 carcasses per pen (720 total samples) for Warner-Bratzler and Slice Shear Force, and trained sensory analysis. Subsamples of eight carcasses per pen (320 total samples) were selected for whole carcass fabrication yield. Final BW was not affected by treatment (P = 0.2892), but there was a tendency for cattle receiving βAA supplementation to be heavier compared to controls (P = 0.0681). Average daily gain and F:G ratio was improved with treatment of βAA (P < 0.05). Carcasses from the ZIL and RAC400 treatments had the heaviest HCW, and were significantly heavier than CON and RAC200 treatments (P < 0.05). The ZIL treatment also recorded the highest dressing percent and carcasses had the largest LMA compared to all other treatments (P < 0.05). USDA yield grade and marbling score were reduced due to βAA supplementation (P < 0.05). Differences in marbling score reduced the frequency of carcass qualifying for the CAB premium in βAA treated cattle (P < 0.05), while also accounting for a decrease in the frequency of carcasses grading choice and an increase in the percentage of carcasses grading select for cattle receiving βAA supplementation compared to controls (P < 0.05). The percentage of YG1 carcasses was increased and the frequency of YG3 carcasses was decreased due to βAA treatment (P < 0.05). Treatment with dietary βAA elicited the greatest response in subprimal yield in cuts from the round. Zilpaterol treatment carcasses reported the highest total saleable yield, and were greater than all RAC treatments (P < 0.05). Warner-Bratzler and SSF was affected by treatment (P < 0.05), with an increase in shear force values with increased dose and potency of βAA's. Likewise, the percentage of steaks shearing greater than 4.4 and 20 kg for WBSF and SSF, respectively, was increased with βAA supplementation (P < 0.05). Tenderness attributes were ranked lower for steaks from βAA treatments by trained sensory panelists (P < 0.05). There were no differences detected by panelists for juiciness or beef flavor attributes.Item Open Access Evaluation of risk factors and detection of selected foodborne pathogens associated with fresh produce(Colorado State University. Libraries, 2015) Coleman, Shannon M., author; Goodridge, Lawrence, advisor; Bunning, Marisa, committee member; Woerner, Dale, committee member; Newman, Steven, committee memberThe Economic Research Service (ERS) of the United States Department of Agriculture (USDA) has reported increases of greater than 40 pounds per capita in yearly fresh produce consumption over the last 30 years. Outbreaks associated with fresh produce have also increased with an estimated 46% of foodborne outbreaks attributed to the consumption of various types of fresh produce from 1998 to 2008. One of the foodborne pathogens of concern is Salmonella spp., the leading cause of foodborne illness hospitalizations and deaths in the United States (US). Salmonella species are ubiquitous microorganisms necessitating increased need for proper surveillance. Testing for major pathogens such as Salmonella spp. and Escherichia coli O157:H7 in produce is impractical due to large retail volume, variability of contamination, and low sensitivity of current platforms. Irrigation, wash waters, and other agricultural sources offer greater probability for pathogen detection when combined with appropriate sample preparation. One food commodity commonly linked to Salmonella spp. outbreaks is tomato. Greenhouse/hydroponic production currently accounts for a large share of tomato production and has had a significant impact on the U.S. fresh-tomato market. There is little known about the possibility of contamination and internalization of foodborne pathogens via greenhouse/hydroponic commercial production since these operations are usually considered relatively sanitary due to the closed environment. I evaluated the risk factors associated with fresh produce contamination such as contaminated irrigation water and agricultural sources using simple sample preparation, subtyping techniques, and rapid molecular testing. This research is comprised of three study topics: development of an irrigation water concentration method with subsequent detection of Salmonella spp. and E. coli O157:H7 using Vitek Immuno Diagnostic Assay (VIDAS) technology, comparison of molecular serotyping methods to conventional serotyping methods for Salmonella enterica subsp. enterica isolates from food and agricultural sources, and evaluation of contaminated irrigation water as a risk factor for contamination of hydroponically grown tomatoes. Novel molecular methods were used in the three studies, including VIDAS UP® technology, Automated RiboPrinter, Luminex® xMAP Salmonella serotyping assay, and pulsed-field gel electrophoresis to detect foodborne pathogens. Results showed that a novel concentration method was effective in concentration of Salmonella spp. and E. coli O157:H7 with subsequent detection via mini VIDAS® technology. Molecular serotype methods were unable to serotype isolates obtained from agricultural sources. However, molecular methods allowed us to identify serovars associated with food and clinical sources. Salmonella Typhimurium did not survive well in the nutrient solution of a conventional hydroponic system used in tomato production. We also discovered that continuous contamination with S. Typhimurium might lead to contamination of the root systems but not contamination of the leaves and fruit. This work illustrates the continuing need to evaluate production methods and pathogen detection techniques to improve the safety of fresh produce.Item Open Access Investigation of the beef supply-chain microbiome and pathogen controls(Colorado State University. Libraries, 2015) Yang, Xiang, author; Belk, Keith, advisor; Woerner, Dale, committee member; Yang, Hua, committee member; Reynolds, Stephen, committee memberFoodborne illness associated with pathogenic bacteria is a global public health and economic challenge. Understanding the ecology of foodborne pathogens within the meat industry is critical to mitigating this challenge. The diversity of microorganisms (pathogenic and non-pathogenic) that exists within the food and meat industries complicates efforts to understand pathogen ecology. Further, little is known about the interaction of pathogens within the microbiome throughout the whole meat production chain. Here, the combined use of a metagenomics approach and shotgun sequencing technology was evaluated as a tool to detect pathogenic bacteria in different sectors of the beef production chain. Environmental samples were obtained at different longitudinal processing steps of the beef production chain: cattle entry to feedlot (Arrival), exit from feedlot, cattle transport trucks, abattoir holding pens, and the end of fabrication system (Market-Ready). The log counts population per million reads for all investigated pathogens (Salmonella enterica, Listeria monocytogenes, generic Escherichia coli, Staphylococcus aureus, Clostridium (C. botulinum, C. perfringens), and Campylobacter (C.jejuni, C.coli, C.fetus)) were reduced from Arrival to Market-Ready samples mainly due to reduced diversity within the microbiome. Further, normalized counts for Salmonella enterica, E. coli, and C. botulinum were greater in Market-Ready samples. This indicates that the proportion of these bacteria increases within the remaining bacterial community, which is likely a result of a reduction or elimination of other bacteria via antimicrobial interventions applied during meat processing. Further characterization of the microbiome allowed for the identification of 63 virulence factors within 27 samples (31% of samples). From an ecological perspective, data indicated that shotgun metagenomics can be used to evaluate not only the microbiome of samples collected from the beef production system, but also observe shifts in pathogen populations during the beef production chain over time. However, our utilization of this approach presented challenges and highlighted a need for further refinement of this methodology. Specifically, identifying the origin of reads assigned to specific pathogen from a diverse environmental sample containing thousands other bacterial species can be difficult. Additionally, low coverage on pathogen whole genome is another limitation of current next generation sequencing technology for shotgun metagenomic data. Moreover, the identification of bacteria from metagenomic data relies heavily on the quality of public genome database, which still need to be improved. Our investigation demonstrates that although the metagenomic approach has promise, further refinement is needed before it can be used to confirm the presence of pathogens in environmental samples. A study was conducted to compare decontamination efficacy of a blend of sulfuric acid and sodium sulfate (SSS) or lactic acid (LA) against Salmonella on the surface of hot beef carcasses. A total of 60 pieces of beef briskets, obtained directly from unchilled beef carcasses, were cut into two sections (10 x 10 x 1 cm) and spot-inoculated with 200µl of inoculum, comprised of six-strain mixtures of Salmonella, and allowed 15 minutes for pathogenic attachment to reach a target level of approximately 5 to 6 log CFU/cm2. One brisket section (of the pair) remained untreated while the other section was treated with the compounds using a custom-built spray cabinet that sprays either SSS (21°C and 52°C) or LA (21°C and 52°C) at pressure of 15 psi for 5 seconds. Treated samples were transferred into Whirl-Pak filter bags and were held for 10 minutes, allowing pathogen bacterialcidal activity before sampling, plating, and counting. Unheated and heated SSS lowered (P < 0.05) means of the total bacterial counts on Tryptic Soy Agar (TSA) from 6.3 log CFU/cm2 to 4.6 and 4.3 log CFU/cm2, respectively. Likewise, unheated and heated LA reduced (P < 0.05) means of the total bacterial counts on TSA from 6.3 log CFU/cm2 to 4.7 and 4.4 log CFU/cm2, respectively. On Xylose lysine deoxycholate agar (XLD), initial counts of inoculated Salmonella (6.1 to 6.2 log CFU/cm2) were reduced (P < 0.05) by 2.0 to 4.2 log CFU/cm2 due to treatment with unheated SSS, by 2.3 to 3.9 log CFU/cm2 due to treatment with heated SSS, by (P < 0.05) 2.4 to 3.7 log CFU/cm2 and 3.8 log CFU/cm2 after treatment with unheated and heated LA, respectively. Overall, no (P > 0.05) chemical by temperature interaction effects on microbial reductions was detected when plated on either TSA or XLD agars. Heating chemical solutions lead to an additional 0.3 log CFU/cm2 reduction in total aerobic bacteria compared to unheated solutions. Less (0.3 log CFU/cm2) inoculated Salmonella were recovered on XLD agar from samples treated with LA compared to samples treated with SSS. However, such a small numeric unit change was likely not biologically important. These results indicated that both unheated and heated SSS and LA are effective interventions to reduce Salmonella inoculated onto hot beef carcass surface tissue.Item Open Access Liver abscess effects on carcass performance and heritability estimates of liver abscess incidence and severity in beef on dairy heifers(Colorado State University. Libraries, 2024) Zuvich, Miranda Lee, author; Enns, R. Mark, advisor; Speidel, Scott E., advisor; Woerner, Dale, committee member; Holt, Timothy N., committee memberThe economic impact of liver abscesses has been reported to be not only due to loss from condemnation of livers but also from impacts on performance. A primary focus in decreasing liver abscess prevalence has been on prevention methods because with limited or no clinical signs present, diagnosis of liver abscesses in live animals is complicated, and no prevention methods have been highly effective in mitigation. As a result, this study aimed to identify the impacts of liver abscesses on carcass performance and estimate heritability for liver abscess incidence and severity in fed beef on dairy heifers. In the first study, 1,860 beef on dairy heifers were fed and harvested in Kansas. All had phenotypes for hot carcass weight (HCW; kg), rib eye area (REA; cm2), fat thickness (FT; cm), marbling score (MS), calculated visual yield grade (VYG), and liver abscess score. Of the 1,860 individuals, 1,646 had phenotypes for heart score (HS). Carcass impacts were estimated using fixed effects of liver abscess score, contemporary group, and age in days. The contemporary group was a concatenation of kill lot and treatment. Liver abscess score was fit in two different forms: 6 scores ("0", "A-", "A", "A+", "A+AD", "A+O") and 4 scores ("0", "A-", "A", "A+") where "A+" included scores of "A+AD" and "A+O". A score of "0" indicated no abscess and abscess severity increases with "A-", "A", and "A+". The scores of "A+AD", and "A+O" indicate there is adhesion of the liver to nearby organs and ruptured abscess, respectively. A significant increase was identified using the six-score model for FT for animals with scores of "A+O" compared to "A+", with respective least-squares means of 1.94 cm ± 0.12 and 1.59 cm ± 0.06 (P < 0.05). While not significant, tendencies were identified for FT for animals with scores of "A" and "A+AD" compared to "A+O" (0.05 ≤ P < 0.1) with respective least-squares means of 1.61 cm ± 0.06, 1.61 cm ± 0.05, and 1.94 cm ± 0.12. A significant increase was identified using the six-score model for VYG in animals with VYG scores of "A+O" higher than "A+" and "A+AD", with respective least-squares means of 3.75 ± 0.19, 3.20 ± 0.09, and 3.20 ± 0.08 (P < 0.05). When using the 4-score system, HCW was significantly lower for animals with scores of "A+" compared to those with non-abscessed livers. Hot carcass weight least-squares means for animals with no abscesses was 396 kg ± 2.63, and for those with severe abscesses was 391 kg ± 2.92 (P < 0.05). In the second study, 1,492 beef on dairy heifers fed and harvested in Kansas had liver abscess scores and sire information. Nine models were utilized to estimate heritability, all with fixed effects of contemporary group, age in days, and number of bovine respiratory disease treatments. The contemporary group was a concatenation of kill lot and treatment. Models 1, 4, and 7 were from data sets with all sires represented but had liver abscess score represented as a continuous variable, a binary score indicating abscess presence, and a binary score indicating severe abscess ("A+") presence, respectively. Models 2, 5, and 8 followed the same respective scoring systems as Models 1, 4, and 7, but the data set only included heifers from sires with 10 or more progeny. Models 3, 6, and 9 followed the same respective scoring systems as Models 1, 4, and 7, but the data set only included heifers from sires with 100 to 200 progeny in the complete data set. Heritability estimates from a sire model for Models 1, 4, and 7 ranged from 4.26 x 10-8 to 1.06 x 10-7. Heritability estimates from a sire model for Models 2, 5, and 8 ranged from 4.90 x 10-8 to 4.61 x 10-7. Heritability estimates from a sire model for Models 3, 6, and 9 ranged from 1.01 x 10-7 to 2.88 x 10-3. All estimates indicate no genetic component to liver abscess severity or incidence in this data set.Item Open Access Survey of the prevalence of conformational defects in feedlot receiving cattle in the United States(Colorado State University. Libraries, 2016) Vollmar, Kaycee, author; Grandin, Temple, advisor; Edwards-Callaway, Lily, committee member; Engle, Terry, committee member; Ogden, Brenda, committee member; Woerner, Dale, committee memberA survey was conducted on large beef cattle feedlots in Colorado and Texas between March and July 2015, to assess the current status of conformational defects in U.S. fed steers and heifers. The objectives were to: 1) determine the prevalence of conformational defects in feedlot receiving cattle in a population across multiple regions within the United States; and 2) increase industry awareness of the structural problems found in the current cattle population to help ultimately improve a practical selection focus. Conformational traits of front and rear claw, front and rear feet angles, rear leg side view, and rear leg hind view were evaluated on a scale of 1-9 with scores 4-6 serving as the most desirable. Overall soundness was evaluated from 0-100 with 66-100 serving as optimal soundness. A new scoring tool was developed and added to assess conformational problems in cattle shoulder and hip structure. Data from 2,886 head of feedlot cattle was used to evaluate the frequency of these conformational defects. Phenotypic evaluation revealed the highest prevalence of conformational issues in the shoulder, hip, and rear leg covering multiple relationships with demographic characteristics. Of the entire sample, 49.97% had a less than ideal shoulder structure, 53.33% had a less than ideal hip structure, and 29.97% displayed a less than ideal hock structure when viewed from the side. Heavier weight cattle showed a significantly higher (P<0.0001) prevalence of front claw scissor type abnormalities (7-9) and an increase (P<0.0001) in impaired mobility scores (group 2). Northern cattle exhibited a significant (P<0.0001) increase in front claw defects of scissor claw type abnormalities (7-9). Lastly, Bos Indicus cattle displayed a higher prevalence (P<0.0001) of round hip structures (7-9) and an increase (P<0.0001) of impaired mobility scores (group 2). The remaining traits had significantly higher proportions in the desirable (normal) group, and thus, the industry has shown positive developments in rear claw set and front and rear feet angles. Additionally, 85.85% of our total sample demonstrated overall comprehensive soundness scores for sound and flexible mobility (group 3). These findings will be useful to the beef industry in creating a benchmark for the conformational status of the current cattle herd to ultimately improve skeletal structure for improved welfare and performance in feedlot cattle.Item Open Access Survival and persistence of foodborne pathogens in food residues on packaging materials and reduction of Escherichia coli O157:H7 and Salmonella in beef trimmings(Colorado State University. Libraries, 2012) Nunnelly, Matthew Charles, author; Sofos, John N., advisor; Woerner, Dale, committee member; Kendall, Patricia, committee memberFoodborne pathogens continue to cause health problems for modern consumers of meat products despite efforts to control bacteria in food. New approaches to controlling pathogens and identifying sources of contamination are needed. Some of the most important foodborne pathogens that affect modern food supplies are Salmonella serotypes and Escherichia coli O157:H7, both associated with uncooked meat, and Listeria monocytogenes, a problematic organism for ready-to-eat foods. The objective of this thesis is to investigate survival of E. coli O157:H7 and L. monocytogenes on food packaging materials soiled with meat-based residues, and compare differences of behavior when exposed to different packaging materials and storage conditions. In addition to these investigations, a study comparing resistance of multi drug-resistant and susceptible Salmonella serotypes and E. coli O157:H7 on beef trimmings treated with decontaminating antimicrobials provides valuable information concerning the efficacy of current chemical interventions against Salmonella serotypes that are at the forefront of public health concerns. To evaluate pathogen survival on contaminated food packaging materials, meat based homogenate (10% w/w) was inoculated with a multi-strain mixture of either L. monocytogenes or E. coli O157:H7 and spot-inoculated on packaging material samples, placed in a new, empty petri dish, and stored in incubators set at either 4 or 25° C for up to 130 days. Samples were analyzed regularly until the end of the study. There were survivors of the pathogens on several soiled packaging material types even at 123 or 130 days of storage (L. monocytogenes or E. coli O157:H7, respectively). When the decontamination of beef trimmings contaminated with multi drug-resistant and susceptible Salmonella was compared with E. coli O157:H7, there were very few statistically significant differences (P < 0.05) between the reduction of Salmonella and the response of E. coli O157:H7 to acidified sodium chlorite (1000ppm), peroxyacetic acid (200ppm), and sodium metasilicate (40000ppm). In addition, there were only minor differences between the reductions of antibiotic susceptible Salmonella and multi drug- resistant strains. Results of these studies will aid in quantifying risks associated with contamination of food packaging materials as well as beef trimmings.Item Open Access The effect of dam nutrient deprivation on lamb carcass characteristics, retail yields, and nutrient composition(Colorado State University. Libraries, 2012) Brenman, Kristina Anne, author; Belk, Keith, advisor; Woerner, Dale, committee member; Engle, Terry, committee member; Mykles, Donald, committee memberThe objective of this study was to determine the effect of dam nutrient restriction on offspring carcass characteristics, retail cut yields, and nutrient composition. Forty one western white rams and ewes were obtained from a previous Colorado State University study of dam nutrient restriction. Prior to gestation, dams were fed 100% of their nutrient requirements. The diet of dams was a vitamin-mineral rich pelleted beet-pulp (77.8% total digestible nutrients [TDN], 90.0% dry matter [DM], and 9.4% crude protein [CP]). At 28 days gestational age, dams were randomly assigned to individual pens and separated into three different treatments: control (100% nutrient requirements), half ration (fed 50% of their nutrient requirements from day 28 until term), and realimented (fed 50% of their nutrient requirements from day 28 until day 78, and then slowly realimented back to 100% for the remainder of gestation). All twin lambs were slaughtered, and hot carcass weight, 12th rib fat, body wall thickness, adjusted fat, ribeye area, ribeye marbling, leg score, leg circumference, conformation, flank streaking, flank firmness, flank color, kidney fat weight, L*, a*, and b* were obtained. After all lambs were slaughtered, one half of each lamb carcass was fabricated in the following subprimals: rack, roast ready, frenched PSO 3x1" (IMPS 204C); shoulder, square-cut, boneless (IMPS 208); Denver ribs, skirt-off (IMPS 209A); Foreshank (IMPS 210); loin, short-cut, trimmed PSO 0x0" (IMPS 232A); flank untrimmed (IMPS 232E); leg, hindshank (IMPS 233F); and leg, shank-off, boneless (IMPS 234A). Lastly, all lambs were utilized to determine dry matter, moisture, crude protein, crude fat, ash, vitamins A and E, trace minerals, and fatty acids. No interactions were found between treatment and gender for any characteristic, so treatment and gender were analyzed separately. Lambs of ewes that were nutritionally restricted were smaller in size with less fat. Lambs of the realimented group had more fat than either the control or the half ration groups. Rams had more percent lean content than ewes, which was to be expected. Results of this study provide insight on the effect of nutrient restriction on lamb growth and development, as well as nutrient content of American lamb.Item Open Access The value of U.S. beef exports and the traceability of pork in countries outside of North America(Colorado State University. Libraries, 2012) Meisinger, Jessica, author; Belk, Keith, advisor; Pendell, Dustin, committee member; Woerner, Dale, committee member; Engle, Terry, committee memberVariation exists within beef cuts produced by U.S. beef packers for domestic and foreign markets, due to differences in consumer expectations and use of the product. The objective of this study was to conduct an industry-wide survey to identify commonality among and between U.S. beef processor specifications, as well as to identify differences between products sent to varying countries, and to determine a more accurate value of beef export. Countries that have an Export Verification program require suppliers to be certified with the United States Department of Agriculture and submit information about exported products. The EV information was collected and used to determine the countries that were receiving the highest volume of U.S. product, as well as the meat cuts common in each country. The data was also used to assign prices to individual products to ascertain export value. These documents do not show individual differences between how companies cut beef products. Four countries that represent significant U.S. beef export markets (Japan, Mexico, Hong Kong, and Taiwan) were visited. During these visits, product was visually checked and compared to known Institutional Meat Processor Specification (IMPS). Animal diseases and related food safety issues have become concerns to many people in the last decade and traceability is becoming increasingly important throughout the world as a way to control disease outbreaks before they have devastating effects on a country's livestock industries. The objective of this review was to discuss swine identification and traceability systems outside North America.