Browsing by Author "Belk, Keith, advisor"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
Item Open Access Characterization of the resistome and microbiome of retail meats processed from carcasses of conventionally and naturally raised cattle(Colorado State University. Libraries, 2019) Thomas, Kevin, author; Belk, Keith, advisor; Morley, Paul, committee member; Metcalf, Jessica, committee memberConcern over human exposure to antimicrobial resistance (AMR) via consumption of meat products has raised questions about use of antimicrobial drugs in food-animal production. This concern has led to an increase in consumer demand for meat products from naturally-raised cattle, or those raised without use of antimicrobials. While previous studies have assessed AMR gene presence in cattle and throughout the beef supply chain, very little work has surveyed the resistome on retail meats available for consumer purchase. The objective of this study was to determine the extent of antimicrobial resistance and characterize the microbiome in retail ground beef products from naturally-raised (raised without antibiotics) and conventionally-raised cattle utilizing 16S rRNA and targeted shotgun metagenomic, high-throughput sequencing techniques. Differing in packaging types and lean points, samples of ground beef derived from carcasses of cattle that were conventionally-raised (n = 50) or naturally-raised cattle (n = 50) were purchased from retail outlets in six major metropolitan cities throughout the United States. Samples were shipped to Colorado State University and processed following 48 hours of refrigeration at 4°C. Thirty-gram portions of each sample were removed and subjected to DNA extraction procedures via DNeasy PowerFecal Microbial Kit. Cell lysates were composited by production system and city before being subjected to paired-end 16S rRNA gene sequencing and targeted shotgun metagenomic sequencing using an enrichment system developed in our laboratory. Microbiome analysis was performed from 16S data with QIIME2 v.2018.4 by utilizing many of the available plugins. Resistome analysis of enriched metagenomic data was performed using a modified AMRPlusPlus pipeline. Microbiome alpha diversity analysis indicated that ground beef processed from conventionally-raised animals had a greater (P < 0.05) species richness than natural ground beef products. Microbiome composition differed (P < 0.05) between samples of differing production systems based on abundance weighted UniFrac distances. Additionally, when analyzed using unweighted UniFrac distances, microbial composition differed (P < 0.05) between samples from different cities. Differences in product packaging availability between cities may have caused these differences detected in microbiome composition, as well as environmental contamination or product handling in distribution. Targeted shotgun sequencing yielded a total of 4.6 trillion reads across all 60 composite samples, with only 58 samples containing hits to AMR. Of these 58 samples, 10.1 million reads were assigned to: 520 groups, 101 mechanisms of resistance, and 22 classes of antibiotics. The three most abundant classes of resistance detected included tetracyclines (56% of assigned reads), multi-drug resistance (21% of reads), and beta-lactams (7% of reads). An analysis of similarity on samples ordinated using Euclidian distances suggested that the overall resistome differed (P < 0.05) by production system, likely driven by greater antimicrobial resistance group variation among conventional retail samples. Results from this study profiled resistance and characterized microbial composition of retail beef products from two major production practices. While the results do not discredit concern over imprudent use of antibiotics in beef production, differing management techniques in cattle production do not appear to have a direct impact on the resistome or microbiome of final retail products available to consumers.Item Unknown Determination of antibiotic, ß-agonist, and non-steroidal anti-inflammatory drug residues in ground beef from USDA certified organic, natural, conventional, and market cow and bull sources(Colorado State University. Libraries, 2009) Bowling, Mitchell Brett, author; Belk, Keith, advisorIn recent years, consumer demand for organic and "natural" products has increased, partly due to a perception that such products are healthier and contain fewer additives, including veterinary drugs and growth promotants. The study presented herein compared occurrence of veterinary drug residues in ground beef samples reflecting different livestock production classifications. We collected ground beef samples (N = 400) consisting of 90.0 ± 4.0% lean muscle tissue from a total of eight plants, two each reflecting production in the following categories: (1) USDA Certified Organic (n = 100); (2) USDA Process Verified Never Ever 3 (n =1 00); (3) conventionally raised fed beef (n = 100); and (4) ground beef derived from carcasses of market cows and bulls (n = 100). Liquid chromatography coupled with triple-quadrupole mass spectrometry (UPLC-MS) methods were developed for the following veterinary drugs: (1) Aminoglycosides (Gentamicin, Amikacin, and Neomycin); (2) ß-lactams (Penicillin, Ampicillin, and Desfuroylceftiofur); (3) Fluoroquinolones (Danofloxacin and Ciprofloxacin); (4) Macrolides (Erythromycin, Tylosin, and Tilmicosin); (5) Phenicols (Florfenicol); (6) Sulfonamides (Sulfamethazine and Sulfadimethoxine,); (7) Tetracyclines (Oxytetracycline, Chlortetracycline, and Tetracycline); (8) Streptogramins (Virginiamycin); (9) ß-agonists (Ractopamine and Zilpaterol); and (10) non-steroidal anti-inflammatory drugs (Flunixin and Phenylbutazone). Residues exceeding their respective US tolerance limit were found in six ground beef samples. Two USDA Certified Organic samples contained Ampicillin residues exceeding US tolerance limits. One USDA Process Verified Never Ever 3 sample contained a residue of Ractopamine exceeding US tolerance limits. One sample from the market cow and bull category contained a residue of Sulfadimethoxine that exceeded US tolerance limits, one contained a residue of Ampicillin that exceeded US tolerance limits, and one contained a residue of Phenylbutazone that exceeded US tolerance limits. Residues of Phenylbutazone exceeding US tolerance limits were also found in one sample from the conventional production category. Additionally, residues (below the US tolerance limit) of several classes of veterinary drugs were found in samples from the USDA Certified Organic and USDA Process Verified Never Ever 3 production categories, a finding that clearly demonstrates violation of zero-tolerance statutes set forth by the National Organic Program and USDA Process Verified Never Ever 3 marketing descriptors. In the USDA Certified Organic production category, residues were detected in eight Ampicillin, seven Penicillin, three Sulfamethazine, one Sulfadimethoxine, and one Ractopamine sample. In the USDA Process Verified Never Ever 3 production category, residues were detected in one Ampicillin, one Chlortetracycline, two Tetracycline, and six Ractopamine samples. These violations exceed the historical prevalence of veterinary drug residues reported by the National Residue Program and demonstrate the need for careful monitoring of animals administered veterinary drugs in order to prevent improper inclusion of unqualified animals in premium marketing programs, such as USDA Certified Organic and USDA Process Verified Never Ever 3 programs.Item Open Access Driving change: the 2011 National Beef Quality Audit(Colorado State University. Libraries, 2012) Igo, Jessica Leigh, author; Belk, Keith, advisor; Tatum, Daryl, committee member; Woerner, Dale, committee member; Chapman, Phillip, committee memberThe National Beef Quality Audit - 2011 evaluated the current status and progress being made towards quality and consistency of cattle, carcasses, and beef products produced by the U.S. fed beef population since the introduction of the National Beef Quality Audit in 1991. The objectives of this research were to determine how each beef market sector defines seven identified quality categories, estimates willingness to pay (WTP) for specified quality categories within each beef market sector, and establishes a best-worst (BW) scaling for the identified quality attributes. Face-to-face interviews were conducted using a modern, dynamic routing instrument over an 11-mo period (February to December 2011) with representatives of the following beef market sectors: Government and Allied Industries (n = 47); Feeders (n = 59); Packers (n = 26); Food Service, Distribution, and Further Processors (n = 48); and Retailers (n = 30). To accomplish the objectives, all responses were characterized using seven pre-established quality categories as the basis for asking interviewees the WTP and BW scaling questions. To determine WTP of the beef market sectors for U.S. fed beef, it was first important to understand what "quality" meant to each sector as it related to the U.S. fed beef products they purchase. To achieve this, "quality" was divided into seven pre-established categories: (1) How and where the cattle were raised, (2) Lean, fat, and bone, (3) Weight and size, (4) Cattle genetics, (5) Visual characteristics, (6) Food safety, and (7) Eating satisfaction, and interviewees in each beef market sector were asked to explain iii exactly which quality-related details/practices were important within each category. Overall, "Food safety" was the attribute of greatest importance to all beef market sectors except Feeders, who ranked "How and where the cattle were raised" as the most important. "Eating satisfaction" was the attribute of second most importance to all beef market sectors, except Feeders. Feeders ranked "Weight and size" as the second most important. Overall, "How and where the cattle were raised" had the greatest odds (0.25) of being considered a "non-negotiable requirement" before the raw material for each sector would be considered at all for purchase, and differed (P < 0.05) from "Visual characteristics" (0.14), "Lean, fat, and bone" (0.12), "Eating satisfaction" (0.12), "Cattle genetics" (0.10), and "Weight and size" (0.06). Of all market sectors combined, "Eating satisfaction" calculated the highest average percentage premium (11.1%), but only differed (P < 0.05) from "Weight and size" (8.8%). Most notably, when a sector said that "Food safety" was a "non-negotiable requirement," no sector was willing to purchase the product at a discounted price if the "Food safety" of the product could not be assured.Item Open Access Effect of dietary beta-agonist supplementation on live performance, carcass characteristics, carcass fabrication yields, and strip loin tenderness and sensory traits(Colorado State University. Libraries, 2012) Arp, Travis Steven, author; Belk, Keith, advisor; Woerner, Dale, committee member; Tatum, J. Daryl, committee member; Pendell, Dustin, committee memberBeef steers (n = 3,906) were fed at a commercial feed yard to evaluate the effects of beta-adrenergic agonist supplementation on live performance, carcass characteristics, carcass fabrication yield and strip loin tenderness and palatability. Steers were weighed and ultrasonic carcass measurements were collected for allocation into four feeding blocks. Within each block, approximately 100 steers were assigned two a pen that was assigned one of five treatments, including: No beta-agonist; Ractopamine hydrochloride (RH) fed at 200 mg/hd/d for the final 30 d of finishing (RAC200); RH fed at 300 mg/hd/d for the final 30 d of finishing (RAC300); RH fed as a 400 mg/hd/d top dress for the final 30 d of finishing (RAC400); and Zilpaterol hydrochloride (ZH) fed at 6.8 g/ton beginning 23 d before slaughter, with a withdrawal period starting 3 d before to slaughter (ZIL). The study design included eight replicates (pens) per treatment (two per block). Each feeding block was harvested on consecutive weeks. Each week, carcass parameters were measured and strip loin samples were collected from 18 carcasses per pen (720 total samples) for Warner-Bratzler and Slice Shear Force, and trained sensory analysis. Subsamples of eight carcasses per pen (320 total samples) were selected for whole carcass fabrication yield. Final BW was not affected by treatment (P = 0.2892), but there was a tendency for cattle receiving βAA supplementation to be heavier compared to controls (P = 0.0681). Average daily gain and F:G ratio was improved with treatment of βAA (P < 0.05). Carcasses from the ZIL and RAC400 treatments had the heaviest HCW, and were significantly heavier than CON and RAC200 treatments (P < 0.05). The ZIL treatment also recorded the highest dressing percent and carcasses had the largest LMA compared to all other treatments (P < 0.05). USDA yield grade and marbling score were reduced due to βAA supplementation (P < 0.05). Differences in marbling score reduced the frequency of carcass qualifying for the CAB premium in βAA treated cattle (P < 0.05), while also accounting for a decrease in the frequency of carcasses grading choice and an increase in the percentage of carcasses grading select for cattle receiving βAA supplementation compared to controls (P < 0.05). The percentage of YG1 carcasses was increased and the frequency of YG3 carcasses was decreased due to βAA treatment (P < 0.05). Treatment with dietary βAA elicited the greatest response in subprimal yield in cuts from the round. Zilpaterol treatment carcasses reported the highest total saleable yield, and were greater than all RAC treatments (P < 0.05). Warner-Bratzler and SSF was affected by treatment (P < 0.05), with an increase in shear force values with increased dose and potency of βAA's. Likewise, the percentage of steaks shearing greater than 4.4 and 20 kg for WBSF and SSF, respectively, was increased with βAA supplementation (P < 0.05). Tenderness attributes were ranked lower for steaks from βAA treatments by trained sensory panelists (P < 0.05). There were no differences detected by panelists for juiciness or beef flavor attributes.Item Unknown Effects of antibiotic treatment strategies on feedlot cattle resistome and microbiome(Colorado State University. Libraries, 2016) Weinroth, Margaret, author; Belk, Keith, advisor; Morley, Paul, committee member; Martin, Jennifer, committee memberThe objective of this study was to evaluate resistome and microbiome changes in feedlot cattle exposed to commonly used antimicrobials. Sixteen pens of cattle (N=16) were randomly assigned to one of four antimicrobial treatments (n=4) resulting in a complete 2x2 factorial arrangement. The first factor was to treat Ceftiofur crystalline free acid (CCFA) to either the entire pen of animals (high dosage) or to one animal in the pen (low dosage). The second factor was the subsequent feeding of chlortetracycline (CTC) in feed to the entire pen of cattle or not administering CTC to a pen of cattle. Rectal fecal samples were collected from individual cattle within each pen on days 0 and 26. Deoxynucleic acid was extracted from individual fecal samples and pooled by DNA mass, so each pen had one composite sample on day 0 and day 26. Deoxynucleic acid was sequenced on an Illumina HiSeq 2000. Sequencing data (as known as reads) were aligned to a comprehensive antimicrobial resistance gene database and assigned to taxonomic labels. Sixty-eight antimicrobial resistance genes and 431 species were identified across all samples. Resistance to tetracycline was identified as the primary resistance at class level (66.9%) with resistance to Macrolide-lincosamide-streptogramin B (MLS) making up the majority of the remainder (26.2%). Resistance to tetracycline and aminoglycoside in the feces decreased (P < 0.05) in relative abundance from day 0 to day 26 when the cattle were fed CTC regardless of CCFA exposure. Beta-lactactams were the only class of resistance affected by the CCFA treatment, with low exposure CCFA pens exhibiting a smaller (P < 0.05) resistome on day 26 than those steers in high exposure CCFA pens, regardless of CTC treatment. These results indicate that the exposure to tetracycline for cattle may not be directly associated to the resistance to tetracycline in their feces. Further research is needed to explore more about this. Additionally, the decrease in resistance to aminoglycoside with no cattle exposed to anyaminoglycosides during the study raises the possibility of co-selection of resistant genes. Overall, the relative abundance of microbiome did not differ (P > 0.05) between pens of cattle with CCFA or CTC treatments but differed (P < 0.05) between day 0 and day 26. Overall microbiome relative abundance did not differ (P > 0.05) due to CCFA or CTC treatments but differed (P < 0.05) between day 0 and day 26. Changes in the microbiome over time affected all 19 phyla identified when all treatments were pooled together. It has been well established in humans that antimicrobial treatment changes in the microbiome (Khoruts et al., 2010; Preidis and Versalovic, 2009). While these findings are not as robust livestock, there is ongoing investigation establishing these same results that may lead to further understanding of how the microbiome of livestock responds to antibiotics.Item Open Access Efficacy of sulfuric acid sodium sulfate on inoculated populations of Salmonella spp. and Campylobacter spp. on pork subprimals, and its effects on natural spoilage microflora, lean discoloration and off-odors(Colorado State University. Libraries, 2016) McCullough, Kathryn Rose, author; Belk, Keith, advisor; Morley, Paul, committee member; Delmore, Robert, committee member; Yang, Hua, committee memberSalmonella and Campylobacter are pathogens commonly associated with foodborne illness. As these pathogens are often found in fresh pork, efforts to reduce or eliminate them is imperative to the pork industry. Additionally, fresh pork is highly perishable and maintenance of desirable attributes is imperative. So, extending shelf life of fresh pork is important to maintain profitability and desirability of product. Although a variety of attributes can determine pork shelf-life, reducing spoilage microflora is an important quality control point. Therefore, this study was conducted to determine efficacy of applying sulfuric acid sodium sulfate (SA) to reduce inoculated populations of Salmonella spp. and Campylobacter spp. on pork subprimals. Additionally, this study aimed to determine efficacy of SA application against inoculated populations of non-pathogenic Escherichia coli that could then serve as surrogates for Salmonella spp. and Campylobacter spp. on pork in in-plant trials (Experiment 1). This study also was conducted to determine effects of a SA spray on the natural spoilage microflora, off odor characteristics, and discoloration properties of pork subprimals during vacuum storage and simulated retail display (Experiment 2). And, SA was evaluated in a commercial pork in-plant system against the natural microflora and inoculated populations of a surrogate bacteria (Experiment 3). For Experiment 1, vacuum packaged pork subprimals were obtained from a local retailer less than 10 days postmortem. Entire subprimals were cut into uniform sample pieces and assigned to one of the following treatments: 1.0 pH SA, 1.5 pH SA, water or an untreated control. Samples were inoculated to a target level of 6 logs CFU/g for Salmonella spp. and surrogate E. coli, or 5.5 logs CFU/g for Campylobacter spp., with cocktails before treatment. Surviving pathogen and non-pathogenic E. coli populations were determined at 5 minutes post- treatment and at 24 h post-treatment. For Experiment 2, boneless pork loins and bone-in backribs were obtained from a commercial pork processing facility and treated with a topical spray of SA at 1.5 pH, 1.0 pH, or an untreated control. After treatment, all samples were placed in dark, refrigerated storage for 14 d or 21 d, after which one half of the samples were removed from storage, overwrapped with polyvinyl chloride film, and placed into retail display cases maintained at 4°C (±2°C) for up to 96 h. At 12 h intervals for the duration of simulated retail display, trained panelists evaluated percent discoloration. Additionally, at 0, 48 and 96 h of display, trained panelists evaluated intensity of off odors and plated and enumerated populations of Psychrotrophic, Pseudomonas, Lactic acid bacteria and yeast and molds. For Experiment 3, 60 carcasses were railed off and market strategically with 5 x 10 cm2 areas. Half the zones were inoculated with the surrogate bacteria, the other half remained uninocualted. Carcasses were then treated with the SA using a commercial application spray cabinet. For Experiment 1, application of 1.0 pH SA was the most effective (P < 0.05) at reducing inoculated populations of both Salmonella spp, and Campylobacter spp, compared to all other treatments. However, no difference (P > 0.05) was observed for Campylobacter and surrogate bacterial populations determined at 5 min versus populations at 24 h. Additionally, non-pathogenic E. coli strains were affected less by treatment than inoculated Salmonella spp. and Campylobacter spp.populations and can, therefore, effectively serve as surrogates for Salmonella spp. and Campylobacter spp. For Experiment 2, after 14 and 21 d of dark storage, both boneless loins and backribs sprayed with 1.0 pH SA had lower (P < 0.05) Psychrotrophic, Pseudomonas, Lactic acid bacteria and yeast and mold populations than control or 1.5-pH treated samples at 0, 48 and 96 h of display. Percent discoloration of boneless loin chops increased over the duration of retail display for products stored for 14 and 21 d before simulated retail display. Boneless loin chops treated with 1.0 pH SA had a greater percent discoloration at each simulated retail display test time than untreated chops or those sprayed with 1.5 pH SA. For Experiment 3, SA proved to effectively lower (P < 0.05) both inoculated and uninoculated bacterial (TPC, EB, TCC, and ECC) populations on pork carcasses. However, treatment with 1.0 pH SA was more effective than treatment with 1.3 pH SA.Item Open Access Impact of antibiotic use on resistance in beef feedlot and dairy cattle(Colorado State University. Libraries, 2017) Rovira Sanz, Pablo, author; Belk, Keith, advisor; Morley, Paul, committee member; Schmidt, John, committee member; Yang, Hua, committee memberIn recent years, consumer demand for natural and organic foods has increased, partly due to concerns about the use of antimicrobials in food producing animals. The aim of this study was to evaluate antimicrobial resistance (AMR) in beef feedlot and dairy cattle raised without use of antibiotics compared to cattle raised in conventional (CONV) production. Three research projects were conducted to accomplish that general goal. In the first study, a conventional feedlot, natural feedlot, conventional dairy and organic dairy were visited to collect cattle feces, wastewater from lagoons and soil where the wastewater was applied. After DNA extraction, sequencing, and processing, metagenomic reads were aligned to reference databases for identification of antibiotic resistance genes (ARGs; i.e. the resistome) and bacteria (microbiome). Resistome composition was influenced by rearing method, cattle type, and type of sample. Most mechanisms of resistance affected by rearing method were enriched (P < 0.05) in conventional samples. Resistome differences were greatest for wastewater samples by rearing method but with contradictory results that suggested an impact of effluent management on wastewater resistome. Resistance to tetracycline and macrolide-lincosamide-streptogramin classes were more abundant in feces of feedlot cattle than in dairy cattle (P < 0.05); whereas resistance to beta-lactams was greatest in feces of dairy cattle (P < 0.05). Resistome and microbiome of feces differed (P < 0.05) between wastewater and soil samples. Results indicated that ARGs are widespread in beef feedlot and dairy cattle farms even in those with restricted antibiotic use. In the second study, feces from RWA (n=36) and CONV (n=36) cattle lots were recovered from colons at a commercial beef processing plant. Samples were equally distributed by month and production protocol over one year (3 samples/production protocol/month). After extracting DNA from individual samples, composite samples were prepared by mixing DNA from each lot into a single composite sample (N = 72) and sequencing the composites on an Illumina platform. Metagenomic reads were processed similarly to those in experiment 1for identification of ARGs and bacteria. Resistomes of CONV and RWA cattle were significantly different by season. In general, mechanisms conferring resistance to beta-lactams, tetracyclines, multi-drug and macrolides were more prevalent (P < 0.05) in feces from CONV colons than in RWA colons. In the third study, a systematic review and meta-analysis was performed to assess the relationship between antimicrobial use (AMU) and antimicrobial resistance (AMR) in feedlot cattle. After conducting a literature search and screening reported studies, 32 studies were selected for use that addressed AMR in Escherichia coli, Enterococcus, Salmonella, Campylobacter, and Mannheimia haemolytica. Overall, 60% (95% CI: 26% to 88%) of the observational studies and 50% (95% CI: 30% to 70%) of the controlled trials reported a positive association between AMU and AMR. Meta-analysis provided evidence for an increase in average relative risk (RR) associated with antibiotic use. Isolates recovered from treated cattle were 2.5 times (95% confidence interval: 1.7 – 3.5) as likely to display antibiotic resistance compared to isolates recovered from unexposed animals. Risk of resistance increases with animal defined daily doses (DDDs). More comprehensive studies that consider the relationship between antibiotic use in cattle and antibiotic resistant bacteria in humans are needed as a part of a farm to fork approach to tackle antimicrobial resistance.Item Unknown Investigation of the beef supply-chain microbiome and pathogen controls(Colorado State University. Libraries, 2015) Yang, Xiang, author; Belk, Keith, advisor; Woerner, Dale, committee member; Yang, Hua, committee member; Reynolds, Stephen, committee memberFoodborne illness associated with pathogenic bacteria is a global public health and economic challenge. Understanding the ecology of foodborne pathogens within the meat industry is critical to mitigating this challenge. The diversity of microorganisms (pathogenic and non-pathogenic) that exists within the food and meat industries complicates efforts to understand pathogen ecology. Further, little is known about the interaction of pathogens within the microbiome throughout the whole meat production chain. Here, the combined use of a metagenomics approach and shotgun sequencing technology was evaluated as a tool to detect pathogenic bacteria in different sectors of the beef production chain. Environmental samples were obtained at different longitudinal processing steps of the beef production chain: cattle entry to feedlot (Arrival), exit from feedlot, cattle transport trucks, abattoir holding pens, and the end of fabrication system (Market-Ready). The log counts population per million reads for all investigated pathogens (Salmonella enterica, Listeria monocytogenes, generic Escherichia coli, Staphylococcus aureus, Clostridium (C. botulinum, C. perfringens), and Campylobacter (C.jejuni, C.coli, C.fetus)) were reduced from Arrival to Market-Ready samples mainly due to reduced diversity within the microbiome. Further, normalized counts for Salmonella enterica, E. coli, and C. botulinum were greater in Market-Ready samples. This indicates that the proportion of these bacteria increases within the remaining bacterial community, which is likely a result of a reduction or elimination of other bacteria via antimicrobial interventions applied during meat processing. Further characterization of the microbiome allowed for the identification of 63 virulence factors within 27 samples (31% of samples). From an ecological perspective, data indicated that shotgun metagenomics can be used to evaluate not only the microbiome of samples collected from the beef production system, but also observe shifts in pathogen populations during the beef production chain over time. However, our utilization of this approach presented challenges and highlighted a need for further refinement of this methodology. Specifically, identifying the origin of reads assigned to specific pathogen from a diverse environmental sample containing thousands other bacterial species can be difficult. Additionally, low coverage on pathogen whole genome is another limitation of current next generation sequencing technology for shotgun metagenomic data. Moreover, the identification of bacteria from metagenomic data relies heavily on the quality of public genome database, which still need to be improved. Our investigation demonstrates that although the metagenomic approach has promise, further refinement is needed before it can be used to confirm the presence of pathogens in environmental samples. A study was conducted to compare decontamination efficacy of a blend of sulfuric acid and sodium sulfate (SSS) or lactic acid (LA) against Salmonella on the surface of hot beef carcasses. A total of 60 pieces of beef briskets, obtained directly from unchilled beef carcasses, were cut into two sections (10 x 10 x 1 cm) and spot-inoculated with 200µl of inoculum, comprised of six-strain mixtures of Salmonella, and allowed 15 minutes for pathogenic attachment to reach a target level of approximately 5 to 6 log CFU/cm2. One brisket section (of the pair) remained untreated while the other section was treated with the compounds using a custom-built spray cabinet that sprays either SSS (21°C and 52°C) or LA (21°C and 52°C) at pressure of 15 psi for 5 seconds. Treated samples were transferred into Whirl-Pak filter bags and were held for 10 minutes, allowing pathogen bacterialcidal activity before sampling, plating, and counting. Unheated and heated SSS lowered (P < 0.05) means of the total bacterial counts on Tryptic Soy Agar (TSA) from 6.3 log CFU/cm2 to 4.6 and 4.3 log CFU/cm2, respectively. Likewise, unheated and heated LA reduced (P < 0.05) means of the total bacterial counts on TSA from 6.3 log CFU/cm2 to 4.7 and 4.4 log CFU/cm2, respectively. On Xylose lysine deoxycholate agar (XLD), initial counts of inoculated Salmonella (6.1 to 6.2 log CFU/cm2) were reduced (P < 0.05) by 2.0 to 4.2 log CFU/cm2 due to treatment with unheated SSS, by 2.3 to 3.9 log CFU/cm2 due to treatment with heated SSS, by (P < 0.05) 2.4 to 3.7 log CFU/cm2 and 3.8 log CFU/cm2 after treatment with unheated and heated LA, respectively. Overall, no (P > 0.05) chemical by temperature interaction effects on microbial reductions was detected when plated on either TSA or XLD agars. Heating chemical solutions lead to an additional 0.3 log CFU/cm2 reduction in total aerobic bacteria compared to unheated solutions. Less (0.3 log CFU/cm2) inoculated Salmonella were recovered on XLD agar from samples treated with LA compared to samples treated with SSS. However, such a small numeric unit change was likely not biologically important. These results indicated that both unheated and heated SSS and LA are effective interventions to reduce Salmonella inoculated onto hot beef carcass surface tissue.Item Open Access National Beef Quality Audit 2016 face to face interviews and validation of HPP pathogen destruction for use in raw pet food(Colorado State University. Libraries, 2017) Hasty, Joshua D., author; Woerner, Dale, advisor; Belk, Keith, advisor; Martin, Jennifer, committee member; Morley, Paul, committee member; Delmore, Robert, committee memberThe two studies described in this dissertation (1) were The National Beef Quality Audit (NBQA) and (2) the Validation of HPP Pathogen Destruction for Use in Raw Pet Food. The NBQA is conducted every five years; the 2016 version face-to-face interviews gauged the status and progress of the live cattle production industry in improving overall quality and consistency of beef using the procedures of set forth in NBQA 2011.This was the first time that the audit of fed steers and heifers was combined with an audit of market cow and bull beef. Face-to-face interviews were designed to illicit definitions for beef quality, estimate willingness to pay (WTP) for quality attributes, establish relative importance (RI) rankings for important quality factors, and assess images, strengths, weaknesses, potential threats, (SWOT) and shifting trends in the beef industry since the 2011 audit. Individuals making purchasing decisions in five market sectors of the steer/heifer and cow/bull beef supply chain were interviewed, including packers (n = 36), retailers (including large and small supermarket companies and warehouse food sales companies; n = 35), food service operators (including quick-serve, full-service, and institutional establishments; n = 29), further processors (n = 64), and peripherally related government and trade organizations (GTO; n = 30). Face-to-face interviews were conducted across the U.S. between January and November of 2016 using a designed (by sequence) dynamic routing program designed on the Qualtrics software platform (Qualtrics 2016; Provo, UT, USA). Interviewers from three separate land-grant universities first correlated on the administration of interviews in November of 2015 to standardize data collection. Definitions (as described by interviewees) for the seven pre-determined quality factors, including: (1) How and where the cattle were raised, (2) Lean, fat, and bone, (3) Weight and size, (4) Visual characteristics, (5) Food safety, (6) Eating satisfaction, and (7) Cattle genetics were recorded verbatim and categorized into similar responses for analysis. It was critical to understand how interviewees perceived the meaning of each of the seven quality factor groupings to interpret WTP and RI responses. As in NBQA-2011, "food safety" was the most important (P < 0.05) quality factor in RI scaling. Additionally, each sector that did not list "food safety" as a non-negotiable must have characteristic, but was willing to pay a premium for the trait, said that they would pay an average of 11.1% premium for a guarantee of their definition of "food safety" (likely overinflated). The "eating satisfaction" quality factor, primarily defined as "customer satisfaction" by all sectors, was ranked second (P < 0.05) by all marketing sectors except packers, who ranked "lean, fat, and bone" second. Compared to NBQA-2011, generally, a higher percentage of companies were willing to pay a premium for guaranteed quality attributes, but overall were willing to pay lower average premiums than the companies interviewed in 2011. In the second part of this study, (2) non-pathogenic E. coli (ATCC BAA 1427-31), were used to validate the efficacy of High Pressure Processing (HPP) as a destruction tool for use in raw pet food. According to the American Pet Products Association (APPA), pet industry expenditures in the U.S. have grown more than 350% in the past 20 years. Monetarily, annual expenditures increased by approximately $2 billion dollars each year for the past 5 years. Furthermore, APPA estimates that 2016 U.S. pet industry expenditures will exceed $62 billion dollars. Raw pet food products are a rapidly growing sector of the pet food industry. While these formulations are increasingly attractive to pet owners, food safety has historically been a concern. This concern, met with FDA regulations of "zero tolerance" for Salmonella, demands that raw pet food producers explore technologies for the elimination of pathogens in raw pet food products. Thus, the objective of this second experiment was to evaluate the effects of HPP and frozen storage on the destruction of surrogate pathogens in a raw pet food. Approximately 18 kg of a raw beef pet food was inoculated to a target of 7 logs CFU/g with a 5 strain cocktail of non-pathogenic Escherichia coli (ATCC BAA 1427-31), which previously were validated as surrogates for STECs and Salmonella (Dickson, 2015). Inoculated product was packaged in 227 g individual roll-stock packages and shipped to a commercial HPP facility for HPP application. Inoculated samples were subjected to HPP at 87,000 psi for 480 seconds. After HPP processing, samples were transported on ice to Colorado State University for determination of remaining bacterial populations. Samples were assigned randomly to either a 24-hours post-processing (n = 10) or following 5-d of frozen storage at -23ËšC (n = 10) evaluation times. Raw product samples were serially diluted in BPW and plated onto selective (Violet Red Bile Agar; VRBA; selective for coliforms) and non-selective (Tryptic Soy Agar; TSA) medias for enumeration. The TSA survivors totaled 5.36 and 4.6 log CFU/g 24 hours post-HPP and post-frozen storage, respectively. Data were analyzed using the mixed procedure of SAS (version 9.3; Cary, NC) and separated using the PDIFF statement with an α of 0.05. These data suggested that HPP is an effective tool for destruction of foodborne pathogens in raw pet food diets, but that HPP alone is not sufficient to reduce pathogenic loads beyond detection limits. Additionally, these data suggest that a frozen storage period following HPP may also be an effective method for enhancing pathogen destruction. Additional research related to the safety of raw pet food is needed.Item Open Access Pre-harvest and postmortem methods of decreasing the incidence and salvaging value of non-conforming beef carcasses(Colorado State University. Libraries, 2009) Bass, Phillip Dean, author; Belk, Keith, advisor; Scanga, John, advisorIt has been stated that the number of non-conforming carcasses in the beef industry cost the industry millions of dollars yearly in lost revenue. The objectives of this research were to identify muscles in dark-cutting beef carcasses that were not affected by the dark-cutting condition, identify relationships between the portion size of individual muscles in the beef carcass and the carcass ribeye area, and finally to investigate the effects of dietary magnesium (Mg) supplementation on the quality characteristics of beef cattle subjected to stress-inducing environments. Carcasses in the dark cutter study were divided into three classes; 1/3, 1/2, and full degree of dark cutting (DEGDC). The ultimate pH of individual muscles within the carcasses resulted in 7, 9, and 5 muscles having mean pH values considered normal for 1/3, 1/2, and full DEGDC carcasses, respectively. A nation wide survey was conducted to identify the acceptable color range of fresh beef muscles for food service chefs and retail meat merchandisers. Muscles that were within an acceptable color value range for food service chefs and retail meat merchandiser had the potential to add up to $42.29 and $30.30 per side when valued at Choice prices compared to commodity discounted prices, respectively. The portion size study evaluated 14 muscles. Seven of the 14 muscles were found to have no relationship between individual muscle portion size and ribeye area. A nationwide survey was conducted with foodservice chefs and retail meat merchandisers to evaluate the acceptability of portion sizes from carcasses varying in ribeye area size. Results of the survey demonstrated that the portion size of many muscles were still acceptable to retail meat merchandisers and foodservice chefs even though the ribeye area of a carcass may be non-conforming. The study involving the Mg supplementation to beef cattle (N = 144) indicated that the Mg was taken up in the blood stream of the cattle fed the supplement, but no effect on tenderness or reduction of quality defects (e.g., dark cutting) were observed. The results of these studies demonstrate methods of returning value to non-conforming beef carcasses, however, other methods of preventing non-conformity are necessary to research.Item Open Access Prevalence and control of Listeria, Salmonella and Escherichia coli O157:H7 in Colorado rural households(Colorado State University. Libraries, 2009) RodrÃguez Marval, Mawill R., author; Belk, Keith, advisor; Sofos, John, advisorThe household environment has been linked to multiple outbreaks of foodborne illnesses, including listeriosis and salmonellosis. The food handling habits of consumers play a critical role in the food chain continuum, and need to be investigated to better prevent foodborne illnesses that originate at home. The objective of this work was to identify risk factors associated with prevalence of Listeria, Salmonella and Escherichia coli 0157:H7 in the rural household environment, and to provide scientific data for the development of reheating instructions for frankfurters in the home setting. To study risk factors associated with Listeria, Salmonella and Escherichia coli 0157:H7 prevalence in rural Colorado households with or without ruminants, households were recruited, and samples from food and the environment, as well as behavioral data from the primary foods preparer in the house, were collected. Listeria was isolated from refrigerators, kitchen sinks, shoes soles, clothes washing machine and food samples, with higher prevalence in households with ruminants. No sample was found positive for E. coli 0157:H7, and Salmonella was isolated from one refrigerator, one washing machine, one working glove, and two shoe samples. Results indicated that behavior related to handling and cooking of perishable foods affected the probability of household samples testing positive tor Listeria, regardless of presence of ruminants. Personal cleanliness habits were related to presence of Listeria on shoe soles, clothes washing machine, and working gloves. Shoes testing positive in households with ruminants were more frequently associated with multiple positive environmental samples compared to households without ruminants. Results indicated that consumer education on handling and storing perishable foods, and animal handling to prevent contamination of the household through shoes or clothes may reduce prevalence of Listeria in home environments. Two studies evaluated reheating of frankfurters inoculated with L. monocytogenes with or without antimicrobials. In both cases, frankfurters were formulated with or without 1.5% potassium lactate and 0.1% sodium diacetate and were inoculated with a ten-strain composite of L. monocytogenes. After inoculation, frankfurters were vacuum packaged and stored under conditions simulating manufacturing/retail and consumer storage. In one study, after the appropriate storage time, frankfurters were placed in a bowl with water and treated in a household microwave oven. Exposure to high power for 75 s reduced pathogen levels (0.7±0.0 to 1.0±0.1 log CFU/cm2) to below the detection limit (<-0.4 log CFU/cm2) on frankfurters with actate/diacetate. On frankfurters without lactate/diacetate, initial levels of L. monocytogenes (1.5±0.1 to 7.2±0.5 log CFU/cm2) on untreated samples increased as storage in vacuum and aerobic packages progressed. For this formulation, the exposure to high power for 75 s produced reductions between >1.5 and 5.9 log CFU/cm2. Depending on the treatment and storage time, the water used to reheat the frankfurters had viable L. monocytogenes counts of <-2.4 to 5.5±0.5 log CFU/ml. Results indicated that levels of L. monocytogenes contamination <3.7 log CFU/cm2, on frankfurters can be significantly (P>0.05) reduced by microwave oven heating at high power for at least 75 s. Higher contamination levels, such as those found on frankfurters without lactate/diacetate and stored for a prolonged period of time, require longer exposure to microwave heating in order to render the product safe for consumption. In the other study, inoculated frankfurters were treated with hot water after different storage periods to evaluate the destructiveness of different time and water-temperature combinations L. monocytogenes. Treatments at 80°C (60, 120 s) and 94°C (30, 60 s) reduced pathogen counts on frankfurters with PL/SD to at/below the detection limit (<-0.4 log CFU/cm) from initial levels on control (immersed in 25°C water for 300 s) samples. For frankfurters without PL/SD, where pathogen numbers reached 6.1 log CFU/cm2 on 60-day old vacuum-packaged product stored aerobically for 7 days, hot water treatments reduced counts by 1.0 (30 s/80°C) to >6.0 (120 s/94°C and 300 s/94°C) log CFU/cmz. No survivors were detected in the heated water after any treatment (detection limit <-2.5 log CFU/ml). While low levels of L. monocytogenes on frankfurters can be inactivated with short exposure to hot water, increased contamination that may occur as the product ages needs longer times and/or higher temperature for inactivation.Item Open Access The effect of dam nutrient deprivation on lamb carcass characteristics, retail yields, and nutrient composition(Colorado State University. Libraries, 2012) Brenman, Kristina Anne, author; Belk, Keith, advisor; Woerner, Dale, committee member; Engle, Terry, committee member; Mykles, Donald, committee memberThe objective of this study was to determine the effect of dam nutrient restriction on offspring carcass characteristics, retail cut yields, and nutrient composition. Forty one western white rams and ewes were obtained from a previous Colorado State University study of dam nutrient restriction. Prior to gestation, dams were fed 100% of their nutrient requirements. The diet of dams was a vitamin-mineral rich pelleted beet-pulp (77.8% total digestible nutrients [TDN], 90.0% dry matter [DM], and 9.4% crude protein [CP]). At 28 days gestational age, dams were randomly assigned to individual pens and separated into three different treatments: control (100% nutrient requirements), half ration (fed 50% of their nutrient requirements from day 28 until term), and realimented (fed 50% of their nutrient requirements from day 28 until day 78, and then slowly realimented back to 100% for the remainder of gestation). All twin lambs were slaughtered, and hot carcass weight, 12th rib fat, body wall thickness, adjusted fat, ribeye area, ribeye marbling, leg score, leg circumference, conformation, flank streaking, flank firmness, flank color, kidney fat weight, L*, a*, and b* were obtained. After all lambs were slaughtered, one half of each lamb carcass was fabricated in the following subprimals: rack, roast ready, frenched PSO 3x1" (IMPS 204C); shoulder, square-cut, boneless (IMPS 208); Denver ribs, skirt-off (IMPS 209A); Foreshank (IMPS 210); loin, short-cut, trimmed PSO 0x0" (IMPS 232A); flank untrimmed (IMPS 232E); leg, hindshank (IMPS 233F); and leg, shank-off, boneless (IMPS 234A). Lastly, all lambs were utilized to determine dry matter, moisture, crude protein, crude fat, ash, vitamins A and E, trace minerals, and fatty acids. No interactions were found between treatment and gender for any characteristic, so treatment and gender were analyzed separately. Lambs of ewes that were nutritionally restricted were smaller in size with less fat. Lambs of the realimented group had more fat than either the control or the half ration groups. Rams had more percent lean content than ewes, which was to be expected. Results of this study provide insight on the effect of nutrient restriction on lamb growth and development, as well as nutrient content of American lamb.Item Open Access The value of U.S. beef exports and the traceability of pork in countries outside of North America(Colorado State University. Libraries, 2012) Meisinger, Jessica, author; Belk, Keith, advisor; Pendell, Dustin, committee member; Woerner, Dale, committee member; Engle, Terry, committee memberVariation exists within beef cuts produced by U.S. beef packers for domestic and foreign markets, due to differences in consumer expectations and use of the product. The objective of this study was to conduct an industry-wide survey to identify commonality among and between U.S. beef processor specifications, as well as to identify differences between products sent to varying countries, and to determine a more accurate value of beef export. Countries that have an Export Verification program require suppliers to be certified with the United States Department of Agriculture and submit information about exported products. The EV information was collected and used to determine the countries that were receiving the highest volume of U.S. product, as well as the meat cuts common in each country. The data was also used to assign prices to individual products to ascertain export value. These documents do not show individual differences between how companies cut beef products. Four countries that represent significant U.S. beef export markets (Japan, Mexico, Hong Kong, and Taiwan) were visited. During these visits, product was visually checked and compared to known Institutional Meat Processor Specification (IMPS). Animal diseases and related food safety issues have become concerns to many people in the last decade and traceability is becoming increasingly important throughout the world as a way to control disease outbreaks before they have devastating effects on a country's livestock industries. The objective of this review was to discuss swine identification and traceability systems outside North America.