Review of pathogen treatment reductions for onsite non ...
Communities face a challenge when implementing onsite reuse of collected waters for non-potable purposes given the lack of national microbial standards. Quantitative Microbial Risk Assessment (QMRA) can be used to predict the pathogen risks associated with the non-potable reuse of onsite-collected waters; the present work reviewed the relevant QMRA literature to prioritize knowledge gaps and identify health-protective pathogen treatment reduction targets. The review indicated that ingestion of untreated, onsite-collected graywater, rainwater, seepage water and stormwater from a variety of exposure routes resulted in gastrointestinal infection risks greater than the traditional acceptable level of risk. We found no QMRAs that estimated the pathogen risks associated with onsite, non-potable reuse of blackwater. Pathogen treatment reduction targets for non-potable, onsite reuse that included a suite of reference pathogens (i.e., including relevant bacterial, protozoan, and viral hazards) were limited to graywater (for a limited set of domestic uses) and stormwater (for domestic and municipal uses). These treatment reductions corresponded with the health benchmark of a probability of infection or illness of 10−3 per person per year or less. The pathogen treatment reduction targets varied depending on the target health benchmark, reference pathogen, source water, and water reuse application. Overall, there remains a need for pathogen reduction targets that are heal
This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was use...
Risk-based enteric pathogen reduction targets for non-potable ...
This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was used to derive the pathogen log10 reduction targets (LRTs) that corresponded with an infection risk of either 10−4 per person per year (ppy) or 10−2 ppy. The QMRA accounted for variation in pathogen concentration and sporadic pathogen occurrence (when data were available) in source waters for reference pathogens in the genera Rotavirus, Mastadenovirus (human adenoviruses), Norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium. Non-potable uses included indoor use (for toilet flushing and clothes washing) with occasional accidental ingestion of treated non-potable water (or cross-connection with potable water), and unrestricted irrigation for outdoor use. Various exposure scenarios captured the uncertainty from key inputs, i.e., the pathogen concentration in source water; the volume of water ingested; and for the indoor use, the frequency of and the fraction of the population exposed to accidental ingestion. Both potable and non-potable uses required pathogen treatment for the selected waters and the LRT was generally greater for potable use than non-potable indoor use and unrestricted irrigation. The difference in treatment requirements among source waters was driven by the
Pathogen Treatment Guidance and Monitoring Approaches fro ...
On-site non-potable water reuse is increasingly used to augment water supplies, but traditional fecal indicator approaches for defining and monitoring exposure risks are limited when applied to these decentralized options. This session emphasizes risk-based modeling to define pathogen log-reduction requirements coupled with alternative targets for monitoring enabled by genomic sequencing (i.e., the microbiome of reuse systems). 1. Discuss risk-based modeling to define pathogen log-reduction requirements 2. Review alternative targets for monitoring 3. Gain an understanding of how new tools can help improve successful development of sustainable on-site non-potable water reuse Presented at the Water Wastewater Equipment Treatment & Transport Show.
As decentralized water reuse continues to gain popularity, risk-based treatment guidance is increasingly sought for the protection of public health. However, efforts to evaluate pathogen risks and log-reduction requirements have been hindered by an incomplete understanding of pat...
Risk-Based Treatment Targets for Onsite Non-Potable Water ...
This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Microbial Risk Assessment (QMRA) was used to derive the pathogen log10 reduction targets (LRTs) that corresponded with an infection risk of either 10-4 per person per year (ppy) or 10-2 ppy. The QMRA accounted for variation in pathogen concentration and sporadic pathogen occurrence (when data were available) in source waters for reference pathogens Rotavirus, Adenovirus, Norovirus, Campylobacter spp., Salmonella spp., Giardia spp., and Cryptosporidium spp.. Non-potable uses included indoor use (for toilet flushing and clothes washing) with accidental ingestion of treated non-potable water (or cross connection with potable water), and unrestricted irrigation for outdoor use. Various exposure scenarios captured the uncertainty from key inputs, i.e., the pathogen concentration in source water; the volume of water ingested; and for the indoor use, the frequency of and the fraction of the population exposed to accidental ingestion. Both potable and non-potable uses required pathogen treatment for the selected waters and the LRT was generally greater for potable use than nonpotable indoor use and unrestricted irrigation. The difference in treatment requirements among source waters was driven by th
Pathogen reduction of blood components.
Solheim, Bjarte G
2008-08-01
Thanks to many blood safety interventions introduced in developed countries the risk of transfusion transmitted infections has become exceedingly small in these countries. However, emerging pathogens still represent a serious challenge, as demonstrated by West Nile virus in the US and more recently by Chikungunya virus in the Indian Ocean. In addition bacterial contamination, particularly in platelets, and protozoa transmitted by blood components still represent sizeable risks in developed countries. In developing countries the risk of all transfusion transmitted infections is still high due to insufficient funding and organisation of the health service. Pathogen reduction of pooled plasma products has virtually eliminated the risk of transfusion transmitted infections, without compromising the quality of the products significantly. Pathogen reduction of blood components has been much more challenging. Solvent detergent treatment which has been so successfully applied for plasma products dissolves cell membranes, and can, therefore, only be applied for plasma and not for cellular blood components. Targeting of nucleic acids has been another method for pathogen inactivation of plasma and the only approach possible for cellular blood products. As documented in more than 15 year's track record, solvent detergent treatment of pooled plasma can yield high quality plasma. The increased risk for contamination by unknown viruses due to pooling is out weighed by elimination of TRALI, significant reduction in allergic reactions and standardisation of the product. Recently, a promising method for solvent detergent treatment of single donor plasma units has been published. Methylene blue light treatment of single donor plasma units has a similar long track record as pooled solvent detergent treated plasma; but the method is less well documented and affects coagulation factor activity more. Psoralen light treated plasma has only recently been introduced (CE marked in Europe, but not licensed by the FDA), while the method of Riboflavin light treatment of plasma still is under development. In addition to pathogen reduction the methods, however, result in some reduction of coagulation factor activity. For platelets only Psoralen and Riboflavin light treatment have been implemented. Both are CE marked products in Europe but only approved for clinical trials in the USA. The methods affect platelet activity, but result in clinically acceptable platelets with only slightly reduced CCI and increased demand for platelet transfusions. Pathogen reduction of red blood cells with FRALE (S-303) or INACTINE (PEN110) has so far resulted in the formation of antibodies against neo-epitopes on red blood cells. A promising method for Riboflavin treatment of red blood cells is under development. This manuscript reviews the current experience and discusses future trends.
Reinforcing effects of non-pathogenic bacteria and predation risk: from physiology to life history.
Janssens, Lizanne; Stoks, Robby
2014-10-01
The important ecological role of predation risk in shaping populations, communities and ecosystems is becoming increasingly clear. In this context, synergistic effects between predation risk and other natural stressors on prey organisms are gaining attention. Although non-pathogenic bacteria can be widespread in aquatic ecosystems, their role in mediating effects of predation risk has been ignored. We here address the hypothesis that non-pathogenic bacteria may reinforce the negative effects of predation risk in larvae of the damselfly Coenagrion puella. We found synergistic effects for all three life history variables studied: mortality increased, growth reductions were magnified and bacterial load was higher when both non-lethal stressors were combined. The combined exposure to the bacterium and predation risk considerably impaired the two key antipredator mechanisms of the damselfly larvae: they no longer reduced their food intake under predation risk and showed a synergistic reduction in escape swimming speed. The reinforcing negative effects on the fitness-related traits could be explained by the observed synergistic effects on food intake, swimming muscle mass, immune function and oxidative damage. These are likely widespread consequences of energetic constraints and increased metabolic rates associated with the fight-or-flight response. We therefore hypothesize that the here documented synergistic interactions with non-pathogenic bacteria may be widespread. Our results highlight the ignored ecological role of non-pathogenic bacteria in reinforcing the negative effects of predation risk on prey organisms.
Review of pathogen treatment reductions for onsite non-potable reuse of alternative source waters
Communities face a challenge when implementing onsite reuse of collected waters for non-potable purposes given the lack of national microbial standards. Quantitative Microbial Risk Assessment (QMRA) can be used to predict the pathogen risks associated with the non-potable reuse o...
Estimation of norovirus infection risks to consumers of wastewater-irrigated food crops eaten raw.
Mara, Duncan; Sleigh, Andrew
2010-03-01
A quantitative microbial risk analysis-Monte Carlo method was used to estimate norovirus infection risks to consumers of wastewater-irrigated lettuce. Using the same assumptions as used in the 2006 WHO guidelines for the safe use of wastewater in agriculture, a norovirus reduction of 6 log units was required to achieve a norovirus infection risk of approximately 10(-3) per person per year (pppy), but for a lower consumption of lettuce (40-48 g per week vs. 350 g per week) the required reduction was 5 log units. If the tolerable additional disease burden is increased from a DALY (disability-adjusted life year) loss of 10(-6) pppy (the value used in the WHO guidelines) to 10(-5) pppy, the required pathogen reduction is one order of magnitude lower. Reductions of 4-6 log units can be achieved by very simple partial treatment (principally settling to achieve a 1-log unit reduction) supplemented by very reliable post-treatment health-protection control measures such as pathogen die-off (1-2 log units), produce washing in cold water (1 log unit) and produce disinfection (3 log units).
Pathogen reduction in minimally managed composting of bovine manure
USDA-ARS?s Scientific Manuscript database
Persistence of pathogenic bacteria such as E. coli O157:H7, Salmonella spp., and Listeria monocytogenes in bovine feces and contaminated soils is an important risk factor in perpetuating the initial infection as well as re-infection of cattle and dissemination of pathogens throughout agricultural la...
Natural disturbance reduces disease risk in endangered rainforest frog populations
Roznik, Elizabeth A.; Sapsford, Sarah J.; Pike, David A.; Schwarzkopf, Lin; Alford, Ross A.
2015-01-01
Natural disturbances can drive disease dynamics in animal populations by altering the microclimates experienced by hosts and their pathogens. Many pathogens are highly sensitive to temperature and moisture, and therefore small changes in habitat structure can alter the microclimate in ways that increase or decrease infection prevalence and intensity in host populations. Here we show that a reduction of rainforest canopy cover caused by a severe tropical cyclone decreased the risk of endangered rainforest frogs (Litoria rheocola) becoming infected by a fungal pathogen (Batrachochytrium dendrobatidis). Reductions in canopy cover increased the temperatures and rates of evaporative water loss in frog microhabitats, which reduced B. dendrobatidis infection risk in frogs by an average of 11–28% in cyclone-damaged areas, relative to unaffected areas. Natural disturbances to the rainforest canopy can therefore provide an immediate benefit to frogs by altering the microclimate in ways that reduce infection risk. This could increase host survival and reduce the probability of epidemic disease outbreaks. For amphibian populations under immediate threat from this pathogen, targeted manipulation of canopy cover could increase the availability of warmer, drier microclimates and therefore tip the balance from host extinction to coexistence. PMID:26294048
Natural disturbance reduces disease risk in endangered rainforest frog populations.
Roznik, Elizabeth A; Sapsford, Sarah J; Pike, David A; Schwarzkopf, Lin; Alford, Ross A
2015-08-21
Natural disturbances can drive disease dynamics in animal populations by altering the microclimates experienced by hosts and their pathogens. Many pathogens are highly sensitive to temperature and moisture, and therefore small changes in habitat structure can alter the microclimate in ways that increase or decrease infection prevalence and intensity in host populations. Here we show that a reduction of rainforest canopy cover caused by a severe tropical cyclone decreased the risk of endangered rainforest frogs (Litoria rheocola) becoming infected by a fungal pathogen (Batrachochytrium dendrobatidis). Reductions in canopy cover increased the temperatures and rates of evaporative water loss in frog microhabitats, which reduced B. dendrobatidis infection risk in frogs by an average of 11-28% in cyclone-damaged areas, relative to unaffected areas. Natural disturbances to the rainforest canopy can therefore provide an immediate benefit to frogs by altering the microclimate in ways that reduce infection risk. This could increase host survival and reduce the probability of epidemic disease outbreaks. For amphibian populations under immediate threat from this pathogen, targeted manipulation of canopy cover could increase the availability of warmer, drier microclimates and therefore tip the balance from host extinction to coexistence.
Jimenez-Marco, Teresa; Cancino-Faure, Beatriz; Girona-Llobera, Enrique; Alcover, M Magdalena; Riera, Cristina; Fisa, Roser
2017-06-01
The parasitic Chagas disease is caused by the protozoan Trypanosoma cruzi, which is mainly transmitted by insect vectors. Other infection routes, both in endemic and in nonendemic areas, include organ and marrow transplantation, congenital transmission, and blood transfusion. Asymptomatic chronic chagasic individuals may have a low and transient parasitemia in peripheral blood and, consequently, they can unknowingly transmit the disease via blood transfusion. Riboflavin and ultraviolet (UV) light pathogen reduction is a method to reduce pathogen transfusion transmission risk based on damage to the pathogen nucleic acids. In this study, we tested the effectiveness of this technology for the elimination of T. cruzi parasites in artificially contaminated whole blood units (WBUs) and thus for decreasing the risk of T. cruzi transfusion transmission. The contaminated WBUs were leukoreduced by filtration and treated with riboflavin and UV light. The level of pathogen reduction was quantified by a real-time polymerase chain reaction (qPCR) and a real-time reverse transcription-polymerase chain reaction (RT-qPCR) as a viability assay. The RNA (cDNA) quantification of the parasites showed a more than 99% reduction of viable T. cruzi parasites after leukoreduction and a complete reduction (100%) after the riboflavin and UV light treatment. Riboflavin and UV light treatment and leukoreduction used in conjunction appears to eliminate significant amounts of viable T. cruzi in whole blood. Both strategies could complement other blood bank measures already implemented to prevent the transmission of T. cruzi via blood transfusion. © 2017 AABB.
Barker, S Fiona; Packer, Michael; Scales, Peter J; Gray, Stephen; Snape, Ian; Hamilton, Andrew J
2013-09-01
Small, remote communities often have limited access to energy and water. Direct potable reuse of treated wastewater has recently gained attention as a potential solution for water-stressed regions, but requires further evaluation specific to small communities. The required pathogen reduction needed for safe implementation of direct potable reuse of treated sewage is an important consideration but these are typically quantified for larger communities and cities. A quantitative microbial risk assessment (QMRA) was conducted, using norovirus, giardia and Campylobacter as reference pathogens, to determine the level of treatment required to meet the tolerable annual disease burden of 10(-6) DALYs per person per year, using Davis Station in Antarctica as an example of a small remote community. Two scenarios were compared: published municipal sewage pathogen loads and estimated pathogen loads during a gastroenteritis outbreak. For the municipal sewage scenario, estimated required log10 reductions were 6.9, 8.0 and 7.4 for norovirus, giardia and Campylobacter respectively, while for the outbreak scenario the values were 12.1, 10.4 and 12.3 (95th percentiles). Pathogen concentrations are higher under outbreak conditions as a function of the relatively greater degree of contact between community members in a small population, compared with interactions in a large city, resulting in a higher proportion of the population being at risk of infection and illness. While the estimates of outbreak conditions may overestimate sewage concentration to some degree, the results suggest that additional treatment barriers would be required to achieve regulatory compliance for safe drinking water in small communities. Copyright © 2013 Elsevier B.V. All rights reserved.
Salunkhe, Vishal; van der Meer, Pieter F; de Korte, Dirk; Seghatchian, Jerard; Gutiérrez, Laura
2015-02-01
Transfusion-transmitted infections (TTI) have been greatly reduced in numbers due to the strict donor selection and screening procedures, i.e. the availability of technologies to test donors for endemic infections, and routine vigilance of regulatory authorities in every step of the blood supply chain (collection, processing and storage). However, safety improvement is still a matter of concern because infection zero-risk in transfusion medicine is non-existent. Alternatives are required to assure the safety of the transfusion product and to provide a substitution to systematic blood screening tests, especially in less-developed countries or at the war-field. Furthermore, the increasing mobility of the population due to traveling poses a new challenge in the endemic screening tests routinely used, because non-endemic pathogens might emerge in a specific population. Pathogen reduction treatments sum a plethora of active approaches to eliminate or reduce potential threatening pathogen load from blood transfusion products. Despite the success of pathogen reduction treatments applied to plasma products, there is still a long way to develop and deploy pathogen reduction treatments to cellular transfusion products (such as platelets, RBCs or even to whole blood) and there is divergence on its acceptance worldwide. While the use of pathogen reduction treatments in platelets is performed routinely in a fair number of European blood banks, most of these treatments are not (or just) licensed in the USA or elsewhere in the world. The development of pathogen reduction treatments for RBC and whole blood is still in its infancy and under clinical trials. In this review, we discuss the available and emerging pathogen reduction treatments and their advantages and disadvantages. Furthermore, we highlight the importance of characterizing standard transfusion products with current and emerging approaches (OMICS) and clinical outcome, and integrating this information on a database, thinking on the benefits it might bring in the future toward personalized transfusion therapies. Copyright © 2014 Elsevier Ltd. All rights reserved.
Physical and chemical interventions to mitigate risk associated with leafy greens
USDA-ARS?s Scientific Manuscript database
Contamination of leafy green vegetables with human pathogens is a source of ongoing concern for consumers. Conventional treatments have typically been able to achieve 1-2 logs reductions of such pathogens as Salmonella, Escherichia coli O157:H7 and Listeria monocytogenes. Novel approaches and treatm...
Health Technology Assessment of pathogen reduction technologies applied to plasma for clinical use
Cicchetti, Americo; Berrino, Alexandra; Casini, Marina; Codella, Paola; Facco, Giuseppina; Fiore, Alessandra; Marano, Giuseppe; Marchetti, Marco; Midolo, Emanuela; Minacori, Roberta; Refolo, Pietro; Romano, Federica; Ruggeri, Matteo; Sacchini, Dario; Spagnolo, Antonio G.; Urbina, Irene; Vaglio, Stefania; Grazzini, Giuliano; Liumbruno, Giancarlo M.
2016-01-01
Although existing clinical evidence shows that the transfusion of blood components is becoming increasingly safe, the risk of transmission of known and unknown pathogens, new pathogens or re-emerging pathogens still persists. Pathogen reduction technologies may offer a new approach to increase blood safety. The study is the output of collaboration between the Italian National Blood Centre and the Post-Graduate School of Health Economics and Management, Catholic University of the Sacred Heart, Rome, Italy. A large, multidisciplinary team was created and divided into six groups, each of which addressed one or more HTA domains. Plasma treated with amotosalen + UV light, riboflavin + UV light, methylene blue or a solvent/detergent process was compared to fresh-frozen plasma with regards to current use, technical features, effectiveness, safety, economic and organisational impact, and ethical, social and legal implications. The available evidence is not sufficient to state which of the techniques compared is superior in terms of efficacy, safety and cost-effectiveness. Evidence on efficacy is only available for the solvent/detergent method, which proved to be non-inferior to untreated fresh-frozen plasma in the treatment of a wide range of congenital and acquired bleeding disorders. With regards to safety, the solvent/detergent technique apparently has the most favourable risk-benefit profile. Further research is needed to provide a comprehensive overview of the cost-effectiveness profile of the different pathogen-reduction techniques. The wide heterogeneity of results and the lack of comparative evidence are reasons why more comparative studies need to be performed. PMID:27403740
Sokolova, Ekaterina; Petterson, Susan R; Dienus, Olaf; Nyström, Fredrik; Lindgren, Per-Eric; Pettersson, Thomas J R
2015-09-01
Norovirus contamination of drinking water sources is an important cause of waterborne disease outbreaks. Knowledge on pathogen concentrations in source water is needed to assess the ability of a drinking water treatment plant (DWTP) to provide safe drinking water. However, pathogen enumeration in source water samples is often not sufficient to describe the source water quality. In this study, the norovirus concentrations were characterised at the contamination source, i.e. in sewage discharges. Then, the transport of norovirus within the water source (the river Göta älv in Sweden) under different loading conditions was simulated using a hydrodynamic model. Based on the estimated concentrations in source water, the required reduction of norovirus at the DWTP was calculated using quantitative microbial risk assessment (QMRA). The required reduction was compared with the estimated treatment performance at the DWTP. The average estimated concentration in source water varied between 4.8×10(2) and 7.5×10(3) genome equivalents L(-1); and the average required reduction by treatment was between 7.6 and 8.8 Log10. The treatment performance at the DWTP was estimated to be adequate to deal with all tested loading conditions, but was heavily dependent on chlorine disinfection, with the risk of poor reduction by conventional treatment and slow sand filtration. To our knowledge, this is the first article to employ discharge-based QMRA, combined with hydrodynamic modelling, in the context of drinking water. Copyright © 2015 Elsevier B.V. All rights reserved.
Risk-Based Treatment Targets for Onsite Non-Potable Water Reuse
This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Micr...
Removal of pathogenic bacteria from sewage-treated effluent and biosolids for agricultural purposes
NASA Astrophysics Data System (ADS)
Al-Gheethi, A. A.; Efaq, A. N.; Bala, J. D.; Norli, I.; Abdel-Monem, M. O.; Ab. Kadir, M. O.
2018-05-01
The reuse of treated sewage for irrigation is considered as an important alternative water source in the new water management strategy of the countries that face a severe deficiency of water resources such as the Middle East countries. The organic material and fertilizing elements contained in biosolids are essential for maintaining soil fertility. However, both treated sewage and biosolids contain a large diversity of pathogens that would be transmitted to the environment and infect human directly or indirectly. Therefore, those pathogens should be reduced from the treated sewage and biosolids before the reuse in the agriculture. This paper reviews the considerations for reuse of treated sewage and biosolids in agriculture and further treatments used for reduction of pathogenic bacteria. The treatment methods used for the reduction of pathogens in these wastes have reviewed. It appeared that the main concern associated with the reduction of pathogenic bacteria lies in their ability to regrow in the treated sewage and biosolids. Therefore, the effective treatment method is that it has the potential to destruct pathogens cells and remove the nutrients to prevent the regrowth or recontamination from the surrounded environment. The removal of nutrients might be applicable in the sewage but not in the biosolids due to high nutrient contents. However, the reduction of health risk in the biosolids might be carried out by regulating the biosolid utilization and selecting the plant species grown in the fertilized soil with biosolids.
USDA-ARS?s Scientific Manuscript database
Control of highly pathogenic avian influenza (HPAI) has traditionally involved the establishment of disease containment zones, where poultry products are only permitted to move from within a containment area under permit. Non-pasteurized liquid egg (NPLE) is one such commodity for which movements ma...
McBride, Graham B; Stott, Rebecca; Miller, Woutrina; Bambic, Dustin; Wuertz, Stefan
2013-09-15
This study is the first to report a quantitative microbial risk assessment (QMRA) on pathogens detected in stormwater discharges-of-concern, rather than relying on pathogen measurements in receiving waters. The pathogen concentrations include seven "Reference Pathogens" identified by the U.S. EPA: Cryptosporidium, Giardia, Salmonella, Norovirus, Rotavirus, Enterovirus, and Adenovirus. Data were collected from 12 sites representative of seven discharge types (including residential, commercial/industrial runoff, agricultural runoff, combined sewer overflows, and forested land), mainly during wet weather conditions during which times human health risks can be substantially elevated. The risks calculated herein therefore generally apply to short-term conditions (during and just after rainfall events) and so the results can be used by water managers to potentially inform the public, even for waters that comply with current criteria (based as they are on a 30-day mean risk). Using an example waterbody and mixed source, pathogen concentrations were used in QMRA models to generate risk profiles for primary and secondary water contact (or inhalation) by adults and children. A number of critical assumptions and considerations around the QMRA analysis are highlighted, particularly the harmonization of the pathogen concentrations measured in discharges during this project with those measured (using different methods) during the published dose-response clinical trials. Norovirus was the most dominant predicted health risk, though further research on its dose-response for illness (cf. infection) is needed. Even if the example mixed-source concentrations of pathogens had been reduced 30 times (by inactivation and mixing), the predicted swimming-associated illness rates - largely driven by Norovirus infections - can still be appreciable. Rotavirus generally induced the second-highest incidence of risk among the tested pathogens while risks for the other Reference Pathogens (Giardia, Cryptosporidium, Adenovirus, Enterovirus and Salmonella) were considerably lower. Secondary contact or inhalation resulted in considerable reductions in risk compared to primary contact. Measurements of Norovirus and careful incorporation of its concentrations into risk models (harmonization) should be a critical consideration for future QMRA efforts. The discharge-based QMRA approach presented herein is particularly relevant to cases where pathogens cannot be reliably detected in receiving waters with detection limits relevant to human health effects. Copyright © 2013 Elsevier Ltd. All rights reserved.
Thompson, Sally E; Levin, Simon; Rodriguez-Iturbe, Ignacio
2014-04-01
Global change will simultaneously impact many aspects of climate, with the potential to exacerbate the risks posed by plant pathogens to agriculture and the natural environment; yet, most studies that explore climate impacts on plant pathogen ranges consider individual climatic factors separately. In this study, we adopt a stochastic modeling approach to address multiple pathways by which climate can constrain the range of the generalist plant pathogen Phytophthora cinnamomi (Pc): through changing winter soil temperatures affecting pathogen survival; spring soil temperatures and thus pathogen metabolic rates; and changing spring soil moisture conditions and thus pathogen growth rates through host root systems. We apply this model to the southwestern USA for contemporary and plausible future climate scenarios and evaluate the changes in the potential range of Pc. The results indicate that the plausible range of this pathogen in the southwestern USA extends over approximately 200,000 km(2) under contemporary conditions. While warming temperatures as projected by the IPCC A2 and B1 emissions scenarios greatly expand the range over which the pathogen can survive winter, projected reductions in spring rainfall reduce its feasible habitat, leading to spatially complex patterns of changing risk. The study demonstrates that temperature and rainfall changes associated with possible climate futures in the southwestern USA have confounding impacts on the range of Pc, suggesting that projections of future pathogen dynamics and ranges should account for multiple pathways of climate-pathogen interaction. © 2014 John Wiley & Sons Ltd.
Ferguson, Christobel M; Croke, Barry F W; Beatson, Peter J; Ashbolt, Nicholas J; Deere, Daniel A
2007-06-01
In drinking water catchments, reduction of pathogen loads delivered to reservoirs is an important priority for the management of raw source water quality. To assist with the evaluation of management options, a process-based mathematical model (pathogen catchment budgets - PCB) is developed to predict Cryptosporidium, Giardia and E. coli loads generated within and exported from drinking water catchments. The model quantifies the key processes affecting the generation and transport of microorganisms from humans and animals using land use and flow data, and catchment specific information including point sources such as sewage treatment plants and on-site systems. The resultant pathogen catchment budgets (PCB) can be used to prioritize the implementation of control measures for the reduction of pathogen risks to drinking water. The model is applied in the Wingecarribee catchment and used to rank those sub-catchments that would contribute the highest pathogen loads in dry weather, and in intermediate and large wet weather events. A sensitivity analysis of the model identifies that pathogen excretion rates from animals and humans, and manure mobilization rates are significant factors determining the output of the model and thus warrant further investigation.
The Use of Filter-feeders to Manage Disease in a Changing World.
Burge, Colleen A; Closek, Collin J; Friedman, Carolyn S; Groner, Maya L; Jenkins, Cody M; Shore-Maggio, Amanda; Welsh, Jennifer E
2016-10-01
Rapid environmental change is linked to increases in aquatic disease heightening the need to develop strategies to manage disease. Filter-feeding species are effective biofilters and can naturally mitigate disease risk to humans and wildlife. We review the role of filter-feeders, with an emphasis on bivalves, in altering disease outcomes via augmentation and reduction. Filtration can reduce transmission by removing pathogens from the water column via degradation and release of pathogens in pseudofeces. In other cases, filtration can increase pathogen transmission and disease risk. The effect of filtration on pathogen transmission depends on the selectivity of the filter-feeder, the degree of infectivity by the pathogen, the mechanism(s) of pathogen transmission and the ability of the pathogen to resist degradation. For example, some bacteria and viruses can resist degradation and accumulate within a filter-feeder leading to disease transmission to humans and other wildlife upon ingestion. Since bivalves can concentrate microorganisms, they are also useful as sentinels for the presence of pathogenic microorganisms. While somewhat less studied, other invertebrates, including ascidians and sponges may also provide ecosystem services by altering pathogen transmission. In all scenarios, climate change may affect the potential for filter-feeders to mitigate disease risk. We conclude that an assessment including empirical data and modeling of system-wide impacts should be conducted before selection of filter-feeders to mitigate disease. Such studies should consider physiology of the host and microbe and risk factors for negative impacts including augmentation of other pathogens. © The Author 2016. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
Carvajal, Guido; Roser, David J; Sisson, Scott A; Keegan, Alexandra; Khan, Stuart J
2015-11-15
Risk management for wastewater treatment and reuse have led to growing interest in understanding and optimising pathogen reduction during biological treatment processes. However, modelling pathogen reduction is often limited by poor characterization of the relationships between variables and incomplete knowledge of removal mechanisms. The aim of this paper was to assess the applicability of Bayesian belief network models to represent associations between pathogen reduction, and operating conditions and monitoring parameters and predict AS performance. Naïve Bayes and semi-naïve Bayes networks were constructed from an activated sludge dataset including operating and monitoring parameters, and removal efficiencies for two pathogens (native Giardia lamblia and seeded Cryptosporidium parvum) and five native microbial indicators (F-RNA bacteriophage, Clostridium perfringens, Escherichia coli, coliforms and enterococci). First we defined the Bayesian network structures for the two pathogen log10 reduction values (LRVs) class nodes discretized into two states (< and ≥ 1 LRV) using two different learning algorithms. Eight metrics, such as Prediction Accuracy (PA) and Area Under the receiver operating Curve (AUC), provided a comparison of model prediction performance, certainty and goodness of fit. This comparison was used to select the optimum models. The optimum Tree Augmented naïve models predicted removal efficiency with high AUC when all system parameters were used simultaneously (AUCs for C. parvum and G. lamblia LRVs of 0.95 and 0.87 respectively). However, metrics for individual system parameters showed only the C. parvum model was reliable. By contrast individual parameters for G. lamblia LRV prediction typically obtained low AUC scores (AUC < 0.81). Useful predictors for C. parvum LRV included solids retention time, turbidity and total coliform LRV. The methodology developed appears applicable for predicting pathogen removal efficiency in water treatment systems generally. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ware, A D; Jacquot, C; Tobian, A A R; Gehrie, E A; Ness, P M; Bloch, E M
2018-01-01
Transfusion-transmitted infection risk remains an enduring challenge to blood safety in Africa. A high background incidence and prevalence of the major transfusion-transmitted infections (TTIs), dependence on high-risk donors to meet demand, suboptimal testing and quality assurance collectively contribute to the increased risk. With few exceptions, donor testing is confined to serological evaluation of human immunodeficiency virus (HIV), hepatitis B and C (HBV and HCV) and syphilis. Barriers to implementation of broader molecular methods include cost, limited infrastructure and lack of technical expertise. Pathogen reduction (PR), a term used to describe a variety of methods (e.g. solvent detergent treatment or photochemical activation) that may be applied to blood following collection, offers the means to diminish the infectious potential of multiple pathogens simultaneously. This is effective against different classes of pathogen, including the major TTIs where laboratory screening is already implemented (e.g. HIV, HBV and HCV) as well pathogens that are widely endemic yet remain unaddressed (e.g. malaria, bacterial contamination). We sought to review the available and emerging PR techniques and their potential application to resource-constrained parts of Africa, focusing on the advantages and disadvantages of such technologies. PR has been slow to be adopted even in high-income countries, primarily given the high costs of use. Logistical considerations, particularly in low-resourced parts of Africa, also raise concerns about practicality. Nonetheless, PR offers a rational, innovative strategy to contend with TTIs; technologies in development may well present a viable complement or even alternative to targeted screening in the future. © 2017 International Society of Blood Transfusion.
Luber, Petra
2009-08-31
Epidemiological studies show that poultry meat and eggs are important sources for consumers' exposure to pathogens such as Salmonella and Campylobacter. There is a focus in many countries to reduce the level of human illness from food-borne pathogens. Reduction of the prevalence of contaminated poultry meat or eggs is one major area of focus. The other is risk communication to the consumer, where information aimed at changing the food preparation behaviour has been utilised as a risk management tool. The efficacy of messages such as 'cook poultry meat and eggs thoroughly' or 'wash your hands' will depend both on the ability to change consumer behaviour as well as where the risk can best be mitigated. In order to prioritise what message should be given to the consumer, the relative contribution of different exposure pathways finally leading to ingestion of the pathogens and resulting in illness needs to be known. It is important to know whether cross-contamination events or undercooking are the greatest risk lurking in consumers' kitchens. A review of studies looking at the location of pathogens in food products has been performed and data regarding internal and external (surface) contamination of poultry meat with Salmonella spp. and Campylobacter jejuni and C. coli is presented. In the case of eggs, data on internal contamination with Salmonella and for contamination of egg shells with Salmonella and Campylobacter are discussed. The results from published risk assessments for these pathogen-food commodity combinations have been evaluated and conclusions regarding the relative risk of internal and external contamination of poultry meat and eggs were drawn. In conclusion, cross-contamination events from activities such as use of the same cutting board for chicken meat and salad without intermediate cleaning or spreading of pathogens via the kitchen environment seem to be of greater importance than the risk associated with undercooking of poultry meat or eggs. Risk management options are discussed against the background of risk communication strategies used in different countries.
Drew, Victor J; Barro, Lassina; Seghatchian, Jerard; Burnouf, Thierry
2017-10-01
Over 110 million units of blood are collected yearly. The need for blood products is greater in developing countries, but so is the risk of contracting a transfusion-transmitted infection. Without efficient donor screening/viral testing and validated pathogen inactivation technology, the risk of transfusion-transmitted infections correlates with the infection rate of the donor population. The World Health Organization has published guidelines on good manufacturing practices in an effort to ensure a strong global standard of transfusion and blood product safety. Sub-Saharan Africa is a high-risk region for malaria, human immunodeficiency virus (HIV), hepatitis B virus and syphilis. Southeast Asia experiences high rates of hepatitis C virus. Areas with a tropical climate have an increased risk of Zika virus, Dengue virus, West Nile virus and Chikungunya, and impoverished countries face economical limitations which hinder efforts to acquire the most modern pathogen inactivation technology. These systems include Mirasol ® Pathogen Reduction Technology, INTERCEPT ® , and THERAFLEX ® . Their procedures use a chemical and ultraviolet or visible light for pathogen inactivation and significantly decrease the threat of pathogen transmission in plasma and platelets. They are licensed for use in Europe and are used in several other countries. The current interest in the blood industry is the development of pathogen inactivation technologies that can treat whole blood (WB) and red blood cell (RBC). The Mirasol system has recently undergone phase III clinical trials for treating WB in Ghana and has demonstrated some efficacy toward malaria inactivation and low risk of adverse effects. A 2 nd -generation of the INTERCEPT ® S-303 system for WB is currently undergoing a phase III clinical trial. Both methodologies are applicable for WB and components derived from virally reduced WB or RBC.
Drew, Victor J.; Barro, Lassina; Seghatchian, Jerard; Burnouf, Thierry
2017-01-01
Over 110 million units of blood are collected yearly. The need for blood products is greater in developing countries, but so is the risk of contracting a transfusion-transmitted infection. Without efficient donor screening/viral testing and validated pathogen inactivation technology, the risk of transfusion-transmitted infections correlates with the infection rate of the donor population. The World Health Organization has published guidelines on good manufacturing practices in an effort to ensure a strong global standard of transfusion and blood product safety. Sub-Saharan Africa is a high-risk region for malaria, human immunodeficiency virus (HIV), hepatitis B virus and syphilis. Southeast Asia experiences high rates of hepatitis C virus. Areas with a tropical climate have an increased risk of Zika virus, Dengue virus, West Nile virus and Chikungunya, and impoverished countries face economical limitations which hinder efforts to acquire the most modern pathogen inactivation technology. These systems include Mirasol® Pathogen Reduction Technology, INTERCEPT®, and THERAFLEX®. Their procedures use a chemical and ultraviolet or visible light for pathogen inactivation and significantly decrease the threat of pathogen transmission in plasma and platelets. They are licensed for use in Europe and are used in several other countries. The current interest in the blood industry is the development of pathogen inactivation technologies that can treat whole blood (WB) and red blood cell (RBC). The Mirasol system has recently undergone phase III clinical trials for treating WB in Ghana and has demonstrated some efficacy toward malaria inactivation and low risk of adverse effects. A 2nd-generation of the INTERCEPT® S-303 system for WB is currently undergoing a phase III clinical trial. Both methodologies are applicable for WB and components derived from virally reduced WB or RBC. PMID:28488960
Shrestha, S; Haramoto, E; Shindo, J
2017-11-01
To assess diarrhoeal risks from enteropathogenic Escherichia coli, Giardia and Cryptosporidium from consuming raw spinach, cabbage, carrots and tomatoes in Kathmandu Valley, Nepal. The annual infection risk was quantified using the probabilistic Quantitative Microbial Risk Assessment approach, which considered 12 vegetable washing combinations. A new model was used to estimate dose of pathogens per exposure comprising parameters such as pathogen concentration in vegetable wash water before selling and eating, vegetable consumption rate, remaining pathogen ratio after washing, remaining water on vegetables after washing and water treatment removal efficiency. When all washing combinations were considered, high infection risks above the acceptable level of -4 log 10 infection per person per year were obtained, whereas the risk was reduced when other sources excluding river water were used. Assuming use of water treated with ceramic filters by all consumers, a 0-2 log 10 reduction in the estimated risks was obtained, which was insufficient to achieve the required risk level. High risk of diarrhoea prevails among raw vegetable consumers in the valley. It is needed to protect vegetable washing water sources and establish advanced water treatment methods to achieve the required level of public health risk. © 2017 The Society for Applied Microbiology.
Pathogen inactivation in liquid dairy manure during anaerobic and aerobic digestions
NASA Astrophysics Data System (ADS)
Biswas, S.; Pandey, P.; Castillo, A. R.; Vaddella, V. K.
2014-12-01
Controlling manure-borne pathogens such as E. coli O157:H7, Salmonella spp. and Listeria monocytogenes are crucial for protecting surface and ground water as well as mitigating risks to human health. In California dairy farms, flushing of dairy manure (mainly animal feces and urine) from freestall barns and subsequent liquid-solid manure separation is a common practice for handling animal waste. The liquid manure fraction is generally pumped into the settling ponds and it goes into aerobic and/or anaerobic lagoons for extended period of time. Considering the importance of controlling pathogens in animal waste, the objective of the study was to understand the effects of anaerobic and aerobic digestions on the survival of three human pathogens in animal waste. The pathogen inactivation was assessed at four temperatures (30, 35, 42, and 50 °C), and the relationships between temperature and pathogen decay were estimated. Results showed a steady decrease of E. coli levels in aerobic and anaerobic digestion processes over the time; however, the decay rates varied with pathogens. The effect of temperature on Salmonella spp. and Listeria monocytogenes survival was different than the E. coli survival. In thermophilic temperatures (42 and 50 °C), decay rate was considerable greater compared to the mesophilic temperatures (30 and 35°C). The E. coli log reductions at 50 °C were 2.1 in both aerobic and anaerobic digestions after 13 days of incubation. The Salmonella spp. log reductions at 50 °C were 5.5 in aerobic digestion, and 5.9 in anaerobic digestion. The Listeria monocytogenes log reductions at 50 °C were 5.0 in aerobic digestion, and 5.6 in anaerobic digestion. The log reduction of E. coli, Salmonella spp., and Listeria monocytogens at 30 °C in aerobic environment were 0.1, 4.7, and 5.6, respectively. In anaerobic environment, the corresponding reductions were 0.4, 4.3, and 5.6, respectively. We anticipate that the outcomes of the study will help improving the existing animal waste management processes to control manure-borne pathogens.
Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P
2018-01-01
Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.
Centrifuge separation effect on bacterial indicator reduction in dairy manure.
Liu, Zong; Carroll, Zachary S; Long, Sharon C; Roa-Espinosa, Aicardo; Runge, Troy
2017-04-15
Centrifugation is a commonly applied separation method for manure processing on large farms to separate solids and nutrients. Pathogen reduction is also an important consideration for managing manure. Appropriate treatment reduces risks from pathogen exposure when manure is used as soil amendments or the processed liquid stream is recycled to flush the barn. This study investigated the effects of centrifugation and polymer addition on bacterial indicator removal from the liquid fraction of manure slurries. Farm samples were taken from a manure centrifuge processing system. There were negligible changes of quantified pathogen indicator concentrations in the low-solids centrate compared to the influent slurry. To study if possible improvements could be made to the system, lab scale experiments were performed investigating a range of g-forces and flocculating polymer addition. The results demonstrated that polymer addition had a negligible effect on the indicator bacteria levels when centrifuged at high g forces. However, the higher g force centrifugation was capable of reducing bacterial indicator levels up to two-log 10 in the liquid stream of the manure, although at speeds higher than typical centrifuge operations currently used for manure processing applications. This study suggests manure centrifuge equipment could be redesigned to provide pathogen reduction to meet emerging issues, such as zoonotic pathogen control. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Ding, Hongliu; Fu, Tong-Jen
2016-01-01
Sprouts have been a recurring public health challenge due to microbiological contamination, and Salmonella has been the major cause of sprout-associated outbreaks. Although seed treatment and microbiological testing have been applied as risk reduction measures during sprout production, the extent to which their effectiveness in reducing the public health risks associated with sprouts has not been well investigated. We conducted a quantitative risk assessment to measure the risk posed by Salmonella contamination in sprouts and to determine whether and how mitigation strategies can achieve a satisfactory risk reduction based on the assumption that the risk reduction achieved by a microbiological sampling and testing program at a given sensitivity is equivalent to that achieved by direct inactivation of pathogens. Our results indicated that if the sprouts were produced without any risk interventions, the health impact caused by sprouts contaminated with Salmonella would be very high, with a median annual estimated loss of disability-adjusted life years (DALYs) of 691,412. Seed treatment (with 20,000 ppm of calcium hypochlorite) or microbiological sampling and testing of spent irrigation water (SIW) alone could reduce the median annual impact to 734 or 4,856 DALYs, respectively. Combining seed treatment with testing of the SIW would further decrease the risk to 58 DALYs. This number could be dramatically lowered to 3.99 DALYs if sprouts were produced under conditions that included treating seeds with 20,000 ppm of calcium hypochlorite plus microbiological testing of seeds, SIW, and finished products. Our analysis shows that the public health impact due to Salmonella contamination in sprouts could be controlled if seeds are treated to reduce pathogens and microbiological sampling and testing is implemented. Future advances in intervention strategies would be important to improve sprout safety further.
Petterson, S R; Stenström, T A
2015-09-01
To support the implementation of quantitative microbial risk assessment (QMRA) for managing infectious risks associated with drinking water systems, a simple modeling approach for quantifying Log10 reduction across a free chlorine disinfection contactor was developed. The study was undertaken in three stages: firstly, review of the laboratory studies published in the literature; secondly, development of a conceptual approach to apply the laboratory studies to full-scale conditions; and finally implementation of the calculations for a hypothetical case study system. The developed model explicitly accounted for variability in residence time and pathogen specific chlorine sensitivity. Survival functions were constructed for a range of pathogens relying on the upper bound of the reported data transformed to a common metric. The application of the model within a hypothetical case study demonstrated the importance of accounting for variable residence time in QMRA. While the overall Log10 reduction may appear high, small parcels of water with short residence time can compromise the overall performance of the barrier. While theoretically simple, the approach presented is of great value for undertaking an initial assessment of a full-scale disinfection contactor based on limited site-specific information.
Han, Il; Congeevaram, Shankar; Ki, Dong-Won; Oh, Byoung-Taek; Park, Joonhong
2011-02-01
Due to the environmental problems associated with disposal of livestock sludge, many stabilization studies emphasizing on the sludge volume reduction were performed. However, little is known about the microbial risk present in sludge and its stabilized products. This study microbiologically explored the effects of anaerobic lagoon fermentation (ALF) and autothermal thermophilic aerobic digestion (ATAD) on pathogen-related risk of raw swine manure by using culture-independent 16S rDNA cloning and sequencing methods. In raw swine manure, clones closely related to pathogens such as Dialister pneumosintes, Erysipelothrix rhusiopathiae, Succinivibrioan dextrinosolvens, and Schineria sp. were detected. Meanwhile, in the mesophilic ALF-treated swine manure, bacterial community clones closely related to pathogens such as Schineria sp. and Succinivibrio dextrinosolvens were still detected. Interestingly, the ATAD treatment resulted in no detection of clones closely related to pathogens in the stabilized thermophilic bacterial community, with the predominance of novel Clostridia class populations. These findings support the superiority of ATAD in selectively reducing potential human and animal pathogens compared to ALF, which is a typical manure stabilization method used in livestock farms.
How much reduction of virus is needed for recycled water: A continuous changing need for assessment?
Gerba, Charles P; Betancourt, Walter Q; Kitajima, Masaaki
2017-01-01
To ensure the safety of wastewater reuse for irrigation of food crops and drinking water pathogenic viruses must be reduced to levels that pose no significant risk. To achieve this goal minimum reduction of viruses by treatment trains have been suggested. For use of edible crops a 6-log reduction and for production of potable drinking water a 12-log reduction has been suggested. These reductions were based on assuming infective virus concentrations of 10 5 to 10 6 per liter. Recent application of molecular methods suggests that some pathogenic viruses may be occurring in concentrations of 10 7 to 10 9 per liter. Factors influencing these levels include the development of molecular methods for virus detection, emergence of newly recognized viruses, decrease in per capita water use due to conservation measures, and outbreaks. Since neither cell culture nor molecular methods can assess all the potentially infectious virus in wastewater conservative estimates should be used to assess the virus load in untreated wastewater. This review indicates that an additional 2- to 3-log reduction of viruses above current recommendations may be needed to ensure the safety of recycled water. Information is needed on peak loading of viruses. In addition, more virus groups need to be quantified using better methods of virus quantification, including more accurate methods for measuring viral infectivity in order to better quantify risks from viruses in recycled water. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modeling environmental contamination in hospital single- and four-bed rooms.
King, M-F; Noakes, C J; Sleigh, P A
2015-12-01
Aerial dispersion of pathogens is recognized as a potential transmission route for hospital acquired infections; however, little is known about the link between healthcare worker (HCW) contacts' with contaminated surfaces, the transmission of infections and hospital room design. We combine computational fluid dynamics (CFD) simulations of bioaerosol deposition with a validated probabilistic HCW-surface contact model to estimate the relative quantity of pathogens accrued on hands during six types of care procedures in two room types. Results demonstrate that care type is most influential (P < 0.001), followed by the number of surface contacts (P < 0.001) and the distribution of surface pathogens (P = 0.05). Highest hand contamination was predicted during Personal care despite the highest levels of hand hygiene. Ventilation rates of 6 ac/h vs. 4 ac/h showed only minor reductions in predicted hand colonization. Pathogens accrued on hands decreased monotonically after patient care in single rooms due to the physical barrier of bioaerosol transmission between rooms and subsequent hand sanitation. Conversely, contamination was predicted to increase during contact with patients in four-bed rooms due to spatial spread of pathogens. Location of the infectious patient with respect to ventilation played a key role in determining pathogen loadings (P = 0.05). We present the first quantitative model predicting the surface contacts by HCW and the subsequent accretion of pathogenic material as they perform standard patient care. This model indicates that single rooms may significantly reduce the risk of cross-contamination due to indirect infection transmission. Not all care types pose the same risks to patients, and housekeeping performed by HCWs may be an important contribution in the transmission of pathogens between patients. Ventilation rates and positioning of infectious patients within four-bed rooms can mitigate the accretion of pathogens, whereby reducing the risk of missed hand hygiene opportunities. The model provides a tool to quantitatively evaluate the influence of hospital room design on infection risk. © 2015 The Authors. Indoor Air Published by John Wiley & Sons Ltd.
Reduction of salmonella on valencia oranges by cold plasma treatment
USDA-ARS?s Scientific Manuscript database
Orange juice has been the source of recurrent food borne illness outbreaks, primarily associated with Salmonella. There is a need for antimicrobial interventions which can effectively eliminate pathogens from fruit surfaces and reduce the risk of cross-contamination during peeling and processing. To...
Highmore, Callum J; Warner, Jennifer C; Rothwell, Steve D; Wilks, Sandra A; Keevil, C William
2018-04-17
The microbiological safety of fresh produce is monitored almost exclusively by culture-based detection methods. However, bacterial food-borne pathogens are known to enter a viable-but-nonculturable (VBNC) state in response to environmental stresses such as chlorine, which is commonly used for fresh produce decontamination. Here, complete VBNC induction of green fluorescent protein-tagged Listeria monocytogenes and Salmonella enterica serovar Thompson was achieved by exposure to 12 and 3 ppm chlorine, respectively. The pathogens were subjected to chlorine washing following incubation on spinach leaves. Culture data revealed that total viable L. monocytogenes and Salmonella Thompson populations became VBNC by 50 and 100 ppm chlorine, respectively, while enumeration by direct viable counting found that chlorine caused a <1-log reduction in viability. The pathogenicity of chlorine-induced VBNC L. monocytogenes and Salmonella Thompson was assessed by using Caenorhabditis elegans Ingestion of VBNC pathogens by C. elegans resulted in a significant life span reduction ( P = 0.0064 and P < 0.0001), and no significant difference between the life span reductions caused by the VBNC and culturable L. monocytogenes treatments was observed. L. monocytogenes was visualized beyond the nematode intestinal lumen, indicating resuscitation and cell invasion. These data emphasize the risk that VBNC food-borne pathogens could pose to public health should they continue to go undetected. IMPORTANCE Many bacteria are known to enter a viable-but-nonculturable (VBNC) state in response to environmental stresses. VBNC cells cannot be detected by standard laboratory culture techniques, presenting a problem for the food industry, which uses these techniques to detect pathogen contaminants. This study found that chlorine, a sanitizer commonly used for fresh produce, induces a VBNC state in the food-borne pathogens Listeria monocytogenes and Salmonella enterica It was also found that chlorine is ineffective at killing total populations of the pathogens. A life span reduction was observed in Caenorhabditis elegans that ingested these VBNC pathogens, with VBNC L. monocytogenes as infectious as its culturable counterpart. These data show that VBNC food-borne pathogens can both be generated and avoid detection by industrial practices while potentially retaining the ability to cause disease. Copyright © 2018 Highmore et al.
Effect of the Cedar River on the quality of the ground-water supply for Cedar Rapids, Iowa
Schulmeyer, P.M.
1995-01-01
Above-normal streamflow and precipitation during the study could have increased the effect the river had on the alluvial aquifer and on the possibility of contamination by a pathogen. Microscopic particulate analysis of 29 samples found no Giardia cysts or Crytosporidium oocysts in water collected from municipal wells. Data also indicate that the aquifer is filtering out large numbers of algae, diatoms, rotifers, and nematodes as well as filtering out Cryptosporidium, Giardia, and other protozoa. The number of algae, diatoms, rotifers, protozoa, and vegetative debris for selected municipal wells tested showed at least a reduction to 1 per 1,000 of the number found in the river. A relative risk factor and a log-reduction rate were determined for the aquifer in the vicinity of selected wells. One municipal well had a high-risk factor, three other wells had a moderate-risk factor, and four wells had a low-risk factor. The filtering efficiency of the aquifer is equivalent to a 3 log-reduction rate or 99.99-percent reduction in particulates.
Pathogen reduction in human plasma using an ultrashort pulsed laser
USDA-ARS?s Scientific Manuscript database
Pathogen reduction is an ideal approach to ensure the continued safety of the blood supply against emerging pathogens. However, the currently licensed pathogen reduction techniques are ineffective against non-enveloped viruses, and they introduce chemicals with concerns of side effects which prevent...
Kobayashi, Naohiro; Oshiki, Mamoru; Ito, Toshihiro; Segawa, Takahiro; Hatamoto, Masashi; Kato, Tsuyoshi; Yamaguchi, Takashi; Kubota, Kengo; Takahashi, Masanobu; Iguchi, Akinori; Tagawa, Tadashi; Okubo, Tsutomu; Uemura, Shigeki; Harada, Hideki; Motoyama, Toshiki; Araki, Nobuo; Sano, Daisuke
2017-03-01
A down-flow hanging sponge (DHS) reactor has been developed as a cost-effective wastewater treatment system that is adaptable to local conditions in low-income countries. A pilot-scale DHS reactor previously demonstrated stable reduction efficiencies for chemical oxygen demand (COD) and ammonium nitrogen over a year at ambient temperature, but the pathogen reduction efficiency of the DHS reactor has yet to be investigated. In the present study, the reduction efficiency of a pilot-scale DHS reactor fed with municipal wastewater was investigated for 10 types of human pathogenic viruses (norovirus GI, GII and GIV, aichivirus, astrovirus, enterovirus, hepatitis A and E viruses, rotavirus, and sapovirus). DHS influent and effluent were collected weekly or biweekly for 337 days, and concentrations of viral genomes were determined by microfluidic quantitative PCR. Aichivirus, norovirus GI and GII, enterovirus, and sapovirus were frequently detected in DHS influent, and the log 10 reduction (LR) of these viruses ranged from 1.5 to 3.7. The LR values for aichivirus and norovirus GII were also calculated using a Bayesian estimation model, and the average LR (±standard deviation) values for aichivirus and norovirus GII were estimated to be 1.4 (±1.5) and 1.8 (±2.5), respectively. Quantitative microbial risk assessment was conducted to calculate a threshold reduction level for norovirus GII that would be required for the use of DHS effluent for agricultural irrigation, and it was found that LRs of 2.6 and 3.7 for norovirus GII in the DHS effluent were required in order to not exceed the tolerable burden of disease at 10 -4 and 10 -6 disability-adjusted life years loss per person per year, respectively, for 95% of the exposed population during wastewater reuse for irrigation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Quantitative Microbial Risk Assessment for Escherichia coli O157:H7 in Fresh-Cut Lettuce.
Pang, Hao; Lambertini, Elisabetta; Buchanan, Robert L; Schaffner, Donald W; Pradhan, Abani K
2017-02-01
Leafy green vegetables, including lettuce, are recognized as potential vehicles for foodborne pathogens such as Escherichia coli O157:H7. Fresh-cut lettuce is potentially at high risk of causing foodborne illnesses, as it is generally consumed without cooking. Quantitative microbial risk assessments (QMRAs) are gaining more attention as an effective tool to assess and control potential risks associated with foodborne pathogens. This study developed a QMRA model for E. coli O157:H7 in fresh-cut lettuce and evaluated the effects of different potential intervention strategies on the reduction of public health risks. The fresh-cut lettuce production and supply chain was modeled from field production, with both irrigation water and soil as initial contamination sources, to consumption at home. The baseline model (with no interventions) predicted a mean probability of 1 illness per 10 million servings and a mean of 2,160 illness cases per year in the United States. All intervention strategies evaluated (chlorine, ultrasound and organic acid, irradiation, bacteriophage, and consumer washing) significantly reduced the estimated mean number of illness cases when compared with the baseline model prediction (from 11.4- to 17.9-fold reduction). Sensitivity analyses indicated that retail and home storage temperature were the most important factors affecting the predicted number of illness cases. The developed QMRA model provided a framework for estimating risk associated with consumption of E. coli O157:H7-contaminated fresh-cut lettuce and can guide the evaluation and development of intervention strategies aimed at reducing such risk.
Impacts of an introduced forest pathogen on the risk of Lyme disease in California.
Swei, Andrea; Briggs, Cheryl J; Lane, Robert S; Ostfeld, Richard S
2012-08-01
Global changes such as deforestation, climate change, and invasive species have the potential to greatly alter zoonotic disease systems through impacts on biodiversity. This study examined the impact of the invasive pathogen that causes sudden oak death (SOD) on the ecology of Lyme disease in California. The Lyme disease bacterium, Borrelia burgdorferi, is maintained in the far western United States by a suite of animal reservoirs including the dusky-footed woodrat (Neotoma fuscipes) and deer mouse (Peromyscus maniculatus), and is transmitted by the western black-legged tick (Ixodes pacificus). Other vertebrates, such as the western fence lizard (Sceloporus occidentalis), are important tick hosts but are not reservoirs of the pathogen. Previous work found that higher levels of SOD are correlated with greater abundance of P. maniculatus and S. occidentalis and lower N. fuscipes abundance. Here we model the contribution of these tick hosts to Lyme disease risk and also evaluate the potential impact of SOD on infection prevalence of the tick vector. By empirically parameterizing a static model with field and laboratory data on tick hosts, we predict that SOD reduces an important index of disease risk, nymphal infection prevalence, leading to a reduction in Lyme disease risk in certain coastal woodlands. Direct observational analysis of the impact of SOD on nymphal infection prevalence supports these model results. This study underscores the important direct and indirect impacts of invasive plant pathogens on biodiversity, the transmission cycles of zoonotic diseases, and ultimately human health.
Li, Yonghong; Arellano, Andre R; Bare, Lance A; Bender, Richard A; Strom, Charles M; Devlin, James J
2017-04-01
The National Comprehensive Cancer Network recommends that women who carry gene variants that confer substantial risk for breast cancer consider risk-reduction strategies, that is, enhanced surveillance (breast magnetic resonance imaging and mammography) or prophylactic surgery. Pathogenic variants can be detected in women with a family history of breast or ovarian cancer syndromes by multigene panel testing. To investigate whether using a seven-gene test to identify women who should consider risk-reduction strategies could cost-effectively increase life expectancy. We estimated effectiveness and lifetime costs from a payer perspective for two strategies in two hypothetical cohorts of women (40-year-old and 50-year-old cohorts) who meet the National Comprehensive Cancer Network-defined family history criteria for multigene testing. The two strategies were the usual test strategy for variants in BRCA1 and BRCA2 and the seven-gene test strategy for variants in BRCA1, BRCA2, TP53, PTEN, CDH1, STK11, and PALB2. Women found to have a pathogenic variant were assumed to undergo either prophylactic surgery or enhanced surveillance. The incremental cost-effectiveness ratio for the seven-gene test strategy compared with the BRCA1/2 test strategy was $42,067 per life-year gained or $69,920 per quality-adjusted life-year gained for the 50-year-old cohort and $23,734 per life-year gained or $48,328 per quality-adjusted life-year gained for the 40-year-old cohort. In probabilistic sensitivity analysis, the seven-gene test strategy cost less than $100,000 per life-year gained in 95.7% of the trials for the 50-year-old cohort. Testing seven breast cancer-associated genes, followed by risk-reduction management, could cost-effectively improve life expectancy for women at risk of hereditary breast cancer. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Partial Tmem106b reduction does not correct abnormalities due to progranulin haploinsufficiency.
Arrant, Andrew E; Nicholson, Alexandra M; Zhou, Xiaolai; Rademakers, Rosa; Roberson, Erik D
2018-06-22
Loss of function mutations in progranulin (GRN) are a major cause of frontotemporal dementia (FTD). Progranulin is a secreted glycoprotein that localizes to lysosomes and is critical for proper lysosomal function. Heterozygous GRN mutation carriers develop FTD with TDP-43 pathology and exhibit signs of lysosomal dysfunction in the brain, with increased levels of lysosomal proteins and lipofuscin accumulation. Homozygous GRN mutation carriers develop neuronal ceroid lipofuscinosis (NCL), an earlier-onset lysosomal storage disorder caused by severe lysosomal dysfunction. Multiple genome-wide association studies have shown that risk of FTD in GRN mutation carriers is modified by polymorphisms in TMEM106B, which encodes a lysosomal membrane protein. Risk alleles of TMEM106B may increase TMEM106B levels through a variety of mechanisms. Brains from FTD patients with GRN mutations exhibit increased TMEM106B expression, and protective TMEM106B polymorphisms are associated with decreased TMEM106B expression. Together, these data raise the possibility that reduction of TMEM106B levels may protect against the pathogenic effects of progranulin haploinsufficiency. We crossed Tmem106b +/- mice with Grn +/- mice, which model the progranulin haploinsufficiency of GRN mutation carriers and develop age-dependent social deficits and lysosomal abnormalities in the brain. We tested whether partial Tmem106b reduction could normalize the social deficits and lysosomal abnormalities of Grn +/- mice. Partial reduction of Tmem106b levels did not correct the social deficits of Grn +/- mice. Tmem106b reduction also failed to normalize most lysosomal abnormalities of Grn +/- mice, except for β-glucuronidase activity, which was suppressed by Tmem106b reduction and increased by progranulin insufficiency. These data do not support the hypothesis that Tmem106b reduction protects against the pathogenic effects of progranulin haploinsufficiency, but do show that Tmem106b reduction normalizes some lysosomal phenotypes in Grn +/- mice.
West, Allison H; Blazer, Kathleen R; Stoll, Jessica; Jones, Matthew; Weipert, Caroline M; Nielsen, Sarah M; Kupfer, Sonia S; Weitzel, Jeffrey N; Olopade, Olufunmilayo I
2018-02-14
Comprehensive genomic cancer risk assessment (GCRA) helps patients, family members, and providers make informed choices about cancer screening, surgical and chemotherapeutic risk reduction, and genetically targeted cancer therapies. The increasing availability of multigene panel tests for clinical applications allows testing of well-defined high-risk genes, as well as moderate-risk genes, for which the penetrance and spectrum of cancer risk are less well characterized. Moderate-risk genes are defined as genes that, when altered by a pathogenic variant, confer a 2 to fivefold relative risk of cancer. Two such genes included on many comprehensive cancer panels are the DNA repair genes ATM and CHEK2, best known for moderately increased risk of breast cancer development. However, the impact of screening and preventative interventions and spectrum of cancer risk beyond breast cancer associated with ATM and/or CHEK2 variants remain less well characterized. We convened a large, multidisciplinary, cross-sectional panel of GCRA clinicians to review challenging, peer-submitted cases of patients identified with ATM or CHEK2 variants. This paper summarizes the inter-professional case discussion and recommendations generated during the session, the level of concordance with respect to recommendations between the academic and community clinician participants for each case, and potential barriers to implementing recommended care in various practice settings.
Page, Declan; Dillon, Peter; Toze, Simon; Bixio, Davide; Genthe, Bettina; Jiménez Cisneros, Blanca Elena; Wintgens, Thomas
2010-03-01
A quantitative microbial risk assessment (QMRA) was performed at four managed aquifer recharge (MAR) sites (Australia, South Africa, Belgium, Mexico) where reclaimed wastewater and stormwater is recycled via aquifers for drinking water supplies, using the same risk-based approach that is used for public water supplies. For each of the sites, the aquifer treatment barrier was assessed for its log(10) removal capacity much like for other water treatment technologies. This information was then integrated into a broader risk assessment to determine the human health burden from the four MAR sites. For the Australian and South African cases, managing the aquifer treatment barrier was found to be critical for the schemes to have low risk. For the Belgian case study, the large treatment trains both in terms of pre- and post-aquifer recharge ensures that the risk is always low. In the Mexico case study, the risk was high due to the lack of pre-treatment and the low residence times of the recharge water in the aquifer. A further sensitivity analysis demonstrated that human health risk can be managed if aquifers are integrated into a treatment train to attenuate pathogens. However, reduction in human health disease burden (as measured in disability adjusted life years, DALYs) varied depending upon the number of pathogens in the recharge source water. The beta-Poisson dose response curve used for translating rotavirus and Cryptosporidium numbers into DALYs coupled with their slow environmental decay rates means poor quality injectant leads to aquifers having reduced value to reduce DALYs. For these systems, like the Mexican case study, longer residence times are required to meet their DALYs guideline for drinking water. Nevertheless the results showed that the risks from pathogens can still be reduced and recharging via an aquifer is safer than discharging directly into surface water bodies. Copyright 2009 Elsevier Ltd. All rights reserved.
Sexual Safety Planning as an HIV Prevention Strategy for Survivors of Domestic Violence.
Foster, Jill; Núñez, Ana; Spencer, Susan; Wolf, Judith; Robertson-James, Candace
2016-06-01
Victims of domestic violence (DV) are not only subject to physical and emotional abuse but may also be at increased risk for less recognized dangers from infection with human immunodeficiency virus (HIV) and other sexually transmitted pathogens. Because of the close link between DV and sexual risk, women need to be educated about the consequences of acquiring a life-threatening sexually transmitted infection, risk reduction measures, and how to access appropriate HIV services for diagnosis and treatment. It is therefore critical for DV workers to receive sufficient training about the link between DV and HIV risk so that sexual safety planning can be incorporated into activities with their clients in the same way as physical safety plans. In this article, we discuss how the Many Hands Working Together project provides interactive training for workers in DV and DV-affiliated agencies to increase their knowledge about HIV and teach sexual safety planning skills to achieve HIV risk reduction.
Sexual Safety Planning as an HIV Prevention Strategy for Survivors of Domestic Violence
Foster, Jill; Spencer, Susan; Wolf, Judith; Robertson-James, Candace
2016-01-01
Abstract Victims of domestic violence (DV) are not only subject to physical and emotional abuse but may also be at increased risk for less recognized dangers from infection with human immunodeficiency virus (HIV) and other sexually transmitted pathogens. Because of the close link between DV and sexual risk, women need to be educated about the consequences of acquiring a life-threatening sexually transmitted infection, risk reduction measures, and how to access appropriate HIV services for diagnosis and treatment. It is therefore critical for DV workers to receive sufficient training about the link between DV and HIV risk so that sexual safety planning can be incorporated into activities with their clients in the same way as physical safety plans. In this article, we discuss how the Many Hands Working Together project provides interactive training for workers in DV and DV-affiliated agencies to increase their knowledge about HIV and teach sexual safety planning skills to achieve HIV risk reduction. PMID:26595667
Pangloli, Philipus; Hung, Yen-Con; Beuchat, Larry R; King, C Harold; Zhao, Zhi-Hui
2009-09-01
Treatment of fresh fruits and vegetables with electrolyzed water (EW) has been shown to kill or reduce foodborne pathogens. We evaluated the efficacy of EW in killing Escherichia coli O157:H7 on iceberg lettuce, cabbage, lemons, and tomatoes by using washing and/or chilling treatments simulating those followed in some food service kitchens. Greatest reduction levels on lettuce were achieved by sequentially washing with 14-A (amperage) acidic EW (AcEW) for 15 or 30 s followed by chilling in 16-A AcEW for 15 min. This procedure reduced the pathogen by 2.8 and 3.0 log CFU per leaf, respectively, whereas washing and chilling with tap water reduced the pathogen by 1.9 and 2.4 log CFU per leaf. Washing cabbage leaves for 15 or 30 s with tap water or 14-A AcEW reduced the pathogen by 2.0 and 3.0 log CFU per leaf and 2.5 to 3.0 log CFU per leaf, respectively. The pathogen was reduced by 4.7 log CFU per lemon by washing with 14-A AcEW and 4.1 and 4.5 log CFU per lemon by washing with tap water for 15 or 30 s. A reduction of 5.3 log CFU per lemon was achieved by washing with 14-A alkaline EW for 15 s prior to washing with 14-A AcEW for 15 s. Washing tomatoes with tap water or 14-A AcEW for 15 s reduced the pathogen by 6.4 and 7.9 log CFU per tomato, respectively. Application of AcEW using procedures mimicking food service operations should help minimize cross-contamination and reduce the risk of E. coli O157:H7 being present on produce at the time of consumption.
The application of quantitative risk assessment to microbial food safety risks.
Jaykus, L A
1996-01-01
Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data limitations to date. Conclusions include a brief discussion of subsequent uncertainty and risk analysis methodologies, and a commentary on present and future applications of QRA in the management of the public health risks associated with the presence of pathogenic microorganisms in the food supply.
USDA-ARS?s Scientific Manuscript database
Background: The majority of human infections with H5N1 high pathogenicity avian influenza (HPAI) virus have occurred in the village setting of developing countries with the primary exposure risk being direct contact with live or dead poultry in the household or neighborhood. In Egypt, the majority o...
Billy, T J; Wachsmuth, I K
1997-08-01
Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a quantitative risk assessment for eggs, and several surveys and studies are being performed to supply data needed to conduct other risk assessments. The FSIS has established a food safety research agenda which will fill data gaps.
Pathogen Reduction in Human Plasma Using an Ultrashort Pulsed Laser
Tsen, Shaw-Wei D.; Kingsley, David H.; Kibler, Karen; Jacobs, Bert; Sizemore, Sara; Vaiana, Sara M.; Anderson, Jeanne; Tsen, Kong-Thon; Achilefu, Samuel
2014-01-01
Pathogen reduction is a viable approach to ensure the continued safety of the blood supply against emerging pathogens. However, the currently licensed pathogen reduction techniques are ineffective against non-enveloped viruses such as hepatitis A virus, and they introduce chemicals with concerns of side effects which prevent their widespread use. In this report, we demonstrate the inactivation of both enveloped and non-enveloped viruses in human plasma using a novel chemical-free method, a visible ultrashort pulsed laser. We found that laser treatment resulted in 2-log, 1-log, and 3-log reductions in human immunodeficiency virus, hepatitis A virus, and murine cytomegalovirus in human plasma, respectively. Laser-treated plasma showed ≥70% retention for most coagulation factors tested. Furthermore, laser treatment did not alter the structure of a model coagulation factor, fibrinogen. Ultrashort pulsed lasers are a promising new method for chemical-free, broad-spectrum pathogen reduction in human plasma. PMID:25372037
Pohler, Petra; Müller, Meike; Winkler, Carla; Schaudien, Dirk; Sewald, Katherina; Müller, Thomas H; Seltsam, Axel
2015-02-01
Residual white blood cells (WBCs) in cellular blood components induce a variety of adverse immune events, including nonhemolytic febrile transfusion reactions, alloimmunization to HLA antigens, and transfusion-associated graft-versus-host disease (TA-GVHD). Pathogen reduction (PR) methods such as the ultraviolet C (UVC) light-based THERAFLEX UV-Platelets system were developed to reduce the risk of transfusion-transmitted infection. As UVC light targets nucleic acids, it interferes with the replication of both pathogens and WBCs. This preclinical study aimed to evaluate the ability of UVC light to inactivate contaminating WBCs in platelet concentrates (PCs). The in vitro and in vivo function of WBCs from UVC-treated PCs was compared to that of WBCs from gamma-irradiated and untreated PCs by measuring cell viability, proliferation, cytokine secretion, antigen presentation in vitro, and xenogeneic GVHD responses in a humanized mouse model. UVC light was at least as effective as gamma irradiation in preventing GVHD in the mouse model. It was more effective in suppressing T-cell proliferation (>5-log reduction in the limiting dilution assay), cytokine secretion, and antigen presentation than gamma irradiation. The THERAFLEX UV-Platelets (MacoPharma) PR system can substitute gamma irradiation for TA-GVHD prophylaxis in platelet (PLT) transfusion. Moreover, UVC treatment achieves suppression of antigen presentation and inhibition of cytokine accumulation during storage of PCs, which has potential benefits for transfusion recipients. © 2014 AABB.
NASA Astrophysics Data System (ADS)
Bergion, Viktor; Sokolova, Ekaterina; Åström, Johan; Lindhe, Andreas; Sörén, Kaisa; Rosén, Lars
2017-01-01
Waterborne outbreaks of gastrointestinal diseases are of great concern to drinking water producers and can give rise to substantial costs to the society. The World Health Organisation promotes an approach where the emphasis is on mitigating risks close to the contamination source. In order to handle microbial risks efficiently, there is a need for systematic risk management. In this paper we present a framework for microbial risk management of drinking water systems. The framework incorporates cost-benefit analysis as a decision support method. The hydrological Soil and Water Assessment Tool (SWAT) model, which was set up for the Stäket catchment area in Sweden, was used to simulate the effects of four different mitigation measures on microbial concentrations. The modelling results showed that the two mitigation measures that resulted in a significant (p < 0.05) reduction of Cryptosporidium spp. and Escherichia coli concentrations were a vegetative filter strip linked to cropland and improved treatment (by one Log10 unit) at the wastewater treatment plants. The mitigation measure with a vegetative filter strip linked to grazing areas resulted in a significant reduction of Cryptosporidium spp., but not of E. coli concentrations. The mitigation measure with enhancing the removal efficiency of all on-site wastewater treatment systems (total removal of 2 Log10 units) did not achieve any significant reduction of E. coli or Cryptosporidium spp. concentrations. The SWAT model was useful when characterising the effect of different mitigation measures on microbial concentrations. Hydrological modelling implemented within an appropriate risk management framework is a key decision support element as it identifies the most efficient alternative for microbial risk reduction.
40 CFR 503.15 - Operational standards-pathogens and vector attraction reduction.
Code of Federal Regulations, 2011 CFR
2011-07-01
... vector attraction reduction. 503.15 Section 503.15 Protection of Environment ENVIRONMENTAL PROTECTION... § 503.15 Operational standards—pathogens and vector attraction reduction. (a) Pathogens—sewage sludge... reclamation site. (c) Vector attraction reduction—sewage sludge. (1) One of the vector attraction reduction...
Pathogen-reduced platelets for the prevention of bleeding
Estcourt, Lise J; Malouf, Reem; Hopewell, Sally; Trivella, Marialena; Doree, Carolyn; Stanworth, Simon J; Murphy, Michael F
2017-01-01
Background Platelet transfusions are used to prevent and treat bleeding in people who are thrombocytopenic. Despite improvements in donor screening and laboratory testing, a small risk of viral, bacterial, or protozoal contamination of platelets remains. There is also an ongoing risk from newly emerging blood transfusion-transmitted infections for which laboratory tests may not be available at the time of initial outbreak. One solution to reduce the risk of blood transfusion-transmitted infections from platelet transfusion is photochemical pathogen reduction, in which pathogens are either inactivated or significantly depleted in number, thereby reducing the chance of transmission. This process might offer additional benefits, including platelet shelf-life extension, and negate the requirement for gamma-irradiation of platelets. Although current pathogen-reduction technologies have been proven to reduce pathogen load in platelet concentrates, a number of published clinical studies have raised concerns about the effectiveness of pathogen-reduced platelets for post-transfusion platelet count recovery and the prevention of bleeding when compared with standard platelets. This is an update of a Cochrane review first published in 2013. Objectives To assess the effectiveness of pathogen-reduced platelets for the prevention of bleeding in people of any age requiring platelet transfusions. Search methods We searched for randomised controlled trials (RCTs) in the Cochrane Central Register of Controlled Trials (CENTRAL) (the Cochrane Library 2016, Issue 9), MEDLINE (from 1946), Embase (from 1974), CINAHL (from 1937), the Transfusion Evidence Library (from 1950), and ongoing trial databases to 24 October 2016. Selection criteria We included RCTs comparing the transfusion of pathogen-reduced platelets with standard platelets, or comparing different types of pathogen-reduced platelets. Data collection and analysis We used the standard methodological procedures expected by Cochrane. Main results We identified five new trials in this update of the review. A total of 15 trials were eligible for inclusion in this review, 12 completed trials (2075 participants) and three ongoing trials. Ten of the 12 completed trials were included in the original review. We did not identify any RCTs comparing the transfusion of one type of pathogen-reduced platelets with another. Nine trials compared Intercept® pathogen-reduced platelets to standard platelets, two trials compared Mirasol® pathogen-reduced platelets to standard platelets; and one trial compared both pathogen-reduced platelets types to standard platelets. Three RCTs were randomised cross-over trials, and nine were parallel-group trials. Of the 2075 participants enrolled in the trials, 1981 participants received at least one platelet transfusion (1662 participants in Intercept® platelet trials and 319 in Mirasol® platelet trials). One trial included children requiring cardiac surgery (16 participants) or adults requiring a liver transplant (28 participants). All of the other participants were thrombocytopenic individuals who had a haematological or oncological diagnosis. Eight trials included only adults. Four of the included studies were at low risk of bias in every domain, while the remaining eight included studies had some threats to validity. Overall, the quality of the evidence was low to high across different outcomes according to GRADE methodology. We are very uncertain as to whether pathogen-reduced platelets increase the risk of any bleeding (World Health Organization (WHO) Grade 1 to 4) (5 trials, 1085 participants; fixed-effect risk ratio (RR) 1.09, 95% confidence interval (CI) 1.02 to 1.15; I2 = 59%, random-effect RR 1.14, 95% CI 0.93 to 1.38; I2 = 59%; low-quality evidence). There was no evidence of a difference between pathogen-reduced platelets and standard platelets in the incidence of clinically significant bleeding complications (WHO Grade 2 or higher) (5 trials, 1392 participants; RR 1.10, 95% CI 0.97 to 1.25; I2 = 0%; moderate-quality evidence), and there is probably no difference in the risk of developing severe bleeding (WHO Grade 3 or higher) (6 trials, 1495 participants; RR 1.24, 95% CI 0.76 to 2.02; I2 = 32%; moderate-quality evidence). There is probably no difference between pathogen-reduced platelets and standard platelets in the incidence of all-cause mortality at 4 to 12 weeks (6 trials, 1509 participants; RR 0.81, 95% CI 0.50 to 1.29; I2 = 26%; moderate-quality evidence). There is probably no difference between pathogen-reduced platelets and standard platelets in the incidence of serious adverse events (7 trials, 1340 participants; RR 1.09, 95% CI 0.88 to 1.35; I2 = 0%; moderate-quality evidence). However, no bacterial transfusion-transmitted infections occurred in the six trials that reported this outcome. Participants who received pathogen-reduced platelet transfusions had an increased risk of developing platelet refractoriness (7 trials, 1525 participants; RR 2.94, 95% CI 2.08 to 4.16; I2 = 0%; high-quality evidence), though the definition of platelet refractoriness differed between trials. Participants who received pathogen-reduced platelet transfusions required more platelet transfusions (6 trials, 1509 participants; mean difference (MD) 1.23, 95% CI 0.86 to 1.61; I2 = 27%; high-quality evidence), and there was probably a shorter time interval between transfusions (6 trials, 1489 participants; MD -0.42, 95% CI -0.53 to -0.32; I2 = 29%; moderate-quality evidence). Participants who received pathogen-reduced platelet transfusions had a lower 24-hour corrected-count increment (7 trials, 1681 participants; MD -3.02, 95% CI -3.57 to -2.48; I2 = 15%; high-quality evidence). None of the studies reported quality of life. We did not evaluate any economic outcomes. There was evidence of subgroup differences in multiple transfusion trials between the two pathogen-reduced platelet technologies assessed in this review (Intercept® and Mirasol®) for all-cause mortality and the interval between platelet transfusions (favouring Intercept®). Authors' conclusions Findings from this review were based on 12 trials, and of the 1981 participants who received a platelet transfusion only 44 did not have a haematological or oncological diagnosis. In people with haematological or oncological disorders who are thrombocytopenic due to their disease or its treatment, we found high-quality evidence that pathogen-reduced platelet transfusions increase the risk of platelet refractoriness and the platelet transfusion requirement. We found moderate-quality evidence that pathogen-reduced platelet transfusions do not affect all-cause mortality, the risk of clinically significant or severe bleeding, or the risk of a serious adverse event. There was insufficient evidence for people with other diagnoses. All three ongoing trials are in adults (planned recruitment 1375 participants) with a haematological or oncological diagnosis. PMID:28756627
Pearce, L E; Smythe, B W; Crawford, R A; Oakley, E; Hathaway, S C; Shepherd, J M
2012-01-01
This is the first study to report kinetic data on the survival of a range of significant milk-borne pathogens under commercial-type pasteurization conditions. The most heat-resistant strain of each of the milk-borne pathogens Staphylococcus aureus, Yersinia enterocolitica, pathogenic Escherichia coli, Cronobacter sakazakii (formerly known as Enterobacter sakazakii), Listeria monocytogenes, and Salmonella was selected to obtain the worst-case scenario in heat inactivation trials using a pilot-plant-scale pasteurizer. Initially, approximately 30 of each species were screened using a submerged coil unit. Then, UHT milk was inoculated with the most heat-resistant pathogens at ~10(7)/mL and heat treated in a pilot-plant-scale pasteurizer under commercial-type conditions of turbulent flow for 15s over a temperature range from 56 to 66°C and at 72°C. Survivors were enumerated on nonselective media chosen for the highest efficiency of plating of heat-damaged bacteria of each of the chosen strains. The mean log(10) reductions and temperatures of inactivation of the 6 pathogens during a 15-s treatment were Staph. aureus >6.7 at 66.5°C, Y. enterocolitica >6.8 at 62.5°C, pathogenic E. coli >6.8 at 65°C, C. sakazakii >6.7 at 67.5°C, L. monocytogenes >6.9 at 65.5°C, and Salmonella ser. Typhimurium >6.9 at 61.5°C. The kinetic data from these experiments will be used by the New Zealand Ministry of Agriculture and Forestry to populate the quantitative risk assessment model being developed to investigate the risks to New Zealand consumers from pasteurized, compared with nonpasteurized, milk and milk products. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
40 CFR 503.25 - Operational standards-pathogens and vector attraction reduction.
Code of Federal Regulations, 2010 CFR
2010-07-01
... vector attraction reduction. 503.25 Section 503.25 Protection of Environment ENVIRONMENTAL PROTECTION... § 503.25 Operational standards—pathogens and vector attraction reduction. (a) Pathogens—sewage sludge... active sewage sludge unit, unless the vector attraction reduction requirement in § 503.33(b)(11) is met...
40 CFR 503.25 - Operational standards-pathogens and vector attraction reduction.
Code of Federal Regulations, 2011 CFR
2011-07-01
... vector attraction reduction. 503.25 Section 503.25 Protection of Environment ENVIRONMENTAL PROTECTION... § 503.25 Operational standards—pathogens and vector attraction reduction. (a) Pathogens—sewage sludge... active sewage sludge unit, unless the vector attraction reduction requirement in § 503.33(b)(11) is met...
Hygienic effects and gas production of plastic bio-digesters under tropical conditions.
Yen-Phi, Vo Thi; Clemens, Joachim; Rechenburg, Andrea; Vinneras, Björn; Lenssen, Christina; Kistemann, Thomas
2009-12-01
Plastic plug-flow bio-digesters have been promoted as a good option for improved treatment of manure and wastewater in developing countries although minimal information has been published on their hygienic status. This bench-scale study replicates bio-digester conditions to evaluate the reduction of pathogen and indicator microorganisms at three different hydraulic retention times (HRT) in the anaerobic treatment of pig manures at 30 degrees C for 50 days. Results showed that physicochemical values differed between HRTs. Gas production efficiency was better for longer HRTS. The accumulated sludge at the reactor's base increased with longer HRT. Phages and bacteria examined were reduced, but none was completely eliminated. Log10 reduction of bacteria ranged from 0.54 to 2.47. Phages ranged from 1.60 to 3.42. The reduction of organisms at HRT = 30 days was about one log10 unit higher than HRT = 15 days and about two log10 units higher than HRT = 3 days. The results indicate that the reduction of tested organisms increases with HRT. However the hygienic quality of the liquid effluent does not meet required quality values for surface and irrigation water. Longer HRTs are recommended to increase gas yield and achieve higher pathogen reduction. More barriers should be applied while handling bio-digester outputs to minimise risks to environmental and human health.
Shibata, A; Hiono, T; Fukuhara, H; Sumiyoshi, R; Ohkawara, A; Matsuno, K; Okamatsu, M; Osaka, H; Sakoda, Y
2018-04-01
The transportation of poultry and related products for international trade contributes to transboundary pathogen spread and disease outbreaks worldwide. To prevent pathogen incursion through poultry products, many countries have regulations about animal health and poultry product quarantine. However, in Japan, animal products have been illegally introduced into the country in baggage and confiscated at the airport. Lately, the number of illegally imported poultry and the incursion risk of transboundary pathogens through poultry products have been increasing. In this study, we isolated avian influenza viruses (AIVs) from raw poultry products illegally imported to Japan by international passengers. Highly (H5N1 and H5N6) and low (H9N2 and H1N2) pathogenic AIVs were isolated from raw chicken and duck products carried by flight passengers. H5 and H9 isolates were phylogenetically closely related to viruses isolated from poultry in China, and haemagglutinin genes of H5N1 and H5N6 isolates belonged to clades 2.3.2.1c and 2.3.4.4, respectively. Experimental infections of H5 and H9 isolates in chickens and ducks demonstrated pathogenicity and tissue tropism to skeletal muscles. To prevent virus incursion by poultry products, it is important to encourage the phased cleaning based on the disease control and eradication and promote the reduction in contamination risk in animal products. © 2017 Blackwell Verlag GmbH.
Weaver, J Todd; Malladi, Sasidhar; Spackman, Erica; Swayne, David E
2015-11-01
Control of highly pathogenic avian influenza (HPAI) outbreaks in poultry has traditionally involved the establishment of disease containment zones, where poultry products are only permitted to move from within a zone under permit. Nonpasteurized liquid egg (NPLE) is one such commodity for which movements may be permitted, considering inactivation of HPAI virus via pasteurization. Active surveillance testing at the flock level, using targeted matrix gene real-time reversed transcriptase-polymerase chain reaction testing (RRT-PCR) has been incorporated into HPAI emergency response plans as the primary on-farm diagnostic test procedure to detect HPAI in poultry and is considered to be a key risk mitigation measure. To inform decisions regarding the potential movement of NPLE to a pasteurization facility, average HPAI virus concentrations in NPLE produced from a HPAI virus infected, but undetected, commercial table-egg-layer flock were estimated for three HPAI virus strains using quantitative simulation models. Pasteurization under newly proposed international design standards (5 log10 reduction) is predicted to inactivate HPAI virus in NPLE to a very low concentration of less than 1 embryo infectious dose (EID)50 /mL, considering the predicted virus titers in NPLE from a table-egg flock under active surveillance. Dilution of HPAI virus from contaminated eggs in eggs from the same flock, and in a 40,000 lb tanker-truck load of NPLE containing eggs from disease-free flocks was also considered. Risk assessment can be useful in the evaluation of commodity-specific risk mitigation measures to facilitate safe trade in animal products from countries experiencing outbreaks of highly transmissible animal diseases. © 2015 Society for Risk Analysis.
Significance of Infectious Agents in Colorectal Cancer Development
Antonic, Vlado; Stojadinovic, Alexander; Kester, Kent E.; Weina, Peter J; Brücher, Björn LDM; Protic, Mladjan; Avital, Itzhak; Izadjoo, Mina
2013-01-01
Colorectal cancer (CRC) is a major burden to healthcare systems worldwide accounting for approximately one million of new cancer cases worldwide. Even though, CRC mortality has decreased over the last 20 years, it remains the third most common cause of cancer-related mortality, accounting for approximately 600,000 deaths in 2008 worldwide. A multitude of risk factors have been linked to CRC, including hereditary factors, environmental factors and inflammatory syndromes affecting the gastrointestinal tract. Recently, various pathogens were added to the growing list of risk factors for a number of common epithelial cancers, but despite the multitude of correlative studies, only suggestions remain about the possible relationship between selected viruses and bacteria of interest and the CRC risk. United States military service members are exposed to various risk factors impacting the incidence of cancer development. These exposures are often different from that of many sectors of the civilian population. Thereby, cancer risk identification, screening and early detection are imperative for both the military health care beneficiaries and the population as a whole. In this review, we will focus on several pathogens and their potential roles in development of CRC, highlighting the clinical trials evaluating this correlation and provide our personal opinion about the importance of risk reduction, health promotion and disease prevention for military health care beneficiaries. PMID:23459622
Hand hygiene regimens for the reduction of risk in food service environments.
Edmonds, Sarah L; McCormack, Robert R; Zhou, Sifang Steve; Macinga, David R; Fricker, Christopher M
2012-07-01
Pathogenic strains of Escherichia coli and human norovirus are the main etiologic agents of foodborne illness resulting from inadequate hand hygiene practices by food service workers. This study was conducted to evaluate the antibacterial and antiviral efficacy of various hand hygiene product regimens under different soil conditions representative of those in food service settings and assess the impact of product formulation on this efficacy. On hands contaminated with chicken broth containing E. coli, representing a moderate soil load, a regimen combining an antimicrobial hand washing product with a 70% ethanol advanced formula (EtOH AF) gel achieved a 5.22-log reduction, whereas a nonantimicrobial hand washing product alone achieved a 3.10log reduction. When hands were heavily soiled from handling ground beef containing E. coli, a wash-sanitize regimen with a 0.5% chloroxylenol antimicrobial hand washing product and the 70% EtOH AF gel achieved a 4.60-log reduction, whereas a wash-sanitize regimen with a 62% EtOH foam achieved a 4.11-log reduction. Sanitizing with the 70% EtOH AF gel alone was more effective than hand washing with a nonantimicrobial product for reducing murine norovirus (MNV), a surrogate for human norovirus, with 2.60- and 1.79-log reductions, respectively. When combined with hand washing, the 70% EtOH AF gel produced a 3.19-log reduction against MNV. A regimen using the SaniTwice protocol with the 70% EtOH AF gel produced a 4.04-log reduction against MNV. These data suggest that although the process of hand washing helped to remove pathogens from the hands, use of a wash-sanitize regimen was even more effective for reducing organisms. Use of a high-efficacy sanitizer as part of a wash-sanitize regimen further increased the efficacy of the regimen. The use of a well-formulated alcohol-based hand rub as part of a wash-sanitize regimen should be considered as a means to reduce risk of infection transmission in food service facilities.
Selective decontamination of the digestive tract.
Krueger, Wolfgang A; Unertl, Klaus E
2002-04-01
Ventilator-associated pneumonia usually originates from the patient's oropharyngeal microflora. In selective digestive decontamination, topical antibiotics are applied to the oropharynx and stomach for prevention of pneumonia and other infections, possibly reducing infection-related mortality. Selective digestive decontamination is also used for the prevention of gut-derived infections in acute necrotizing pancreatitis and liver transplantation. Despite numerous clinical trials, selective digestive decontamination remains controversial. Reduction of the incidence of pneumonia is accepted, but the extent of reduction is debated. Mortality was not reduced in most individual trials, but this finding was calculated in meta-analyses, especially for combined use of topical and systemic antibiotics in surgical ICU patients. Some investigators reported increased resistance and a shift to Gram-positive pathogens. Today, it appears that selective means not only selective suppression of pathogenic bacteria but also selection of appropriate groups of patients for underlying diseases and severity of illness, and selection of ICUs, where the endemic resistance patterns might allow the use of selective digestive decontamination at a relatively low risk for increased selection pressure.
Totton, Sarah C; Glanville, Julie M; Dzikamunhenga, Rungano S; Dickson, James S; O'Connor, Annette M
2016-06-01
In this systematic review, we summarized change in Salmonella prevalence and/or quantity associated with pathogen reduction treatments (washes, sprays, steam) on pork carcasses or skin-on carcass parts in comparative designs (natural or artificial contamination). In January 2015, CAB Abstracts (1910-2015), SCI and CPCI-Science (1900-2015), Medline® and Medline® In-Process (1946-2015) (OVIDSP), Science.gov, and Safe Pork (1996-2012) were searched with no language or publication type restrictions. Reference lists of 24 review articles were checked. Two independent reviewers screened 4001 titles/abstracts and assessed 122 full-text articles for eligibility. Only English-language records were extracted. Fourteen studies (5 in commercial abattoirs) were extracted and risk of bias was assessed by two reviewers independently. Risk of bias due to systematic error was moderate; a major source of bias was the potential differential recovery of Salmonella from treated carcasses due to knowledge of the intervention. The most consistently observed association was a positive effect of acid washes on categorical measures of Salmonella; however, this was based on individual results, not a summary effect measure. There was no strong evidence that any one intervention protocol (acid temperature, acid concentration, water temperature) was clearly superior to others for Salmonella control.
Steinmann, Eike; Gravemann, Ute; Friesland, Martina; Doerrbecker, Juliane; Müller, Thomas H; Pietschmann, Thomas; Seltsam, Axel
2013-05-01
Contamination of blood products with hepatitis C virus (HCV) can cause infections resulting in acute and chronic liver diseases. Pathogen reduction methods such as photodynamic treatment with methylene blue (MB) plus visible light as well as irradiation with shortwave ultraviolet (UVC) light were developed to inactivate viruses and other pathogens in plasma and platelet concentrates (PCs), respectively. So far, their inactivation capacities for HCV have only been tested in inactivation studies using model viruses for HCV. Recently, a HCV infection system for the propagation of infectious HCV in cell culture was developed. Inactivation studies were performed with cell culture-derived HCV and bovine viral diarrhea virus (BVDV), a model for HCV. Plasma units or PCs were spiked with high titers of cell culture-grown viruses. After treatment of the blood units with MB plus light (Theraflex MB-Plasma system, MacoPharma) or UVC (Theraflex UV-Platelets system, MacoPharma), residual viral infectivity was assessed using sensitive cell culture systems. HCV was sensitive to inactivation by both pathogen reduction procedures. HCV in plasma was efficiently inactivated by MB plus light below the detection limit already by 1/12 of the full light dose. HCV in PCs was inactivated by UVC irradiation with a reduction factor of more than 5 log. BVDV was less sensitive to the two pathogen reduction methods. Functional assays with human HCV offer an efficient tool to directly assess the inactivation capacity of pathogen reduction procedures. Pathogen reduction technologies such as MB plus light treatment and UVC irradiation have the potential to significantly reduce transfusion-transmitted HCV infections. © 2012 American Association of Blood Banks.
Pathogen reduction co-benefits of nutrient best management practices
Wainger, Lisa A.; Barber, Mary C.
2016-01-01
Background Many of the practices currently underway to reduce nitrogen, phosphorus, and sediment loads entering the Chesapeake Bay have also been observed to support reduction of disease-causing pathogen loadings. We quantify how implementation of these practices, proposed to meet the nutrient and sediment caps prescribed by the Total Maximum Daily Load (TMDL), could reduce pathogen loadings and provide public health co-benefits within the Chesapeake Bay system. Methods We used published data on the pathogen reduction potential of management practices and baseline fecal coliform loadings estimated as part of prior modeling to estimate the reduction in pathogen loadings to the mainstem Potomac River and Chesapeake Bay attributable to practices implemented as part of the TMDL. We then compare the estimates with the baseline loadings of fecal coliform loadings to estimate the total pathogen reduction potential of the TMDL. Results We estimate that the TMDL practices have the potential to decrease disease-causing pathogen loads from all point and non-point sources to the mainstem Potomac River and the entire Chesapeake Bay watershed by 19% and 27%, respectively. These numbers are likely to be underestimates due to data limitations that forced us to omit some practices from analysis. Discussion Based on known impairments and disease incidence rates, we conclude that efforts to reduce nutrients may create substantial health co-benefits by improving the safety of water-contact recreation and seafood consumption. PMID:27904807
Pathogen reduction co-benefits of nutrient best management practices.
Richkus, Jennifer; Wainger, Lisa A; Barber, Mary C
2016-01-01
Many of the practices currently underway to reduce nitrogen, phosphorus, and sediment loads entering the Chesapeake Bay have also been observed to support reduction of disease-causing pathogen loadings. We quantify how implementation of these practices, proposed to meet the nutrient and sediment caps prescribed by the Total Maximum Daily Load (TMDL), could reduce pathogen loadings and provide public health co-benefits within the Chesapeake Bay system. We used published data on the pathogen reduction potential of management practices and baseline fecal coliform loadings estimated as part of prior modeling to estimate the reduction in pathogen loadings to the mainstem Potomac River and Chesapeake Bay attributable to practices implemented as part of the TMDL. We then compare the estimates with the baseline loadings of fecal coliform loadings to estimate the total pathogen reduction potential of the TMDL. We estimate that the TMDL practices have the potential to decrease disease-causing pathogen loads from all point and non-point sources to the mainstem Potomac River and the entire Chesapeake Bay watershed by 19% and 27%, respectively. These numbers are likely to be underestimates due to data limitations that forced us to omit some practices from analysis. Based on known impairments and disease incidence rates, we conclude that efforts to reduce nutrients may create substantial health co-benefits by improving the safety of water-contact recreation and seafood consumption.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-24
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2001-D-0066] (Formerly Docket No. 2001D-0107) Expedited Review for New Animal Drug Applications for Human Pathogen... Review for New Animal Drug Applications for Human Pathogen Reduction Claims.'' The guidance predates the...
Tormey, Christopher A; Santhanakrishnan, Manjula; Smith, Nicole H; Liu, Jingchun; Marschner, Susanne; Goodrich, Raymond P; Hendrickson, Jeanne E
2016-04-01
Ultraviolet (UV) illumination/pathogen reduction effectively inactivates white blood cells (WBCs) in whole blood. Given that cotransfused WBCs may impact recipient immune responses, we hypothesized that pathogen reduction of whole blood may alter responses to RBC antigens. Transgenic mice expressing a model (HOD) antigen, authentic human (hGPA or KEL) antigens, or natural fluorescence (uGFP) on their RBCs were utilized as blood donors. Recipients were transfused with fresh whole blood to which riboflavin had been added or fresh whole blood treated by UV illumination/pathogen reduction treatment after the addition of riboflavin. Posttransfusion RBC recovery, survival, and alloimmunization were measured by flow cytometry. UV illumination/pathogen reduction treatment did not alter RBC antigen expression, and recipients of treated syngeneic RBCs had persistently negative direct antiglobulin tests. Greater than 75% of treated and untreated syngeneic RBCs were recovered 24 hours posttransfusion in all experiments, although alterations in the long-term posttransfusion survival of treated RBCs were observed. Treated and untreated KEL RBCs induced similar recipient alloimmune responses, with all recipients making anti-KEL glycoprotein immunoglobulins (p > 0.05). Alloimmune responses to treated HOD or hGPA RBCs were no different from untreated RBCs (p > 0.05). Pathogen inactivation treatment of fresh whole murine blood with riboflavin and UV illumination does not impact the rate or magnitude of RBC alloimmunization to three distinct RBC antigens. Further, UV illumination/pathogen reduction appears safe from an immunohematologic standpoint, with no immunogenic neoantigens detected on treated murine RBCs. Future studies with fresh and stored human RBCs are warranted to confirm these findings. © 2015 AABB.
Long, Elizabeth Y; Finke, Deborah L
2015-04-01
A widely cited benefit of predator diversity is greater suppression of insect herbivores, with corresponding increases in plant biomass. In the context of a vector-borne pathogen system, predator species richness may also influence plant disease risk via the direct effects of predators on the abundance and behavior of herbivores that also act as pathogen vectors. Using an assemblage of generalist insect predators, we examined the relationship between predator species richness and the prevalence of the aphid-vectored cereal yellow dwarf virus in wheat. We found that increasing predator richness enhanced suppression of the vector population and that pathogen prevalence was reduced when predators were present, but the reduction in prevalence was independent of predator species richness. To determine the mechanism(s) by which predator species richness contributes to vector suppression, but not pathogen prevalence, we evaluated vector movement and host plant occupancy in response to predator treatments. We found that pathogen prevalence was unrelated to vector suppression because host plant occupancy by vectors did not vary as a function of vector abundance. However, the presence of predators reduced pathogen prevalence because predators stimulated greater plant-to-plant movement by vectors, which likely diminished vector feeding time and reduced the transmission efficiency of this persistent pathogen. We conclude that community structure (i.e., the presence of predators), but not predator diversity, is a potential factor influencing local plant infection by this insect-vectored pathogen.
Pathogen inactivation techniques.
Pelletier, J P R; Transue, S; Snyder, E L
2006-01-01
The desire to rid the blood supply of pathogens of all types has led to the development of many technologies aimed at the same goal--eradication of the pathogen(s) without harming the blood cells or generating toxic chemical agents. This is a very ambitious goal, and one that has yet to be achieved. One approach is to shun the 'one size fits all' concept and to target pathogen-reduction agents at the Individual component types. This permits the development of technologies that might be compatible with, for example, plasma products but that would be cytocidal and thus incompatible with platelet concentrates or red blood cell units. The technologies to be discussed include solvent detergent and methylene blue treatments--designed to inactivate plasma components and derivatives; psoralens (S-59--amotosalen) designed to pathogen-reduce units of platelets; and two products aimed at red blood cells, S-303 (a Frale--frangible anchor-linker effector compound) and Inactine (a binary ethyleneimine). A final pathogen-reduction material that might actually allow one material to inactivate all three blood components--riboflavin (vitamin B2)--is also under development. The sites of action of the amotosalen (S-59), the S-303 Frale, Inactine, and riboflavin are all localized in the nucleic acid part of the pathogen. Solvent detergent materials act by dissolving the plasma envelope, thus compromising the integrity of the pathogen membrane and rendering it non-infectious. By disrupting the pathogen's ability to replicate or survive, its infectivity is removed. The degree to which bacteria and viruses are affected by a particular pathogen-reducing technology relates to its Gram-positive or Gram-negative status, to the sporulation characteristics for bacteria, and the presence of lipid or protein envelopes for viruses. Concerns related to photoproducts and other breakdown products of these technologies remain, and the toxicology of pathogen-reduction treatments is a major ongoing area of investigation. Clearly, regulatory agencies have a major role to play in the evaluation of these new technologies. This chapter will cover the several types of pathogen-reduction systems, mechanisms of action, the inactivation efficacy for specific types of pathogens, toxicology of the various systems and the published research and clinical trial data supporting their potential usefulness. Due to the nature of the field, pathogen reduction is a work in progress and this review should be considered as a snapshot in time rather than a clear picture of what the future will bring.
Engineered nanoconstructs for the multiplexed and sensitive detection of high-risk pathogens
NASA Astrophysics Data System (ADS)
Seo, Youngmin; Kim, Ji-Eun; Jeong, Yoon; Lee, Kwan Hong; Hwang, Jangsun; Hong, Jongwook; Park, Hansoo; Choi, Jonghoon
2016-01-01
Many countries categorize the causative agents of severe infectious diseases as high-risk pathogens. Given their extreme infectivity and potential to be used as biological weapons, a rapid and sensitive method for detection of high-risk pathogens (e.g., Bacillus anthracis, Francisella tularensis, Yersinia pestis, and Vaccinia virus) is highly desirable. Here, we report the construction of a novel detection platform comprising two units: (1) magnetic beads separately conjugated with multiple capturing antibodies against four different high-risk pathogens for simple and rapid isolation, and (2) genetically engineered apoferritin nanoparticles conjugated with multiple quantum dots and detection antibodies against four different high-risk pathogens for signal amplification. For each high-risk pathogen, we demonstrated at least 10-fold increase in sensitivity compared to traditional lateral flow devices that utilize enzyme-based detection methods. Multiplexed detection of high-risk pathogens in a sample was also successful by using the nanoconstructs harboring the dye molecules with fluorescence at different wavelengths. We ultimately envision the use of this novel nanoprobe detection platform in future applications that require highly sensitive on-site detection of high-risk pathogens.
75 FR 30844 - General Mills, Inc.; Withdrawal of Food Additive Petition
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-02
... for the reduction of pathogens and other microorganisms in aqueous sugar solutions and potable water... reduction of pathogens and other microorganisms in aqueous sugar solutions and potable water intended for...
Efficiency of different air filter types for pig facilities at laboratory scale
Wenke, Cindy; Pospiech, Janina; Reutter, Tobias; Truyen, Uwe
2017-01-01
Air filtration has been shown to be efficient in reducing pathogen burden in circulating air. We determined at laboratory scale the retention efficiency of different air filter types either composed of a prefilter (EU class G4) and a secondary fiberglass filter (EU class F9) or consisting of a filter mat (EU class M6 and F8-9). Four filter prototypes were tested for their capability to remove aerosol containing equine arteritis virus (EAV), porcine reproductive and respiratory syndrome virus (PRRSV), bovine enterovirus 1 (BEV), Actinobacillus pleuropneumoniae (APP), and Staphylococcus (S.) aureus from air. Depending on the filter prototype and utilisation, the airflow was set at 1,800 m3/h (combination of upstream prefilter and fiberglass filter) or 80 m3/h (filter mat). The pathogens were aerosolized and their concentration was determined in front of and behind the filter by culture or quantitative real-time RT-PCR. Furthermore, survival of the pathogens over time in the filter material was determined. Bacteria were most efficiently filtered with a reduction rate of up to 99.9% depending on the filter used. An approximately 98% reduction was achieved for the viruses tested. Viability or infectivity of APP or PRRSV in the filter material decreased below the detection limit after 4 h and 24 h, respectively, whereas S. aureus was still culturable after 4 weeks. Our results demonstrate that pathogens can efficiently be reduced by air filtration. Consequently, air filtration combined with other strict biosecurity measures markedly reduces the risk of introducing airborne transmitted pathogens to animal facilities. In addition, air filtration might be useful in reducing bioaerosols within a pig barn, hence improving respiratory health of pigs. PMID:29028843
Efficiency of different air filter types for pig facilities at laboratory scale.
Wenke, Cindy; Pospiech, Janina; Reutter, Tobias; Truyen, Uwe; Speck, Stephanie
2017-01-01
Air filtration has been shown to be efficient in reducing pathogen burden in circulating air. We determined at laboratory scale the retention efficiency of different air filter types either composed of a prefilter (EU class G4) and a secondary fiberglass filter (EU class F9) or consisting of a filter mat (EU class M6 and F8-9). Four filter prototypes were tested for their capability to remove aerosol containing equine arteritis virus (EAV), porcine reproductive and respiratory syndrome virus (PRRSV), bovine enterovirus 1 (BEV), Actinobacillus pleuropneumoniae (APP), and Staphylococcus (S.) aureus from air. Depending on the filter prototype and utilisation, the airflow was set at 1,800 m3/h (combination of upstream prefilter and fiberglass filter) or 80 m3/h (filter mat). The pathogens were aerosolized and their concentration was determined in front of and behind the filter by culture or quantitative real-time RT-PCR. Furthermore, survival of the pathogens over time in the filter material was determined. Bacteria were most efficiently filtered with a reduction rate of up to 99.9% depending on the filter used. An approximately 98% reduction was achieved for the viruses tested. Viability or infectivity of APP or PRRSV in the filter material decreased below the detection limit after 4 h and 24 h, respectively, whereas S. aureus was still culturable after 4 weeks. Our results demonstrate that pathogens can efficiently be reduced by air filtration. Consequently, air filtration combined with other strict biosecurity measures markedly reduces the risk of introducing airborne transmitted pathogens to animal facilities. In addition, air filtration might be useful in reducing bioaerosols within a pig barn, hence improving respiratory health of pigs.
Rudan, Igor; O’Brien, Katherine L.; Nair, Harish; Liu, Li; Theodoratou, Evropi; Qazi, Shamim; Lukšić, Ivana; Fischer Walker, Christa L.; Black, Robert E.; Campbell, Harry
2013-01-01
Background The recent series of reviews conducted within the Global Action Plan for Pneumonia and Diarrhoea (GAPPD) addressed epidemiology of the two deadly diseases at the global and regional level; it also estimated the effectiveness of interventions, barriers to achieving high coverage and the main implications for health policy. The aim of this paper is to provide the estimates of childhood pneumonia at the country level. This should allow national policy–makers and stakeholders to implement proposed policies in the World Health Organization (WHO) and UNICEF member countries. Methods We conducted a series of systematic reviews to update previous estimates of the global, regional and national burden of childhood pneumonia incidence, severe morbidity, mortality, risk factors and specific contributions of the most common pathogens: Streptococcus pneumoniae (SP), Haemophilus influenzae type B (Hib), respiratory syncytial virus (RSV) and influenza virus (flu). We distributed the global and regional–level estimates of the number of cases, severe cases and deaths from childhood pneumonia in 2010–2011 by specific countries using an epidemiological model. The model was based on the prevalence of the five main risk factors for childhood pneumonia within countries (malnutrition, low birth weight, non–exclusive breastfeeding in the first four months, solid fuel use and crowding) and risk effect sizes estimated using meta–analysis. Findings The incidence of community–acquired childhood pneumonia in low– and middle–income countries (LMIC) in the year 2010, using World Health Organization's definition, was about 0.22 (interquartile range (IQR) 0.11–0.51) episodes per child–year (e/cy), with 11.5% (IQR 8.0–33.0%) of cases progressing to severe episodes. This is a reduction of nearly 25% over the past decade, which is consistent with observed reductions in the prevalence of risk factors for pneumonia throughout LMIC. At the level of pneumonia incidence, RSV is the most common pathogen, present in about 29% of all episodes, followed by influenza (17%). The contribution of different pathogens varies by pneumonia severity strata, with viral etiologies becoming relatively less important and most deaths in 2010 caused by the main bacterial agents – SP (33%) and Hib (16%), accounting for vaccine use against these two pathogens. Conclusions In comparison to 2000, the primary epidemiological evidence contributing to the models of childhood pneumonia burden has improved only slightly; all estimates have wide uncertainty bounds. Still, there is evidence of a decreasing trend for all measures of the burden over the period 2000–2010. The estimates of pneumonia incidence, severe morbidity, mortality and etiology, although each derived from different and independent data, are internally consistent – lending credibility to the new set of estimates. Pneumonia continues to be the leading cause of both morbidity and mortality for young children beyond the neonatal period and requires ongoing strategies and progress to reduce the burden further. PMID:23826505
Rudan, Igor; O'Brien, Katherine L; Nair, Harish; Liu, Li; Theodoratou, Evropi; Qazi, Shamim; Lukšić, Ivana; Fischer Walker, Christa L; Black, Robert E; Campbell, Harry
2013-06-01
The recent series of reviews conducted within the Global Action Plan for Pneumonia and Diarrhoea (GAPPD) addressed epidemiology of the two deadly diseases at the global and regional level; it also estimated the effectiveness of interventions, barriers to achieving high coverage and the main implications for health policy. The aim of this paper is to provide the estimates of childhood pneumonia at the country level. This should allow national policy-makers and stakeholders to implement proposed policies in the World Health Organization (WHO) and UNICEF member countries. WE CONDUCTED A SERIES OF SYSTEMATIC REVIEWS TO UPDATE PREVIOUS ESTIMATES OF THE GLOBAL, REGIONAL AND NATIONAL BURDEN OF CHILDHOOD PNEUMONIA INCIDENCE, SEVERE MORBIDITY, MORTALITY, RISK FACTORS AND SPECIFIC CONTRIBUTIONS OF THE MOST COMMON PATHOGENS: Streptococcus pneumoniae (SP), Haemophilus influenzae type B (Hib), respiratory syncytial virus (RSV) and influenza virus (flu). We distributed the global and regional-level estimates of the number of cases, severe cases and deaths from childhood pneumonia in 2010-2011 by specific countries using an epidemiological model. The model was based on the prevalence of the five main risk factors for childhood pneumonia within countries (malnutrition, low birth weight, non-exclusive breastfeeding in the first four months, solid fuel use and crowding) and risk effect sizes estimated using meta-analysis. The incidence of community-acquired childhood pneumonia in low- and middle-income countries (LMIC) in the year 2010, using World Health Organization's definition, was about 0.22 (interquartile range (IQR) 0.11-0.51) episodes per child-year (e/cy), with 11.5% (IQR 8.0-33.0%) of cases progressing to severe episodes. This is a reduction of nearly 25% over the past decade, which is consistent with observed reductions in the prevalence of risk factors for pneumonia throughout LMIC. At the level of pneumonia incidence, RSV is the most common pathogen, present in about 29% of all episodes, followed by influenza (17%). The contribution of different pathogens varies by pneumonia severity strata, with viral etiologies becoming relatively less important and most deaths in 2010 caused by the main bacterial agents - SP (33%) and Hib (16%), accounting for vaccine use against these two pathogens. In comparison to 2000, the primary epidemiological evidence contributing to the models of childhood pneumonia burden has improved only slightly; all estimates have wide uncertainty bounds. Still, there is evidence of a decreasing trend for all measures of the burden over the period 2000-2010. The estimates of pneumonia incidence, severe morbidity, mortality and etiology, although each derived from different and independent data, are internally consistent - lending credibility to the new set of estimates. Pneumonia continues to be the leading cause of both morbidity and mortality for young children beyond the neonatal period and requires ongoing strategies and progress to reduce the burden further.
Crotta, Matteo; Paterlini, Franco; Rizzi, Rita; Guitian, Javier
2016-02-01
Foodborne disease as a result of raw milk consumption is an increasing concern in Western countries. Quantitative microbial risk assessment models have been used to estimate the risk of illness due to different pathogens in raw milk. In these models, the duration and temperature of storage before consumption have a critical influence in the final outcome of the simulations and are usually described and modeled as independent distributions in the consumer phase module. We hypothesize that this assumption can result in the computation, during simulations, of extreme scenarios that ultimately lead to an overestimation of the risk. In this study, a sensorial analysis was conducted to replicate consumers' behavior. The results of the analysis were used to establish, by means of a logistic model, the relationship between time-temperature combinations and the probability that a serving of raw milk is actually consumed. To assess our hypothesis, 2 recently published quantitative microbial risk assessment models quantifying the risks of listeriosis and salmonellosis related to the consumption of raw milk were implemented. First, the default settings described in the publications were kept; second, the likelihood of consumption as a function of the length and temperature of storage was included. When results were compared, the density of computed extreme scenarios decreased significantly in the modified model; consequently, the probability of illness and the expected number of cases per year also decreased. Reductions of 11.6 and 12.7% in the proportion of computed scenarios in which a contaminated milk serving was consumed were observed for the first and the second study, respectively. Our results confirm that overlooking the time-temperature dependency may yield to an important overestimation of the risk. Furthermore, we provide estimates of this dependency that could easily be implemented in future quantitative microbial risk assessment models of raw milk pathogens. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Modeling tools for the assessment of microbiological risks during floods: a review
NASA Astrophysics Data System (ADS)
Collender, Philip; Yang, Wen; Stieglitz, Marc; Remais, Justin
2015-04-01
Floods are a major, recurring source of harm to global economies and public health. Projected increases in the frequency and intensity of heavy precipitation events under future climate change, coupled with continued urbanization in areas with high risk of floods, may exacerbate future impacts of flooding. Improved flood risk management is essential to support global development, poverty reduction and public health, and is likely to be a crucial aspect of climate change adaptation. Importantly, floods can facilitate the transmission of waterborne pathogens by changing social conditions (overcrowding among displaced populations, interruption of public health services), imposing physical challenges to infrastructure (sewerage overflow, reduced capacity to treat drinking water), and altering fate and transport of pathogens (transport into waterways from overland flow, resuspension of settled contaminants) during and after flood conditions. Hydrological and hydrodynamic models are capable of generating quantitative characterizations of microbiological risks associated with flooding, while accounting for these diverse and at times competing physical and biological processes. Despite a few applications of such models to the quantification of microbiological risks associated with floods, there exists limited guidance as to the relative capabilities, and limitations, of existing modeling platforms when used for this purpose. Here, we review 17 commonly used flood and water quality modeling tools that have demonstrated or implicit capabilities of mechanistically representing and quantifying microbial risk during flood conditions. We compare models with respect to their capabilities of generating outputs that describe physical and microbial conditions during floods, such as concentration or load of non-cohesive sediments or pathogens, and the dynamics of high flow conditions. Recommendations are presented for the application of specific modeling tools for assessing particular flood-related microbial risks, and model improvements are suggested that may better characterize key microbial risks during flood events. The state of current tools are assessed in the context of a changing climate where the frequency, intensity and duration of flooding are shifting in some areas.
Opportunities for mitigating pathogen contamination during on-farm food production.
Doyle, Michael P; Erickson, Marilyn C
2012-01-16
Fruits, vegetables, and meat are susceptible to contamination by foodborne pathogens at many points from production through preparation in the home. This review will largely highlight approaches and progress made in the last five years to address strategies to reduce pathogen contamination in animal production but will also touch on the emerging field of preharvest produce food safety. Mitigation strategies can be divided into those that address pathogen reduction in the environment and those that target reduction/elimination of pathogen contamination in animals or plants. The former strategy has been encompassed in studies evaluating sanitation treatments of facilities as well as in numerous epidemiologic risk assessment studies (both on-farm assessments and computer simulation models) that identify management practices that impact pathogen prevalence in animals. Interventions to significantly reduce pathogen exposure via feed or water are dependent on their role as a significant contributor to pathogen contamination in the animal production system. In addition, inconsistent results obtained with interventions of dietary additives or formulation modifications (grain versus forage; inclusion of distiller's grains) on pathogen prevalence in animals have been attributed to a range of factors including target organism, grain type, level of inclusion, the animal's health or stress level, and ability to survive the gastric acidic conditions. Recent attempts to microencapsulate organic acids or bacteriophage within feed have met with only marginal improvements in reducing pathogen carriage in animals but this approach may have greater potential with other antimicrobial additives (i.e., essential oils). Bacteriophage therapy, in general, can significantly reduce pathogen carriage in animals but based on its transient nature and the potential for development of phage-resistant subpopulations, this approach should be administered to animals just prior to slaughter and preferably to animals that are suspected "super-shedders". Other promising on-farm intervention approaches have included breeding for pathogen resistance, vaccines, and dietary bacteriocins. To optimize interventions on a cost basis, studies have also determined that application of dietary interventions at specific time points in the animal's production cycle is a useful strategy to reduce pathogen carriage (e.g., probiotics to fertilized eggs and acidified feed to fattening swine). In conclusion, applicable management and intervention strategies may vary depending on the type of food under production; however, it is important to consider from a holistic view how any new intervention strategies will affect the overall production system in order to maintain a successful, efficient food production environment. Copyright © 2011 Elsevier B.V. All rights reserved.
Voordouw, Maarten J; Tupper, Haley; Önder, Özlem; Devevey, Godefroy; Graves, Christopher J; Kemps, Brian D; Brisson, Dustin
2013-04-01
Vaccinating wildlife is becoming an increasingly popular method to reduce human disease risks from pathogens such as Borrelia burgdorferi, the causative agent of Lyme disease. To successfully limit human disease risk, vaccines targeting the wildlife reservoirs of B. burgdorferi must be easily distributable and must effectively reduce pathogen transmission from infected animals, given that many animals in nature will be infected prior to vaccination. We assessed the efficacy of an easily distributable oral bait vaccine based on the immunogenic outer surface protein A (OspA) to protect uninfected mice from infection and to reduce transmission from previously infected white-footed mice, an important reservoir host of B. burgdorferi. Oral vaccination of white-footed mice effectively reduces transmission of B. burgdorferi at both critical stages of the Lyme disease transmission cycle. First, oral vaccination of uninfected white-footed mice elicits an immune response that protects mice from B. burgdorferi infection. Second, oral vaccination of previously infected mice significantly reduces the transmission of B. burgdorferi to feeding ticks despite a statistically nonsignificant immune response. We used the estimates of pathogen transmission to and from vaccinated and unvaccinated mice to model the efficacy of an oral vaccination campaign targeting wild white-footed mice. Projection models suggest that the effects of the vaccine on both critical stages of the transmission cycle of B. burgdorferi act synergistically in a positive feedback loop to reduce the nymphal infection prevalence, and thus human Lyme disease risk, well below what would be expected from either effect alone. This study suggests that oral immunization of wildlife with an OspA-based vaccine can be a promising long-term strategy to reduce human Lyme disease risk.
Flynn, Padrig B; Higginbotham, Sarah; Alshraiedeh, Nid'a H; Gorman, Sean P; Graham, William G; Gilmore, Brendan F
2015-07-01
The emergence of multidrug-resistant pathogens within the clinical environment is presenting a mounting problem in hospitals worldwide. The 'ESKAPE' pathogens (Enterococcus faecium, Staphylococcus aureus, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa and Enterobacter spp.) have been highlighted as a group of causative organisms in a majority of nosocomial infections, presenting a serious health risk due to widespread antimicrobial resistance. The stagnating pipeline of new antibiotics requires alternative approaches to the control and treatment of nosocomial infections. Atmospheric pressure non-thermal plasma (APNTP) is attracting growing interest as an alternative infection control approach within the clinical setting. This study presents a comprehensive bactericidal assessment of an in-house-designed APNTP jet both against biofilms and planktonic bacteria of the ESKAPE pathogens. Standard plate counts and the XTT metabolic assay were used to evaluate the antibacterial effect of APNTP, with both methods demonstrating comparable eradication times. APNTP exhibited rapid antimicrobial activity against all of the ESKAPE pathogens in the planktonic mode of growth and provided efficient and complete eradication of ESKAPE pathogens in the biofilm mode of growth within 360s, with the exception of A. baumannii where a >4log reduction in biofilm viability was observed. This demonstrates its effectiveness as a bactericidal treatment against these pathogens and further highlights its potential application in the clinical environment for the control of highly antimicrobial-resistant pathogens. Copyright © 2015 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.
Mukhopadhyay, Sudarsan; Sokorai, Kimberly; Ukuku, Dike; Fan, Xuetong; Juneja, Vijay; Sites, Joseph; Cassidy, Jennifer
2016-10-17
The objective of this research was to evaluate and develop a method for inactivation of Salmonella enterica and Listeria monocytogenes in cantaloupe puree (CP) by high hydrostatic pressure (HHP). Cantaloupe being the most netted varieties of melons presents a greater risk of pathogen transmission. Freshly prepared CP with or without 0.1% ascorbic acid (AA) was inoculated with a bacterial cocktail composed of a three serotype mixture of S. enterica (S. Poona, S. Newport H1275 and S. Stanley H0558) and a mixture of three strains of L. monocytogenes (Scott A, 43256 and 51742) to a population of ca. 10(8)CFU/g. Double sealed and double bagged inoculated CP (ca. 5g) were pressure treated at 300, 400 and 500MPa at 8°C and 15°C for 5min. Data indicated increased inactivation of both Salmonella and Listeria spp. with higher pressure. Log reduction for CP at 300MPa, 8°C for 5min was 2.4±0.2 and 1.6±0.5logCFU/g for Salmonella and Listeria, respectively. Survivability of the pathogens was significantly compromised at 400MPa and 8°C, inactivating 4.5±0.3logCFU/g of Salmonella and 3.0±0.4logCFU/g of Listeria spp. Complete inactivation of the pathogens in the puree (log reduction >6.7logCFU/g), with or without AA, was achieved when the pressure was further increased to 500MPa, except that for Listeria containing no AA at 8°C. Listeria presented higher resistance to pressure treatment compared to Salmonella spp. Initial temperatures (8 and 15°C) had no significant influence on Salmonella log reductions. Log reduction of pathogens increased but not significantly with increase of temperature. AA did not show any significant antimicrobial activity. Viable counts were about 0.2-0.4logCFU/g less in presence of 0.1% AA. These data validate that HHP can be used as an effective method for decontamination of cantaloupe puree. Published by Elsevier B.V.
Prevention of Infection Due to Clostridium difficile.
Cooper, Christopher C; Jump, Robin L P; Chopra, Teena
2016-12-01
Clostridium difficile is one of the foremost nosocomial pathogens. Preventing infection is particularly challenging. Effective prevention efforts typically require a multifaceted bundled approach. A variety of infection control procedures may be advantageous, including strict hand decontamination with soap and water, contact precautions, and using chlorine-containing decontamination agents. Additionally, risk factor reduction can help reduce the burden of disease. The risk factor modification is principally accomplished though antibiotic stewardship programs. Unfortunately, most of the current evidence for prevention is in acute care settings. This review focuses on preventative approaches to reduce the incidence of Clostridium difficile infection in healthcare settings. Copyright © 2016 Elsevier Inc. All rights reserved.
Vibrio bacteria in raw oysters: managing risks to human health.
Froelich, Brett A; Noble, Rachel T
2016-03-05
The human-pathogenic marine bacteria Vibrio vulnificus and V. parahaemolyticus are strongly correlated with water temperature, with concentrations increasing as waters warm seasonally. Both of these bacteria can be concentrated in filter-feeding shellfish, especially oysters. Because oysters are often consumed raw, this exposes people to large doses of potentially harmful bacteria. Various models are used to predict the abundance of these bacteria in oysters, which guide shellfish harvest policy meant to reduce human health risk. Vibrio abundance and behaviour varies from site to site, suggesting that location-specific studies are needed to establish targeted risk reduction strategies. Moreover, virulence potential, rather than simple abundance, should be also be included in future modeling efforts. © 2016 The Author(s).
The Effect of Ongoing Exposure Dynamics in Dose Response Relationships
Pujol, Josep M.; Eisenberg, Joseph E.; Haas, Charles N.; Koopman, James S.
2009-01-01
Characterizing infectivity as a function of pathogen dose is integral to microbial risk assessment. Dose-response experiments usually administer doses to subjects at one time. Phenomenological models of the resulting data, such as the exponential and the Beta-Poisson models, ignore dose timing and assume independent risks from each pathogen. Real world exposure to pathogens, however, is a sequence of discrete events where concurrent or prior pathogen arrival affects the capacity of immune effectors to engage and kill newly arriving pathogens. We model immune effector and pathogen interactions during the period before infection becomes established in order to capture the dynamics generating dose timing effects. Model analysis reveals an inverse relationship between the time over which exposures accumulate and the risk of infection. Data from one time dose experiments will thus overestimate per pathogen infection risks of real world exposures. For instance, fitting our model to one time dosing data reveals a risk of 0.66 from 313 Cryptosporidium parvum pathogens. When the temporal exposure window is increased 100-fold using the same parameters fitted by our model to the one time dose data, the risk of infection is reduced to 0.09. Confirmation of this risk prediction requires data from experiments administering doses with different timings. Our model demonstrates that dose timing could markedly alter the risks generated by airborne versus fomite transmitted pathogens. PMID:19503605
NASA Astrophysics Data System (ADS)
Ramirez-Cabral, Nadiezhda Yakovleva Zitz; Kumar, Lalit; Shabani, Farzin
2018-01-01
Worldwide, crop pests (CPs) such as pathogens and insects affect agricultural production detrimentally. Species distribution models can be used for projecting current and future suitability of CPs and host crop localities. Our study overlays the distribution of two CPs (Asian soybean rust and beet armyworm) and common bean, a potential host of them, in order to determine their current and future levels of coexistence. This kind of modeling approach has rarely been performed previously in climate change studies. The soybean rust and beet armyworm model projections herein show a reduction of the worldwide area with high and medium suitability of both CPs and a movement of them away from the Equator, in 2100 more pronounced than in 2050. Most likely, heat and dry stress will be responsible for these changes. Heat and dry stress will greatly reduce and shift the future suitable cultivation area of common bean as well, in a similar manner. The most relevant findings of this study were the reduction of the suitable areas for the CPs, the reduction of the risk under future scenarios, and the similarity of trends for the CPs and host. The current results highlight the relation between and the coevolution of host and pathogens.
Monte Carlo simulation of the risk of contamination of apples with Escherichia coli O157:H7.
Duffy, Siobain; Schaffner, Donald W
2002-10-25
Quantitative descriptions of the frequency and extent of contamination of apple cider with pathogenic bacteria were obtained using literature data and computer simulation. Probability distributions were chosen to describe the risk of apple contamination by each suspected pathway. Tree-picked apples may be contaminated by birds infected with Escherichia coli O157:H7 when orchards were located near a sewage source (ocean or landfill). Dropped apples could become contaminated from either infected animal droppings or from contaminated manure if used as fertilizer. A risk assessment model was created in Analytica. The results of worst-case simulations revealed that 6-9 log CFU E. coli O157:H7 might be found on a harvest of 1000 dropped apples, while 3-4 log CFU contamination could be present on 1000 tree-picked apples. This model confirms that practices such as using dropped apples and using animal waste as fertilizer increase risk in the production of apple cider, and that pasteurization may not eliminate all contamination in juice from heavily contaminated fruit. Recently published FDA regulations for juices requiring a 5-log CFU/ml reduction of pathogenic bacteria in fresh juices should be a fail-safe measure for apples harvested in all but the worst-case scenarios.
Pathogen reduction in minimally managed composting of bovine manure.
Millner, Patricia; Ingram, David; Mulbry, Walter; Arikan, Osman A
2014-11-01
Spread of manure pathogens is of considerable concern due to use of manure for land application. In this study, the effects of four static pile treatment options for bovine manure on die-off of a generic Escherichia coli, E. coli O157:H7 surrogate, Salmonella Senftenberg, Salm. Typhimurium, and Listeria monocytogenes were evaluated. Bovine manure spiked with these bacteria were placed in cassettes at the top, middle, and bottom sections of four static pile treatments that reflect minimal changes in pile construction with and without straw. Temperatures were monitored continuously during the 28 day self-heating period. E. coli and salmonellae were reduced from 8 to 9 log10 CFU g(-1) to undetectable levels (<1.77 log10 MPN g(-1)) at 25-30 cm depths within 7 days in all pile sections except for the manure-only pile in which 3-4 logs of reduction were obtained. No L. monocytogenes initially present at 6.62 log10 CFU g(-1) were recovered from straw-amended piles after 14 days, in contrast with manure-only treatment in which this pathogen was recovered even at 28 days. Decline of target bacterial populations corresponded to exposure to temperatures above 45°C for more than 3 days and amendments of manure with straw to increase thermophilic zones. Use of straw to increase aeration, self-heating capacity, and heat retention in manure piles provides producers a minimal management option for composting that enhances pathogen die-off and thereby reduces risk of environmental spread when manure is applied to land. Published by Elsevier Ltd.
40 CFR 503.15 - Operational standards-pathogens and vector attraction reduction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... met when bulk sewage sludge is applied to a lawn or a home garden. (3) The Class A pathogen... home garden. (3) One of the vector attraction reduction requirements in § 503.33 (b)(1) through (b)(8...
40 CFR 503.15 - Operational standards-pathogens and vector attraction reduction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... met when bulk sewage sludge is applied to a lawn or a home garden. (3) The Class A pathogen... home garden. (3) One of the vector attraction reduction requirements in § 503.33 (b)(1) through (b)(8...
40 CFR 503.15 - Operational standards-pathogens and vector attraction reduction.
Code of Federal Regulations, 2012 CFR
2012-07-01
... met when bulk sewage sludge is applied to a lawn or a home garden. (3) The Class A pathogen... home garden. (3) One of the vector attraction reduction requirements in § 503.33 (b)(1) through (b)(8...
Grasso, Elizabeth M; Uribe-Rendon, Roberto M; Lee, Ken
2011-01-01
During the past decade there were more than 50 reported outbreaks involving leafy green vegetables contaminated with foodborne pathogens. Leafy greens, including cabbage, are fresh foods rarely heated before consumption, which enables foodborne illness. The need for improved safety of fresh food drives the demand for nonthermal food processes to decrease the risk of pathogens while maintaining fresh quality. This study examines the efficacy of electron-beam (e-beam) irradiation in decreasing indigenous microflora on fresh-cut cabbage and determines the optimal dosage to pasteurize fresh-cut cabbage inoculated with Escherichia coli K-12. Fresh-cut cabbage (100 g) was inoculated with ∼8 log E. coli K-12 and e-beam irradiated at doses of 0, 1.0, 2.3, or 4.0 kGy. At 2.3 kGy there was <1.0 log indigenous microflora remaining, indicating greater than a 4.0-log reduction by e-beam. At a 4.0-kGy dose there was >7-log reduction of E. coli K-12 in the fresh-cut cabbage. The D(10)-value for E. coli K-12 in fresh-cut cabbage was 0.564 kGy. E-beam irradiation is thus a viable nonthermal treatment that extends the shelf life and increases the safety of fresh cabbage by reducing or eliminating indigenous microflora and unwanted pathogens.
Multidrug-resistant pathogens in patients with pneumonia coming from the community.
Sibila, Oriol; Rodrigo-Troyano, Ana; Shindo, Yuichiro; Aliberti, Stefano; Restrepo, Marcos I
2016-05-01
Identification of patients with multidrug-resistant (MDR) pathogens at initial diagnosis is essential for the appropriate selection of empiric treatment of patients with pneumonia coming from the community. The term Healthcare-Associated Pneumonia (HCAP) is controversial for this purpose. Our goal is to summarize and interpret the data addressing the association of MDR pathogens and community-onset pneumonia. Most recent clinical studies conclude that HCAP risk factor does not accurately identify resistant pathogens. Several risk factors related to MDR pathogens, including new ones that were not included in the original HCAP definition, have been described and different risk scores have been proposed. The present review focuses on the most recent literature assessing the importance of different risk factors for MDR pathogens in patients with pneumonia coming from the community. These included generally MDR risk factors, specific risk factors related to methicillin-resistant Staphylococcus aureus or Pseudomonas aeruginosa and clinical scoring systems develop to assess the MDR risk factors and its application in clinical practice. Different MDR risk factors and prediction scores have been recently developed. However, further research is needed in order to help clinicians in distinguishing between different MDR pathogens causing pneumonia.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-18
...] Draft Risk Profile on Pathogens and Filth in Spices: Availability; Extension of Comment Period AGENCY... Profile on Pathogens and Filth in Spices: Availability'' that appeared in the Federal Register of November... Risk Profile on Pathogens and Filth in Spices: Availability.'' The notice provided a 60-day comment...
Adverse Effects of Plasma Transfusion
Pandey, Suchitra; Vyas, Girish N.
2012-01-01
Plasma utilization has increased over the last two decades, and there is a growing concern that many plasma transfusions are inappropriate. Plasma transfusion is not without risk, and certain complications are more likely with plasma than other blood components. Clinical and laboratory investigations of the patients suffering reactions following infusion of fresh frozen plasma (FFP) define the etiology and pathogenesis of the panoply of adverse effects. We review here the pathogenesis, diagnosis, and management of the risks associated with plasma transfusion. Risks commonly associated with FFP include: (1) transfusion related acute lung injury; (2) transfusion associated circulatory overload, and (3) allergic/anaphylactic reactions. Other less common risks include (1) transmission of infections, (2) febrile non-hemolytic transfusion reactions, (3) RBC allo-immunization, and (4) hemolytic transfusion reactions. The affect of pathogen inactivation/reduction methods on these risks are also discussed. Fortunately, a majority of the adverse effects are not lethal and are adequately treated in clinical practice. PMID:22578374
Oishi, Wakana; Sano, Daisuke; Decrey, Loic; Kadoya, Syunsuke; Kohn, Tamar; Funamizu, Naoyuki
2017-11-15
Volume reduction (condensation) is a key for the practical usage of human urine as a fertilizer because it enables the saving of storage space and the reduction of transportation cost. However, concentrated urine may carry infectious disease risks resulting from human pathogens frequently present in excreta, though the survival of pathogens in concentrated urine is not well understood. In this study, the inactivation of MS2 coliphage, a surrogate for single-stranded RNA human enteric viruses, in concentrated synthetic urine was investigated. The infectious titer reduction of MS2 coliphage in synthetic urine samples was measured by plaque assay, and the reduction of genome copy number was monitored by reverse transcription-quantitative PCR (RTqPCR). Among chemical-physical conditions such as pH and osmotic pressure, uncharged ammonia was shown to be the predominant factor responsible for MS2 inactivation, independently of urine concentration level. The reduction rate of the viral genome number varied among genome regions, but the comprehensive reduction rate of six genome regions was well correlated with that of the infectious titer of MS2 coliphage. This indicates that genome degradation is the main mechanism driving loss of infectivity, and that RT-qPCR targeting the six genome regions can be used as a culture-independent assay for monitoring infectivity loss of the coliphage in urine. MS2 inactivation rate constants were well predicted by a model using ion composition and speciation in synthetic urine samples, which suggests that MS2 infectivity loss can be estimated solely based on the solution composition, temperature and pH, without explicitly accounting for effects of osmotic pressure. Copyright © 2017 Elsevier B.V. All rights reserved.
Corsi, Steven R.; Borchardt, Mark A.; Carvin, Rebecca B.; Burch, Tucker R; Spencer, Susan K.; Lutz, Michelle A.; McDermott, Colleen M.; Busse, Kimberly M.; Kleinheinz, Gregory; Feng, Xiaoping; Zhu, Jun
2016-01-01
Waterborne pathogens were measured at three beaches in Lake Michigan, environmental factors for predicting pathogen concentrations were identified, and the risk of swimmer infection and illness was estimated. Waterborne pathogens were detected in 96% of samples collected at three Lake Michigan beaches in summer, 2010. Samples were quantified for 22 pathogens in four microbial categories (human viruses, bovine viruses, protozoa, and pathogenic bacteria). All beaches had detections of human and bovine viruses and pathogenic bacteria indicating influence of multiple contamination sources at these beaches. Occurrence ranged from 40 to 87% for human viruses, 65–87% for pathogenic bacteria, and 13–35% for bovine viruses. Enterovirus, adenovirus A, Salmonella spp., Campylobacter jejuni, bovine polyomavirus, and bovine rotavirus A were present most frequently. Variables selected in multiple regression models used to explore environmental factors that influence pathogens included wave direction, cloud cover, currents, and water temperature. Quantitative Microbial Risk Assessment was done for C. jejuni, Salmonella spp., and enteroviruses to estimate risk of infection and illness. Median infection risks for one-time swimming events were approximately 3 × 10–5, 7 × 10–9, and 3 × 10–7 for C. jejuni, Salmonella spp., and enteroviruses, respectively. Results highlight the importance of investigating multiple pathogens within multiple categories to avoid underestimating the prevalence and risk of waterborne pathogens.
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops
Leotta, Gerardo A.; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1–100 scale as high-risk (1–40), moderate-risk (41–70) or low-risk (71–100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010–2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010–2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home. PMID:27618439
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops.
Leotta, Gerardo A; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1-100 scale as high-risk (1-40), moderate-risk (41-70) or low-risk (71-100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010-2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010-2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home.
Kouamé-Sina, Sylvie Mireille; Makita, Kohei; Costard, Solenne; Grace, Delia; Dadié, Adjehi; Dje, Marcellin; Bonfoh, Bassirou
2012-12-01
Animal-source foods are important causes of food-borne illness, and milk and dairy products can contain pathogenic microorganisms. We conducted a stochastic assessment of the risk of ingesting milk contaminated with specific microbial pathogens (Escherichia coli, Staphylococcus aureus, and Enterococcus spp.) in Abidjan, Côte d'Ivoire. We carried out structured interviews and focus group discussions with farmers (n = 15), vendors (n = 17), and consumers (n = 188) to characterize dairy production systems and milk consumption behavior. Microbiological sampling was conducted at different points between milking and sale. A risk model was developed, and the risk of consuming contaminated raw milk was estimated by Monte Carlo simulation. The investigation into local raw milk consumption patterns showed that the proportion of raw milk consumption was 51.6% in people who consume milk. The probability of ingestion of marketed raw milk that failed to meet standards for this group of bacteria was 29.9% and about 652 consumers per day were estimated to ingest contaminated milk. Microbiological tests from the farm showed that 7.2% of samples taken from milkers' hands, 4.4% of water samples (water used to rinse milk containers or milking utensils (calabash, plastic bottle, filters, buckets), 4.4% of environmental samples (air pollution), 13.2% of samples from milking utensils, and 4.9% of samples from cows' udders were contaminated with one or more of these pathogens. About 624.6 L of marketed raw milk would need to be discarded per day if discarding milk was chosen as the option of risk reduction. The destruction of this milk would result in a potential loss of Euro623.9 per day for all producers. The risk of human illness from consumption of raw milk could be mitigated by raising awareness about heat treatment of milk and good hygiene practices in the dairy chain.
Pouillot, Régis; Garin, Benoit; Ravaonindrina, Noro; Diop, Kane; Ratsitorahina, Mahery; Ramanantsoa, Domoina; Rocourt, Jocelyne
2012-10-01
We used a quantitative microbiological risk assessment model to describe the risk of Campylobacter and Salmonella infection linked to chicken meals prepared in households in Dakar, Senegal. The model uses data collected specifically for this study, such as the prevalence and level of bacteria on the neck skin of chickens bought in Dakar markets, time-temperature profiles recorded from purchase to consumption, an observational survey of meal preparation in private kitchens, and detection and enumeration of pathogens on kitchenware and cooks' hands. Thorough heating kills all bacteria present on chicken during cooking, but cross-contamination of cooked chicken or ready-to-eat food prepared for the meal via kitchenware and cooks' hands leads to a high expected frequency of pathogen ingestion. Additionally, significant growth of Salmonella is predicted during food storage at ambient temperature before and after meal preparation. These high exposures lead to a high estimated risk of campylobacteriosis and/or salmonellosis in Dakar households. The public health consequences could be amplified by the high level of antimicrobial resistance of Salmonella and Campylobacter observed in this setting. A significant decrease in the number of ingested bacteria and in the risk could be achieved through a reduction of the prevalence of chicken contamination at slaughter, and by the use of simple hygienic measures in the kitchen. There is an urgent need to reinforce the hygiene education of food handlers in Senegal. © 2012 Society for Risk Analysis.
A Quantitative Prioritisation of Human and Domestic Animal Pathogens in Europe
McIntyre, K. Marie; Setzkorn, Christian; Hepworth, Philip J.; Morand, Serge; Morse, Andrew P.; Baylis, Matthew
2014-01-01
Disease or pathogen risk prioritisations aid understanding of infectious agent impact within surveillance or mitigation and biosecurity work, but take significant development. Previous work has shown the H-(Hirsch-)index as an alternative proxy. We present a weighted risk analysis describing infectious pathogen impact for human health (human pathogens) and well-being (domestic animal pathogens) using an objective, evidence-based, repeatable approach; the H-index. This study established the highest H-index European pathogens. Commonalities amongst pathogens not included in previous surveillance or risk analyses were examined. Differences between host types (humans/animals/zoonotic) in pathogen H-indices were explored as a One Health impact indicator. Finally, the acceptability of the H-index proxy for animal pathogen impact was examined by comparison with other measures. 57 pathogens appeared solely in the top 100 highest H-indices (1) human or (2) animal pathogens list, and 43 occurred in both. Of human pathogens, 66 were zoonotic and 67 were emerging, compared to 67 and 57 for animals. There were statistically significant differences between H-indices for host types (humans, animal, zoonotic), and there was limited evidence that H-indices are a reasonable proxy for animal pathogen impact. This work addresses measures outlined by the European Commission to strengthen climate change resilience and biosecurity for infectious diseases. The results include a quantitative evaluation of infectious pathogen impact, and suggest greater impacts of human-only compared to zoonotic pathogens or scientific under-representation of zoonoses. The outputs separate high and low impact pathogens, and should be combined with other risk assessment methods relying on expert opinion or qualitative data for priority setting, or could be used to prioritise diseases for which formal risk assessments are not possible because of data gaps. PMID:25136810
Hoyer, Andrea B; Schladow, S Geoffrey; Rueda, Francisco J
2015-10-15
Pathogen contamination of drinking water lakes and reservoirs is a severe threat to human health worldwide. A major source of pathogens in surface sources of drinking waters is from body-contact recreation in the water body. However, dispersion pathways of human waterborne pathogens from recreational beaches, where body-contact recreation is known to occur to drinking water intakes, and the associated risk of pathogens entering the drinking water supply remain largely undocumented. A high spatial resolution, three-dimensional hydrodynamic and particle tracking modeling approach has been developed to analyze the risk and mechanisms presented by pathogen dispersion. The pathogen model represents the processes of particle release, transport and survival. Here survival is a function of both water temperature and cumulative exposure to ultraviolet (UV) radiation. Pathogen transport is simulated using a novel and computationally efficient technique of tracking particle trajectories backwards, from a drinking water intake toward their source areas. The model has been applied to a large, alpine lake - Lake Tahoe, CA-NV (USA). The dispersion model results reveal that for this particular lake (1) the risk of human waterborne pathogens to enter drinking water intakes is low, but significant; (2) this risk is strongly related to the depth of the thermocline in relation to the depth of the intake; (3) the risk increases with the seasonal deepening of the surface mixed layer; and (4) the risk increases at night when the surface mixed layer deepens through convective mixing and inactivation by UV radiation is eliminated. While these risk factors will quantitatively vary in different lakes, these same mechanisms will govern the process of transport of pathogens. Copyright © 2015 Elsevier Ltd. All rights reserved.
Cap, Andrew P; Pidcoke, Heather F; Keil, Shawn D; Staples, Hilary M; Anantpadma, Manu; Carrion, Ricardo; Davey, Robert A; Frazer-Abel, Ashley; Taylor, Audra L; Gonzales, Richard; Patterson, Jean L; Goodrich, Raymond P
2016-03-01
Transfusion of plasma from recovered patients after Ebolavirus (EBOV) infection, typically called "convalescent plasma," is an effective treatment for active disease available in endemic areas, but carries the risk of introducing other pathogens, including other strains of EBOV. A pathogen reduction technology using ultraviolet light and riboflavin (UV+RB) is effective against multiple enveloped, negative-sense, single-stranded RNA viruses that are similar in structure to EBOV. We hypothesized that UV+RB is effective against EBOV in blood products without activating complement or reducing protective immunoglobulin titers that are important for the treatment of Ebola virus disease (EVD). Four in vitro experiments were conducted to evaluate effects of UV+RB on green fluorescent protein EBOV (EBOV-GFP), wild-type EBOV in serum, and whole blood, respectively, and on immunoglobulins and complement in plasma. Initial titers for Experiments 1 to 3 were 4.21 log GFP units/mL, 4.96 log infectious units/mL, and 4.23 log plaque-forming units/mL. Conditions tested in the first three experiments included the following: 1-EBOV-GFP plus UV+RB; 2-EBOV-GFP plus RB only; 3-EBOV-GFP plus UV only; 4-EBOV-GFP without RB or UV; 5-virus-free control plus UV only; and 6-virus-free control without RB or UV. UV+RB reduced EBOV titers to nondetectable levels in both nonhuman primate serum (≥2.8- to 3.2-log reduction) and human whole blood (≥3.0-log reduction) without decreasing protective antibody titers in human plasma. Our in vitro results demonstrate that the UV+RB treatment efficiently reduces EBOV titers to below limits of detection in both serum and whole blood. In vivo testing to determine whether UV+RB can improve convalescent blood product safety is indicated. © 2016 AABB.
Cap, Andrew P.; Pidcoke, Heather F.; Keil, Shawn D.; Staples, Hilary M.; Anantpadma, Manu; Carrion, Ricardo; Davey, Robert A.; Frazer-Abel, Ashley; Taylor, Audra L.; Gonzales, Richard; Patterson, Jean L.; Goodrich, Raymond P.
2018-01-01
BACKGROUND Transfusion of plasma from recovered patients after Ebolavirus (EBOV) infection, typically called ‘convalescent plasma,’ is an effective treatment for active disease available in endemic areas, but carries the risk of introducing other pathogens, including other strains of EBOV. A pathogen reduction technology using ultraviolet light and riboflavin (UV + RB) is effective against multiple enveloped, negative-sense, single-stranded RNA viruses that are similar in structure to EBOV. We hypothesized that UV + RB is effective against EBOV in blood products without activating complement or reducing protective immunoglobulin titers that are important for the treatment of ebolavirus disease (EVD). STUDY DESIGN AND METHODS Four in vitro experiments were conducted to evaluate effects of UV + RB on green fluorescent protein EBOV (EBOV-GFP), wild-type EBOV in serum and whole blood, respectively, and on immunoglobulins and complement in plasma. Initial titers for Experiments 1–3 were: 4.21 log10 GFP units/mL, 4.96 log10 infectious units per mL, and 4.23 log10 plaque forming units per mL (PFU/mL). Conditions tested in the first three experiments included: 1. EBOV-GFP + UV + RB; 2. EBOV-GFP + RB only; 3 EBOV-GFP + UV only; 4. EBOV-GFP without RB or UV; 5. Virus-free control + UV only; and 6. Virus-free control without RB or UV. RESULTS UV + RB reduced EBOV titers to non-detectable levels in both non-human primate serum (≥ 2.8 to 3.2 log reduction) and human whole blood (≥ 3.0 log reduction) without decreasing protective antibody titers in human plasma. CONCLUSION Our in vitro results demonstrate that the UV + RB treatment efficiently reduces EBOV titers to below limits of detection in both serum and whole blood. In vivo testing to determine whether UV + RB can improve convalescent blood product safety is indicated. PMID:27001363
Brouwer, Andrew F; Masters, Nina B; Eisenberg, Joseph N S
2018-04-20
Waterborne enteric pathogens remain a global health threat. Increasingly, quantitative microbial risk assessment (QMRA) and infectious disease transmission modeling (IDTM) are used to assess waterborne pathogen risks and evaluate mitigation. These modeling efforts, however, have largely been conducted independently for different purposes and in different settings. In this review, we examine the settings where each modeling strategy is employed. QMRA research has focused on food contamination and recreational water in high-income countries (HICs) and drinking water and wastewater in low- and middle-income countries (LMICs). IDTM research has focused on large outbreaks (predominately LMICs) and vaccine-preventable diseases (LMICs and HICs). Human ecology determines the niches that pathogens exploit, leading researchers to focus on different risk assessment research strategies in different settings. To enhance risk modeling, QMRA and IDTM approaches should be integrated to include dynamics of pathogens in the environment and pathogen transmission through populations.
Kern, Aurelie; Zhou, Chensheng W; Jia, Feng; Xu, Qiaobing; Hu, Linden T
2016-08-31
The incidence of Lyme disease has continued to rise despite attempts to control its spread. Vaccination of zoonotic reservoirs of human pathogens has been successfully used to decrease the incidence of rabies in raccoons and foxes. We have previously reported on the efficacy of a vaccinia virus vectored vaccine to reduce carriage of Borrelia burgdorferi in reservoir mice and ticks. One potential drawback to vaccinia virus vectored vaccines is the risk of accidental infection of humans. To reduce this risk, we developed a process to encapsulate vaccinia virus with a pH-sensitive polymer that inactivates the virus until it is ingested and dissolved by stomach acids. We demonstrate that the vaccine is inactive both in vitro and in vivo until it is released from the polymer. Once released from the polymer by contact with an acidic pH solution, the virus regains infectivity. Vaccination with coated vaccinia virus confers protection against B. burgdorferi infection and reduction in acquisition of the pathogen by naïve feeding ticks. Copyright © 2016. Published by Elsevier Ltd.
Microbial minimalism: genome reduction in bacterial pathogens.
Moran, Nancy A
2002-03-08
When bacterial lineages make the transition from free-living or facultatively parasitic life cycles to permanent associations with hosts, they undergo a major loss of genes and DNA. Complete genome sequences are providing an understanding of how extreme genome reduction affects evolutionary directions and metabolic capabilities of obligate pathogens and symbionts.
Antibody Levels to Persistent Pathogens and Incident Stroke in Mexican Americans
Sealy-Jefferson, Shawnita; Gillespie, Brenda W.; Aiello, Allison E.; Haan, Mary N.; Morgenstern, Lewis B.; Lisabeth, Lynda D.
2013-01-01
Background Persistent pathogens have been proposed as risk factors for stroke; however, the evidence remains inconclusive. Mexican Americans have an increased risk of stroke especially at younger ages, as well as a higher prevalence of infections caused by several persistent pathogens. Methodology/Principal Findings Using data from the Sacramento Area Latino Study on Aging (n = 1621), the authors used discrete-time regression to examine associations between stroke risk and (1) immunoglobulin G antibody levels to Helicobacter pylori (H. pylori), Cytomegalovirus, Varicella Zoster Virus, Toxoplasma gondii and Herpes simplex virus 1, and (2) concurrent exposure to several pathogens (pathogen burden), defined as: (a) summed sero-positivity, (b) number of pathogens eliciting high antibody levels, and (c) average antibody level. Models were adjusted for socio-demographics and stroke risk factors. Antibody levels to H. pylori predicted incident stroke in fully adjusted models (Odds Ratio: 1.58; 95% Confidence Interval: 1.09, 2.28). No significant associations were found between stroke risk and antibody levels to the other four pathogens. No associations were found for pathogen burden and incident stroke in fully adjusted models. Conclusions/Significance Our results suggest that exposure to H. pylori may be a stroke risk factor in Mexican Americans and may contribute to ethnic differences in stroke risk given the increased prevalence of exposure to H. pylori in this population. Future studies are needed to confirm this association. PMID:23799066
General Suppression of Escherichia coli O157:H7 in Sand-Based Dairy Livestock Bedding▿ †
Westphal, Andreas; Williams, Michele L.; Baysal-Gurel, Fulya; LeJeune, Jeffrey T.; McSpadden Gardener, Brian B.
2011-01-01
Sand bedding material is frequently used in dairy operations to reduce the occurrence of mastitis and enhance cow comfort. One objective of this work was to determine if sand-based bedding also supported the microbiologically based suppression of an introduced bacterial pathogen. Bedding samples were collected in summer, fall, and winter from various locations within a dairy operation and tested for their ability to suppress introduced populations of Escherichia coli O157:H7. All sources of bedding displayed a heat-sensitive suppressiveness to the pathogen. Differences in suppressiveness were also noted between different samples at room temperature. At just 1 day postinoculation (dpi), the recycled sand bedding catalyzed up to a 1,000-fold reduction in E. coli counts, typically 10-fold greater than the reduction achieved with other substrates, depending on the sampling date. All bedding substrates were able to reduce E. coli populations by over 10,000-fold within 7 to 15 dpi, regardless of sampling date. Terminal restriction fragment length polymorphism (T-RFLP) analysis was used to identify bacterial populations potentially associated with the noted suppression of E. coli O157:H7 in sand bedding. Eleven terminal restriction fragments (TRFs) were overrepresented in paired comparisons of suppressive and nonsuppressive specimens at multiple sampling points, indicating that they may represent environmentally stable populations of pathogen-suppressing bacteria. Cloning and sequencing of these TRFs indicated that they represent a diverse subset of bacteria, belonging to the Cytophaga-Flexibacter-Bacteroidetes, Gammaproteobacteria, and Firmicutes, only a few of which have previously been identified in livestock manure. Such data indicate that microbial suppression may be harnessed to develop new options for mitigating the risk and dispersal of zoonotic bacterial pathogens on dairy farms. PMID:21257815
Trends in Reported Foodborne Illness in the United States; 1996-2013.
Powell, Mark R
2016-08-01
Retrospective review is a key to designing effective food safety measures. The analysis examines trends in the reported incidence of illness due to bacterial pathogens commonly transmitted by food in the United States during 1996-2013 with and without specifying a model form for trend. The findings indicate early declines in reported incidence followed by a period of no significant trend for Campylobacter, Listeria, Shiga toxin-producing Escherichia coli O157, and Yersinia. The results are inconclusive about whether there is no trend or an increasing trend for Salmonella. While Shigella exhibits a continuous decline, Vibrio exhibits a continuous increase. Overall, the findings indicate a lack of evidence for continuous reduction in illness due to bacterial pathogens commonly transmitted by food in the United States during 1996-2013. © 2015 Society for Risk Analysis.
Daud, Aziah Binti; Mohd Fuzi, Nik Mohd Hafiz; Wan Mohammad, Wan Mohd Zahiruddin; Amran, Fairuz; Ismail, Nabilah; Arshad, Mohd Mokhtar; Kamarudin, Suratan
2018-04-01
Leptospirosis is an emerging zoonosis and its occurrence has been reported to be rising globally. The environment plays an important role in the survival of Leptospira and determines the risk of infection. Those who were exposed to and had contact with contaminated environment through their occupational, recreational and other activities can be infected with the organism. To determine the seroprevalence of leptospirosis among cattle farmers, prevalence of pathogenic Leptospira, and the workplace environmental risk factors for leptospirosis among cattle farmers in northeastern Malaysia. A cross-sectional study involving 120 cattle farmers was conducted. The participants answered an interviewer-guided questionnaire that consisted of sociodemographic and workplace environment characteristics questionnaire, before having their blood sample taken for microscopic agglutination test (MAT). Seropositivity was determined using a cut-off titer of ≥1:100. 248 environmental samples were also collected from the cattle farms for polymerase chain reaction (PCR). The overall seroprevalence of leptospiral antibodies was 72.5% (95% CI 63.5% to 80.1%) and the prevalence of pathogenic Leptospira in the cattle farms environment was 12.1% (95% CI 8.4% to 17.0%). The independent factors associated with seropositivity of leptospirosis among cattle farmers were positive pathogenic Leptospira in the environment (Adj OR 5.90, 95% CI 1.34 to 26.01) and presence of garbage dumping in the farm (Adj OR 2.40, 95% CI 1.02 to 5.65). Preventing leptospirosis incidence among cattle farmers necessitates changes in work environment. Identifying modifiable factors may also contribute to the reduction of infection.
Weight-control behaviour and weight-concerns in young elite athletes – a systematic review
2013-01-01
Weight-control behaviour is commonly observed in a wide range of elite sports, especially leanness sports, where control over body weight is crucial for high peak performance. Nonetheless, there is only a fine line between purely functional behaviour and clinically relevant eating disorders. Especially the rapid form of weight manipulation seems to foster later eating disorders. So far, most studies have focussed on adult athletes and concentrated on manifest eating disorders. In contrast, our review concentrates on young athletes and weight-control behaviour as a risk factor for eating disorders. An electronic search according to PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement was performed using Pubmed, PsychInfo and Spolit. The following search terms were used: weight-control, weight-control behaviour, weight gain, weight loss, pathogenic weight-control behaviour and weight-concerns, each of them combined with elite athlete, young elite athlete, adolescent elite athlete and elite sports. Overall, data are inconsistent. In general, athletes do not seem to be at a higher risk for pathogenic weight concerns and weight-control behaviour. It does seem to be more prevalent in leanness sports, though. There is evidence for pathogenic weight-control behaviour in both genders; male athletes mostly trying to gain weight whereas females emphasise weight reduction. There is not enough data to make predictions about connections with age of onset. Young elite athletes do show weight-control behaviour with varying degrees of frequency and severity. In particular, leanness sports seem to be a risk factor for weight manipulation. Further research is needed for more details and possible connections. PMID:24999399
Chappell, Thomas M; Kennedy, George G
2018-06-21
Imidacloprid is widely used to manage tomato spotted wilt disease (TSW) in tobacco, tomato, and pepper, caused by Tomato spotted wilt orthotospovirus (TSWV) and spread by the tobacco thrips, Frankliniella fusca Hinds (Thysanoptera: Thripidae). Imidacloprid suppresses transmission of TSWV by reducing probing and feeding by adult thrips on treated plants, thereby reducing the probability of transmission by infectious thrips. Because imidacloprid does not reduce probing and feeding on treated plants to zero, the reduction in transmission probability per viruliferous thrips can be offset by an increase in the number of viruliferous thrips challenging treated plants. A composite of these effects which we call 'pathogen pressure' experienced by plants is a function of thrips population size, the proportion of those thrips that are viruliferous, and the probability that viruliferous thrips successfully inoculate plants. To better understand the relationship between imidacloprid's effect on virus transmission, pathogen pressure, and TSW incidence in tobacco, we modeled TSW incidence as a function of the two most important variables affecting components of pathogen pressure, temperature, and precipitation, and the dependence of imidacloprid's effect on pathogen pressure. A model incorporating imidacloprid's effect as a reduction in pathogen pressure was found to be more descriptive than models incorporating the effect as a reduction in TSW incidence. Results reveal maximum proportional reduction in TSW incidence resulting from imidacloprid use is associated with minimal potential TSW incidence. As pathogen pressure increases, potential TSW incidence approaches 100%, and the benefits of imidacloprid use are highest at intermediate levels of pathogen pressure.
Ishida, Tadashi; Ito, Akihiro; Washio, Yasuyoshi; Yamazaki, Akio; Noyama, Maki; Tokioka, Fumiaki; Arita, Machiko
2017-01-01
The new acronym, PES pathogens (Pseudomonas aeruginosa, Enterobacteriaceae extended-spectrum beta-lactamase-positive, and methicillin-resistant Staphylococcus aureus), was recently proposed to identify drug-resistant pathogens associated with community-acquired pneumonia. To evaluate the risk factors for antimicrobial-resistant pathogens in immunocompetent patients with pneumonia and to validate the role of PES pathogens. A retrospective analysis of a prospective observational study of immunocompetent patients with pneumonia between March 2009 and June 2015 was conducted. We clarified the risk factors for PES pathogens. Of the total 1559 patients, an etiological diagnosis was made in 705 (45.2%) patients. PES pathogens were identified in 51 (7.2%) patients, with 53 PES pathogens (P. aeruginosa, 34; ESBL-positive Enterobacteriaceae, 6; and MRSA, 13). Patients with PES pathogens had tendencies toward initial treatment failure, readmission within 30 days, and a prolonged hospital stay. Using multivariate analysis, female sex (adjusted odds ratio [AOR] 1.998, 95% confidence interval [CI] 1.047-3.810), admission within 90 days (AOR 2.827, 95% CI 1.250-6.397), poor performance status (AOR 2.380, 95% CI 1.047-5.413), and enteral feeding (AOR 5.808, 95% CI 1.813-18.613) were independent risk factors for infection with PES pathogens. The area under the receiver operating characteristics curve for the risk factors was 0.66 (95% CI 0.577-0.744). We believe the definition of PES pathogens is an appropriate description of drug-resistant pathogens associated with pneumonia in immunocompetent patients. The frequency of PES pathogens is quite low. However, recognition is critical because they can cause refractory pneumonia and different antimicrobial treatment is required. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Inhaled antibiotics in non-cystic fibrosis bronchiectasis: A meta-analysis.
Xu, Li; Zhang, Fei; Du, Shuai; Yu, Qi; Chen, Lin; Long, Li-Hui; Li, Ya-Ming; Jia, Ai-Hua
2016-09-01
To evaluate the efficacy and safety of inhaled antibiotics for the treatment of non-cystic fibrosis bronchiectasis (NCFB). Pubmed, Cochrane library, Embase, Elsevier, OVID, Springerlink, Web of knowledge and NEJM were searched for randomized controlled trials (RCTs) on inhaled antibiotics in treatment of NCFB from inception until April 2015. Meta-analysis was conducted to assess the efficacy and safety of inhaled antibiotics in the treatment of NCFB. Twelve RCTs involving 1154 participants were included. They showed that inhaled antibiotics were more effective in reduction of sputum bacterial density, eradication of P. aeruginosa, prolonged time to exacerbation and reduction of new pathogens emergence with no significant difference in adverse events compared with control groups. However, we did not find significant benefits of inhaled antibiotics in reducing the risk of acute exacerbation, improving health-related quality of life and reduction of P. aeruginosa resistance. Moreover, inhaled antibiotics exerted a statistically significant reduction in FEV1%. Inhaled antibiotics may be an alternative pathway to inhibit airway inflammation with no more adverse events in patients with NCFB.
Mai, Xiaodan; Genco, Robert J.; LaMonte, Michael J.; Hovey, Kathleen M.; Freudenheim, Jo L.; Andrews, Christopher A.; Wactawski-Wende, Jean
2016-01-01
Background Extraoral translocation of oral bacteria may contribute to associations between periodontal disease and cancer. The associations among the presence of three orange-complex periodontal pathogens (Fusobacterium nucleatum, Prevotella intermedia, and Campylobacter rectus), two red-complex periodontal pathogens (Porphyromonas gingivalis and Tannerella forsythia), and cancer risk were investigated. Methods A total of 1,252 postmenopausal females enrolled in the Buffalo Osteoporosis and Periodontal Disease Study were followed prospectively. Baseline subgingival plaque samples were assessed for the presence of periodontal pathogens using indirect immunofluorescence. Incident cancer cases were adjudicated by staff physicians via review of medical records. Cox proportional hazards regression was used to calculate hazard ratios (HRs) and 95% confidence intervals (CIs) for the associations of periodontal pathogens with total cancer and site-specific cancer risk in unadjusted and multivariable-adjusted models. Results Neither the presence of individual pathogens nor the presence of any red-complex pathogens was associated with total cancer or site-specific cancers. Borderline associations were seen among the presence of any orange-complex pathogens (F. nucleatum, P. intermedia, and C. rectus), total cancer risk (HR = 1.35, 95% CI = 1.00 to 1.84), and lung cancer risk (HR = 3.02, 95% CI = 0.98 to 9.29). Conclusions No associations were found between the presence of individual subgingival pathogens and cancer risk. However, there were suggestions of borderline positive associations of the presence of any orange-complex pathogens with total cancer and lung cancer risk. The study is limited by the small number of cancer cases and the assessment of only five oral bacteria. Additional research is needed to understand the possi ble role of periodontal disease in carcinogenesis. PMID:26513268
Purnell, Sarah; Ebdon, James; Buck, Austen; Tupper, Martyn; Taylor, Huw
2016-09-01
The aim of this study was to demonstrate how seasonal variability in the removal efficacy of enteric viral pathogens from an MBR-based water recycling system might affect risks to human health if the treated product were to be used for the augmentation of potable water supplies. Samples were taken over a twelve month period (March 2014-February 2015), from nine locations throughout a water recycling plant situated in East London and tested for faecal indicator bacteria (thermotolerant coliforms, intestinal enterococci n = 108), phages (somatic coliphage, F-specific RNA phage and Bacteroides phage (GB-124) n = 108), pathogenic viruses (adenovirus, hepatitis A, norovirus GI/GII n = 48) and a range of physico-chemical parameters (suspended solids, DO, BOD, COD). Thermotolerant coliforms and intestinal enterococci were removed effectively by the water recycling plant throughout the study period. Significant mean log reductions of 3.9-5.6 were also observed for all three phage groups monitored. Concentrations of bacteria and phages did not vary significantly according to season (P < 0.05; Kruskal-Wallis), though recorded levels of norovirus (GI) were significantly higher during autumn/winter months (P = 0.027; Kruskal-Wallis). Log reduction values for norovirus and adenovirus following MBR treatment were 2.3 and 4.4, respectively. However, both adenovirus and norovirus were detected at low levels (2000 and 3240 gene copies/L, respectively) post chlorination in single samples. Whilst phage concentrations did correlate with viral pathogens, the results of this study suggest that phages may not be suitable surrogates, as viral pathogen concentrations varied to a greater degree seasonally than did the phage indicators and were detected on a number of occasions on which phages were not detected (false negative sample results). Copyright © 2016 Elsevier Ltd. All rights reserved.
Mootian, Gabriel K; Flimlin, George E; Karwe, Mukund V; Schaffner, Donald W
2013-02-01
Shellfish may internalize dangerous pathogens during filter feeding. Traditional methods of depuration have been found ineffective against certain pathogens. The objective was to explore high hydrostatic pressure (HHP) as an alternative to the traditional depuration process. The effect of HHP on the survival of Vibrio parahaemolyticus in live clams (Mercanaria mercanaria) and the impact of HHP on physical characteristics of clam meat were investigated. Clams were inoculated with up to 7 log CFU/g of a cocktail of V. parahaemolyticus strains via filter feeding. Clams were processed at pressures ranging from 250 to 552 MPa for hold times ranging between 2 and 6 min. Processing conditions of 450 MPa for 4 min and 350 MPa for 6 min reduced the initial concentration of V. parahaemolyticus to a nondetectable level (<10(1) CFU/g), achieving >5 log reductions. The volume of clam meat (processed in shell) increased with negligible change in mass after exposure to pressure at 552 MPa for 3 min, while the drip loss was reduced. Clams processed at 552 MPa were softer compared to those processed at 276 MPa. However, all HHP processed clams were found to be harder compared to unprocessed. The lightness (L*) of the meat increased although the redness (a*) decreased with increasing pressure. Although high pressure-processed clams may pose a significantly lower risk from V. parahaemolyticus, the effect of the accompanied physical changes on the consumer's decision to purchase HHP clams remains to be determined. Shellfish may contain dangerous foodborne pathogens. Traditional methods of removing those pathogen have been found ineffective against certain pathogens. The objective of this research was to determine the effect of high hydrostatic pressure on V. parahaemolyticus in clams. Processing conditions of 450 MPa for 4 min and 350 MPa for 6 min reduced the initial concentration of V. parahaemolyticus to a nondetectable level, achieving >5 log reductions. © 2013 Institute of Food Technologists®
Development of a qualitative pathogen risk-assessment methodology for municipal-sludge landfilling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-04-01
This report addresses potential risks from microbiological pathogens present in municipal sludge disposal in landfills. Municipal sludges contain a wide variety of bacteria, viruses, protozoa, helminths, and fungi. Survival characteristics of pathogens are critical factors in assessing the risks associated with potential transport of microorganisms from the sludge-soil matrix to the ground-water environment of landfills. Various models are discussed for predicting microbial die-off. The order of persistence in the environment from longest to shortest survival time appears to be helminth eggs > viruses > bacteria > protozoan cysts. Whether or not a pathogen reaches ground-water and is transported to drinking-watermore » wells depends on a number of factors, including initial concentration of the pathogen, survival of the pathogen, number of pathogens that reach the sludge-soil interface, degree of removal through the unsaturated and saturated-soil zones, and the hydraulic gradient. The degree to which each of these factors will influence the probability of pathogens entering ground-water cannot be determined precisely. Information on the fate of pathogens at existing landfills is sorely lacking. Additional laboratory and field studies are needed to determine the degree of pathogen leaching, survival and transport in ground-water in order to estimate potential risks from pathogens at sludge landfills with reasonable validity.« less
Goss, Michael; Richards, Charlene
2008-06-01
Source water protection planning (SWPP) is an approach to prevent contamination of ground and surface water in watersheds where these resources may be abstracted for drinking or used for recreation. For SWPP the hazards within a watershed that could contribute to water contamination are identified together with the pathways that link them to the water resource. In rural areas, farms are significant potential sources of pathogens. A risk-based index can be used to support the assessment of the potential for contamination following guidelines on safety and operational efficacy of processes and practices developed as beneficial approaches to agricultural land management. Evaluation of the health risk for a target population requires knowledge of the strength of the hazard with respect to the pathogen load (massxconcentration). Manure handling and on-site wastewater treatment systems form the most important hazards, and both can comprise confined and unconfined source elements. There is also a need to understand the modification of pathogen numbers (attenuation) together with characteristics of the established pathways (surface or subsurface), which allow the movement of the contaminant species from a source to a receptor (water source). Many practices for manure management have not been fully evaluated for their impact on pathogen survival and transport in the environment. A key component is the identification of potential pathways of contaminant transport. This requires the development of a suitable digital elevation model of the watershed for surface movement and information on local groundwater aquifer systems for subsurface flows. Both require detailed soils and geological information. The pathways to surface and groundwater resources can then be identified. Details of land management, farm management practices (including animal and manure management) and agronomic practices have to be obtained, possibly from questionnaires completed by each producer within the watershed. To confirm that potential pathways are active requires some microbial source tracking. One possibility is to identify the molecular types of Escherichia coli present in each hazard on a farm. An essential part of any such index is the identification of mitigation strategies and practices that can reduce the magnitude of the hazard or block open pathways.
USDA-ARS?s Scientific Manuscript database
In the western United States where dairy wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks after inhalation exposure of pathogens aero...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... ENVIRONMENTAL PROTECTION AGENCY [FRL-9311-4] Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids'' EPA/600/R-08/035F...
Hwang, Jusun; Lee, Kyunglee; Walsh, Daniel P.; Kim, SangWha; Sleeman, Jonathan M.; Lee, Hang
2018-01-01
Wildlife-associated diseases and pathogens have increased in importance; however, management of a large number of diseases and diversity of hosts is prohibitively expensive. Thus, the determination of priority wildlife pathogens and risk factors for disease emergence is warranted. We used an online questionnaire survey to assess release and exposure risks, and consequences of wildlife-associated diseases and pathogens in the Republic of Korea (ROK). We also surveyed opinions on pathways for disease exposure, and risk factors for disease emergence and spread. For the assessment of risk, we employed a two-tiered, statistical K-means clustering algorithm to group diseases into three levels (high, medium and low) of perceived risk based on release and exposure risks, societal consequences and the level of uncertainty of the experts’ opinions. To examine the experts’ perceived risk of routes of introduction of pathogens and disease amplification and spread, we used a Bayesian, multivariate normal order-statistics model. Six diseases or pathogens, including four livestock and two wildlife diseases, were identified as having high risk with low uncertainty. Similarly, 13 diseases were characterized as having high risk with medium uncertainty with three of these attributed to livestock, six associated with human disease, and the remainder having the potential to affect human, livestock and wildlife (i.e., One Health). Lastly, four diseases were described as high risk with high certainty, and were associated solely with fish diseases. Experts identified migration of wildlife, international human movement and illegal importation of wildlife as the three routes posing the greatest risk of pathogen introduction into ROK. Proximity of humans, livestock and wildlife was the most significant risk factor for promoting the spread of wildlife-associated diseases and pathogens, followed by high density of livestock populations, habitat loss and environmental degradation, and climate change. This study provides useful information to decision makers responsible for allocating resources to address disease risks. This approach provided a rapid, cost-effective method of risk assessment of wildlife-associated diseases and pathogens for which the published literature is sparse.
Cui, Qijia; Fang, Tingting; Huang, Yong; Dong, Peiyan; Wang, Hui
2017-07-01
The microbial quality of urban recreational water is of great concern to public health. The monitoring of indicator organisms and several pathogens alone is not sufficient to accurately and comprehensively identify microbial risks. To assess the levels of bacterial pathogens and health risks in urban recreational water, we analyzed pathogen diversity and quantified four pathogens in 46 water samples collected from waterbodies in Beijing Olympic Forest Park in one year. The pathogen diversity revealed by 16S rRNA gene targeted next-generation sequencing (NGS) showed that 16 of 40 genera and 13 of 76 reference species were present. The most abundant species were Acinetobacter johnsonii, Mycobacterium avium and Aeromonas spp. Quantitative polymerase chain reaction (qPCR) of Escherichia coli (uidA), Aeromonas (aerA), M. avium (16S rRNA), Pseudomonas aeruginosa (oaa) and Salmonella (invA) showed that the aerA genes were the most abundant, occurring in all samples with concentrations of 10 4-6 genome copies/100mL, followed by oaa, invA and M. avium. In total, 34.8% of the samples harbored all genes, indicating the prevalence of these pathogens in this recreational waterbody. Based on the qPCR results, a quantitative microbial risk assessment (QMRA) showed that the annual infection risks of Salmonella, M. avium and P. aeruginosa in five activities were mostly greater than the U.S. EPA risk limit for recreational contacts, and children playing with water may be exposed to the greatest infection risk. Our findings provide a comprehensive understanding of bacterial pathogen diversity and pathogen abundance in urban recreational water by applying both NGS and qPCR. Copyright © 2016. Published by Elsevier B.V.
Pathogen reduction of whole blood: utility and feasibility.
Allain, J-P; Goodrich, R
2017-10-01
To collect information on pathogen reduction applied to whole blood. Pathogen reduction (PR) of blood components has been developed over the past two decades, and pathogen-reduced fresh-frozen plasma and platelet concentrates are currently in clinical use. High cost and incomplete coverage of components make PR out of reach for low- and middle-income countries (LMIC). However, should PR become applicable to whole blood (WB), the main product transfused in sub-Saharan Africa, and be compatible with the preparation of clinically suitable components, cost would be minimised, and a range of safety measures in place at high cost in developed areas would become redundant. All articles called with "pathogen reduction", "pathogen inactivation" and "whole blood" were retrieved from Medline. References in articles were utilised. One such PR technology (PRT) applied to WB has been developed and has shown efficacious against viruses, bacteria and parasites in vitro; and has been able to inactivate nucleated blood cells whilst retaining the ability to prepare components with acceptable characteristics. The efficacy of this WB PRT has been demonstrated in vivo using the inactivation of Plasmodium falciparum as a model and showing a high degree of correlation between in vitro and in vivo data. Obtaining further evidence of efficacy on other suitable targets is warranted. Shortening of the process, which is currently around 50 min, or increasing the number of units simultaneously processed would be necessary to make PRT WB conducive to LMIC blood services' needs. Even if not 100% effective against agents that are present in high pathogen load titres, WB PRT could massively impact blood safety in LMIC by providing safer products at an affordable cost. © 2017 British Blood Transfusion Society.
Shih, Chia-Jen; Tarng, Der-Cherng; Yang, Wu-Chang; Yang, Chih-Yu
2014-07-01
Due to lifelong immunosuppression, renal transplant recipients (RTRs) are at risk of infectious complications such as pneumonia. Severe pneumonia results in respiratory failure and is life‑threatening. We aimed to examine the influence of immunosuppressant dose reduction on RTRs with bacterial pneumonia and respiratory failure. From January 2001 to January 2011, 33 of 1,146 RTRs at a single centre developed bacterial pneumonia with respiratory failure. All patients were treated using mechanical ventilation and aggressive therapies in the intensive care unit. Average time from kidney transplantation to pneumonia with respiratory failure was 6.8 years. In-hospital mortality rate was 45.5% despite intensive care and aggressive therapies. Logistic regression analysis indicated that a high serum creatinine level at the time of admission to the intensive care unit (odds ratio 1.77 per mg/dL, 95% confidence interval 1.01-3.09; p = 0.045) was a mortality determinant. Out of the 33 patients, immunosuppressive agents were reduced in 17 (51.5%). We found that although immunosuppressant dose reduction tended to improve in-hospital mortality, this was not statistically significant. Nevertheless, during a mean follow-up period of two years, none of the survivors (n = 18) developed acute rejection or allograft necrosis. In RTRs with bacterial pneumonia and respiratory failure, higher serum creatinine levels were a mortality determinant. Although temporary immunosuppressant dose reduction might not reduce mortality, it was associated with a minimal risk of acute rejection during the two-year follow-up. Our results suggest that early immunosuppressant reduction in RTRs with severe pneumonia of indeterminate microbiology may be safe even when pathogens are bacterial in nature.
TREATMENT OF MUNICIPAL SLUDGE FOR PATHOGEN REDUCTION
This presentation reviews the pathogenic microorganisms that may be found in municipal sewage sludge and the commonly employed Class A and B processes for controlling pathogens. It notes how extensively they are used and discusses issues and concerns with their application. The...
Prevention of infections in an ART laboratory: a reflection on simplistic methods.
Huyser, C
2014-01-01
Preventative measures combined with reactive remedial actions are generic management tools to optimize and protect an entity's core businesses. Differences between assisted reproduction technology (ART) laboratories in developing versus developed countries include restricted access to, or availability of resources, and the prevalence of pathological conditions that are endemic or common in non-industrialized regions. The aim of this paper is to discuss the prevention of infections in an ART laboratory in a low to middle-income country, with reference to simplistic risk reduction applications to avoid the introduction and transmission of pathogens. Diagnostic and procedural phases will be examined, i.e. (i) screening for microbes during patient evaluation, and (ii-iii) prevention of environmental and procedural contamination. Preventative action is enabled by knowledge of threats and the degree of risk involved. Awareness and understanding of the vulnerabilities in an ART system, wherein laboratory personnel operate, are invaluable assets when unforeseen equipment failure occurs or instant decisions have to be made to safeguard procedures. An inter-connective team approach to patient treatment, biosafety training and utilization of practical procedures such as semen decontamination, are fundamental tools in a laboratory's risk-reduction armoury to prevent and eliminate infectious elements.
Masumoto, Shota; Uchida, Masaki; Tojo, Motoaki; Herrero, Maria Luz; Mori, Akira S; Imura, Satoshi
2018-03-01
In Arctic tundra, plant pathogens have substantial effects on the growth and survival of hosts, and impacts on the carbon balance at the scale of ecological systems. To understand these effects on carbon dynamics across different scales including plant organ, individual, population and ecosystem, we focused on two primary factors: host productivity reduction and carbon consumption by the pathogen. We measured the effect of the pathogen on photosynthetic and respiratory activity in the host. We also measured respiration and the amount of carbon in the pathogen. We constructed a model based on these two factors, and calculated pathogenic effects on the carbon balance at different organismal and ecological scales. We found that carbon was reduced in infected leaves by 118% compared with healthy leaves; the major factor causing this loss was pathogenic carbon consumption. The carbon balance at the population and ecosystem levels decreased by 35% and 20%, respectively, at an infection rate of 30%. This case study provides the first evidence that a host plant can lose more carbon through pathogenic carbon consumption than through a reduction in productivity. Such a pathogenic effect could greatly change ecosystem carbon cycling without decreasing annual productivity.
Recycling of treated domestic effluent from an on-site wastewater treatment system for hydroponics.
Oyama, N; Nair, J; Ho, G E
2005-01-01
An alternative method to conserve water and produce crops in arid regions is through hydroponics. Application of treated wastewater for hydroponics will help in stripping off nutrients from wastewater, maximising reuse through reduced evaporation losses, increasing control on quality of water and reducing risk of pathogen contamination. This study focuses on the efficiency of treated wastewater from an on-site aerobic wastewater treatment unit. The experiment aimed to investigate 1) nutrient reduction 2) microbial reduction and 3) growth rate of plants fed on wastewater compared to a commercial hydroponics medium. The study revealed that the chemical and microbial quality of wastewater after hydroponics was safe and satisfactory for irrigation and plant growth rate in wastewater hydroponics was similar to those grown in a commercial medium.
Quantitative assessment of risk reduction from hand washing with antibacterial soaps.
Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A
2002-01-01
The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.
OVERVIEW OF RISK ASSESSMENT FOR TOXIC AND PATHOGENIC AGENTS
Risk assessment is a process that defines the adverse health consequences of exposure to toxic or pathogenic agents. hen used in regulatory decision making, risk assessment is an important component of risk management, which "combines the risk assessment with the directives of re...
What is the risk for exposure to vector-borne pathogens in United States national parks?
Eisen, Lars; Wong, David; Shelus, Victoria; Eisen, Rebecca J
2013-03-01
United States national parks attract > 275 million visitors annually and collectively present risk of exposure for staff and visitors to a wide range of arthropod vector species (most notably fleas, mosquitoes, and ticks) and their associated bacterial, protozoan, or viral pathogens. We assessed the current state of knowledge for risk of exposure to vector-borne pathogens in national parks through a review of relevant literature, including internal National Park Service documents and organismal databases. We conclude that, because of lack of systematic surveillance for vector-borne pathogens in national parks, the risk of pathogen exposure for staff and visitors is unclear. Existing data for vectors within national parks were not based on systematic collections and rarely include evaluation for pathogen infection. Extrapolation of human-based surveillance data from neighboring communities likely provides inaccurate estimates for national parks because landscape differences impact transmission of vector-borne pathogens and human-vector contact rates likely differ inside versus outside the parks because of differences in activities or behaviors. Vector-based pathogen surveillance holds promise to define when and where within national parks the risk of exposure to infected vectors is elevated. A pilot effort, including 5-10 strategic national parks, would greatly improve our understanding of the scope and magnitude of vector-borne pathogen transmission in these high-use public settings. Such efforts also will support messaging to promote personal protection measures and inform park visitors and staff of their responsibility for personal protection, which the National Park Service preservation mission dictates as the core strategy to reduce exposure to vector-borne pathogens in national parks.
Nou, Xiangwu; Luo, Yaguang
2010-06-01
Currently, most fresh-cut processing facilities in the United States use chlorinated water or other sanitizer solutions for microbial reduction after lettuce is cut. Freshly cut lettuce releases significant amounts of organic matter that negatively impacts the effectiveness of chlorine or other sanitizers for microbial reduction. The objective of this study is to evaluate whether a sanitizer wash before cutting improves microbial reduction efficacy compared to a traditional postcutting sanitizer wash. Romaine lettuce leaves were quantitatively inoculated with E. coli O157:H7 strains and washed in chlorinated water before or after cutting, and E. coli O157:H7 cells that survived the washing process were enumerated to determine the effectiveness of microbial reduction for the 2 cutting and washing sequences. Whole-leaf washing in chlorinated water improved pathogen reduction by approximately 1 log unit over traditional cut-leaf sanitization. Similar improvement in the reduction of background microflora was also observed. Inoculated "Lollo Rossa" red lettuce leaves were mixed with noninoculated Green-Leaf lettuce leaves to evaluate pathogen cross-contamination during processing. High level (96.7% subsamples, average MPN 0.6 log CFU/g) of cross-contamination of noninoculated green leaves by inoculated red leaves was observed when mixed lettuce leaves were cut prior to washing in chlorinated water. In contrast, cross-contamination of noninoculated green leaves was significantly reduced (3.3% of subsamples, average MPN
Besner, Marie-Claude; Prévost, Michèle; Regli, Stig
2011-01-01
Low and negative pressure events in drinking water distribution systems have the potential to result in intrusion of pathogenic microorganisms if an external source of contamination is present (e.g., nearby leaking sewer main) and there is a pathway for contaminant entry (e.g., leaks in drinking water main). While the public health risk associated with such events is not well understood, quantitative microbial risk assessment can be used to estimate such risk. A conceptual model is provided and the state of knowledge, current assumptions, and challenges associated with the conceptual model parameters are presented. This review provides a characterization of the causes, magnitudes, durations and frequencies of low/negative pressure events; pathways for pathogen entry; pathogen occurrence in external sources of contamination; volumes of water that may enter through the different pathways; fate and transport of pathogens from the pathways of entry to customer taps; pathogen exposure to populations consuming the drinking water; and risk associated with pathogen exposure. Copyright © 2010 Elsevier Ltd. All rights reserved.
Aliberti, Stefano; Di Pasquale, Marta; Zanaboni, Anna Maria; Cosentini, Roberto; Brambilla, Anna Maria; Seghezzi, Sonia; Tarsia, Paolo; Mantero, Marco; Blasi, Francesco
2012-02-15
Not all risk factors for acquiring multidrug-resistant (MDR) organisms are equivalent in predicting pneumonia caused by resistant pathogens in the community. We evaluated risk factors for acquiring MDR bacteria in patients coming from the community who were hospitalized with pneumonia. Our evaluation was based on actual infection with a resistant pathogen and clinical outcome during hospitalization. An observational, prospective study was conducted on consecutive patients coming from the community who were hospitalized with pneumonia. Data on admission and during hospitalization were collected. Logistic regression models were used to evaluate risk factors for acquiring MDR bacteria independently associated with the actual presence of a resistant pathogen and in-hospital mortality. Among the 935 patients enrolled in the study, 473 (51%) had at least 1 risk factor for acquiring MDR bacteria on admission. Of all risk factors, hospitalization in the preceding 90 days (odds ratio [OR], 4.87 95% confidence interval {CI}, 1.90-12.4]; P = .001) and residency in a nursing home (OR, 3.55 [95% CI, 1.12-11.24]; P = .031) were independent predictors for an actual infection with a resistant pathogen. A score able to predict pneumonia caused by a resistant pathogen was computed, including comorbidities and risk factors for MDR. Hospitalization in the preceding 90 days and residency in a nursing home were also independent predictors for in-hospital mortality. Risk factors for acquiring MDR bacteria should be weighted differently, and a probabilistic approach to identifying resistant pathogens among patients coming from the community with pneumonia should be embraced.
Neonatal resuscitation equipment: A hidden risk for our babies?
Winckworth, Lucinda C; McLaren, Emma; Lingeswaran, Arvin; Kelsey, Michael
2016-05-01
Neonatal infections carry a heavy burden of morbidity and mortality. Poor practice can result in unintentional colonisation of medical equipment with potentially pathogenic organisms. This study will determine the prevalence and type of bacterial contamination on exposed neonatal resuscitation equipment in different clinical settings and explore simple measures to reduce contamination risk. A survey determined the rates of resuscitation equipment usage. All environmentally exposed items were identified on resuscitaires hospital-wide and swabbed for bacterial contamination. A new cleaning and storage policy was implemented and the prevalence of environmentally exposed equipment re-measured post-intervention. Resuscitation equipment was used in 28% of neonatal deliveries. Bacterial colony forming units were present on 44% of the 236 exposed equipment pieces swabbed. There was no significant difference in contamination rates between equipment types. Coagulase negative staphylococcus was the most prevalent species (59 pieces, 25%) followed by Escherichia coli and Enterobacter cloacae (20 pieces, 9% each). Opened items stored inside plastic remained sterile, whilst those in low-use areas had significantly less contamination than those in high-use areas (22% vs. 51%, P < 0.05). Implementing a simple educational programme led to a significant reduction in environmentally exposed equipment (79% reduction, P < 0.01). Pathogenic bacteria can colonise commonly used pieces of neonatal resuscitation equipment. Whilst the clinical significance remains uncertain, equipment should be kept packaged until required and discarded once open, even if unused. Standardising cleaning policies results in rapid and significant improvements in equipment storage conditions, reducing microbial colonisation opportunities. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).
Understanding growers' decisions to manage invasive pathogens at the farm level.
Breukers, Annemarie; van Asseldonk, Marcel; Bremmer, Johan; Beekman, Volkert
2012-06-01
Globalization causes plant production systems to be increasingly threatened by invasive pests and pathogens. Much research is devoted to support management of these risks. Yet, the role of growers' perceptions and behavior in risk management has remained insufficiently analyzed. This article aims to fill this gap by addressing risk management of invasive pathogens from a sociopsychological perspective. An analytical framework based on the Theory of Planned Behavior was used to explain growers' decisions on voluntary risk management measures. Survey information from 303 Dutch horticultural growers was statistically analyzed, including regression and cluster analysis. It appeared that growers were generally willing to apply risk management measures, and that poor risk management was mainly due to perceived barriers, such as high costs and doubts regarding efficacy of management measures. The management measures applied varied considerably among growers, depending on production sector and farm-specific circumstances. Growers' risk perception was found to play a role in their risk management, although the causal relation remained unclear. These results underscore the need to apply a holistic perspective to farm level management of invasive pathogen risk, considering the entire package of management measures and accounting for sector- and farm-specific circumstances. Moreover, they demonstrate that invasive pathogen risk management can benefit from a multidisciplinary approach that incorporates growers' perceptions and behavior.
Hwang, J; Lee, K; Walsh, D; Kim, S W; Sleeman, J M; Lee, H
2018-02-01
Wildlife-associated diseases and pathogens have increased in importance; however, management of a large number of diseases and diversity of hosts is prohibitively expensive. Thus, the determination of priority wildlife pathogens and risk factors for disease emergence is warranted. We used an online questionnaire survey to assess release and exposure risks, and consequences of wildlife-associated diseases and pathogens in the Republic of Korea (ROK). We also surveyed opinions on pathways for disease exposure, and risk factors for disease emergence and spread. For the assessment of risk, we employed a two-tiered, statistical K-means clustering algorithm to group diseases into three levels (high, medium and low) of perceived risk based on release and exposure risks, societal consequences and the level of uncertainty of the experts' opinions. To examine the experts' perceived risk of routes of introduction of pathogens and disease amplification and spread, we used a Bayesian, multivariate normal order-statistics model. Six diseases or pathogens, including four livestock and two wildlife diseases, were identified as having high risk with low uncertainty. Similarly, 13 diseases were characterized as having high risk with medium uncertainty with three of these attributed to livestock, six associated with human disease, and the remainder having the potential to affect human, livestock and wildlife (i.e., One Health). Lastly, four diseases were described as high risk with high certainty, and were associated solely with fish diseases. Experts identified migration of wildlife, international human movement and illegal importation of wildlife as the three routes posing the greatest risk of pathogen introduction into ROK. Proximity of humans, livestock and wildlife was the most significant risk factor for promoting the spread of wildlife-associated diseases and pathogens, followed by high density of livestock populations, habitat loss and environmental degradation, and climate change. This study provides useful information to decision makers responsible for allocating resources to address disease risks. This approach provided a rapid, cost-effective method of risk assessment of wildlife-associated diseases and pathogens for which the published literature is sparse. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Reduction of microbiological risk in minced meat by a combination of natural antimicrobials.
Klančnik, Anja; Piskernik, Saša; Bucar, Franz; Vučković, Darinka; Možina, Sonja Smole; Jeršek, Barbara
2014-10-01
Responsibility for food safety must be taken through the entire food-production chain, to avoid consumer cross-contamination. The antimicrobial activities of an Alpinia katsumadai seed extract and epigallocatechin gallate (EGCG), and their combination, were evaluated against individual food-borne pathogenic strains of Listeria monocytogenes, Escherichia coli and Campylobacter jejuni, individually and as a cocktail, in chicken-meat juice and sterile minced meat as food models, and in minced meat with the naturally present microflora, as an actual food sample. The antimicrobial combination of the A. katsumadai extract and EGCG was the most efficient for C. jejuni growth inhibition, followed by inhibition of L. monocytogenes, which was reduced more efficiently in the bacterial cocktail than as an individual strain. The antimicrobial combination added to minced meat at refrigeration temperatures used in the food chain (8 °C) revealed inhibition of these pathogens and inhibition of the naturally present bacteria after 5 days. The antibacterial efficiencies of the tested combinations are influenced by storage temperature. Food safety can be improved by using the appropriate combination of natural antimicrobials to reduce the microbiological risk of minced meat. © 2014 Society of Chemical Industry.
Eastern equine encephalitis virus: high seroprevalence in horses from Southern Quebec, Canada, 2012.
Rocheleau, Jean-Philippe; Arsenault, Julie; Lindsay, L Robbin; DiBernardo, Antonia; Kulkarni, Manisha A; Côté, Nathalie; Michel, Pascal
2013-10-01
Eastern equine encephalitis virus (EEEV) is a highly pathogenic arbovirus that infects humans, horses, and other animals. There has been a significant increase in EEEV activity in southeastern Canada since 2008. Few data are available regarding nonlethal EEEV infections in mammals, and consequently the distribution and pathogenicity spectrum of EEEV infections in these hosts is poorly understood. This cross-sectional study focuses on the evaluation of viral activity in southern Quebec's horses by seroprevalence estimation. A total of 196 horses, 18 months and older, which had never been vaccinated against EEEV and have never traveled outside Canada, were sampled from 92 barns distributed throughout three administrative regions of southern Quebec. Blood samples were taken from each horse and titrated for EEEV antibodies by plaque reduction neutralization test (PRNT). Equine population vaccination coverage was estimated by surveying horse owners and equine practitioners. PRNT results revealed an EEEV seroprevalence up to 8.7%, with 95% confidence limits ranging from 4.4% to 13.0%. Vaccination coverage was estimated to be at least 79%. Our study reveals for the first time in Canada a measure of EEEV seroprevalence in horses. High seroprevalence in unvaccinated animals challenges the perception that EEEV is a highly lethal pathogen in horses. Monitoring high-risk vector-borne infections such as EEEV in animal populations can be an important element of a public health surveillance strategy, population risk assessment and early detection of epidemics.
NASA Astrophysics Data System (ADS)
Thupila, Nunticha; Ratana-arporn, Pattama; Wilaipun, Pongtep
2011-07-01
In Thailand, white scar oyster ( Crassostrea belcheri) was ranked for premium quality, being most expensive and of high demand. This oyster is often eaten raw, hence it may pose health hazards to consumers when contaminated with food-borne pathogens. As limited alternative methods are available to sterilize the oyster while preserving the raw characteristic, irradiation may be considered as an effective method for decontamination. In this study, the radiation resistance of pathogenic bacteria commonly contaminating the oyster and the optimum irradiation doses for sterilization of the most radiation resistant bacteria were investigated. The radiation decimal reduction doses ( D10) of Salmonella Weltevreden DMST 33380, Vibrio parahaemolyticus ATCC 17802 and Vibrio vulnificus DMST 5852 were determined in broth culture and inoculated oyster homogenate. The D10 values of S. Weltevreden, V. parahaemolyticus and V. vulnificus in broth culture were 0.154, 0.132 and 0.059 kGy, while those of inoculated oyster homogenate were 0.330, 0.159 and 0.140 kGy, respectively. It was found that among the pathogens tested, S. Weltevreden was proved to be the most resistant species. An irradiation dose of 1.5 kGy reduced the counts of 10 5 CFU/g S. Weltevreden inoculated in oyster meat to an undetectable level. The present study indicated that a low-dose irradiation can improve the microbial quality of oyster and further reduce the risks from the food-borne pathogens without adversely affecting the sensory attributes.
Richgels, Katherine L. D.; Russell, Robin E.; Adams, Michael J.; White, C. LeAnn; Campbell Grant, Evan H.
2016-01-01
A newly identified fungal pathogen, Batrachochytrium salamandrivorans (Bsal), is responsible for mass mortality events and severe population declines in European salamanders. The eastern USA has the highest diversity of salamanders in the world and the introduction of this pathogen is likely to be devastating. Although data are inevitably limited for new pathogens, disease-risk assessments use best available data to inform management decisions. Using characteristics of Bsal ecology, spatial data on imports and pet trade establishments, and salamander species diversity, we identify high-risk areas with both a high likelihood of introduction and severe consequences for local salamanders. We predict that the Pacific coast, southern Appalachian Mountains and mid-Atlantic regions will have the highest relative risk from Bsal. Management of invasive pathogens becomes difficult once they are established in wildlife populations; therefore, import restrictions to limit pathogen introduction and early detection through surveillance of high-risk areas are priorities for preventing the next crisis for North American salamanders.
What is the Risk for Exposure to Vector-Borne Pathogens in United States National Parks?
EISEN, LARS; WONG, DAVID; SHELUS, VICTORIA; EISEN, REBECCA J.
2015-01-01
United States national parks attract >275 million visitors annually and collectively present risk of exposure for staff and visitors to a wide range of arthropod vector species (most notably fleas, mosquitoes, and ticks) and their associated bacterial, protozoan, or viral pathogens. We assessed the current state of knowledge for risk of exposure to vector-borne pathogens in national parks through a review of relevant literature, including internal National Park Service documents and organismal databases. We conclude that, because of lack of systematic surveillance for vector-borne pathogens in national parks, the risk of pathogen exposure for staff and visitors is unclear. Existing data for vectors within national parks were not based on systematic collections and rarely include evaluation for pathogen infection. Extrapolation of human-based surveillance data from neighboring communities likely provides inaccurate estimates for national parks because landscape differences impact transmission of vector-borne pathogens and human-vector contact rates likely differ inside versus outside the parks because of differences in activities or behaviors. Vector-based pathogen surveillance holds promise to define when and where within national parks the risk of exposure to infected vectors is elevated. A pilot effort, including 5–10 strategic national parks, would greatly improve our understanding of the scope and magnitude of vector-borne pathogen transmission in these high-use public settings. Such efforts also will support messaging to promote personal protection measures and inform park visitors and staff of their responsibility for personal protection, which the National Park Service preservation mission dictates as the core strategy to reduce exposure to vector-borne pathogens in national parks. PMID:23540107
Chronic Azithromycin Use in Cystic Fibrosis and Risk of Treatment-Emergent Respiratory Pathogens.
Cogen, Jonathan D; Onchiri, Frankline; Emerson, Julia; Gibson, Ronald L; Hoffman, Lucas R; Nichols, David P; Rosenfeld, Margaret
2018-02-23
Azithromycin has been shown to improve lung function and reduce the number of pulmonary exacerbations in cystic fibrosis patients. Concerns remain, however, regarding the potential emergence of treatment-related respiratory pathogens. To determine if chronic azithromycin use (defined as thrice weekly administration) is associated with increased rates of detection of eight specific respiratory pathogens. We performed a new-user, propensity-score matched retrospective cohort study utilizing data from the Cystic Fibrosis Foundation Patient Registry. Incident azithromycin users were propensity-score matched 1:1 with contemporaneous non-users. Kaplan-Meier curves and Cox proportional hazards regression were used to evaluate the association between chronic azithromycin use and incident respiratory pathogen detection. Analyses were performed separately for each pathogen, limited to patients among whom that pathogen had not been isolated in the two years prior to cohort entry. After propensity score matching, mean age of the cohorts was ~12 years. Chronic azithromycin users had a significantly lower risk of detection of new methicillin-resistant Staphylococcus aureus, non-tuberculous mycobacteria, and Burkholderia cepacia complex compared to non-users. The risk of acquiring the remaining five pathogens was not significantly different between users and non-users. Using an innovative new-user, propensity-score matched study design to minimize indication and selection biases, we found in a predominantly pediatric cohort that chronic azithromycin users had a lower risk of acquiring several cystic fibrosis-related respiratory pathogens. These results may ease concerns that chronic azithromycin exposure increases the risk of acquiring new respiratory pathogens among pediatric cystic fibrosis patients.
The Pathogen Equivalency Committee has updated the criteria it uses to make recommendations of equivalency on innovative or alternative sludge pathogen reduction processes. To assist new applicants through the equivalency recommendation process the pathogen equivalency committee ...
Forest species diversity reduces disease risk in a generalist plant pathogen invasion
Haas, Sarah E.; Hooten, Mevin B.; Rizzo, David M.; Meentemeyer, Ross K.
2011-01-01
Empirical evidence suggests that biodiversity loss can increase disease transmission, yet our understanding of the 'diversity-disease hypothesis' for generalist pathogens in natural ecosystems is limited. We used a landscape epidemiological approach to examine two scenarios regarding diversity effects on the emerging plant pathogen Phytophthora ramorum across a broad, heterogeneous ecoregion: (1) an amplification effect exists where disease risk is greater in areas with higher plant diversity due to the pathogen's wide host range, or (2) a dilution effect where risk is reduced with increasing diversity due to lower competency of alternative hosts. We found evidence for pathogen dilution, whereby disease risk was lower in sites with higher species diversity, after accounting for potentially confounding effects of host density and landscape heterogeneity. Our results suggest that although nearly all plants in the ecosystem are hosts, alternative hosts may dilute disease transmission by competent hosts, thereby buffering forest health from infectious disease.
ESTIMATING THE RISK OF INFECTIOUS DISEASE ASSOCIATED WITH PATHOGENS IN DRINKING WATER
Most of the microorganisms present in aquatic environments seem to have no effect upon the health of humans. However, some clearly do represent a public health risk, and for this reason the latter are considered to be pathogenic in nature and referred to as being "pathogens". The...
UV Influence on the Re-Growth of Pathogens in Cow Fecal Extract
The health risks pathogens pose to water and food resources are highly dependent on their fate and transport in agricultural settings. In order to assess these risks, and understanding of the factors that influence pathogen fate in agricultural settings is needed and is critical ...
Elving, Josefine; Vinnerås, Björn; Albihn, Ann; Ottoson, Jakob R
2014-01-01
Thermal treatment at temperatures between 46.0°C and 55.0°C was evaluated as a method for sanitization of organic waste, a temperature interval less commonly investigated but important in connection with biological treatment processes. Samples of dairy cow feces inoculated with Salmonella Senftenberg W775, Enterococcus faecalis, bacteriophage ϕX174, and porcine parvovirus (PPV) were thermally treated using block thermostats at set temperatures in order to determine time-temperature regimes to achieve sufficient bacterial and viral reduction, and to model the inactivation rate. Pasteurization at 70°C in saline solution was used as a comparison in terms of bacterial and viral reduction and was proven to be effective in rapidly reducing all organisms with the exception of PPV (decimal reduction time of 1.2 h). The results presented here can be used to construct time-temperature regimes in terms of bacterial inactivation, with D-values ranging from 0.37 h at 55°C to 22.5 h at 46.0°C and 0.45 h at 55.0°C to 14.5 h at 47.5°C for Salmonella Senftenberg W775 and Enterococcus faecalis, respectively and for relevant enteric viruses based on the ϕX174 phage with decimal reduction times ranging from 1.5 h at 55°C to 16.5 h at 46°C. Hence, the study implies that considerably lower treatment temperatures than 70°C can be used to reach a sufficient inactivation of bacterial pathogens and potential process indicator organisms such as the ϕX174 phage and raises the question whether PPV is a valuable process indicator organism considering its extreme thermotolerance.
Modern banking, collection, compatibility testing and storage of blood and blood components.
Green, L; Allard, S; Cardigan, R
2015-01-01
The clinical practice of blood transfusion has changed considerably over the last few decades. The potential risk of transfusion transmissible diseases has directed efforts towards the production of safe and high quality blood. All transfusion services now operate in an environment of ever-increasing regulatory controls encompassing all aspects of blood collection, processing and storage. Stringent donor selection, identification of pathogens that can be transmitted through blood, and development of technologies that can enhance the quality of blood, have all led to a substantial reduction in potential risks and complications associated with blood transfusion. In this article, we will discuss the current standards required for the manufacture of blood, starting from blood collection, through processing and on to storage. © 2014 The Association of Anaesthetists of Great Britain and Ireland.
Interaction of probiotics and pathogens--benefits to human health?
Salminen, Seppo; Nybom, Sonja; Meriluoto, Jussi; Collado, Maria Carmen; Vesterlund, Satu; El-Nezami, Hani
2010-04-01
The probiotic terminology has matured over the years and currently a unified definition has been formed. Lactic acid bacteria (LAB) and bifidobacteria have been reported to remove heavy metals, cyanotoxins and mycotoxins from aqueous solutions. The binding processes appear to be species and strain specific. The most efficient microbial species and strains in the removal of these compounds vary between components tested. However, it is of interest to note that most strains characterized until now do not bind positive components or nutrients in the diet. This has significant implications to future detoxification biotechnology development. In a similar manner, lactic acid bacteria and bifidobacteria interact directly with viruses and pathogens in food and water as well as toxin producing microbes and some toxins. This review updates information and aims to characterize these interactions in association. The target is to understand probiotic health effects and to relate the mechanisms and actions to future potential of specific probiotic bacteria on decontamination of foods and water, and diets. The same aim is targeted in characterizing the role of probiotics in inactivating pathogens and viruses of health importance to facilitate the establishment of novel means of disease risk reduction related health benefits. Copyright 2010. Published by Elsevier Ltd.
Computational Modeling in Support of Global Eradication of Infectious Diseases
NASA Astrophysics Data System (ADS)
Eckhoff, Philip A.; Gates, William H., III; Myhrvold, Nathan P.; Wood, Lowell
2014-07-01
The past century has seen tremendous advances in global health, with broad reductions in the worldwide burden of infectious disease. Science has fundamentally advanced our understanding of disease etiology and medicine has provided remarkable capabilities to diagnose many syndromes and to target the causative pathogen. The advent and proliferation of antibiotics has dramatically lowered the impact of infections that were once near certain death sentences. Vaccination has provided a route to protect each new birth cohort from pathogens which once killed a substantial fraction of each generation, and in some countries, vaccination coverage has been raised to sufficiently high levels to fully interrupt transmission of major pathogens. There were 7 million deaths among children under 5 years of age in 2010, substantially down from decades past, and even more so in terms of deaths per capita per year of populations at risk. However, the annual rate globally is 1,070 per 100,000, while in developed countries the rate is only 137 per 100,000 (IHME GBD, 2010). Therefore, bringing global rates down to rates already achieved in developed countries represents the huge gains currently available via means such as vaccination and access to modern health care...
Comprehensive bactericidal activity of an ethanol-based hand gel in 15 seconds.
Kampf, Günter; Hollingsworth, Angela
2008-01-22
Some studies indicate that the commonly recommended 30 s application time for the post contamination treatment of hands may not be necessary as the same effect may be achieved with some formulations in a shorter application time such as 15 s. We evaluated the bactericidal activity of an ethanol-based hand gel (Sterillium Comfort Gel) within 15 s in a time-kill-test against 11 Gram-positive, 16 Gram-negative bacteria and 11 emerging bacterial pathogens. Each strain was evaluated in quadruplicate. The hand gel (85% ethanol, w/w) was found to reduce all 11 Gram-positive and all 16 Gram-negative bacteria by more than 5 log10 steps within 15 s, not only against the ATCC test strains but also against corresponding clinical isolates. In addition, a log10 reduction > 5 was observed against all tested emerging bacterial pathogens. The ethanol-based hand gel was found to have a broad spectrum of bactericidal activity in only 15 s which includes the most common species causing nosocomial infections and the relevant emerging pathogens. Future research will hopefully help to find out if a shorter application time for the post contamination treatment of hands provides more benefits or more risks.
Amoah, Isaac Dennis; Reddy, Poovendhree; Seidu, Razak; Stenström, Thor Axel
2018-05-01
Wastewater may contain contaminants harmful to human health; hence, there is the need for treatment before discharge. Centralized wastewater treatment systems are the favored treatment options globally, but these are not necessarily superior in reduction of pathogens as compared to decentralized wastewater treatment systems (collectively called DEWATS). This study was therefore undertaken to assess the soil-transmitted helminth (STH) and Taenia sp. egg reduction efficiency of selected anaerobic baffled reactors and planted gravel filters compared to centralized wastewater treatment plants in South Africa and Lesotho. The risk of ascariasis with exposure to effluents from the centralized wastewater treatment plants was also assessed using the quantitative microbial risk assessment (QMRA) approach. Eggs of Ascaris spp., hookworm, Trichuris spp., Taenia spp., and Toxocara spp. were commonly detected in the untreated wastewater. The DEWATS plants removed between 95 and 100% of the STH and Taenia sp. eggs, with centralized plants removing between 67 and 100%. Helminth egg concentrations in the final effluents from the centralized wastewater treatment plants were consistently higher than those in the WHO recommended guideline (≤ 1 helminth egg/L) for agricultural use resulting in higher risk of ascariasis. Therefore, in conclusion, DEWATS plants may be more efficient in reducing the concentration of helminth eggs in wastewater, resulting in lower risks of STH infections upon exposure.
Predicting pathogen introduction: West Nile virus spread to Galáipagos.
Kilpatrick, A Marm; Daszak, Peter; Goodman, Simon J; Rogg, Helmuth; Kramer, Laura D; Cedeño, Virna; Cunningham, Andrew A
2006-08-01
Emerging infectious diseases are a key threat to conservation and public health, yet predicting and preventing their emergence is notoriously difficult. We devised a predictive model for the introduction of a zoonotic vector-borne pathogen by considering each of the pathways by which it may be introduced to a new area and comparing the relative risk of each pathway. This framework is an adaptation of pest introduction models and estimates the number of infectious individuals arriving in a location and the duration of their infectivity. We used it to determine the most likely route for the introduction of West Nile virus to Galápagos and measures that can be taken to reduce the risk of introduction. The introduction of this highly pathogenic virus to this unique World Heritage Site could have devastating consequences, similar to those seen following introductions of pathogens into other endemic island faunas. Our model identified the transport of mosquitoes on airplanes as the highest risk for West Nile virus introduction. Pathogen dissemination through avian migration and the transportation of day-old chickens appeared to be less important pathways. Infected humans and mosquitoes transported in sea containers, in tires, or by wind all represented much lower risk. Our risk-assessment framework has broad applicability to other pathogens and other regions and depends only on the availability of data on the transport of goods and animals and the epidemiology of the pathogen.
Visser, M; Stephan, D; Jaynes, J M; Burger, J T
2012-06-01
Natural and synthetic antimicrobial peptides (AMPs) are of increasing interest as potential resistance conferring elements in plants against pathogen infection. The efficacy of AMPs against pathogens is prescreened by in vitro assays, and promising AMP candidates are introduced as transgenes into plants. As in vitro and in planta environments differ, a prescreening procedure of the AMP efficacy in the plant environment is desired. Here, we report the efficacy of the purified synthetic peptide D4E1 against the grapevine-infecting bacterial pathogens Agrobacterium vitis and Xylophilus ampelinus in vitro and describe for the first time an in planta prescreening procedure based on transiently expressed D4E1. The antimicrobial effect of D4E1 against Ag. vitis and X. ampelinus was shown by a reduction in colony-forming units in vitro in a traditional plate-based assay and by a reduction in bacterial titres in planta as measured by quantitative real-time PCR (qPCR) in grapevine leaves transiently expressing D4E1. A statistically significant reduction in titre was shown for X. ampelinus, but for Ag. vitis, a significant reduction in titre was only observed in a subset of plants. The titres of both grapevine-infecting bacterial pathogens were reduced in an in vitro assay and for X. ampelinus in an in planta assay by D4E1 application. This widens the applicability of D4E1 as a potential resistance-enhancing element to additional pathogens and in a novel plant species. D4E1 is a promising candidate to confer enhanced resistance against the two tested grapevine bacterial pathogens, and the applied transient expression system proved to be a valuable tool for prescreening of D4E1 efficacy in an in planta environment. The described prescreening procedure can be used for other AMPs and might be adapted to other plant species and pathogens before the expensive and tedious development of stably transgenic lines is started. © 2012 The Authors. Letters in Applied Microbiology © 2012 The Society for Applied Microbiology.
78 FR 66010 - Draft Risk Profile on Pathogens and Filth in Spices; Availability
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-04
... about the frequency and levels of pathogen and/or filth contamination of spices throughout the food... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-N-1204] Draft Risk Profile on Pathogens and Filth in Spices; Availability AGENCY: Food and Drug Administration...
Kwan, Grace; Charkowski, Amy O; Barak, Jeri D
2013-02-12
Although enteric human pathogens are usually studied in the context of their animal hosts, a significant portion of their life cycle occurs on plants. Plant disease alters the phyllosphere, leading to enhanced growth of human pathogens; however, the impact of human pathogens on phytopathogen biology and plant health is largely unknown. To characterize the interaction between human pathogens and phytobacterial pathogens in the phyllosphere, we examined the interactions between Pectobacterium carotovorum subsp. carotovorum and Salmonella enterica or Escherichia coli O157:H7 with regard to bacterial populations, soft rot progression, and changes in local pH. The presence of P. carotovorum subsp. carotovorum enhanced the growth of both S. enterica and E. coli O157:H7 on leaves. However, in a microaerophilic environment, S. enterica reduced P. carotovorum subsp. carotovorum populations and soft rot progression by moderating local environmental pH. Reduced soft rot was not due to S. enterica proteolytic activity. Limitations on P. carotovorum subsp. carotovorum growth, disease progression, and pH elevation were not observed on leaves coinoculated with E. coli O157:H7 or when leaves were coinoculated with S. enterica in an aerobic environment. S. enterica also severely undermined the relationship between the phytobacterial population and disease progression of a P. carotovorum subsp. carotovorum budB mutant defective in the 2,3-butanediol pathway for acid neutralization. Our results show that S. enterica and E. coli O157:H7 interact differently with the enteric phytobacterial pathogen P. carotovorum subsp. carotovorum. S. enterica inhibition of soft rot progression may conceal a rapidly growing human pathogen population. Whereas soft rotted produce can alert consumers to the possibility of food-borne pathogens, healthy-looking produce may entice consumption of contaminated vegetables. Salmonella enterica and Escherichia coli O157:H7 may use plants to move between animal and human hosts. Their populations are higher on plants cocolonized with the common bacterial soft rot pathogen Pectobacterium carotovorum subsp. carotovorum, turning edible plants into a risk factor for human disease. We inoculated leaves with P. carotovorum subsp. carotovorum and S. enterica or E. coli O157:H7 to study the interactions between these bacteria. While P. carotovorum subsp. carotovorum enhanced the growth of both S. enterica and E. coli O157:H7, these human pathogens affected P. carotovorum subsp. carotovorum fundamentally differently. S. enterica reduced P. carotovorum subsp. carotovorum growth and acidified the environment, leading to less soft rot on leaves; E. coli O157:H7 had no such effects. As soft rot signals a food safety risk, the reduction of soft rot symptoms in the presence of S. enterica may lead consumers to eat healthy-looking but S. enterica-contaminated produce.
Reggeti, Mariana; Romero, Emilse; Eblen-Zajjur, Antonio
2016-06-01
There is a risk for an avian influenza AH5N1 virus pandemia. To estimate the magnitude and impact of an AH5N1 pandemic in areas of Latin-America in order to design interventions and to reduce morbidity-mortality. The InfluSim program was used to simulate a highly pathogenic AH5N1 aviar virus epidemic outbreak with human to human transmission in Valencia, Venezuela. We estimated the day of maximal number of cases, number of moderately and severely ill patients, exposed individuals, deaths and associated costs for 5 different interventions: absence of any intervention; implementation of antiviral treatment; reduction of 20% in population general contacts; closure of 20% of educational institutions; and reduction of 50% in massive public gatherings. Simulation parameters used were: population: 829.856 persons, infection risk 6-47%, contagiousness Index Rh o 2,5; relative contagiousness 90%, overall lethality 64,1% and, costs according to the official basic budget. For an outbreak lasting 200 days direct and indirect deaths by intervention strategies would be: 29,907; 29,900; 9,701; 29,295 and 14,752. Costs would follow a similar trend. Reduction of 20% in general population contacts results in a significant reduction of up to 68% of cases. The outbreak would collapse the health care system. Antiviral treatment would not be efficient during the outbreak. Interpersonal contact reduction proved to be the best sanitary measure to control an AH5N1 theoretical epidemic outbreak.
BEACH Act amendment to Clean Water Act requires EPA to establish more expeditious methods for the timely detection of pathogens and pathogen indicators in coastal waters New methods should demonstrate utility for and be compatible with all CWA 304(a) criteria needs including:...
Respiratory Viruses and Treatment Failure in Children With Asthma Exacerbation.
Merckx, Joanna; Ducharme, Francine M; Martineau, Christine; Zemek, Roger; Gravel, Jocelyn; Chalut, Dominic; Poonai, Naveen; Quach, Caroline
2018-06-04
: media-1vid110.1542/5771275574001PEDS-VA_2017-4105 Video Abstract OBJECTIVES: Respiratory pathogens commonly trigger pediatric asthma exacerbations, but their impact on severity and treatment response remains unclear. We performed a secondary analysis of the Determinants of Oral Corticosteroid Responsiveness in Wheezing Asthmatic Youth (DOORWAY) study, a prospective cohort study of children (aged 1-17 years) presenting to the emergency department with moderate or severe exacerbations. Nasopharyngeal specimens were analyzed by RT-PCR for 27 respiratory pathogens. We investigated the association between pathogens and both exacerbation severity (assessed with the Pediatric Respiratory Assessment Measure) and treatment failure (hospital admission, emergency department stay >8 hours, or relapse) of a standardized severity-specific treatment. Logistic multivariate regressions were used to estimate average marginal effects (absolute risks and risk differences [RD]). Of 958 participants, 61.7% were positive for ≥1 pathogen (rhinovirus was the most prevalent [29.4%]) and 16.9% experienced treatment failure. The presence of any pathogen was not associated with higher baseline severity but with a higher risk of treatment failure (20.7% vs 12.5%; RD = 8.2% [95% confidence interval: 3.3% to 13.1%]) compared to the absence of a pathogen. Nonrhinovirus pathogens were associated with an increased absolute risk (RD) of treatment failure by 13.1% (95% confidence interval: 6.4% to 19.8%), specifically, by 8.8% for respiratory syncytial virus, 24.9% for influenza, and 34.1% for parainfluenza. Although respiratory pathogens were not associated with higher severity on presentation, they were associated with increased treatment failure risk, particularly in the presence of respiratory syncytial virus, influenza, and parainfluenza. This supports influenza prevention in asthmatic children, consideration of pathogen identification on presentation, and exploration of treatment intensification for infected patients at higher risk of treatment failure. Copyright © 2018 by the American Academy of Pediatrics.
Bueno, I; Smith, K M; Sampedro, F; Machalaba, C C; Karesh, W B; Travis, D A
2016-06-01
Wildlife trade (both formal and informal) is a potential driver of disease introduction and emergence. Legislative proposals aim to prevent these risks by banning wildlife imports, and creating 'white lists' of species that are cleared for importation. These approaches pose economic harm to the pet industry, and place substantial burden on importers and/or federal agencies to provide proof of low risk for importation of individual species. As a feasibility study, a risk prioritization tool was developed to rank the pathogens found in rodent species imported from Latin America into the United States with the highest risk of zoonotic consequence in the United States. Four formally traded species and 16 zoonotic pathogens were identified. Risk scores were based on the likelihood of pathogen release and human exposure, and the severity of the disease (consequences). Based on the methodology applied, three pathogens (Mycobacterium microti, Giardia spp. and Francisella tularensis) in one species (Cavia porcellus) were ranked as highest concern. The goal of this study was to present a methodological approach by which preliminary management resources can be allocated to the identified high-concern pathogen-species combinations when warranted. This tool can be expanded to other taxa and geographic locations to inform policy surrounding the wildlife trade. © 2015 Blackwell Verlag GmbH.
Prioritizing Risks and Uncertainties from Intentional Release of Selected Category A Pathogens
Hong, Tao; Gurian, Patrick L.; Huang, Yin; Haas, Charles N.
2012-01-01
This paper synthesizes available information on five Category A pathogens (Bacillus anthracis, Yersinia pestis, Francisella tularensis, Variola major and Lassa) to develop quantitative guidelines for how environmental pathogen concentrations may be related to human health risk in an indoor environment. An integrated model of environmental transport and human health exposure to biological pathogens is constructed which 1) includes the effects of environmental attenuation, 2) considers fomite contact exposure as well as inhalational exposure, and 3) includes an uncertainty analysis to identify key input uncertainties, which may inform future research directions. The findings provide a framework for developing the many different environmental standards that are needed for making risk-informed response decisions, such as when prophylactic antibiotics should be distributed, and whether or not a contaminated area should be cleaned up. The approach is based on the assumption of uniform mixing in environmental compartments and is thus applicable to areas sufficiently removed in time and space from the initial release that mixing has produced relatively uniform concentrations. Results indicate that when pathogens are released into the air, risk from inhalation is the main component of the overall risk, while risk from ingestion (dermal contact for B. anthracis) is the main component of the overall risk when pathogens are present on surfaces. Concentrations sampled from untracked floor, walls and the filter of heating ventilation and air conditioning (HVAC) system are proposed as indicators of previous exposure risk, while samples taken from touched surfaces are proposed as indicators of future risk if the building is reoccupied. A Monte Carlo uncertainty analysis is conducted and input-output correlations used to identify important parameter uncertainties. An approach is proposed for integrating these quantitative assessments of parameter uncertainty with broader, qualitative considerations to identify future research priorities. PMID:22412915
Ao, Dong; Chen, Rong; Wang, Xiaochang C; Liu, Yanzheng; Dzakpasu, Mawuli; Zhang, Lu; Huang, Yue; Xue, Tao; Wang, Nan
2018-05-01
The extensive use of reclaimed wastewater (RW) as a source of urban landscape pond replenishment, stimulated by the lack of surface water (SW) resources, has raised public concern. Greater attention should be paid to pond sediments, which act as 'sinks' and 'sources' of contaminants to the overlying pond water. Three ponds replenished with RW (RW ponds) in three Chinese cities were chosen to investigate 22 indices of sediment quality in four categories: eutrophication, heavy metal, ecotoxicity and pathogens risk. RW ponds were compared with other ponds of similar characteristics in the same cities that were replenished with SW (SW ponds). Our results show a strong impact of RW to the eutrophication and pathogenic risks, which are represented by organic matter, water content, total nitrogen, total phosphorus and phosphorus fractions, and pathogens. In particular, total phosphorus concentrations in the RW pond sediments were, on average, 50% higher than those of SW ponds. Moreover, the content of phosphorus, extracted by bicarbonate/dithionite (normally represented by BD-P) and NaOH (NaOH-P), were 2.0- and 2.83-times higher in RW ponds, respectively. For pathogens, the concentrations of norovirus and rotavirus in RW pond sediments were, on average, 0.52 and 0.30- log times those of SW ponds. The duration of RW replenishment was proved to have a marked impact on the eutrophication and pathogens risks from sediments. The continued use of RW for replenishment increases the eutrophication risk, and the pathogens risk, especially by viral pathogens, becomes greater. Copyright © 2018 Elsevier Ltd. All rights reserved.
El-Mougy, Nehal S.; Abdel-Kader, Mokhtar M.
2013-01-01
Evaluation of the efficacy of blue-green algal compounds against the growth of either pathogenic or antagonistic microorganisms as well as their effect on the antagonistic ability of bioagents was studied under in vitro conditions. The present study was undertaken to explore the inhibitory effect of commercial algal compounds, Weed-Max and Oligo-Mix, against some soil-borne pathogens. In growth medium supplemented with these algal compounds, the linear growth of pathogenic fungi decreased by increasing tested concentrations of the two algal compounds. Complete reduction in pathogenic fungal growth was observed at 2% of both Weed-Max and Oligo-Mix. Gradual significant reduction in the pathogenic fungal growth was caused by the two bioagents and by increasing the concentrations of algal compounds Weed-Max and Oligo-Mix. The present work showed that commercial algal compounds, Weed-Max and Oligo-Mix, have potential for the suppression of soil-borne fungi and enhance the antagonistic ability of fungal, bacterial, and yeast bio-agents. PMID:24307948
Reyher, K K; Dohoo, I R; Scholl, D T; Keefe, G P
2012-07-01
Major mastitis pathogens such as Staphylococcus aureus, Streptococcus uberis, Streptococcus dysgalactiae, and coliforms are usually considered more virulent and damaging to the udder than minor mastitis pathogens such as Corynebacterium spp. and coagulase-negative staphylococci (CNS). The current literature comprises several studies (n=38) detailing analyses with conflicting results as to whether intramammary infections (IMI) with the minor pathogens decrease, increase, or have no effect on the risk of a quarter acquiring a new IMI (NIMI) with a major pathogen. The Canadian Bovine Mastitis Research Network has a large mastitis database derived from a 2-yr data collection on a national cohort of dairy farms, and data from this initiative were used to further investigate the effect of IMI with minor pathogens on the acquisition of new major pathogen infections (defined as a culture-positive quarter sample in a quarter that had been free of that major pathogen in previous samples in the sampling period). Longitudinal milk samplings of clinically normal udders taken over several 6-wk periods as well as samples from cows pre-dry-off and postcalving were used to this end (n=80,397 quarter milk samples). The effects of CNS and Corynebacterium spp. on the major mastitis pathogens Staph. aureus, Strep. uberis, Strep. dysgalactiae, and coliform bacteria (Escherichia coli and Klebsiella spp.) were investigated using risk ratio analyses and multilevel logistic regression models. Quarter-, cow- and herd-level susceptibility parameters were also evaluated and were able to account for the increased susceptibility that exists within herds, cows and quarters, removing it from estimates for the effects of the minor pathogens. Increased quarter-level susceptibility was associated with increased risk of major pathogen NIMI for all pathogens except the coliforms. Increased somatic cell count was consistently associated with elevated risk of new major pathogen infections, but this was assumed to be a result of low sensitivity of bacteriology to diagnose major pathogen NIMI expediently and accurately. The presence of CNS in the sample 2 samplings before the occurrence of a NIMI increased the odds of experiencing a Staph. aureus NIMI 2.0 times, making the presence of CNS a risk factor for acquiring a Staph. aureus NIMI. Even with this extensive data set, power was insufficient to make a definitive statement about the effect of minor pathogen IMI on the acquisition of major pathogen NIMI. Definitively answering questions of this nature are likely to require an extremely large data set dedicated particularly to minor pathogen presence and NIMI with major pathogens. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Gkana, E; Chorianopoulos, N; Grounta, A; Koutsoumanis, K; Nychas, G-J E
2017-04-01
The objective of the present study was to determine the factors affecting the transfer of foodborne pathogens from inoculated beef fillets to non-inoculated ones, through food processing surfaces. Three different levels of inoculation of beef fillets surface were prepared: a high one of approximately 10 7 CFU/cm 2 , a medium one of 10 5 CFU/cm 2 and a low one of 10 3 CFU/cm 2 , using mixed-strains of Listeria monocytogenes, or Salmonella enterica Typhimurium, or Escherichia coli O157:H7. The inoculated fillets were then placed on 3 different types of surfaces (stainless steel-SS, polyethylene-PE and wood-WD), for 1 or 15 min. Subsequently, these fillets were removed from the cutting boards and six sequential non-inoculated fillets were placed on the same surfaces for the same period of time. All non-inoculated fillets were contaminated with a progressive reduction trend of each pathogen's population level from the inoculated fillets to the sixth non-inoculated ones that got in contact with the surfaces, and regardless the initial inoculum, a reduction of approximately 2 log CFU/g between inoculated and 1st non-inoculated fillet was observed. S. Typhimurium was transferred at lower mean population (2.39 log CFU/g) to contaminated fillets than E. coli O157:H7 (2.93 log CFU/g), followed by L. monocytogenes (3.12 log CFU/g; P < 0.05). Wooden surfaces (2.77 log CFU/g) enhanced the transfer of bacteria to subsequent fillets compared to other materials (2.66 log CFU/g for SS and PE; P < 0.05). Cross-contamination between meat and surfaces is a multifactorial process strongly depended on the species, initial contamination level, kind of surface, contact time and the number of subsequent fillet, according to analysis of variance. Thus, quantifying the cross-contamination risk associated with various steps of meat processing and food establishments or households can provide a scientific basis for risk management of such products. Copyright © 2016 Elsevier Ltd. All rights reserved.
A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters
USDA-ARS?s Scientific Manuscript database
In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...
Thorrington, Dominic; Andrews, Nick; Stowe, Julia; Miller, Elizabeth; van Hoek, Albert Jan
2018-02-08
The seven-valent pneumococcal conjugate vaccine (PCV) was introduced in England in September 2006, changing to the 13-valent vaccine in April 2010. PCV impact on invasive pneumococcal disease (IPD) has been extensively reported, but less described is its impact on the burden of pneumonia, sepsis and otitis media in the hospital. Using details on all admissions to hospitals in England, we compared the incidence of pneumococcal-specific and syndromic disease endpoints in a 24-month pre-PCV period beginning April 2004 to the 24-month period ending March 2015 to derive incidence rate ratios (IRRs). To adjust for possible secular trends in admission practice, IRRs were compared to the IRRs for five control conditions over the same period and the relative change assessed using the geometric mean of the five control IRRs as a composite, and individually for each control condition to give the min-max range. Relative changes were also compared with IRRs for IPD from the national laboratory database. The effect of stratifying cases into those with and without clinical risk factors for pneumococcal infection was explored. Relative reductions in pneumococcal pneumonia were seen in all age groups and in those with and without risk factors; in children under 15 years old reductions were similar in magnitude to reductions in IPD. For pneumonia of unspecified cause, relative reductions were seen in those under 15 years old (maximum reduction in children under 2 years of 34%, min-max: 11-49%) with a relative increase in 65+ year olds most marked in those with underlying risk conditions (41%, min-max: 0-82%). Reductions in pneumococcal sepsis were seen in all age groups, with the largest reduction in children younger than 2 years (67%, min-max 56-75%). Reductions in empyema and lung abscess were also seen in under 15 year olds. Results for other disease endpoints were varied. For disease endpoints showing an increase in raw IRR, the increase was generally reduced when expressed as a relative change. Use of a composite control and stratification by risk group status can help elucidate the impact of PCV on non-IPD disease endpoints and in vulnerable population groups. We estimate a substantial reduction in the hospitalised burden of pneumococcal pneumonia in all age groups and pneumonia of unspecified cause, empyema and lung abscess in children under 15 years of age since PCV introduction. The increase in unspecified pneumonia in high-risk 65+ year olds may in part reflect their greater susceptibility to develop pneumonia from less pathogenic serotypes that are replacing vaccine types in the nasopharynx.
Non-hospital based registered nurses and the risk of bloodborne pathogen exposure.
Gershon, Robyn R M; Qureshi, Kristine A; Pogorzelska, Monika; Rosen, Jonathan; Gebbie, Kristine M; Brandt-Rauf, Paul W; Sherman, Martin F
2007-10-01
The aim of this study was to assess the risk of blood and body fluid exposure among non-hospital based registered nurses (RNs) employed in New York State. The study population was mainly unionized public sector workers, employed in state institutions. A self-administered questionnaire was completed by a random stratified sample of members of the New York State Nurses Association and registered nurse members of the New York State Public Employees Federation. Results were reviewed by participatory action research (PAR) teams to identify opportunities for improvement. Nine percent of respondents reported at least one needlestick injury in the 12-month period prior to the study. The percutaneous injury (PI) rate was 13.8 per 100 person years. Under-reporting was common; 49% of all PIs were never formally reported and 70% never received any post-exposure care. Primary reasons for not reporting included: time constraints, fear, and lack of information on reporting. Significant correlates of needlestick injuries included tenure, patient load, hours worked, lack of compliance with standard precautions, handling needles and other sharps, poor safety climate, and inadequate training and availability of safety devices (p<0.05). PAR teams identified several risk reduction strategies, with an emphasis on safety devices. Non-hospital based RNs are at risk for bloodborne exposure at rates comparable to hospital based RNs; underreporting is an important obstacle to infection prevention, and primary and secondary risk management strategies appeared to be poorly implemented. Intervention research is warranted to evaluate improved risk reduction practices tailored to this population of RNs.
Barker, S Fiona; Amoah, Philip; Drechsel, Pay
2014-07-15
With a rapidly growing urban population in Kumasi, Ghana, the consumption of street food is increasing. Raw salads, which often accompany street food dishes, are typically composed of perishable vegetables that are grown in close proximity to the city using poor quality water for irrigation. This study assessed the risk of gastroenteritis illness (caused by rotavirus, norovirus and Ascaris lumbricoides) associated with the consumption of street food salads using Quantitative Microbial Risk Assessment (QMRA). Three different risk assessment models were constructed, based on availability of microbial concentrations: 1) Water - starting from irrigation water quality, 2) Produce - starting from the quality of produce at market, and 3) Street - using microbial quality of street food salad. In the absence of viral concentrations, published ratios between faecal coliforms and viruses were used to estimate the quality of water, produce and salad, and annual disease burdens were determined. Rotavirus dominated the estimates of annual disease burden (~10(-3)Disability Adjusted Life Years per person per year (DALYs pppy)), although norovirus also exceeded the 10(-4)DALY threshold for both Produce and Street models. The Water model ignored other on-farm and post-harvest sources of contamination and consistently produced lower estimates of risk; it likely underestimates disease burden and therefore is not recommended. Required log reductions of up to 5.3 (95th percentile) for rotavirus were estimated for the Street model, demonstrating that significant interventions are required to protect the health and safety of street food consumers in Kumasi. Estimates of virus concentrations were a significant source of model uncertainty and more data on pathogen concentrations is needed to refine QMRA estimates of disease burden. Copyright © 2014 Elsevier B.V. All rights reserved.
Tanner, Benjamin D
2009-02-01
Surface-mediated infectious disease transmission is a major concern in various settings, including schools, hospitals, and food-processing facilities. Chemical disinfectants are frequently used to reduce contamination, but many pose significant risks to humans, surfaces, and the environment, and all must be properly applied in strict accordance with label instructions to be effective. This study set out to determine the capability of a novel chemical-free, saturated steam vapor disinfection system to kill microorganisms, reduce surface-mediated infection risks, and serve as an alternative to chemical disinfectants. High concentrations of Escherichia coli, Shigella flexneri, vancomycin-resistant Enterococcus faecalis (VRE), methicillin-resistant Staphylococcus aureus (MRSA), Salmonella enterica, methicillin-sensitive Staphylococcus aureus, MS2 coliphage (used as a surrogate for nonenveloped viruses including norovirus), Candida albicans, Aspergillus niger, and the endospores of Clostridium difficile were dried individually onto porous clay test surfaces. Surfaces were treated with the saturated steam vapor disinfection system for brief periods and then numbers of surviving microorganisms were determined. Infection risks were calculated from the kill-time data using microbial dose-response relationships published in the scientific literature, accounting for surface-to-hand and hand-to-mouth transfer efficiencies. A diverse assortment of pathogenic microorganisms was rapidly killed by the steam disinfection system; all of the pathogens tested were completely inactivated within 5 seconds. Risks of infection from the contaminated surfaces decreased rapidly with increasing periods of treatment by the saturated steam vapor disinfection system. The saturated steam vapor disinfection system tested for this study is chemical-free, broadly active, rapidly efficacious, and therefore represents a novel alternative to liquid chemical disinfectants.
USDA-ARS?s Scientific Manuscript database
Inhouse composting is a management practice to reduce pathogen in poultry litter. In between flocks, growers windrow the litter inside the broiler houses. This results in high temperatures that can reduce some pathogens in the litter. However, this practice is likely to increase emissions of NH3 and...
Food decontamination using nanomaterials
USDA-ARS?s Scientific Manuscript database
The research indicates that nanomaterials including nanoemulsions are promising decontamination media for the reduction of food contaminating pathogens. The inhibitory effect of nanoparticles for pathogens could be due to deactivate cellular enzymes and DNA; disrupting of membrane permeability; and/...
Mathurand, Prateek; Schaffner, Donald W
2013-06-01
Ceviche is a raw fish dish common in Peru and other Latin American counties. The most characteristic feature of ceviche is the use of lime juice for marinating or "cooking" the raw fish. Confirmed cases of cholera in Peru, New Jersey, and Florida have been associated with ceviche. Although the effect of organic acids on pathogenic bacteria has been well characterized, few data exist on the effect of these acids in seafood systems. The objective of the study was to evaluate the effects of lime juice marination on pathogens likely to be present in ceviche. Tilapia (Oreochromis niloticus) fillet pieces were inoculated with Vibrio parahaemolyticus and Salmonella enterica (>7 log CFU/g) and incubated at 25 and 4°C for 30 or 120 min in the presence of fresh lime juice at concentrations typical for the preparation of ceviche. Similar levels of cells were also inoculated into fresh lime juice without tilapia. Surviving cells were enumerated on selective (xylose lysine Tergitol 4 and thiosulfate-bile-citrate-sucrose) and nonselective (tryptic soy agar) media. V. parahaemolyticus levels were reduced to below detection limits (∼5-log reduction) under all conditions studied. Salmonella strains on tilapia were much more resistant to inactivation and were only slightly reduced (∼1- to 2-log reduction). Salmonella and V. parahaemolyticus inoculated directly into lime juice without tilapia were all reduced to below detection limits (∼5-log reduction). A typical ceviche recipe reduces V. parahaemolyticus risk significantly but is less effective for control of S. enterica.
Future research needs involving pathogens in groundwater
NASA Astrophysics Data System (ADS)
Bradford, Scott A.; Harvey, Ronald W.
2017-06-01
Contamination of groundwater by enteric pathogens has commonly been associated with disease outbreaks. Proper management and treatment of pathogen sources are important prerequisites for preventing groundwater contamination. However, non-point sources of pathogen contamination are frequently difficult to identify, and existing approaches for pathogen detection are costly and only provide semi-quantitative information. Microbial indicators that are readily quantified often do not correlate with the presence of pathogens. Pathogens of emerging concern and increasing detections of antibiotic resistance among bacterial pathogens in groundwater are topics of growing concern. Adequate removal of pathogens during soil passage is therefore critical for safe groundwater extraction. Processes that enhance pathogen transport (e.g., high velocity zones and preferential flow) and diminish pathogen removal (e.g., reversible retention and enhanced survival) are of special concern because they increase the risk of groundwater contamination, but are still incompletely understood. Improved theory and modeling tools are needed to analyze experimental data, test hypotheses, understand coupled processes and controlling mechanisms, predict spatial and/or temporal variability in model parameters and uncertainty in pathogen concentrations, assess risk, and develop mitigation and best management approaches to protect groundwater.
Future research needs involving pathogens in groundwater
Bradford, Scott A.; Harvey, Ronald W.
2017-01-01
Contamination of groundwater by enteric pathogens has commonly been associated with disease outbreaks. Proper management and treatment of pathogen sources are important prerequisites for preventing groundwater contamination. However, non-point sources of pathogen contamination are frequently difficult to identify, and existing approaches for pathogen detection are costly and only provide semi-quantitative information. Microbial indicators that are readily quantified often do not correlate with the presence of pathogens. Pathogens of emerging concern and increasing detections of antibiotic resistance among bacterial pathogens in groundwater are topics of growing concern. Adequate removal of pathogens during soil passage is therefore critical for safe groundwater extraction. Processes that enhance pathogen transport (e.g., high velocity zones and preferential flow) and diminish pathogen removal (e.g., reversible retention and enhanced survival) are of special concern because they increase the risk of groundwater contamination, but are still incompletely understood. Improved theory and modeling tools are needed to analyze experimental data, test hypotheses, understand coupled processes and controlling mechanisms, predict spatial and/or temporal variability in model parameters and uncertainty in pathogen concentrations, assess risk, and develop mitigation and best management approaches to protect groundwater.
Food Safety Impacts from Post-Harvest Processing Procedures of Molluscan Shellfish.
Baker, George L
2016-04-18
Post-harvest Processing (PHP) methods are viable food processing methods employed to reduce human pathogens in molluscan shellfish that would normally be consumed raw, such as raw oysters on the half-shell. Efficacy of human pathogen reduction associated with PHP varies with respect to time, temperature, salinity, pressure, and process exposure. Regulatory requirements and PHP molluscan shellfish quality implications are major considerations for PHP usage. Food safety impacts associated with PHP of molluscan shellfish vary in their efficacy and may have synergistic outcomes when combined. Further research for many PHP methods are necessary and emerging PHP methods that result in minimal quality loss and effective human pathogen reduction should be explored.
Gorham, T J; Lee, J
2016-05-01
Canada geese (Branta canadensis) faeces have been shown to contain pathogenic protozoa and bacteria in numerous studies over the past 15 years. Further, increases in both the Canada geese populations and their ideal habitat requirements in the United States (US) translate to a greater presence of these human pathogens in public areas, such as recreational freshwater beaches. Combining these factors, the potential health risk posed by Canada geese faeces at freshwater beaches presents an emerging public health issue that warrants further study. Here, literature concerning human pathogens in Canada geese faeces is reviewed and the potential impacts these pathogens may have on human health are discussed. Pathogens of potential concern include Campylobacter jejuni, Salmonella Typhimurium, Listeria monocytogenes, Helicobacter canadensis, Arcobacter spp., Enterohemorragic Escherichia coli pathogenic strains, Chlamydia psitacci, Cryptosporidium parvum and Giardia lamblia. Scenarios presenting potential exposure to pathogens eluted from faeces include bathers swimming in lakes, children playing with wet and dry sand impacted by geese droppings and other common recreational activities associated with public beaches. Recent recreational water-associated disease outbreaks in the US support the plausibility for some of these pathogens, including Cryptosporidium spp. and C. jejuni, to cause human illness in this setting. In view of these findings and the uncertainties associated with the real health risk posed by Canada geese faecal pathogens to users of freshwater lakes, it is recommended that beach managers use microbial source tracking and conduct a quantitative microbial risk assessment to analyse the local impact of Canada geese on microbial water quality during their decision-making process in beach and watershed management. © 2015 Blackwell Verlag GmbH.
Estimated health risks to swimmers from seagull and bather sources of fecal contamination at Doheny Beach, California were compared using quantitative microbial risk assessment (QMRA) with a view to aiding beach closure decisions. Surfzone pathogens from seagulls were thought to...
Di Pietro, A; Picerno, I; Visalli, G; Chirico, C; Scoglio, M E
2004-01-01
In order to improve the knowledge of host/pathogenic agent interaction and to obtain a more careful estimation of risk related to ingestion of food contaminated by Vibrio spp., the effects of bile extracts have been studied. The growth of one V. fluvialis, two V. alginolyticus, and three V. parahaemolyticus strains, isolated from mollusks and crustaceans, has been determined to evaluate their adaptability to intestinal environment. Moreover, the expression of virulence factors responsible for the colonization, as bacterial "swarming mobility", biofilm production, adherence on epithelial cells and hydrophobicity, has been evaluated. Using a bile concentration of 1.5%, all examined strains showed a constant inhibitory effect, quite moderate in the first growth phases. Bile increased the "swarming mobility" and biofilm production; also the adherence was favored, but only after adaptation and during the early logarithmic phase. The decreased hydrophobicity could explain the reduction of adherence during the stationary phase. Studying the phenotypic expression of virulence factors in "minor vibrios" in the presence of bile, it was possible to extend the knowledge about their pathogenetic mechanisms owing to the ingestion of contaminated food. That permits a more careful estimation of risk related to the contamination, considering the high frequency of isolation of these species in some seafood.
Enteric pathogen sampling of tourist restaurants in Bangkok, Thailand.
Teague, Nathan S; Srijan, Apichai; Wongstitwilairoong, Boonchai; Poramathikul, Kamonporn; Champathai, Thanaporn; Ruksasiri, Supaporn; Pavlin, Julie; Mason, Carl J
2010-01-01
Travelers' diarrhea (TD) is the most prevalent disorder affecting travelers to developing countries. Thailand is considered "moderately risky" for TD acquisition, but the risk by city visited or behavior of the visitor has yet to be definitely defined. Restaurant eating is consistently associated with the acquisition of diarrhea while traveling, and pathogen-free meals serve as a marker of public health success. This study seeks to ascertain a traveler's risk of exposure to certain bacterial gastric pathogens while eating at Bangkok restaurants recommended in popular tourist guide books. A cross-sectional tourist restaurant survey was conducted. Thirty-five restaurants recommended in the two top selling Bangkok guidebooks on Amazon.com were sampled for bacterial pathogens known to cause diarrhea in Thailand, namely Salmonella, Campylobacter, and Arcobacter (a Campylobacter-like organism). A total of 70 samples from two meals at each restaurant were obtained. Suspected bacterial pathogens were isolated by differential culture and tested for antibiotic resistance. Salmonella group E was isolated from one meal (2%), and Arcobacter butzleri from nine meals (13%). Campylobacter spp. were not found. The large majority of A butzleri isolates were resistant to azithromycin but susceptible to ciprofloxacin and an aminoglycoside. A traveler's risk of exposure to established bacterial pathogens, Salmonella and Campylobacter, by eating in recommended restaurants is small. Arcobacter butzleri exposure risk is 13% per meal eaten, and rises to 75% when 10 meals are eaten. All restaurants, regardless of price, appear to be equally "risky." Current evidence points to Arcobacter being pathogenic in humans; however, further research is needed to conclusively define pathogenicity. Routine prophylaxis for diarrhea is not recommended; however, travelers should be aware of the risk and come prepared with adequate and appropriate self-treatment medications.
MANAGING MICROBIAL CONTAMINATION IN URBAN WATERSHEDS
This paper presents different approaches for controlling pathogen contamination in urban watersheds for contamination resulting from point and diffuse sources. Point sources of pathogens can be treated by a disinfection technology of known effectiveness, and a desired reduction ...
MANAGING MICROBIAL CONTAMINATION IN URBAN WATERSHEDS
This paper presents different approaches for controlling pathogen contamination in urban watersheds for contamination resulting from point and diffuses sources. Point sources of pathogens can be treated by a disinfection technology of known effectiveness, and a desired reduction ...
Reduced MHC Alloimmunization and Partial Tolerance Protection With Pathogen Reduction Of Whole Blood
Jackman, Rachael P.; Muench, Marcus O.; Inglis, Heather; Heitman, John W.; Marschner, Susanne; Goodrich, Raymond P.; Norris, Philip J.
2017-01-01
BACKGROUND Allogeneic blood transfusion can result in an immune response against major histocompatibility complex (MHC) antigens, potentially complicating future transfusions or transplants. We have previously shown that pathogen reduction of platelet-rich plasma (PRP) with riboflavin and UV light (UV+R) can prevent alloimmunization in mice. A similar pathogen reduction treatment is currently under development for the treatment of whole blood using riboflavin and a higher dose of UV light. We sought to determine the effectiveness of this treatment in prevention of alloimmunization. STUDY DESIGN AND METHODS BALB/c mice were transfused with untreated or UV+R treated allogeneic C57Bl/6 whole blood with or without leukoreduction. Mice were evaluated for donor specific antibodies and ex vivo splenocyte cytokine responses, as well as for changes in the frequency of regulatory T (Treg) cells. RESULTS UV+R treatment blocked cytokine priming and reduced anti-MHC alloantibody responses to transfused whole blood. Leukoreduction reduced alloantibody levels in both the untreated and UV+R groups. Mice transfused with UV+R treated whole blood had reduced alloantibody and cytokine responses when subsequently transfused with untreated blood from the same donor type. This reduction in responses was not associated with increased Treg cells. CONCLUSIONS Pathogen reduction of whole blood with UV+R significantly reduces, but does not eliminate the alloimmune response. Exposure to UV+R treated whole blood transfusion does appear to induce tolerance to alloantigens resulting in reduced anti-MHC alloantibody and cytokine responses to subsequent exposures to the same alloantigens. This tolerance does not appear to be driven by an increase in Treg cells. PMID:27859333
Evaluation of radio-frequency heating in controlling Salmonella enterica in raw shelled almonds.
Jeong, Seul-Gi; Baik, Oon-Doo; Kang, Dong-Hyun
2017-08-02
This study was conducted to investigate the efficacy of radio-frequency (RF) heating to reduce Salmonella enterica serovars Enteritidis, Typhimurium, and Senftenberg in raw shelled almonds compared to conventional convective heating, and the effect of RF heating on quality by measuring changes in the color and degree of lipid oxidation. Agar-grown cells of three pathogens were inoculated onto the surface or inside of raw shelled almonds using surface inoculation or the vacuum perfusion method, respectively, and subjected to RF or conventional heating. RF heating for 40s achieved 3.7-, 6.0-, and 5.6-log reductions in surface-inoculated S. Enteritidis, S. Typhimurium, and S. Senftenberg, respectively, whereas the reduction of these pathogens following convective heating for 600s was 1.7, 2.5, and 3.7 log, respectively. RF heating reduced internally inoculated pathogens to below the detection limit (0.7 logCFU/g) after 30s. However, conventional convective heating did not attain comparable reductions even at the end of treatment (600s). Color values, peroxide values, and acid values of RF-treated (40-s treatment) almonds were not significantly (P>0.05) different from those of nontreated samples. These results suggest that RF heating can be applied to control internalized pathogens as well as surface-adhering pathogens in raw almonds without affecting product quality. Copyright © 2017. Published by Elsevier B.V.
Esrey, S. A.; Feachem, R. G.; Hughes, J. M.
1985-01-01
A theoretical model is proposed that relates the level of ingestion of diarrhoea-causing pathogens to the frequency of diarrhoea in the community. The implications of this model are that, in poor communities with inadequate water supply and excreta disposal, reducing the level of enteric pathogen ingestion by a given amount will have a greater impact on diarrhoea mortality rates than on morbidity rates, a greater impact on the incidence rate of severe diarrhoea than on that of mild diarrhoea, and a greater impact on diarrhoea caused by pathogens having high infectious doses than on diarrhoea caused by pathogens of a low infectious dose. The impact of water supply and sanitation on diarrhoea, related infections, nutritional status, and mortality is analysed by reviewing 67 studies from 28 countries. The median reductions in diarrhoea morbidity rates are 22% from all studies and 27% from a few better-designed studies. All studies of the impact on total mortality rates show a median reduction of 21%, while the few better-designed studies give a median reduction of 30%. Improvements in water quality have less of an impact than improvements in water availability or excreta disposal. PMID:3878742
Endogenous System Microbes as Treatment Process ...
Monitoring the efficacy of treatment strategies to remove pathogens in decentralized systems remains a challenge. Evaluating log reduction targets by measuring pathogen levels is hampered by their sporadic and low occurrence rates. Fecal indicator bacteria are used in centralized systems to indicate the presence of fecal pathogens, but are ineffective decentralized treatment process indicators as they generally occur at levels too low to assess log reduction targets. System challenge testing by spiking with high loads of fecal indicator organisms, like MS2 coliphage, has limitations, especially for large systems. Microbes that are endogenous to the decentralized system, occur in high abundances and mimic removal rates of bacterial, viral and/or parasitic protozoan pathogens during treatment could serve as alternative treatment process indicators to verify log reduction targets. To identify abundant microbes in wastewater, the bacterial and viral communities were examined using deep sequencing. Building infrastructure-associated bacteria, like Zoogloea, were observed as dominant members of the bacterial community in graywater. In blackwater, bacteriophage of the order Caudovirales constituted the majority of contiguous sequences from the viral community. This study identifies candidate treatment process indicators in decentralized systems that could be used to verify log removal during treatment. The association of the presence of treatment process indic
Tinidazole inhibitory and cidal activity against anaerobic periodontal pathogens.
Alou, L; Giménez, M J; Manso, F; Sevillano, D; Torrico, M; González, N; Granizo, J J; Bascones, A; Prieto, J; Maestre, J R; Aguilar, L
2009-05-01
The in vitro activity of tinidazole against anaerobic periodontal pathogens (25 Prevotella buccae, 18 Prevotella denticola, 10 Prevotella intermedia, 6 Prevotella melaninogenica, 5 Prevotella oralis, 10 Fusobacterium nucleatum and 8 Veillonella spp.) was determined by agar dilution. MIC(90) values (minimum inhibitory concentration for 90% of the organisms) were 8 microg/mL for Veillonella spp., 4 microg/mL for P. intermedia, 2 microg/mL for P. buccae, 1 microg/mL for Fusobacterium spp. and 0.5 microg/mL for other Prevotella spp. Cidal activity was studied by killing curves with tinidazole and amoxicillin (alone and in combination) at concentrations similar to those achieved in crevicular fluid (41.2 microg/mL tinidazole and 14.05 microg/mL amoxicillin) against an inoculum of ca. 10(7)colony-forming units/mL of four bacterial groups, each one composed of four different strains of the following periodontal isolates: Prevotella spp., Fusobacterium spp. and Veillonella spp. (anaerobes) and one amoxicillin-susceptible Streptococcus spp. (facultative) in a proportion of 1:1:1:1. When only beta-lactamase-negative Prevotella or Fusobacterium strains were tested, significantly higher reductions were found with amoxicillin (>4 log reduction at 48 h) versus controls. The presence of beta-lactamase-positive Prevotella spp. or F. nucleatum strains rendered amoxicillin inactive (no reductions at 48 h), with no differences from controls. Amoxicillin+tinidazole produced >3 log reduction at 24h and >4 log reduction at 48 h regardless of the presence or not of beta-lactamase-positive strains. The presence in crevicular fluid of beta-lactamases produced by beta-lactamase-positive periodontal pathogens may have ecological and therapeutic consequences since it may protect beta-lactamase-negative periodontal pathogens from amoxicillin treatment. In vitro, tinidazole offered high antianaerobic activity against beta-lactamase-positive and -negative periodontal pathogens, avoiding amoxicillin inactivation.
Irradiation and additive combinations on the pathogen reduction and quality of poultry meat.
Ahn, Dong U; Kim, Il Suk; Lee, Eun Joo
2013-02-01
Reduction of foodborne illnesses and deaths by improving the safety of poultry products is one of the priority areas in the United States, and developing and implementing effective food processing technologies can be very effective to accomplish that goal. Irradiation is an effective processing technology for eliminating pathogens in poultry meat. Addition of antimicrobial agents during processing can be another approach to control pathogens in poultry products. However, the adoption of irradiation technology by the meat industry is limited because of quality and health concerns about irradiated meat products. Irradiation produces a characteristic aroma as well as alters meat flavor and color that significantly affect consumer acceptance. The generation of a pink color in cooked poultry and off-odor in poultry by irradiation is a critical issue because consumers associate the presence of a pink color in cooked poultry breast meat as contaminated or undercooked, and off-odor in raw meat and off-flavor in cooked meat with undesirable chemical reactions. As a result, the meat industry has difficulties in using irradiation to achieve its food safety benefits. Antimicrobials such as sodium lactate, sodium diacetate, and potassium benzoate are extensively used to extend the shelf-life and ensure the safety of meat products. However, the use of these antimicrobial agents alone cannot guarantee the safety of poultry products. It is known that some of the herbs, spices, and antimicrobials commonly used in meat processing can have synergistic effects with irradiation in controlling pathogens in meat. Also, the addition of spices or herbs in irradiated meat improves the quality of irradiated poultry by reducing lipid oxidation and production of off-odor volatiles or masking off-flavor. Therefore, combinations of irradiation with these additives can accomplish better pathogen reduction in meat products than using them alone even at lower levels of antimicrobials/herbs and irradiation doses. Effects of irradiation and additive combinations on the pathogen reduction and quality of poultry meat will be discussed in detail.
Ants avoid superinfections by performing risk-adjusted sanitary care.
Konrad, Matthias; Pull, Christopher D; Metzler, Sina; Seif, Katharina; Naderlinger, Elisabeth; Grasse, Anna V; Cremer, Sylvia
2018-03-13
Being cared for when sick is a benefit of sociality that can reduce disease and improve survival of group members. However, individuals providing care risk contracting infectious diseases themselves. If they contract a low pathogen dose, they may develop low-level infections that do not cause disease but still affect host immunity by either decreasing or increasing the host's vulnerability to subsequent infections. Caring for contagious individuals can thus significantly alter the future disease susceptibility of caregivers. Using ants and their fungal pathogens as a model system, we tested if the altered disease susceptibility of experienced caregivers, in turn, affects their expression of sanitary care behavior. We found that low-level infections contracted during sanitary care had protective or neutral effects on secondary exposure to the same (homologous) pathogen but consistently caused high mortality on superinfection with a different (heterologous) pathogen. In response to this risk, the ants selectively adjusted the expression of their sanitary care. Specifically, the ants performed less grooming and more antimicrobial disinfection when caring for nestmates contaminated with heterologous pathogens compared with homologous ones. By modulating the components of sanitary care in this way the ants acquired less infectious particles of the heterologous pathogens, resulting in reduced superinfection. The performance of risk-adjusted sanitary care reveals the remarkable capacity of ants to react to changes in their disease susceptibility, according to their own infection history and to flexibly adjust collective care to individual risk.
The pathogen transmission avoidance theory of sexual selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loehle, C.
1997-08-01
The current theory that sexual selection results from female preference for males with good genes suffers from several problems. An alternative explanation, the pathogen transmission avoidance hypothesis, argues that the primary function of showy traits is to provide a reliable signal of current disease status, so that sick individuals can be avoided during mating. This study shows that a significant risk of pathogen transmission occurs during mating and that showy traits are reliable indicators of current disease status. The origin of female choosiness is argued to lie in a general tendency to avoid sick individuals, even in the absence ofmore » showy traits, which originate as exaggerations of normal traits that are indicative of good health (bright feathers, vigorous movement, large size). Thus, in this new model the origins of both showy traits and female choosiness are not problematic and there is no threshold effect. This model predicts that when the possession of male showy traits does not help to reduce disease in the female, showy traits are unlikely to occur. This case corresponds to thorough exposure of every animal to all group pathogens, on average, in large groups. Such species are shown with a large data set on birds to be less likely to exhibit showy traits. The good-genes model does not make this prediction. The pathogen transmission avoidance model can also lead to the evolution of showy traits even when selection is not effective against a given pathogen (e.g., when there is no heritable variation for resistance), but can result in selection for resistance if such genes are present. Monogamy is argued to reduce selection pressures for showy traits; data show monogamous species to be both less parasitized and less showy. In the context of reduction of pathogen transmission rates in showy populations, selection pressure becomes inversely frequency-dependent, which makes showy traits likely to be self-limiting rather than runaway.« less
Bryan, Brett A; Kandulu, John; Deere, Daniel A; White, Monique; Frizenschaf, Jacqueline; Crossman, Neville D
2009-07-01
Water-borne pathogens such as Cryptosporidium pose a significant human health risk and catchments provide the first critical pollution 'barrier' in mitigating risk in drinking water supply. In this paper we apply an adaptive management framework to mitigating Cryptosporidium risk in source water using a case study of the Myponga catchment in South Australia. Firstly, we evaluated the effectiveness of past water quality management programs in relation to the adoption of practices by landholders using a socio-economic survey of land use and management in the catchment. The impact of past management on the mitigation of Cryptosporidium risk in source water was also evaluated based on analysis of water quality monitoring data. Quantitative risk assessment was used in planning the next round of management in the adaptive cycle. Specifically, a pathogen budget model was used to identify the major remaining sources of Cryptosporidium in the catchment and estimate the mitigation impact of 30 alternative catchment management scenarios. Survey results show that earlier programs have resulted in the comprehensive adoption of best management practices by dairy farmers including exclusion of stock from watercourses and effluent management from 2000 to 2007. Whilst median Cryptosporidium concentrations in source water have decreased since 2004 they remain above target levels and put pressure on other barriers to mitigate risk, particularly the treatment plant. Non-dairy calves were identified as the major remaining source of Cryptosporidium in the Myponga catchment. The restriction of watercourse access of non-dairy calves could achieve a further reduction in Cryptosporidium export to the Myponga reservoir of around 90% from current levels. The adaptive management framework applied in this study was useful in guiding learning from past management, and in analysing, planning and refocusing the next round of catchment management strategies to achieve water quality targets.
Approach to the health-risk management on municipal reclaimed water reused in landscape water system
NASA Astrophysics Data System (ADS)
Liu, X.; Li, J.; Liu, W.
2008-12-01
Water pollution and water heavily shortage are both main environmental conflicts in China. Reclaimed water reuse is an important approach to lessen water pollution and solve the water shortage crisis in the city. The heath risk of reclaimed water has become the focus of the public. It is impending to evaluate the health risk of reclaimed water with risk assessment technique. Considering the ways of the reclaimed water reused, it is studied that health risk produced by toxic pollutants and pathogenic microbes in the processes of reclaimed water reused in landscape water system. The pathogenic microbes monitoring techniques in wastewater and reclaimed water are discussed and the hygienic indicators, risk assessment methods, concentration limitations of pathogenic microbes for various reclaimed water uses are studied. The principle of health risk assessment is used to research the exposure level and the health risk of concerned people in a wastewater reuse project where the reclaimed water is applied for green area irrigation in a public park in Beijing. The exposure assessment method and model of various reclaimed water uses are built combining with Beijing reclaimed water project. Firstly the daily ingesting dose and lifetime average daily dose(LADD) of exposure people are provided via field work and monitoring analysis, which could be used in health risk assessment as quantitative reference. The result shows that the main risk comes from the pathology pollutants, the toxic pollutants, the eutrophication pollutants, pathogenic microbes and the secondary pollutants when municipal wastewater is reclaimed for landscape water. The major water quality limited should include pathogenic microbes, toxic pollutants, and heavy metals. Keywords: municipal wastewater, reclaimed water, landscape water, health risk
Code of Federal Regulations, 2011 CFR
2011-07-01
... DISPOSAL OF SEWAGE SLUDGE Pathogens and Vector Attraction Reduction § 503.30 Scope. (a) This subpart... land, forest, or a reclamation site. (d) This subpart contains alternative vector attraction reduction...
Code of Federal Regulations, 2010 CFR
2010-07-01
... DISPOSAL OF SEWAGE SLUDGE Pathogens and Vector Attraction Reduction § 503.30 Scope. (a) This subpart... land, forest, or a reclamation site. (d) This subpart contains alternative vector attraction reduction...
NASA Astrophysics Data System (ADS)
Kim, S.; Park, J.; Park, J. K.; Park, S.; Jeon, H.; Kwon, H.
2017-12-01
Foot and mouth disease outbreaks globally occur. Although livestock suspected to be infected or actually infected by animal infectious diseases is typically treated with various methods including burial, burning, incineration, rendering, and composting, burial into soil is currently the major treatment method in Korea. However, buried carcasses are often found to remain undecomposed or incompletely decomposed even after the legal burial period (3 years). To reuse the land used for the burial purposes, Korea government is considering a novel approach to conduct in-situ burial treatment and then to move remaining carcasses from the burial sites to other sites designated for further ex-situ stabilization treatment (burial-composting sequential treatment). In this work, the feasibility of the novel approach was evaluated at a pilot scale facility. For the ex-situ stabilization, we tested the validity of use of a bio-augmented aerobic composting with carcass-degrading microorganisms, with emphasis on examining if the novel aerobic composting has reducing effects on potential pathogenic bacteria. As results, the decreased chemical oxygen demand (COD, 160,000 mg/kg to 40,000 mg/kg) and inorganic nitrogen species (total nitrogen, 5,000 mg/kg to 2,000 mg/kg) indicated effective bio-stabilization of carcasses. During the stabilization, bacterial community structure and dynamics determined by bacterial 16S rRNA sequencing were significantly changed. The prediction of potential pathogenic bacteria showed that bacterial pathogenic risk was significantly reduced up to a normal soil level during the ex-situ stabilization. The conclusion was confirmed by the following functional analysis of dominant bacteria using PICRUST. The findings support the microbiological safety of the ex-site use of the novel burial-composting sequential treatment. Acknowledgement : This study is supported by Korea Ministry of Environmental as "The GAIA Project"
Bishai, David; Liu, Liang; Shiau, Stephanie; Wang, Harrison; Tsai, Cindy; Liao, Margaret; Prakash, Shivaani; Howard, Tracy
2011-06-01
The purpose of this study was to estimate the risk of acquiring pathogenic bacteria as a result of shaking hands at graduation ceremonies. School officials participating in graduation ceremonies at elementary, secondary, and postsecondary schools were recruited. Specimens were collected before and immediately following graduation. Cultures identified any pathogenic bacteria in each specimen. Subjects shook a total of 5,209 hands. Staphylococcus aureus was separately detected on one pregraduation right hand, one postgraduation right hand, and one postgraduation left hand. Nonpathogenic bacteria were collected in 93% of specimens. Pregraduation and postgraduation specimens were of different strains. We measured a risk of one new bacterial acquisition in a sample exposed to 5,209 handshakes yielding an overall estimate of 0.019 pathogens acquired per handshake. We conclude that a single handshake at a graduation offers only a small risk of bacterial pathogen acquisition.
Collender, Philip A.; Cooke, Olivia C.; Bryant, Lee D.; Kjeldsen, Thomas R.; Remais, Justin V.
2017-01-01
Flooding is known to facilitate infectious disease transmission, yet quantitative research on microbiological risks associated with floods has been limited. Pathogen fate and transport models provide a framework to examine interactions between landscape characteristics, hydrology, and waterborne disease risks, but have not been widely developed for flood conditions. We critically examine capabilities of current hydrological models to represent unusual flow paths, non-uniform flow depths, and unsteady flow velocities that accompany flooding. We investigate the theoretical linkages between hydrodynamic processes and spatio-temporally variable suspension and deposition of pathogens from soils and sediments; pathogen dispersion in flow; and concentrations of constituents influencing pathogen transport and persistence. Identifying gaps in knowledge and modeling practice, we propose a research agenda to strengthen microbial fate and transport modeling applied to inland floods: 1) development of models incorporating pathogen discharges from flooded sources (e.g., latrines), effects of transported constituents on pathogen persistence, and supply-limited pathogen transport; 2) studies assessing parameter identifiability and comparing model performance under varying degrees of process representation, in a range of settings; 3) development of remotely sensed datasets to support modeling of vulnerable, data-poor regions; and 4) collaboration between modelers and field-based researchers to expand the collection of useful data in situ. PMID:28757789
Evaluating the importance of faecal sources in human-impacted waters.
Schoen, Mary E; Soller, Jeffrey A; Ashbolt, Nicholas J
2011-04-01
Quantitative microbial risk assessment (QMRA) was used to evaluate the relative contribution of faecal indicators and pathogens when a mixture of human sources impacts a recreational waterbody. The waterbody was assumed to be impacted with a mixture of secondary-treated disinfected municipal wastewater and untreated (or poorly treated) sewage, using Norovirus as the reference pathogen and enterococci as the reference faecal indicator. The contribution made by each source to the total waterbody volume, indicator density, pathogen density, and illness risk was estimated for a number of scenarios that accounted for pathogen and indicator inactivation based on the age of the effluent (source-to-receptor), possible sedimentation of microorganisms, and the addition of a non-pathogenic source of faecal indicators (such as old sediments or an animal population with low occurrence of human-infectious pathogens). The waterbody indicator density was held constant at 35 CFU 100 mL(-1) enterococci to compare results across scenarios. For the combinations evaluated, either the untreated sewage or the non-pathogenic source of faecal indicators dominated the recreational waterbody enterococci density assuming a culture method. In contrast, indicator density assayed by qPCR, pathogen density, and bather gastrointestinal illness risks were largely dominated by secondary disinfected municipal wastewater, with untreated sewage being increasingly less important as the faecal indicator load increased from a non-pathogenic source. The results support the use of a calibrated qPCR total enterococci indicator, compared to a culture-based assay, to index infectious human enteric viruses released in treated human wastewater, and illustrate that the source contributing the majority of risk in a mixture may be overlooked when only assessing faecal indicators by a culture-based method. Published by Elsevier Ltd.
Baron, Julianne L; Peters, Tammy; Shafer, Raymond; MacMurray, Brian; Stout, Janet E
2014-11-01
Opportunistic waterborne pathogens (eg, Legionella, Pseudomonas) may persist in water distribution systems despite municipal chlorination and secondary disinfection and can cause health care-acquired infections. Point-of-use (POU) filtration can limit exposure to pathogens; however, their short maximum lifetime and membrane clogging have limited their use. A new faucet filter rated at 62 days was evaluated at a cancer center in Northwestern Pennsylvania. Five sinks were equipped with filters, and 5 sinks served as controls. Hot water was collected weekly for 17 weeks and cultured for Legionella, Pseudomonas, and total bacteria. Legionella was removed from all filtered samples for 12 weeks. One colony was recovered from 1 site at 13 weeks; however, subsequent tests were negative through 17 weeks of testing. Total bacteria were excluded for the first 2 weeks, followed by an average of 1.86 log reduction in total bacteria compared with controls. No Pseudomonas was recovered from filtered or control faucets. This next generation faucet filter eliminated Legionella beyond the 62 day manufacturers' recommended maximum duration of use. These new POU filters will require fewer change-outs than standard filters and could be a cost-effective method for preventing exposure to Legionella and other opportunistic waterborne pathogens in hospitals with high-risk patients. Copyright © 2014 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Hogan, Jennifer N.; Daniels, Miles E.; Watson, Fred G.; Conrad, Patricia A.; Oates, Stori C.; Miller, Melissa A.; Hardin, Dane; Byrne, Barbara A.; Dominik, Clare; Melli, Ann; Jessup, David A.
2012-01-01
Fecal pathogen contamination of watersheds worldwide is increasingly recognized, and natural wetlands may have an important role in mitigating fecal pathogen pollution flowing downstream. Given that waterborne protozoa, such as Cryptosporidium and Giardia, are transported within surface waters, this study evaluated associations between fecal protozoa and various wetland-specific and environmental risk factors. This study focused on three distinct coastal California wetlands: (i) a tidally influenced slough bordered by urban and agricultural areas, (ii) a seasonal wetland adjacent to a dairy, and (iii) a constructed wetland that receives agricultural runoff. Wetland type, seasonality, rainfall, and various water quality parameters were evaluated using longitudinal Poisson regression to model effects on concentrations of protozoa and indicator bacteria (Escherichia coli and total coliform). Among wetland types, the dairy wetland exhibited the highest protozoal and bacterial concentrations, and despite significant reductions in microbe concentrations, the wetland could still be seen to influence water quality in the downstream tidal wetland. Additionally, recent rainfall events were associated with higher protozoal and bacterial counts in wetland water samples across all wetland types. Notably, detection of E. coli concentrations greater than a 400 most probable number (MPN) per 100 ml was associated with higher Cryptosporidium oocyst and Giardia cyst concentrations. These findings show that natural wetlands draining agricultural and livestock operation runoff into human-utilized waterways should be considered potential sources of pathogens and that wetlands can be instrumental in reducing pathogen loads to downstream waters. PMID:22427504
A Review of Zoonotic Infection Risks Associated with the Wild Meat Trade in Malaysia.
Cantlay, Jennifer Caroline; Ingram, Daniel J; Meredith, Anna L
2017-06-01
The overhunting of wildlife for food and commercial gain presents a major threat to biodiversity in tropical forests and poses health risks to humans from contact with wild animals. Using a recent survey of wildlife offered at wild meat markets in Malaysia as a basis, we review the literature to determine the potential zoonotic infection risks from hunting, butchering and consuming the species offered. We also determine which taxa potentially host the highest number of pathogens and discuss the significant disease risks from traded wildlife, considering how cultural practices influence zoonotic transmission. We identify 51 zoonotic pathogens (16 viruses, 19 bacteria and 16 parasites) potentially hosted by wildlife and describe the human health risks. The Suidae and the Cervidae families potentially host the highest number of pathogens. We conclude that there are substantial gaps in our knowledge of zoonotic pathogens and recommend performing microbial food safety risk assessments to assess the hazards of wild meat consumption. Overall, there may be considerable zoonotic risks to people involved in the hunting, butchering or consumption of wild meat in Southeast Asia, and these should be considered in public health strategies.
Managing dynamic epidemiological risks through trade
Horan, Richard D.; Fenichel, Eli P.; Finnoff, David; Wolf, Christopher A.
2015-01-01
There is growing concern that trade, by connecting geographically isolated regions, unintentionally facilitates the spread of invasive pathogens and pests – forms of biological pollution that pose significant risks to ecosystem and human health. We use a bioeconomic framework to examine whether trade always increases private risks, focusing specifically on pathogen risks from live animal trade. When the pathogens have already established and traders bear some private risk, we find two results that run counter to the conventional wisdom on trade. First, uncertainty about the disease status of individual animals held in inventory may increase the incentives to trade relative to the disease-free case. Second, trade may facilitate reduced long-run disease prevalence among buyers. These results arise because disease risks are endogenous due to dynamic feedback processes involving valuable inventories, and markets facilitate the management of private risks that producers face with or without trade. PMID:25914431
Impacts of climate change on indirect human exposure to pathogens and chemicals from agriculture.
Boxall, Alistair B A; Hardy, Anthony; Beulke, Sabine; Boucard, Tatiana; Burgin, Laura; Falloon, Peter D; Haygarth, Philip M; Hutchinson, Thomas; Kovats, R Sari; Leonardi, Giovanni; Levy, Leonard S; Nichols, Gordon; Parsons, Simon A; Potts, Laura; Stone, David; Topp, Edward; Turley, David B; Walsh, Kerry; Wellington, Elizabeth M H; Williams, Richard J
2009-04-01
Climate change is likely to affect the nature of pathogens and chemicals in the environment and their fate and transport. Future risks of pathogens and chemicals could therefore be very different from those of today. In this review, we assess the implications of climate change for changes in human exposures to pathogens and chemicals in agricultural systems in the United Kingdom and discuss the subsequent effects on health impacts. In this review, we used expert input and considered literature on climate change; health effects resulting from exposure to pathogens and chemicals arising from agriculture; inputs of chemicals and pathogens to agricultural systems; and human exposure pathways for pathogens and chemicals in agricultural systems. We established the current evidence base for health effects of chemicals and pathogens in the agricultural environment; determined the potential implications of climate change on chemical and pathogen inputs in agricultural systems; and explored the effects of climate change on environmental transport and fate of different contaminant types. We combined these data to assess the implications of climate change in terms of indirect human exposure to pathogens and chemicals in agricultural systems. We then developed recommendations on future research and policy changes to manage any adverse increases in risks. Overall, climate change is likely to increase human exposures to agricultural contaminants. The magnitude of the increases will be highly dependent on the contaminant type. Risks from many pathogens and particulate and particle-associated contaminants could increase significantly. These increases in exposure can, however, be managed for the most part through targeted research and policy changes.
Miller, Ryan S.; Sweeney, Steven J.; Slootmaker, Chris; Grear, Daniel A.; DiSalvo, Paul A.; Kiser, Deborah; Shwiff, Stephanie A.
2017-01-01
Cross-species disease transmission between wildlife, domestic animals and humans is an increasing threat to public and veterinary health. Wild pigs are increasingly a potential veterinary and public health threat. Here we investigate 84 pathogens and the host species most at risk for transmission with wild pigs using a network approach. We assess the risk to agricultural and human health by evaluating the status of these pathogens and the co-occurrence of wild pigs, agriculture and humans. We identified 34 (87%) OIE listed swine pathogens that cause clinical disease in livestock, poultry, wildlife, and humans. On average 73% of bacterial, 39% of viral, and 63% of parasitic pathogens caused clinical disease in other species. Non-porcine livestock in the family Bovidae shared the most pathogens with swine (82%). Only 49% of currently listed OIE domestic swine diseases had published wild pig surveillance studies. The co-occurrence of wild pigs and farms increased annually at a rate of 1.2% with as much as 57% of all farms and 77% of all agricultural animals co-occurring with wild pigs. The increasing co-occurrence of wild pigs with livestock and humans along with the large number of pathogens shared is a growing risk for cross-species transmission.
Miller, Ryan S; Sweeney, Steven J; Slootmaker, Chris; Grear, Daniel A; Di Salvo, Paul A; Kiser, Deborah; Shwiff, Stephanie A
2017-08-10
Cross-species disease transmission between wildlife, domestic animals and humans is an increasing threat to public and veterinary health. Wild pigs are increasingly a potential veterinary and public health threat. Here we investigate 84 pathogens and the host species most at risk for transmission with wild pigs using a network approach. We assess the risk to agricultural and human health by evaluating the status of these pathogens and the co-occurrence of wild pigs, agriculture and humans. We identified 34 (87%) OIE listed swine pathogens that cause clinical disease in livestock, poultry, wildlife, and humans. On average 73% of bacterial, 39% of viral, and 63% of parasitic pathogens caused clinical disease in other species. Non-porcine livestock in the family Bovidae shared the most pathogens with swine (82%). Only 49% of currently listed OIE domestic swine diseases had published wild pig surveillance studies. The co-occurrence of wild pigs and farms increased annually at a rate of 1.2% with as much as 57% of all farms and 77% of all agricultural animals co-occurring with wild pigs. The increasing co-occurrence of wild pigs with livestock and humans along with the large number of pathogens shared is a growing risk for cross-species transmission.
Transport and fate of microbial pathogens in agricultural settings
USDA-ARS?s Scientific Manuscript database
An understanding of the transport and survival of microbial pathogens (pathogens hereafter) in agricultural settings is needed to assess the risk of pathogen contamination to water and food resources, and to develop control strategies and treatment options. However, many knowledge gaps still remain ...
Insight into the risk of replenishing urban landscape ponds with reclaimed wastewater.
Chen, Rong; Ao, Dong; Ji, Jiayuan; Wang, Xiaochang C; Li, Yu-You; Huang, Yue; Xue, Tao; Guo, Hongbing; Wang, Nan; Zhang, Lu
2017-02-15
Increasing use of reclaimed wastewater (RW) for replenishing urban landscape ponds has aroused public concern about the water quality. Three ponds replenished with RW in three cities in China were chosen to investigate 22 indexes of water quality in five categories. This was achieved by comparing three pairs of ponds in the three different cities, where one pond in each pair was replenished with RW and the other with surface water (SW). The nutrients condition, heavy metal concentration and ecotoxicity did not differ significantly between RW- and SW-replenished ponds. By contrast, significant differences were observed in algal growth and pathogen risk. RW ponds presented a Cyanophyta-Chlorophyta-Bacillariophyta type with high algal diversity while SW ponds presented a Cyanophyta type with low diversity. Regrowth of bacterial pathogens and especially survival of viral pathogens in RW, was the main driver behind the higher risk for RW ponds compared with SW ones. The duration of RW replenishment was proved to have a marked impact on the algal growth and pathogen risk. With continued RW replenishment, non-dominant algal species subjected to decrease while dominant species were enhanced resulting in the biomass increasing but diversity declining, and the risk posed by viral pathogens might become greater. Copyright © 2016 Elsevier B.V. All rights reserved.
Han, Il; Congeevaram, Shankar; Park, Joonhong
2009-01-01
In this study, we microbiologically evaluated antibiotic resistance and pathogenicity in livestock (swine) manure as well as its biologically stabilized products. One of new livestock manure stabilization techniques is ATAD (Autothermal Thermophilic Aerobic Digestion). Because of its high operation temperature (60-65 degrees C), it has been speculated to have effective microbial risk control in livestock manure. This hypothesis was tested by evaluating microbial risk in ATAD-treated swine manure. Antibiotic resistance, multiple antibiotic resistance (MAR), and pathogenicity were microbiologically examined for swine manure as well as its conventionally stabilized (anaerobically fermented) and ATAD-stabilized products. In the swine manure and its conventionally stabilized product, antibiotic resistant (tetracycline-, kanamycine-, ampicillin-, and rifampicin-resistant) bacteria and the pathogen indicator bacteria were detected. Furthermore, approximately 2-5% of the Staphylococcus and Salmonella colonies from their selective culture media were found to exhibit a MAR-phenotypes, suggesting a serious level of microbe induced health risk. In contrast, after the swine manure was stabilized with a pilot-scale ATAD treatment for 3 days at 60-65 degrees C, antibiotic resistant bacteria, pathogen indicator bacteria, and MAR-exhibiting pathogens were all undetected. These findings support the improved control of microbial risk in livestock wastes by ATAD treatment.
Alternative Treatment Technologies – Working With the Pathogen Equivalency Committee
Under current Federal regulations (40 CFR 503), municipal sludge must be treated prior to land application. The regulations identify two classes of treatment with respect to pathogen reduction: Class B (three alternatives) which provides a minimum acceptable level of treatment;...
USDA-ARS?s Scientific Manuscript database
Hydrothermal carbonization (HTC), utilizing high temperature and pressure, has the potential to treat agricultural waste and inactivate pathogens, antibiotic resistance genes (ARG), and contaminants of emerging concern (CEC) in an environmentally and economically friendly manner. Livestock mortality...
Assessment of Environmental Contamination with Pathogenic Bacteria at a Hospital Laundry Facility.
Michael, Karen E; No, David; Daniell, William E; Seixas, Noah S; Roberts, Marilyn C
2017-11-10
Little is known about exposure to pathogenic bacteria among industrial laundry workers who work with soiled clinical linen. To study worker exposures, an assessment of surface contamination was performed at an industrial laundry facility serving hospitals in Seattle, WA, USA. Surface swab samples (n = 240) from the environment were collected during four site visits at 3-month intervals. These samples were cultured for Clostridium difficile, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant enterococci (VRE). Voluntary participation of 23 employees consisted of nasal swabs for detection of MRSA, observations during work, and questionnaires. Contamination with all three pathogens was observed in both dirty (laundry handling prior to washing) and clean areas (subsequent to washing). The dirty area had higher odds of overall contamination (≥1 pathogen) than the clean area (odds ratio, OR = 18.0, 95% confidence interval 8.9-36.5, P < 0.001). The odds of contamination were high for each individual pathogen: C. difficile, OR = 15.5; MRSA, OR = 14.8; and VRE, OR = 12.6 (each, P < 0.001). The highest odds of finding surface contamination occurred in the primary and secondary sort areas where soiled linens were manually sorted by employees (OR = 63.0, P < 0.001). The study substantiates that the laundry facility environment can become contaminated by soiled linens. Workers who handle soiled linen may have a higher risk of exposure to C. difficile, MRSA, and VRE than those who handle clean linens. Improved protocols for prevention and reduction of environmental contamination were implemented because of this study. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Ha, Jae-Won; Ryu, Sang-Ryeol; Kang, Dong-Hyun
2012-09-01
This study was conducted to investigate the efficacy of near-infrared (NIR) heating to reduce Salmonella enterica serovar Typhimurium, Escherichia coli O157:H7, and Listeria monocytogenes in ready-to-eat (RTE) sliced ham compared to conventional convective heating, and the effect of NIR heating on quality was determined by measuring the color and texture change. A cocktail of three pathogens was inoculated on the exposed or protected surfaces of ham slices, followed by NIR or conventional heating at 1.8 kW. NIR heating for 50 s achieved 4.1-, 4.19-, and 3.38-log reductions in surface-inoculated S. Typhimurium, E. coli O157:H7, and L. monocytogenes, respectively, whereas convective heating needed 180 s to attain comparable reductions for each pathogen. There were no statistically significant (P > 0.05) differences in reduction between surface- and internally inoculated pathogens at the end of NIR treatment (50 s). However, when treated with conventional convective heating, significant (P < 0.05) differences were observed at the final stages of the treatment (150 and 180 s). Color values and texture parameters of NIR-treated (50-s treatment) ham slices were not significantly (P > 0.05) different from those of nontreated samples. These results suggest that NIR heating can be applied to control internalized pathogens as well as surface-adhering pathogens in RTE sliced meats without affecting product quality.
Ha, Jae-Won; Ryu, Sang-Ryeol
2012-01-01
This study was conducted to investigate the efficacy of near-infrared (NIR) heating to reduce Salmonella enterica serovar Typhimurium, Escherichia coli O157:H7, and Listeria monocytogenes in ready-to-eat (RTE) sliced ham compared to conventional convective heating, and the effect of NIR heating on quality was determined by measuring the color and texture change. A cocktail of three pathogens was inoculated on the exposed or protected surfaces of ham slices, followed by NIR or conventional heating at 1.8 kW. NIR heating for 50 s achieved 4.1-, 4.19-, and 3.38-log reductions in surface-inoculated S. Typhimurium, E. coli O157:H7, and L. monocytogenes, respectively, whereas convective heating needed 180 s to attain comparable reductions for each pathogen. There were no statistically significant (P > 0.05) differences in reduction between surface- and internally inoculated pathogens at the end of NIR treatment (50 s). However, when treated with conventional convective heating, significant (P < 0.05) differences were observed at the final stages of the treatment (150 and 180 s). Color values and texture parameters of NIR-treated (50-s treatment) ham slices were not significantly (P > 0.05) different from those of nontreated samples. These results suggest that NIR heating can be applied to control internalized pathogens as well as surface-adhering pathogens in RTE sliced meats without affecting product quality. PMID:22773635
Hoen, Anne Gatewood; Rollend, Lindsay G; Papero, Michele A; Carroll, John F; Daniels, Thomas J; Mather, Thomas N; Schulze, Terry L; Stafford, Kirby C; Fish, Durland
2009-08-01
We evaluated the effects of tick control by acaricide self-treatment of white-tailed deer on the infection prevalence and entomologic risk for three Ixodes scapularis-borne bacteria in host-seeking ticks. Ticks were collected from vegetation in areas treated with the "4-Poster" device and from control areas over a 6-year period in five geographically diverse study locations in the Northeastern United States and tested for infection with two known agents of human disease, Borrelia burgdorferi and Anaplasma phagocytophilum, and for a novel relapsing fever-group spirochete related to Borrelia miyamotoi. Overall, 38.2% of adults and 12.5% of nymphs were infected with B. burgdorferi; 8.5% of adults and 4.2% of nymphs were infected with A. phagocytophilum; and 1.9% of adults and 0.8% of nymphs were infected with B. miyamotoi. In most cases, treatment with the 4-Poster device was not associated with changes in the prevalence of infection with any of these three microorganisms among nymphal or adult ticks. However, the density of nymphs infected with B. burgdorferi, and consequently the entomologic risk for Lyme disease, was reduced overall by 68% in treated areas compared to control areas among the five study sites at the end of the study. The frequency of bacterial coinfections in ticks was generally equal to the product of the proportion of ticks infected with a single bacterium, indicating that enzootic maintenance of these pathogens is independent. We conclude that controlling ticks on deer by self-application of acaricide results in an overall decrease in the human risk for exposure to these three bacterial agents, which is due solely to a reduction in tick density.
Viau, Sabrina; Chabrand, Lucie; Eap, Sandy; Lorant, Judith; Rouger, Karl; Goudaliez, Francis; Sumian, Chryslain; Delorme, Bruno
2017-01-01
We recently developed and characterized a standardized and clinical grade human Platelet Lysate (hPL) that constitutes an advantageous substitute for fetal bovine serum (FBS) for human mesenchymal stem cell (hMSC) expansion required in cell therapy procedures, avoiding xenogenic risks (virological and immunological) and ethical issues. Because of the progressive use of pathogen-reduced (PR) labile blood components, and the requirement of ensuring the viral safety of raw materials for cell therapy products, we evaluated the impact of the novel procedure known as THERAFLEX UV-Platelets for pathogen reduction on hPL quality (growth factors content) and efficacy (as a medium supplement for hMSC expansion). This technology is based on short-wave ultraviolet light (UV-C) that induces non-reversible damages in DNA and RNA of pathogens while preserving protein structures and functions, and has the main advantage of not needing the addition of any photosensitizing additives (that might secondarily interfere with hMSCs). We applied the THERAFLEX UV-Platelets procedure on fresh platelet concentrates (PCs) suspended in platelet additive solution and prepared hPL from these treated PCs. We compared the quality and efficacy of PR-hPL with the corresponding non-PR ones. We found no impact on the content of five cytokines tested (EGF, bFGF, PDGF-AB, VEGF and IGF-1) but a significant decrease in TGF-ß1 (-21%, n = 11, p<0.01). We performed large-scale culture of hMSCs from bone marrow (BM) during three passages and showed that hPL or PR-hPL at 8% triggered comparable BM-hMSC proliferation as FBS at 10% plus bFGF. Moreover, after proliferation of hMSCs in an hPL- or PR-hPL-containing medium, their profile of membrane marker expression, their clonogenic potential and immunosuppressive properties were maintained, in comparison with BM-hMSCs cultured under FBS conditions. The potential to differentiate towards the adipogenic and osteogenic lineages of hMSCs cultured in parallel in the three conditions also remained identical. We demonstrated the feasibility of using UV-C-treated platelets to subsequently obtain pathogen-reduced hPL, while preserving its optimal quality and efficacy for hMSC expansion in cell therapy applications.
Viau, Sabrina; Chabrand, Lucie; Eap, Sandy; Lorant, Judith; Rouger, Karl; Goudaliez, Francis; Sumian, Chryslain; Delorme, Bruno
2017-01-01
Background We recently developed and characterized a standardized and clinical grade human Platelet Lysate (hPL) that constitutes an advantageous substitute for fetal bovine serum (FBS) for human mesenchymal stem cell (hMSC) expansion required in cell therapy procedures, avoiding xenogenic risks (virological and immunological) and ethical issues. Because of the progressive use of pathogen-reduced (PR) labile blood components, and the requirement of ensuring the viral safety of raw materials for cell therapy products, we evaluated the impact of the novel procedure known as THERAFLEX UV-Platelets for pathogen reduction on hPL quality (growth factors content) and efficacy (as a medium supplement for hMSC expansion). This technology is based on short-wave ultraviolet light (UV-C) that induces non-reversible damages in DNA and RNA of pathogens while preserving protein structures and functions, and has the main advantage of not needing the addition of any photosensitizing additives (that might secondarily interfere with hMSCs). Methodology / Principal findings We applied the THERAFLEX UV-Platelets procedure on fresh platelet concentrates (PCs) suspended in platelet additive solution and prepared hPL from these treated PCs. We compared the quality and efficacy of PR-hPL with the corresponding non-PR ones. We found no impact on the content of five cytokines tested (EGF, bFGF, PDGF-AB, VEGF and IGF-1) but a significant decrease in TGF-ß1 (-21%, n = 11, p<0.01). We performed large-scale culture of hMSCs from bone marrow (BM) during three passages and showed that hPL or PR-hPL at 8% triggered comparable BM-hMSC proliferation as FBS at 10% plus bFGF. Moreover, after proliferation of hMSCs in an hPL- or PR-hPL-containing medium, their profile of membrane marker expression, their clonogenic potential and immunosuppressive properties were maintained, in comparison with BM-hMSCs cultured under FBS conditions. The potential to differentiate towards the adipogenic and osteogenic lineages of hMSCs cultured in parallel in the three conditions also remained identical. Conclusion / Significance We demonstrated the feasibility of using UV-C-treated platelets to subsequently obtain pathogen-reduced hPL, while preserving its optimal quality and efficacy for hMSC expansion in cell therapy applications. PMID:28763452
Williams, Michael S; Cao, Yong; Ebel, Eric D
2013-07-15
Levels of pathogenic organisms in food and water have steadily declined in many parts of the world. A consequence of this reduction is that the proportion of samples that test positive for the most contaminated product-pathogen pairings has fallen to less than 0.1. While this is unequivocally beneficial to public health, datasets with very few enumerated samples present an analytical challenge because a large proportion of the observations are censored values. One application of particular interest to risk assessors is the fitting of a statistical distribution function to datasets collected at some point in the farm-to-table continuum. The fitted distribution forms an important component of an exposure assessment. A number of studies have compared different fitting methods and proposed lower limits on the proportion of samples where the organisms of interest are identified and enumerated, with the recommended lower limit of enumerated samples being 0.2. This recommendation may not be applicable to food safety risk assessments for a number of reasons, which include the development of new Bayesian fitting methods, the use of highly sensitive screening tests, and the generally larger sample sizes found in surveys of food commodities. This study evaluates the performance of a Markov chain Monte Carlo fitting method when used in conjunction with a screening test and enumeration of positive samples by the Most Probable Number technique. The results suggest that levels of contamination for common product-pathogen pairs, such as Salmonella on poultry carcasses, can be reliably estimated with the proposed fitting method and samples sizes in excess of 500 observations. The results do, however, demonstrate that simple guidelines for this application, such as the proportion of positive samples, cannot be provided. Published by Elsevier B.V.
Reyher, K K; Haine, D; Dohoo, I R; Revie, C W
2012-11-01
Major mastitis pathogens such as Staphylococcus aureus, Streptococcus agalactiae, Streptococcus uberis, Streptococcus dysgalactiae, and the coliforms are usually considered more virulent and damaging to the udder than minor mastitis pathogens such as Corynebacterium bovis and coagulase-negative staphylococci (CNS). The current literature contains several studies detailing analyses with conflicting results as to whether intramammary infection (IMI) with the minor pathogens decreases, increases, or has no effect on the risk of a quarter acquiring a new intramammary infection (NIMI) with a major pathogen. To investigate the available scientific evidence regarding the effect of IMI with minor pathogens on the acquisition of NIMI with major pathogens, a systematic review and meta-analysis were conducted. The total extant English- and French-language literature in electronic databases was searched and all publications cited by relevant papers were investigated. Results from 68 studies were extracted from 38 relevant papers. Random-effects models were used to investigate the effects of CNS and C. bovis on acquisition of new IMI with any of the major pathogens, as well as individually for the minor pathogens and Staph. aureus. Significant heterogeneity among studies exists, some of which could be accounted for by using meta-regression. Overall, observational studies showed no effect, whereas challenge studies showed strong and significant protective effects, specifically when major pathogens were introduced into the mammary gland via methods bypassing the teat end. Underlying risk can account for several unmeasured factors, and studies with higher underlying risk found more protective effects of minor pathogens. Larger doses of challenge organisms reduced the protective effect of minor pathogens, and studies with more stringent diagnostic criteria for pathogen IMI identified less protection. Smaller studies (those utilizing fewer than 40 cows) also showed a greater protective effect than larger studies. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Calle, Alexandra; Porto-Fett, Anna C S; Shoyer, Bradley A; Luchansky, John B; Thippareddi, Harshavardhan
2015-12-01
Boneless beef rib eye roasts were surface inoculated on the fat side with ca. 5.7 log CFU/g of a five-strain cocktail of Salmonella for subsequent searing, cooking, and warm holding using preparation methods practiced by restaurants surveyed in a medium-size Midwestern city. A portion of the inoculated roasts was then passed once through a mechanical blade tenderizer. For both intact and nonintact roasts, searing for 15 min at 260°C resulted in reductions in Salmonella populations of ca. 0.3 to 1.3 log CFU/g. For intact (nontenderized) rib eye roasts, cooking to internal temperatures of 37.8 or 48.9°C resulted in additional reductions of ca. 3.4 log CFU/g. For tenderized (nonintact) rib eye roasts, cooking to internal temperatures of 37.8 or 48.9°C resulted in additional reductions of ca. 3.1 or 3.4 log CFU/g, respectively. Pathogen populations remained relatively unchanged for intact roasts cooked to 37.8 or 48.9°C and for nonintact roasts cooked to 48.9°C when held at 60.0°C for up to 8 h. In contrast, pathogen populations increased ca. 2.0 log CFU/g in nonintact rib eye cooked to 37.8°C when held at 60.0°C for 8 h. Thus, cooking at low temperatures and extended holding at relatively low temperatures as evaluated herein may pose a food safety risk to consumers in terms of inadequate lethality and/or subsequent outgrowth of Salmonella, especially if nonintact rib eye is used in the preparation of prime rib, if on occasion appreciable populations of Salmonella are present in or on the meat, and/or if the meat is not cooked adequately throughout.
Crook, Jennifer A; Rossitto, Paul V; Parko, Jared; Koutchma, Tatiana; Cullor, James S
2015-06-01
Nonthermal technologies are being investigated as viable alternatives to, or supplemental utilization, with thermal pasteurization in the food-processing industry. In this study, the effect of ultraviolet (UV)-C light on the inactivation of seven milkborne pathogens (Listeria monocytogenes, Serratia marcescens, Salmonella Senftenberg, Yersinia enterocolitica, Aeromonas hydrophila, Escherichia coli, and Staphylococcus aureus) was evaluated. The pathogens were suspended in ultra-high-temperature whole milk and treated at UV doses between 0 and 5000 J/L at a flow rate of 4300 L/h in a thin-film turbulent flow-through pilot system. Of the seven milkborne pathogens tested, L. monocytogenes was the most UV resistant, requiring 2000 J/L of UV-C exposure to reach a 5-log reduction. The most sensitive bacterium was S. aureus, requiring only 1450 J/L to reach a 5-log reduction. This study demonstrated that the survival curves were nonlinear. Sigmoidal inactivation curves were observed for all tested bacterial strains. Nonlinear modeling of the inactivation data was a better fit than the traditional log-linear approach. Results obtained from this study indicate that UV illumination has the potential to be used as a nonthermal method to reduce microorganism populations in milk.
Detection of hepatitis E virus and other livestock-related pathogens in Iowa streams
Givens, Carrie E.; Kolpin, Dana W.; Borchardt, Mark A.; Duris, Joseph W.; Moorman, Thomas B.; Spencer, Susan K.
2016-01-01
Manure application is a source of pathogens to the environment. Through overland runoff and tile drainage, zoonotic pathogens can contaminate surface water and streambed sediment and could affect both wildlife and human health. This study examined the environmental occurrence of gene markers for livestock-related bacterial, protozoan, and viral pathogens and antibiotic resistance in surface waters within the South Fork Iowa River basin before and after periods of swine manure application on agricultural land. Increased concentrations of indicator bacteria after manure application exceeding Iowa's state bacteria water quality standards suggest that swine manure contributes to diminished water quality and may pose a risk to human health. Additionally, the occurrence of HEV and numerous bacterial pathogen genes for Escherichia coli, Enterococcus spp., Salmonella sp., and Staphylococcus aureus in both manure samples and in corresponding surface water following periods of manure application suggests a potential role for swine in the spreading of zoonotic pathogens to the surrounding environment. During this study, several zoonotic pathogens were detected including Shiga-toxin producing E. coli, Campylobacter jejuni, pathogenic enterococci, and S. aureus; all of which can pose mild to serious health risks to swine, humans, and other wildlife. This research provides the foundational understanding required for future assessment of the risk to environmental health from livestock-related zoonotic pathogen exposures in this region. This information could also be important for maintaining swine herd biosecurity and protecting the health of wildlife near swine facilities.
Kiryluk, Krzysztof; Li, Yifu; Scolari, Francesco; Sanna-Cherchi, Simone; Choi, Murim; Verbitsky, Miguel; Fasel, David; Lata, Sneh; Prakash, Sindhuri; Shapiro, Samantha; Fischman, Clara; Snyder, Holly J.; Appel, Gerald; Izzi, Claudia; Viola, Battista Fabio; Dallera, Nadia; Vecchio, Lucia Del; Barlassina, Cristina; Salvi, Erika; Bertinetto, Francesca Eleonora; Amoroso, Antonio; Savoldi, Silvana; Rocchietti, Marcella; Amore, Alessandro; Peruzzi, Licia; Coppo, Rosanna; Salvadori, Maurizio; Ravani, Pietro; Magistroni, Riccardo; Ghiggeri, Gian Marco; Caridi, Gianluca; Bodria, Monica; Lugani, Francesca; Allegri, Landino; Delsante, Marco; Maiorana, Mariarosa; Magnano, Andrea; Frasca, Giovanni; Boer, Emanuela; Boscutti, Giuliano; Ponticelli, Claudio; Mignani, Renzo; Marcantoni, Carmelita; Di Landro, Domenico; Santoro, Domenico; Pani, Antonello; Polci, Rosaria; Feriozzi, Sandro; Chicca, Silvana; Galliani, Marco; Gigante, Maddalena; Gesualdo, Loreto; Zamboli, Pasquale; Maixnerová, Dita; Tesar, Vladimir; Eitner, Frank; Rauen, Thomas; Floege, Jürgen; Kovacs, Tibor; Nagy, Judit; Mucha, Krzysztof; Pączek, Leszek; Zaniew, Marcin; Mizerska-Wasiak, Małgorzata; Roszkowska-Blaim, Maria; Pawlaczyk, Krzysztof; Gale, Daniel; Barratt, Jonathan; Thibaudin, Lise; Berthoux, Francois; Canaud, Guillaume; Boland, Anne; Metzger, Marie; Panzer, Ulf; Suzuki, Hitoshi; Goto, Shin; Narita, Ichiei; Caliskan, Yasar; Xie, Jingyuan; Hou, Ping; Chen, Nan; Zhang, Hong; Wyatt, Robert J.; Novak, Jan; Julian, Bruce A.; Feehally, John; Stengel, Benedicte; Cusi, Daniele; Lifton, Richard P.; Gharavi, Ali G.
2014-01-01
We performed a genome-wide association study (GWAS) of IgA nephropathy (IgAN), the most common form of glomerulonephritis, with discovery and follow-up in 20,612 individuals of European and East Asian ancestry. We identified six novel genome-wide significant associations, four in ITGAM-ITGAX, VAV3 and CARD9 and two new independent signals at HLA-DQB1 and DEFA. We replicated the nine previously reported signals, including known SNPs in the HLA-DQB1 and DEFA loci. The cumulative burden of risk alleles is strongly associated with age at disease onset. Most loci are either directly associated with risk of inflammatory bowel disease (IBD) or maintenance of the intestinal epithelial barrier and response to mucosal pathogens. The geo-spatial distribution of risk alleles is highly suggestive of multi-locus adaptation and the genetic risk correlates strongly with variation in local pathogens, particularly helminth diversity, suggesting a possible role for host-intestinal pathogen interactions in shaping the genetic landscape of IgAN. PMID:25305756
Quarantine Regulations and the Impact of Modern Detection Methods.
Martin, Robert R; Constable, Fiona; Tzanetakis, Ioannis E
2016-08-04
Producers worldwide need access to the best plant varieties and cultivars available to be competitive in global markets. This often means moving plants across international borders as soon as they are available. At the same time, quarantine agencies are tasked with minimizing the risk of introducing exotic pests and pathogens along with imported plant material, with the goal to protect domestic agriculture and native fauna and flora. These two drivers, the movement of more plant material and reduced risk of pathogen introduction, are at odds. Improvements in large-scale or next-generation sequencing (NGS) and bioinformatics for data analysis have resulted in improved speed and accuracy of pathogen detection that could facilitate plant trade with reduced risk of pathogen movement. There are concerns to be addressed before NGS can replace existing tools used for pathogen detection in plant quarantine and certification programs. Here, we discuss the advantages and possible pitfalls of this technology for meeting the needs of plant quarantine and certification.
[Occupational exposure to blood in multiple trauma care].
Wicker, S; Wutzler, S; Schachtrupp, A; Zacharowski, K; Scheller, B
2015-01-01
Trauma care personnel are at risk of occupational exposure to blood-borne pathogens. Little is known regarding compliance with standard precautions or occupational exposure to blood and body fluids among multiple trauma care personnel in Germany. Compliance rates of multiple trauma care personnel in applying standard precautions, knowledge about transmission risks of blood-borne pathogens, perceived risks of acquiring hepatitis B, hepatitis C and human immunodeficiency virus (HIV) and the personal attitude towards testing of the index patient for blood-borne pathogens after a needlestick injury were evaluated. In the context of an advanced multiple trauma training an anonymous questionnaire was administered to the participants. Almost half of the interviewees had sustained a needlestick injury within the last 12 months. Approximately three quarters of the participants were concerned about the risk of HIV and hepatitis. Trauma care personnel had insufficient knowledge of the risk of blood-borne pathogens, overestimated the risk of hepatitis C infection and underused standard precautionary measures. Although there was excellent compliance for using gloves, there was poor compliance in using double gloves (26.4 %), eye protectors (19.7 %) and face masks (15.8 %). The overwhelming majority of multiple trauma care personnel believed it is appropriate to test an index patient for blood-borne pathogens following a needlestick injury. The process of treatment in prehospital settings is less predictable than in other settings in which invasive procedures are performed. Periodic training and awareness programs for trauma care personnel are required to increase the knowledge of occupational infections and the compliance with standard precautions. The legal and ethical aspects of testing an index patient for blood-borne pathogens after a needlestick injury of a healthcare worker have to be clarified in Germany.
Endogenous System Microbes as Treatment Process Indicators for Decentralized Non-potable Water Reuse
Monitoring the efficacy of treatment strategies to remove pathogens in decentralized systems remains a challenge. Evaluating log reduction targets by measuring pathogen levels is hampered by their sporadic and low occurrence rates. Fecal indicator bacteria are used in centraliz...
Xu, Sheng-Gen; Mao, Zhao-Guang; Liu, Bin-Sheng; Zhu, Hui-Hua; Pan, Hui-Lin
2015-02-01
Widespread overuse and inappropriate use of antibiotics contribute to increasingly antibiotic-resistant pathogens and higher health care costs. It is not clear whether routine antibiotic prophylaxis can reduce the rate of surgical site infection (SSI) in low-risk patients undergoing orthopaedic surgery. We designed a simple scorecard to grade SSI risk factors and determined whether routine antibiotic prophylaxis affects SSI occurrence during open reduction and internal fixation (ORIF) orthopaedic surgeries in trauma patients at low risk of developing SSI. The SSI risk scorecard (possible total points ranged from 5 to 25) was designed to take into account a patient's general health status, the primary cause of fractures, surgical site tissue condition or wound class, types of devices implanted, and surgical duration. Patients with a low SSI risk score (≤8 points) who were undergoing clean ORIF surgery were divided into control (routine antibiotic treatment, cefuroxime) and evaluation (no antibiotic treatment) groups and followed up for 13-17 months after surgery. The infection rate was much higher in patients with high SSI risk scores (≥9 points) than in patients with low risk scores assigned to the control group (10.7% vs. 2.2%, P<0.0001). SSI occurred in 11 of 499 patients in the control group and in 13 of 534 patients in the evaluation group during the follow-up period of 13-17 months. The SSI occurrence rate did not differ significantly (2.2% vs. 2.4%, P=0.97) between the control and evaluation groups. Routine antibiotic prophylaxis does not significantly decrease the rate of SSI in ORIF surgical patients with a low risk score. Implementation of this scoring system could guide the rational use of perioperative antibiotics and ultimately reduce antibiotic resistance, health care costs, and adverse reactions to antibiotics. Copyright © 2014 Elsevier Ltd. All rights reserved.
Impacts of Climate Change on Indirect Human Exposure to Pathogens and Chemicals from Agriculture
Boxall, Alistair B.A.; Hardy, Anthony; Beulke, Sabine; Boucard, Tatiana; Burgin, Laura; Falloon, Peter D.; Haygarth, Philip M.; Hutchinson, Thomas; Kovats, R. Sari; Leonardi, Giovanni; Levy, Leonard S.; Nichols, Gordon; Parsons, Simon A.; Potts, Laura; Stone, David; Topp, Edward; Turley, David B.; Walsh, Kerry; Wellington, Elizabeth M.H.; Williams, Richard J.
2009-01-01
Objective Climate change is likely to affect the nature of pathogens and chemicals in the environment and their fate and transport. Future risks of pathogens and chemicals could therefore be very different from those of today. In this review, we assess the implications of climate change for changes in human exposures to pathogens and chemicals in agricultural systems in the United Kingdom and discuss the subsequent effects on health impacts. Data sources In this review, we used expert input and considered literature on climate change; health effects resulting from exposure to pathogens and chemicals arising from agriculture; inputs of chemicals and pathogens to agricultural systems; and human exposure pathways for pathogens and chemicals in agricultural systems. Data synthesis We established the current evidence base for health effects of chemicals and pathogens in the agricultural environment; determined the potential implications of climate change on chemical and pathogen inputs in agricultural systems; and explored the effects of climate change on environmental transport and fate of different contaminant types. We combined these data to assess the implications of climate change in terms of indirect human exposure to pathogens and chemicals in agricultural systems. We then developed recommendations on future research and policy changes to manage any adverse increases in risks. Conclusions Overall, climate change is likely to increase human exposures to agricultural contaminants. The magnitude of the increases will be highly dependent on the contaminant type. Risks from many pathogens and particulate and particle-associated contaminants could increase significantly. These increases in exposure can, however, be managed for the most part through targeted research and policy changes. PMID:19440487
Gregg A. DeNitto; Philip Cannon; Andris Eglitis; Jessie A. Glaeser; Helen Maffei; Sheri Smith
2015-01-01
The unmitigated risk potential of the introduction of exotic insects and pathogens to Hawai'i was evaluated for its impact on native plants, specifically Acacia koa, Cibotium spp., Dicranopteris linearis, Diospyros sandwicensis, Dodonaea viscosa, ...
Microbial risk assessment in heterogeneous aquifers: 2. Infection risk sensitivity
NASA Astrophysics Data System (ADS)
Molin, S.; Cvetkovic, V.; StenströM, T. A.
2010-05-01
The entire chain of events of human disease transmitted through contaminated water, from pathogen introduction into the source (E. coli, rotavirus, and Hepatitis A), pathogen migration through the aquifer pathway, to ingestion via a supply well, and finally, the potential infection in the human host, is investigated. The health risk calculations are based on a relevant hazardous event with safe setback distances estimated by considering the infection risk from peak exposure in compliance with an acceptable level defined by a regulatory agency. A site-specific hypothetical scenario is illustrated for an aquifer with similar characteristics as the Cape Cod site, Massachusetts (United States). Relatively large variation of safe distances for the three index pathogens is found; individually, none of the index pathogens could predict the safe distance under the wide range of conditions investigated. It is shown that colloid filtration theory (CFT) with spatially variable attachment-detachment rates yields significantly different results from the effective CFT model (i.e., assuming spatially constant parameters).
Transmission of Bacterial Zoonotic Pathogens between Pets and Humans: The Role of Pet Food.
Lambertini, Elisabetta; Buchanan, Robert L; Narrod, Clare; Pradhan, Abani K
2016-01-01
Recent Salmonella outbreaks associated with dry pet food and treats raised the level of concern for these products as vehicle of pathogen exposure for both pets and their owners. The need to characterize the microbiological and risk profiles of this class of products is currently not supported by sufficient specific data. This systematic review summarizes existing data on the main variables needed to support an ingredients-to-consumer quantitative risk model to (1) describe the microbial ecology of bacterial pathogens in the dry pet food production chain, (2) estimate pet exposure to pathogens through dry food consumption, and (3) assess human exposure and illness incidence due to contact with pet food and pets in the household. Risk models populated with the data here summarized will provide a tool to quantitatively address the emerging public health concerns associated with pet food and the effectiveness of mitigation measures. Results of such models can provide a basis for improvements in production processes, risk communication to consumers, and regulatory action.
Gao, Ke; Lai, Yutian; Huang, Jian; Wang, Yifan; Wang, Xiaowei; Che, Guowei
2017-04-20
Surgical procedure is the main method of treating lung cancer. Meanwhile, postoperative pneumonia (POP) is the major cause of perioperative mortality in lung cancer surgery. The preoperative pathogenic airway bacterial colonization is an independent risk factor causing postoperative pulmonary complications (PPC). This cross-sectional study aimed to explore the relationship between preoperative pathogenic airway bacterial colonization and POP in lung cancer and to identify the high-risk factors of preoperative pathogenic airway bacterial colonization. A total of 125 patients with non-small cell lung cancer (NSCLC) underwent thoracic surgery in six hospitals of Chengdu between May 2015 and January 2016. Preoperative pathogenic airway bacterial colonization was detected in all patients via fiber bronchoscopy. Patients' PPC, high-risk factors, clinical characteristics, and the serum surfactant protein D (SP-D) level were also analyzed. The incidence of preoperative pathogenic airway bacterial colonization among NSCLC patients was 15.2% (19/125). Up to 22 strains were identified in the colonization positive group, with Gram-negative bacteria being dominant (86.36%, 19/22). High-risk factors of pathogenic airway bacterial colonization were age (≥75 yr) and smoking index (≥400 cigarettes/year). PPC incidence was significantly higher in the colonization-positive group (42.11%, 8/19) than that in the colonization-negative group (16.04%, 17/106)(P=0.021). POP incidence was significantly higher in the colonization-positive group (26.32%, 5/19) than that in the colonization-negative group (6.60%, 7/106)(P=0.019). The serum SP-D level of patients in the colonization-positive group was remarkably higher than that in the colonization-negative group [(31.25±6.09) vs (28.17±5.23)](P=0.023). The incidence of preoperative pathogenic airway bacterial colonization among NSCLC patients with POP was 41.67% (5/12). This value was 3.4 times higher than that among the patients without POP (OR=3.363, 95%CI: 1.467-7.711). An intimate correlation was observed between POP and pathogenic airway bacterial colonization in lung cancer. The high-risk factors of pathogenic airway bacterial colonization were age and smoking index.
Burch, Tucker R.; Spencer, Susan K.; Stokdyk, Joel P.; Kieke, Burney A.; Larson, Rebecca A.; Firnstahl, Aaron D.; Rule, Ana M.
2017-01-01
Background: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. Objectives: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. Methods: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irrigation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. Results: Median risk estimates from Monte Carlo simulations ranged from 10−5 to 10−2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. Conclusions: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk. https://doi.org/10.1289/EHP283 PMID:28885976
Burch, Tucker R; Spencer, Susan K.; Stokdyk, Joel; Kieke, Burney A; Larson, Rebecca A; Firnstahl, Aaron; Rule, Ana M; Borchardt, Mark A.
2017-01-01
BACKGROUND: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irri- gation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS: Median risk estimates from Monte Carlo simulations ranged from 10−5 to 10−2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk.
What’s the risk? Identifying potential human pathogens within grey-headed flying foxes faeces
Galbraith, Penelope; Coutts, Scott; Prosser, Toby; Boyce, John; McCarthy, David T.
2018-01-01
Pteropus poliocephalus (grey-headed flying foxes) are recognised vectors for a range of potentially fatal human pathogens. However, to date research has primarily focused on viral disease carriage, overlooking bacterial pathogens, which also represent a significant human disease risk. The current study applied 16S rRNA amplicon sequencing, community analysis and a multi-tiered database OTU picking approach to identify faecal-derived zoonotic bacteria within two colonies of P. poliocephalus from Victoria, Australia. Our data show that sequences associated with Enterobacteriaceae (62.8% ± 24.7%), Pasteurellaceae (19.9% ± 25.7%) and Moraxellaceae (9.4% ± 11.8%) dominate flying fox faeces. Further colony specific differences in bacterial faecal colonisation patterns were also identified. In total, 34 potential pathogens, representing 15 genera, were identified. However, species level definition was only possible for Clostridium perfringens, which likely represents a low infectious risk due to the low proportion observed within the faeces and high infectious dose required for transmission. In contrast, sequences associated with other pathogenic species clusters such as Haemophilus haemolyticus-H. influenzae and Salmonella bongori-S. enterica, were present at high proportions in the faeces, and due to their relatively low infectious doses and modes of transmissions, represent a greater potential human disease risk. These analyses of the microbial community composition of Pteropus poliocephalus have significantly advanced our understanding of the potential bacterial disease risk associated with flying foxes and should direct future epidemiological and quantitative microbial risk assessments to further define the health risks presented by these animals. PMID:29360880
Orthobunyavirus antibodies among humans in selected parts of the Rift Valley and northeastern Kenya.
Odhiambo, Collins; Venter, Marietjie; Swanepoel, Robert; Sang, Rosemary
2015-05-01
Ngari, Bunyamwera, Ilesha, and Germiston viruses are among the mosquito-borne human pathogens in the Orthobunyavirus genus, family Bunyaviridae, associated with febrile illness. Although the four orthobunyaviruses have been isolated from mosquito and/or tick vectors sampled from different geographic regions in Kenya, little is known of human exposure in such areas. We conducted a serologic investigation to determine whether orthobunyaviruses commonly infect humans in Kenya. Orthobunyavirus-specific antibodies were detected by plaque reduction neutralization tests in 89 (25.8%) of 345 persons tested. Multivariable analysis revealed age and residence in northeastern Kenya as risk factors. Implementation of acute febrile illness surveillance in northeastern Kenya will help to detect such infections.
Hemovigilance monitoring of platelet septic reactions with effective bacterial protection systems.
Benjamin, Richard J; Braschler, Thomas; Weingand, Tina; Corash, Laurence M
2017-12-01
Delayed, large-volume bacterial culture and amotosalen/ultraviolet-A light pathogen reduction are effective at reducing the risk of bacterial proliferation in platelet concentrates (PCs). Hemovigilance programs continue to receive reports of suspected septic transfusion reactions, most with low imputability. Here, we compile national hemovigilance data to determine the relative efficacy of these interventions. Annual reports from the United Kingdom, France, Switzerland, and Belgium were reviewed between 2005 and 2016 to assess the risk of bacterial contamination and septic reactions. Approximately 1.65 million delayed, large-volume bacterial culture-screened PCs in the United Kingdom and 2.3 million amotosalen/ultraviolet-A-treated PCs worldwide were issued with no reported septic fatalities. One definite, one possible, and 12 undetermined/indeterminate septic reactions and eight contaminated "near misses" were reported with delayed, large-volume bacterial cultures between 2011 and 2016, for a lower false-negative culture rate than that in the previous 5 years (5.4 vs. 16.3 per million: odds ratio, 3.0; 95% confidence interval, 1.4-6.5). Together, the Belgian, Swiss, and French hemovigilance programs documented zero probable or definite/certain septic reactions with 609,290 amotosalen/ultraviolet-A-treated PCs (<1.6 per million). The rates were significantly lower than those reported with concurrently transfused, nonpathogen-reduced PCs in Belgium (<4.4 vs. 35.6 per million: odds ratio, 8.1; 95% confidence interval,1.1-353.3) and with historic septic reaction rates in Switzerland (<6.0 vs. 82.9 per million: odds ratio, 13.9; 95% confidence interval, 2.1-589.2), and the rates tended to be lower than those from concurrently transfused, nonpathogen-reduced PCs in France (<4.7 vs. 19.0 per million: odds ratio, 4.1; 95% confidence interval, 0.7-164.3). Pathogen reduction and bacterial culture both reduced the incidence of septic reactions, although under-reporting and strict imputability criteria resulted in an underestimation of risk. © 2017 The Authors Transfusion published by Wiley Periodicals, Inc. on behalf of AABB.
Petterson, S R
2016-02-01
The aim of this study was to develop a modified quantitative microbial risk assessment (QMRA) framework that could be applied as a decision support tool to choose between alternative drinking water interventions in the developing context. The impact of different household water treatment (HWT) interventions on the overall incidence of diarrheal disease and disability adjusted life years (DALYs) was estimated, without relying on source water pathogen concentration as the starting point for the analysis. A framework was developed and a software tool constructed and then implemented for an illustrative case study for Nepal based on published scientific data. Coagulation combined with free chlorine disinfection provided the greatest estimated health gains in the short term; however, when long-term compliance was incorporated into the calculations, the preferred intervention was porous ceramic filtration. The model demonstrates how the QMRA framework can be used to integrate evidence from different studies to inform management decisions, and in particular to prioritize the next best intervention with respect to estimated reduction in diarrheal incidence. This study only considered HWT interventions; it is recognized that a systematic consideration of sanitation, recreation, and drinking water pathways is important for effective management of waterborne transmission of pathogens, and the approach could be expanded to consider the broader water-related context. © 2015 Society for Risk Analysis.
Schijven, Jack; Bouwknegt, Martijn; de Roda Husman, Ana Maria; Rutjes, Saskia; Sudre, Bertrand; Suk, Jonathan E; Semenza, Jan C
2013-12-01
Climate change may impact waterborne and foodborne infectious disease, but to what extent is uncertain. Estimating climate-change-associated relative infection risks from exposure to viruses, bacteria, or parasites in water or food is critical for guiding adaptation measures. We present a computational tool for strategic decision making that describes the behavior of pathogens using location-specific input data under current and projected climate conditions. Pathogen-pathway combinations are available for exposure to norovirus, Campylobacter, Cryptosporidium, and noncholera Vibrio species via drinking water, bathing water, oysters, or chicken fillets. Infection risk outcomes generated by the tool under current climate conditions correspond with those published in the literature. The tool demonstrates that increasing temperatures lead to increasing risks for infection with Campylobacter from consuming raw/undercooked chicken fillet and for Vibrio from water exposure. Increasing frequencies of drought generally lead to an elevated infection risk of exposure to persistent pathogens such as norovirus and Cryptosporidium, but decreasing risk of exposure to rapidly inactivating pathogens, like Campylobacter. The opposite is the case with increasing annual precipitation; an upsurge of heavy rainfall events leads to more peaks in infection risks in all cases. The interdisciplinary tool presented here can be used to guide climate change adaptation strategies focused on infectious diseases. © 2013 Society for Risk Analysis.
Dagher, Fadi
2017-01-01
Traditional surface disinfectants that have long been applied in medicine, animal husbandry, manufacturing and institutions are inconvenient at best and dangerous at worst. Moreover, some of these substances have adverse environmental impacts: for example, quaternary ammonium compounds (“quats”) are reproductive toxicants in both fish and mammals. Halogens are corrosive both to metals and living tissues, are highly reactive, can be readily neutralized by metals, and react with organic matter to form toxic, persistent by-products such as dioxins and furans. Aldehydes may be carcinogenic to both human and animals upon repeated exposures, are corrosive, cross-link living tissues and many synthetic materials, and may lose efficacy when pathogens enzymatically adapt to them. Alcohols are flammable and volatile and can be enzymatically degraded by certain bacterial pathogens. Quats are highly irritating to mucous membranes and over time can induce pathogen resistance, especially if they are not alternated with functionally different disinfectants. In contrast, peracetic acid (PAA), a potent oxidizer, liberates hydrogen peroxide (itself a disinfectant), biodegrades to carbon dioxide, water and oxygen, and is at least as efficacious as contact biocides e.g., halogens and aldehydes. Nevertheless, the standard form of liquid PAA is highly corrosive, is neutralized by metals and organic matter, gives off noxious odours and must be stored in vented containers. For the reasons stated above, Bioxy formulations were developed, a series of powder forms of PAA, which are odourless, stable in storage and safe to transport and handle. They generate up to 10% PAA in situ when dissolved in water. A 0.2% aqueous solution of Bioxy (equivalent to 200 ppm PAA) effected a 6.76 log reduction in Methicillin-resistant Staphylococcus aureus (MRSA) within 2 minutes after application. A 5% aqueous solution of Bioxy achieved a 3.93 log reduction in the bovine tuberculosis bacillus Mycobacterium bovis, within 10 minutes after contact. A 1% solution of Bioxy reduced vancomycin-resistant enterococci (VRE) and Pseudomonas aeruginosa by 6.31 and 7.18 logs, respectively, within 3 minutes after application. A 0.5% solution of Bioxy inactivated porcine epidemic diarrhea virus (PEDV) within 15 minutes of contact, and a 5% solution of Bioxy realized a 5.36 log reduction in the spores of Clostridium difficile within 10 minutes of application. In summary, Bioxy is safe and easy to transport and store, poses negligible human, animal and environmental health risks, shows high levels of pathogen control efficacy and does not induce microbial resistance. Further investigations are recommended to explore its use as an industrial biocide. PMID:28207828
Dagher, Dori; Ungar, Ken; Robison, Richard; Dagher, Fadi
2017-01-01
Traditional surface disinfectants that have long been applied in medicine, animal husbandry, manufacturing and institutions are inconvenient at best and dangerous at worst. Moreover, some of these substances have adverse environmental impacts: for example, quaternary ammonium compounds ("quats") are reproductive toxicants in both fish and mammals. Halogens are corrosive both to metals and living tissues, are highly reactive, can be readily neutralized by metals, and react with organic matter to form toxic, persistent by-products such as dioxins and furans. Aldehydes may be carcinogenic to both human and animals upon repeated exposures, are corrosive, cross-link living tissues and many synthetic materials, and may lose efficacy when pathogens enzymatically adapt to them. Alcohols are flammable and volatile and can be enzymatically degraded by certain bacterial pathogens. Quats are highly irritating to mucous membranes and over time can induce pathogen resistance, especially if they are not alternated with functionally different disinfectants. In contrast, peracetic acid (PAA), a potent oxidizer, liberates hydrogen peroxide (itself a disinfectant), biodegrades to carbon dioxide, water and oxygen, and is at least as efficacious as contact biocides e.g., halogens and aldehydes. Nevertheless, the standard form of liquid PAA is highly corrosive, is neutralized by metals and organic matter, gives off noxious odours and must be stored in vented containers. For the reasons stated above, Bioxy formulations were developed, a series of powder forms of PAA, which are odourless, stable in storage and safe to transport and handle. They generate up to 10% PAA in situ when dissolved in water. A 0.2% aqueous solution of Bioxy (equivalent to 200 ppm PAA) effected a 6.76 log reduction in Methicillin-resistant Staphylococcus aureus (MRSA) within 2 minutes after application. A 5% aqueous solution of Bioxy achieved a 3.93 log reduction in the bovine tuberculosis bacillus Mycobacterium bovis, within 10 minutes after contact. A 1% solution of Bioxy reduced vancomycin-resistant enterococci (VRE) and Pseudomonas aeruginosa by 6.31 and 7.18 logs, respectively, within 3 minutes after application. A 0.5% solution of Bioxy inactivated porcine epidemic diarrhea virus (PEDV) within 15 minutes of contact, and a 5% solution of Bioxy realized a 5.36 log reduction in the spores of Clostridium difficile within 10 minutes of application. In summary, Bioxy is safe and easy to transport and store, poses negligible human, animal and environmental health risks, shows high levels of pathogen control efficacy and does not induce microbial resistance. Further investigations are recommended to explore its use as an industrial biocide.
Nonindigenous Pathogenic Shrimp Virus Introductions into the United States: Developing a Qualitative Ecological Risk Assessment. Austin, R.K.; van der Schalie, W.R.; U.S. Environmental Protection Agency, Washington, DC; Menzie, C.; Menzie-Cura and Associates, Chelmsford, MA; Fair...
USEPA PATHOGEN EQUIVALENCY COMMITTEE RETREAT
The Pathogen Equivalency Committee held its retreat from September 20-21, 2005 at Hueston Woods State Park in College Corner, Ohio. This presentation will update the PEC’s membership on emerging pathogens, analytical methods, disinfection techniques, risk analysis, preparat...
Microbial risk assessment in heterogeneous aquifers: 1. Pathogen transport
NASA Astrophysics Data System (ADS)
Molin, S.; Cvetkovic, V.
2010-05-01
Pathogen transport in heterogeneous aquifers is investigated for microbial risk assessment. A point source with time-dependent input of pathogens is assumed, exemplified as a simple on-site sanitation installation, intermingled with water supply wells. Any pathogen transmission pathway (realization) to the receptor from a postulated infection hazard is viewed as a random event, with the hydraulic conductivity varying spatially. For aquifers where VAR[lnK] < 1 and the integral scale is finite, we provide relatively simple semianalytical expressions for pathogen transport that incorporate the colloid filtration theory. We test a wide range of Damkohler numbers in order to assess the significance of rate limitations on the aquifer barrier function. Even slow immobile inactivation may notably affect the retention of pathogens. Analytical estimators for microbial peak discharge are evaluated and are shown to be applicable using parameters representative of rotavirus and Hepatitis A with input of 10-20 days duration.
2011-01-01
In the past the root rot pathogen Roesleria subterranea (Ascomycota) was generally considered as a minor parasite, a view with which we were often confronted during field work in German wine-growing regions where this ascomycete recently caused serious problems in established vineyards and at replant sites. To irrevocably demonstrate that R. subterranea is not a minor, but a primary pathogen of grapevines (and fruit trees) a pest risk analysis was carried out according to the guidelines defined by EPPO standard series PM 5, which defines the information needed, and contains standardised, detailed key questions and a decision support scheme for risk analysis. Following the provided decision scheme, it becomes apparent that R. subterranea must be considered as a serious, primary pathogen for grapevines and fruit trees that can cause massive economic losses. Based on the literature, the pathogen seems to be ubiquitous in wine growing regions in cool climates of the northern hemisphere. It is likely that because of its growth below ground, the small fruiting bodies, and ambiguous symptoms above ground, R. subterranea has been overlooked in the past and therefore, has not been considered as primary pathogen for grapevine. Available published information together with experience from field trials was implemented into a diagnostic decision scheme which will, together with the comprehensive literature provided, be the basis (a) to implement quick and efficient diagnosis of this pathogen in the field and (b) to conduct risk analysis and management in areas where R. subterranea has not established yet. PMID:22318129
Mills, Freya; Petterson, Susan; Norman, Guy
2018-01-01
Public health benefits are often a key political driver of urban sanitation investment in developing countries, however, pathogen flows are rarely taken systematically into account in sanitation investment choices. While several tools and approaches on sanitation and health risks have recently been developed, this research identified gaps in their ability to predict faecal pathogen flows, to relate exposure risks to the existing sanitation services, and to compare expected impacts of improvements. This paper outlines a conceptual approach that links faecal waste discharge patterns with potential pathogen exposure pathways to quantitatively compare urban sanitation improvement options. An illustrative application of the approach is presented, using a spreadsheet-based model to compare the relative effect on disability-adjusted life years of six sanitation improvement options for a hypothetical urban situation. The approach includes consideration of the persistence or removal of different pathogen classes in different environments; recognition of multiple interconnected sludge and effluent pathways, and of multiple potential sites for exposure; and use of quantitative microbial risk assessment to support prediction of relative health risks for each option. This research provides a step forward in applying current knowledge to better consider public health, alongside environmental and other objectives, in urban sanitation decision making. Further empirical research in specific locations is now required to refine the approach and address data gaps. PMID:29360775
Sheen, Shiowshuh; Huang, Chi-Yun; Ramos, Rommel; Chien, Shih-Yung; Scullen, O Joseph; Sommers, Christopher
2018-03-01
Pathogenic Escherichia coli, intestinal (O157:H7) as well as extraintestinal types (for example, Uropathogenic E. coli [UPEC]) are commonly found in many foods including raw chicken meat. The resistance of E. coli O157:H7 to UPEC in chicken meat under the stresses of high hydrostatic Pressure (HHP, also known as HPP-high pressure processing) and trans-cinnamaldehyde (an essential oil) was investigated and compared. UPEC was found slightly less resistant than O157:H7 in our test parameter ranges. With the addition of trans-cinnamaldehyde as an antimicrobial to meat, HPP lethality enhanced both O157:H7 and UPEC inactivation. To facilitate the predictive model development, a central composite design (CCD) was used to assess the 3-parameter effects, that is, pressure (300 to 400 MPa), trans-cinnamaldehyde dose (0.2 to 0.5%, w/w), and pressure-holding time (15 to 25 min), on the inactivation of E. coli O157:H7 and UPEC in ground chicken. Linear models were developed to estimate the lethality of E. coli O157:H7 (R 2 = 0.86) and UPEC (R 2 = 0.85), as well as dimensionless nonlinear models. All models were validated with data obtained from separated CCD combinations. Because linear models of O157:H7 and UPEC had similar R 2 and the significant lethality difference of CCD points was only 9 in 20; all data were combined to generate models to include both O157:H7 and UPEC. The results provide useful information/tool to predict how pathogenic E. coli may survive HPP in the presence of trans-cinnamaldehyde and to achieve a great than 5 log CFU/g reduction in chicken meat. The models may be used for process optimization, product development and to assist the microbial risk assessment. The study provided an effective means to reduce the high hydrostatic pressure level with incorporation of antimicrobial compound to achieve a 5-log reduction of pathogenic E. coli without damaging the raw meat quality. The developed models may be used to predict the high pressure processing lethality (and process optimization), product development (ingredient selection), and to assist the microbial risk assessment. © 2018 Institute of Food Technologists®.
US EPA's Pathogen Equivalency Committee (PEC) has updated the evaluation criteria it uses to make recommendations of equivalency (to processes acceptable under 40CFR503) on innovative or alternative sludge pathogen reduction processes. These criteria will be presented along with ...
Enhanced salmonella reduction on tomatoes washed in chlorinated water with wash aid T-128
USDA-ARS?s Scientific Manuscript database
Chlorine is widely used by the fresh and fresh-cut produce industries to reduce microbial populations and to prevent potential pathogen cross contamination during produce washing. However, the organic materials released from produce quickly react with chlorine and degrade its efficacy for pathogen i...
Microbial risk assessment (MRA) in the food industry is used to support HACCP – which largely focuses on bacterial pathogen control in processing foodstuffs Potential role of microbially-contaminated water used in food production is not as well understood Emergence...
We evaluate the influence of multiple sources of faecal indicator bacteria in recreational water bodies on potential human health risk by considering waters impacted by human and animal sources, human and non-pathogenic sources, and animal and non-pathogenic sources. We illustrat...
Driscoll, Amanda J; Deloria Knoll, Maria; Hammitt, Laura L; Baggett, Henry C; Brooks, W Abdullah; Feikin, Daniel R; Kotloff, Karen L; Levine, Orin S; Madhi, Shabir A; O'Brien, Katherine L; Scott, J Anthony G; Thea, Donald M; Howie, Stephen R C; Adrian, Peter V; Ahmed, Dilruba; DeLuca, Andrea N; Ebruke, Bernard E; Gitahi, Caroline; Higdon, Melissa M; Kaewpan, Anek; Karani, Angela; Karron, Ruth A; Mazumder, Razib; McLellan, Jessica; Moore, David P; Mwananyanda, Lawrence; Park, Daniel E; Prosperi, Christine; Rhodes, Julia; Saifullah, Md; Seidenberg, Phil; Sow, Samba O; Tamboura, Boubou; Zeger, Scott L; Murdoch, David R
2017-06-15
Antibiotic exposure and specimen volume are known to affect pathogen detection by culture. Here we assess their effects on bacterial pathogen detection by both culture and polymerase chain reaction (PCR) in children. PERCH (Pneumonia Etiology Research for Child Health) is a case-control study of pneumonia in children aged 1-59 months investigating pathogens in blood, nasopharyngeal/oropharyngeal (NP/OP) swabs, and induced sputum by culture and PCR. Antibiotic exposure was ascertained by serum bioassay, and for cases, by a record of antibiotic treatment prior to specimen collection. Inoculated blood culture bottles were weighed to estimate volume. Antibiotic exposure ranged by specimen type from 43.5% to 81.7% in 4223 cases and was detected in 2.3% of 4863 controls. Antibiotics were associated with a 45% reduction in blood culture yield and approximately 20% reduction in yield from induced sputum culture. Reduction in yield of Streptococcus pneumoniae from NP culture was approximately 30% in cases and approximately 32% in controls. Several bacteria had significant but marginal reductions (by 5%-7%) in detection by PCR in NP/OP swabs from both cases and controls, with the exception of S. pneumoniae in exposed controls, which was detected 25% less frequently compared to nonexposed controls. Bacterial detection in induced sputum by PCR decreased 7% for exposed compared to nonexposed cases. For every additional 1 mL of blood culture specimen collected, microbial yield increased 0.51% (95% confidence interval, 0.47%-0.54%), from 2% when volume was ≤1 mL to approximately 6% for ≥3 mL. Antibiotic exposure and blood culture volume affect detection of bacterial pathogens in children with pneumonia and should be accounted for in studies of etiology and in clinical management. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Park, Sang-Hyun; Kang, Dong-Hyun
2015-08-17
The objective of this study was to evaluate the antimicrobial effect of chlorine dioxide (ClO2) gas and aerosolized sanitizer, when applied alone or in combination, on the survival of Escherichia coli O157:H7, Salmonella Typhimurium, and Listeria monocytogenes inoculated onto spinach leaves and tomato surfaces. Spinach leaves and tomatoes were inoculated with a cocktail of three strains each of the three foodborne pathogens. ClO2 gas (5 or 10 ppmv) and aerosolized peracetic acid (PAA) (80 ppm) were applied alone or in combination for 20 min. Exposure to 10 ppmv of ClO2 gas for 20 min resulted in 3.4, 3.3, and 3.4 log reductions of E. coli O157:H7, S. Typhimurium, and L. monocytogenes on spinach leaves, respectively. Treatment with 80 ppm of aerosolized PAA for 20 min caused 2.3, 1.9, and 0.8 log reductions of E. coli O157:H7, S. Typhimurium, and L. monocytogenes, respectively. Combined treatment of ClO2 gas (10 ppmv) and aerosolized PAA (80 ppm) for 20 min caused 5.4, 5.1, and 4.1 log reductions of E. coli O157:H7, S. Typhimurium, and L. monocytogenes, respectively. E. coli O157:H7, S. Typhimurium, and L. monocytogenes on tomatoes experienced similar reduction patterns to those on spinach leaves. As treatment time increased, most combinations of ClO2 gas and aerosolized PAA showed additive effects in the inactivation of the three pathogens. Combined treatment of ClO2 gas and aerosolized PAA produced injured cells of three pathogens on spinach leaves while generally did not produce injured cells of these pathogens on tomatoes. Combined treatment of ClO2 gas (10 ppmv) and aerosolized PAA (80 ppm) did not significantly (p>0.05) affect the color and texture of samples during 7 days of storage. Copyright © 2015. Published by Elsevier B.V.
A stochastic agent-based model of pathogen propagation in dynamic multi-relational social networks
Khan, Bilal; Dombrowski, Kirk; Saad, Mohamed
2015-01-01
We describe a general framework for modeling and stochastic simulation of epidemics in realistic dynamic social networks, which incorporates heterogeneity in the types of individuals, types of interconnecting risk-bearing relationships, and types of pathogens transmitted across them. Dynamism is supported through arrival and departure processes, continuous restructuring of risk relationships, and changes to pathogen infectiousness, as mandated by natural history; dynamism is regulated through constraints on the local agency of individual nodes and their risk behaviors, while simulation trajectories are validated using system-wide metrics. To illustrate its utility, we present a case study that applies the proposed framework towards a simulation of HIV in artificial networks of intravenous drug users (IDUs) modeled using data collected in the Social Factors for HIV Risk survey. PMID:25859056
Hinz, Rebecca
2015-01-01
Chronic inflammation, which is caused by recurrent infections, is one of the factors contributing to the pathogenesis of cholesteatoma. If reimplantation of autologous ossicles after a surgical intervention is intended, inactivation of planktonic bacteria and biofilms is desirable. High hydrostatic pressure treatment is a procedure, which has been used to inactivate cholesteatoma cells on ossicles. Here we discuss the potential inactivating effect of high hydrostatic pressure on microbial pathogens including biofilms. Recent experimental data suggest an incomplete inactivation at a pressure level, which is tolerable for the bone substance of ossicles and results at least in a considerable reduction of pathogen load. Further studies are necessary to access how far this quantitative reduction of pathogens is sufficient to prevent ongoing chronic infections, for example, due to forming of biofilms. PMID:25705686
Pathogens at the livestock-wildlife interface in Western Alberta: does transmission route matter?
2014-01-01
In southwestern Alberta, interactions between beef cattle and free-ranging elk (Cervus elaphus) may provide opportunities for pathogen transmission. To assess the importance of the transmission route on the potential for interspecies transmission, we conducted a cross-sectional study on four endemic livestock pathogens with three different transmission routes: Bovine Viral Diarrhea Virus and Bovine Herpesvirus 1 (predominantly direct transmission), Mycobacterium avium subsp. paratuberculosis (MAP) (indirect fecal-oral transmission), Neospora caninum (indirect transmission with definitive host). We assessed the occurrence of these pathogens in 28 cow-calf operations exposed or non-exposed to elk, and in 10 elk herds exposed or not to cattle. We characterized the effect of species commingling as a risk factor of pathogen exposure and documented the perceived risk of pathogen transmission at this wildlife-livestock interface in the rural community. Herpesviruses found in elk were elk-specific gamma-herpesviruses unrelated to cattle viruses. Pestivirus exposure in elk could not be ascertained to be of livestock origin. Evidence of MAP circulation was found in both elk and cattle, but there was no statistical effect of the species commingling. Finally, N. caninum was more frequently detected in elk exposed to cattle and this association was still significant after adjustment for herd and sampling year clustering, and individual elk age and sex. Only indirectly transmitted pathogens co-occurred in cattle and elk, indicating the potential importance of the transmission route in assessing the risk of pathogen transmission in multi-species grazing systems. PMID:24517283
Codony, Francesc; Pérez, Leonardo Martín; Adrados, Bárbara; Agustí, Gemma; Fittipaldi, Mariana; Morató, Jordi
2012-01-01
Culture-based methods for fecal indicator microorganisms are the standard protocol to assess potential health risk from drinking water systems. However, these traditional fecal indicators are inappropriate surrogates for disinfection-resistant fecal pathogens and the indigenous pathogens that grow in drinking water systems. There is now a range of molecular-based methods, such as quantitative PCR, which allow detection of a variety of pathogens and alternative indicators. Hence, in addition to targeting total Escherichia coli (i.e., dead and alive) for the detection of fecal pollution, various amoebae may be suitable to indicate the potential presence of pathogenic amoeba-resisting microorganisms, such as Legionellae. Therefore, monitoring amoeba levels by quantitative PCR could be a useful tool for directly and indirectly evaluating health risk and could also be a complementary approach to current microbial quality control strategies for drinking water systems.
Lethal exposure: An integrated approach to pathogen transmission via environmental reservoirs.
Turner, Wendy C; Kausrud, Kyrre L; Beyer, Wolfgang; Easterday, W Ryan; Barandongo, Zoë R; Blaschke, Elisabeth; Cloete, Claudine C; Lazak, Judith; Van Ert, Matthew N; Ganz, Holly H; Turnbull, Peter C B; Stenseth, Nils Chr; Getz, Wayne M
2016-06-06
To mitigate the effects of zoonotic diseases on human and animal populations, it is critical to understand what factors alter transmission dynamics. Here we assess the risk of exposure to lethal concentrations of the anthrax bacterium, Bacillus anthracis, for grazing animals in a natural system over time through different transmission mechanisms. We follow pathogen concentrations at anthrax carcass sites and waterholes for five years and estimate infection risk as a function of grass, soil or water intake, age of carcass sites, and the exposure required for a lethal infection. Grazing, not drinking, seems the dominant transmission route, and transmission is more probable from grazing at carcass sites 1-2 years of age. Unlike most studies of virulent pathogens that are conducted under controlled conditions for extrapolation to real situations, we evaluate exposure risk under field conditions to estimate the probability of a lethal dose, showing that not all reservoirs with detectable pathogens are significant transmission pathways.
Disease Risk in a Dynamic Environment: The Spread of Tick-Borne Pathogens in Minnesota, USA
Robinson, Stacie J.; Neitzel, David F.; Moen, Ronald A.; Craft, Meggan E.; Hamilton, Karin E.; Johnson, Lucinda B.; Mulla, David J.; Munderloh, Ulrike G.; Redig, Patrick T.; Smith, Kirk E.; Turner, Clarence L.; Umber, Jamie K.; Pelican, Katharine M.
2015-01-01
As humans and climate change alter the landscape, novel disease risk scenarios emerge. Understanding the complexities of pathogen emergence and subsequent spread as shaped by landscape heterogeneity is crucial to understanding disease emergence, pinpointing high-risk areas, and mitigating emerging disease threats in a dynamic environment. Tick-borne diseases present an important public health concern and incidence of many of these diseases are increasing in the United States. The complex epidemiology of tick-borne diseases includes strong ties with environmental factors that influence host availability, vector abundance, and pathogen transmission. Here, we used 16 years of case data from the Minnesota Department of Health to report spatial and temporal trends in Lyme disease (LD), human anaplasmosis, and babesiosis. We then used a spatial regression framework to evaluate the impact of landscape and climate factors on the spread of LD. Finally, we use the fitted model, and landscape and climate datasets projected under varying climate change scenarios, to predict future changes in tick-borne pathogen risk. Both forested habitat and temperature were important drivers of LD spread in Minnesota. Dramatic changes in future temperature regimes and forest communities predict rising risk of tick-borne disease. PMID:25281302
Disease risk in a dynamic environment: the spread of tick-borne pathogens in Minnesota, USA.
Robinson, Stacie J; Neitzel, David F; Moen, Ronald A; Craft, Meggan E; Hamilton, Karin E; Johnson, Lucinda B; Mulla, David J; Munderloh, Ulrike G; Redig, Patrick T; Smith, Kirk E; Turner, Clarence L; Umber, Jamie K; Pelican, Katharine M
2015-03-01
As humans and climate change alter the landscape, novel disease risk scenarios emerge. Understanding the complexities of pathogen emergence and subsequent spread as shaped by landscape heterogeneity is crucial to understanding disease emergence, pinpointing high-risk areas, and mitigating emerging disease threats in a dynamic environment. Tick-borne diseases present an important public health concern and incidence of many of these diseases are increasing in the United States. The complex epidemiology of tick-borne diseases includes strong ties with environmental factors that influence host availability, vector abundance, and pathogen transmission. Here, we used 16 years of case data from the Minnesota Department of Health to report spatial and temporal trends in Lyme disease (LD), human anaplasmosis, and babesiosis. We then used a spatial regression framework to evaluate the impact of landscape and climate factors on the spread of LD. Finally, we use the fitted model, and landscape and climate datasets projected under varying climate change scenarios, to predict future changes in tick-borne pathogen risk. Both forested habitat and temperature were important drivers of LD spread in Minnesota. Dramatic changes in future temperature regimes and forest communities predict rising risk of tick-borne disease.
Walker, Logan C; Marquart, Louise; Pearson, John F; Wiggins, George A R; O'Mara, Tracy A; Parsons, Michael T; Barrowdale, Daniel; McGuffog, Lesley; Dennis, Joe; Benitez, Javier; Slavin, Thomas P; Radice, Paolo; Frost, Debra; Godwin, Andrew K; Meindl, Alfons; Schmutzler, Rita Katharina; Isaacs, Claudine; Peshkin, Beth N; Caldes, Trinidad; Hogervorst, Frans BL; Lazaro, Conxi; Jakubowska, Anna; Montagna, Marco; Chen, Xiaoqing; Offit, Kenneth; Hulick, Peter J; Andrulis, Irene L; Lindblom, Annika; Nussbaum, Robert L; Nathanson, Katherine L; Chenevix-Trench, Georgia; Antoniou, Antonis C; Couch, Fergus J; Spurdle, Amanda B
2017-01-01
Genome-wide studies of patients carrying pathogenic variants (mutations) in BRCA1 or BRCA2 have reported strong associations between single-nucleotide polymorphisms (SNPs) and cancer risk. To conduct the first genome-wide association analysis of copy-number variants (CNVs) with breast or ovarian cancer risk in a cohort of 2500 BRCA1 pathogenic variant carriers, CNV discovery was performed using multiple calling algorithms and Illumina 610k SNP array data from a previously published genome-wide association study. Our analysis, which focused on functionally disruptive genomic deletions overlapping gene regions, identified a number of loci associated with risk of breast or ovarian cancer for BRCA1 pathogenic variant carriers. Despite only including putative deletions called by at least two or more algorithms, detection of selected CNVs by ancillary molecular technologies only confirmed 40% of predicted common (>1% allele frequency) variants. These include four loci that were associated (unadjusted P<0.05) with breast cancer (GTF2H2, ZNF385B, NAALADL2 and PSG5), and two loci associated with ovarian cancer (CYP2A7 and OR2A1). An interesting finding from this study was an association of a validated CNV deletion at the CYP2A7 locus (19q13.2) with decreased ovarian cancer risk (relative risk=0.50, P=0.007). Genomic analysis found this deletion coincides with a region displaying strong regulatory potential in ovarian tissue, but not in breast epithelial cells. This study highlighted the need to verify CNVs in vitro, but also provides evidence that experimentally validated CNVs (with plausible biological consequences) can modify risk of breast or ovarian cancer in BRCA1 pathogenic variant carriers. PMID:28145423
Transstadial Effects of Bti on Traits of Aedes aegypti and Infection with Dengue Virus
Alto, Barry W.; Lord, Cynthia C.
2016-01-01
Most mosquito control efforts are primarily focused on reducing the adult population size mediated by reductions in the larval population, which should lower risk of disease transmission. Although the aim of larviciding is to reduce larval abundance and thus recruitment of adults, nonlethal effects on adults are possible, including transstadial effects on phenotypes of adults such as survival and pathogen infection and transmission. In addition, the mortality induced by control efforts may act in conjunction with other sources of mosquito mortality in nature. The consequences of these effects and interactions may alter the potential of the population to transmit pathogens. We tested experimentally the combined effects of a larvicide (Bacillus thuringiensis ssp. israelensis, Bti) and competition during the larval stages on subsequent Aedes aegypti (Linnaeus) traits, population performance, and susceptibility to dengue-1 virus infection. Ae. aegypti that survived exposure to Bti experienced accelerated development, were larger, and produced more eggs with increasing amounts of Bti, consistent with competitive release among surviving mosquitoes. Changing larval density had no significant interactive effect with Bti treatment on development and growth to adulthood. Larval density, but not Bti or treatment interaction, had a strong effect on survival of adult Ae. aegypti females. There were sharper declines in cumulative daily survival of adults from crowded than uncrowded larval conditions, suggesting that high competition conditions of larvae may be an impediment to transmission of dengue viruses. Rates of infection and dengue-1 virus disseminated infections were found to be 87±13% and 88±12%, respectively. There were no significant treatment effects on infection measurements. Our findings suggest that larvicide campaigns using Bti may reduce the number of emerged adults, but survivors will have a fitness advantage (growth, development, enhanced production of eggs) relative to conspecifics that are not under larvicide pressure. However, under most circumstances, these transstadial effects are unlikely to outweigh reductions in the adult population by Bti and altered risk of disease transmission. PMID:26871951
Riera, Cristina; Girona-Llobera, Enrique; Guillen, Carmen; Iniesta, Laura; Alcover, Magdalena; Berenguer, Diana; Pujol, Alba; Tomás-Pérez, Miriam; Cancino-Faure, Beatriz; Serra, Teresa; Mascaró, Martín; Gascó, Joan; Fisa, Roser
2018-01-01
Background In the Balearic Islands, as in other areas of the Mediterranean basin, there is a significant proportion of asymptomatic Leishmania (L.) infantum-infected blood donors, who may represent an important threat to transfusion safety. The Balearic Islands blood bank, located in an area endemic for L. infantum, carried out a study of donors and patients to investigate the impact of this infectious disease on blood safety in the region. Materials and methods Twenty asymptomatic Leishmania-infected blood donors were followed-up between 2008 and 2011 to investigate the evolution of Leishmania infection in asymptomatic carriers. Their blood was periodically tested for anti-Leishmania antibodies by western blot and for Leishmania DNA by quantitative polymerase chain reaction (qPCR). Additionally, the prevalence of L. infantum infection was investigated in a group of 68 multiply transfused patients to ascertain the risk of transfusion-transmitted leishmaniasis (TTL) in the region, taking into account regular blood component production practices such as pre-storage leucodepletion and pathogen reduction technology. Results All 20 donors remained asymptomatic over the study period (2008–2011). Most donors had repeatedly positive qPCR results, either persistently or intermittently, but showed no symptoms of Leishmaniasis. Levels of parasitaemia were remarkably low in asymptomatic donors, with values ≤1 parasite/mL. Despite multiple transfusions received over 15 years, no transfused patient studied was infected with L. infantum. Discussion L. infantum-infected donors can remain asymptomatic for at least 3 years. In our region, no cases of TTL were detected, despite an active search in multiply transfused patients. This seems to be related to two independent variables: (i) a low concentration of the parasite in the peripheral blood of asymptomatic carriers and (ii) the application of methods with proven efficacy against TTL, such as leucodepletion and pathogen reduction technology. PMID:28488962
Gilbert, Gregory S; Magarey, Roger; Suiter, Karl; Webb, Campbell O
2012-01-01
Assessing risk from a novel pest or pathogen requires knowing which local plant species are susceptible. Empirical data on the local host range of novel pests are usually lacking, but we know that some pests are more likely to attack closely related plant species than species separated by greater evolutionary distance. We use the Global Pest and Disease Database, an internal database maintained by the United States Department of Agriculture Animal and Plant Health Inspection Service – Plant Protection and Quarantine Division (USDA APHIS-PPQ), to evaluate the strength of the phylogenetic signal in host range for nine major groups of plant pests and pathogens. Eight of nine groups showed significant phylogenetic signal in host range. Additionally, pests and pathogens with more known hosts attacked a phylogenetically broader range of hosts. This suggests that easily obtained data – the number of known hosts and the phylogenetic distance between known hosts and other species of interest – can be used to predict which plant species are likely to be susceptible to a particular pest. This can facilitate rapid assessment of risk from novel pests and pathogens when empirical host range data are not yet available and guide efficient collection of empirical data for risk evaluation. PMID:23346231
Gilbert, Gregory S; Magarey, Roger; Suiter, Karl; Webb, Campbell O
2012-12-01
Assessing risk from a novel pest or pathogen requires knowing which local plant species are susceptible. Empirical data on the local host range of novel pests are usually lacking, but we know that some pests are more likely to attack closely related plant species than species separated by greater evolutionary distance. We use the Global Pest and Disease Database, an internal database maintained by the United States Department of Agriculture Animal and Plant Health Inspection Service - Plant Protection and Quarantine Division (USDA APHIS-PPQ), to evaluate the strength of the phylogenetic signal in host range for nine major groups of plant pests and pathogens. Eight of nine groups showed significant phylogenetic signal in host range. Additionally, pests and pathogens with more known hosts attacked a phylogenetically broader range of hosts. This suggests that easily obtained data - the number of known hosts and the phylogenetic distance between known hosts and other species of interest - can be used to predict which plant species are likely to be susceptible to a particular pest. This can facilitate rapid assessment of risk from novel pests and pathogens when empirical host range data are not yet available and guide efficient collection of empirical data for risk evaluation.
Early cancer diagnoses through BRCA1/2 screening of unselected adult biobank participants
Buchanan, Adam H; Manickam, Kandamurugu; Meyer, Michelle N; Wagner, Jennifer K; Hallquist, Miranda L G; Williams, Janet L; Rahm, Alanna Kulchak; Williams, Marc S; Chen, Zong-Ming E; Shah, Chaitali K; Garg, Tullika K; Lazzeri, Amanda L; Schwartz, Marci L B; Lindbuchler, D'Andra M; Fan, Audrey L; Leeming, Rosemary; Servano, Pedro O; Smith, Ashlee L; Vogel, Victor G; Abul-Husn, Noura S; Dewey, Frederick E; Lebo, Matthew S; Mason-Suares, Heather M; Ritchie, Marylyn D; Davis, F Daniel; Carey, David J; Feinberg, David T; Faucett, W Andrew; Ledbetter, David H; Murray, Michael F
2018-01-01
Purpose The clinical utility of screening unselected individuals for pathogenic BRCA1/2 variants has not been established. Data on cancer risk management behaviors and diagnoses of BRCA1/2-associated cancers can help inform assessments of clinical utility. Methods Whole-exome sequences of participants in the MyCode Community Health Initiative were reviewed for pathogenic/likely pathogenic BRCA1/2 variants. Clinically confirmed variants were disclosed to patient–participants and their clinicians. We queried patient–participants’ electronic health records for BRCA1/2-associated cancer diagnoses and risk management that occurred within 12 months after results disclosure, and calculated the percentage of patient–participants of eligible age who had begun risk management. Results Thirty-seven MyCode patient–participants were unaware of their pathogenic/likely pathogenic BRCA1/2 variant, had not had a BRCA1/2-associated cancer, and had 12 months of follow-up. Of the 33 who were of an age to begin BRCA1/2-associated risk management, 26 (79%) had performed at least one such procedure. Three were diagnosed with an early-stage, BRCA1/2-associated cancer—including a stage 1C fallopian tube cancer—via these procedures. Conclusion Screening for pathogenic BRCA1/2 variants among unselected individuals can lead to occult cancer detection shortly after disclosure. Comprehensive outcomes data generated within our learning healthcare system will aid in determining whether population-wide BRCA1/2 genomic screening programs offer clinical utility. PMID:29261187
2013-01-01
Background The US CDC estimates over 2 million foodborne illnesses are annually caused by 4 major enteropathogens: non-typhoid Salmonella spp., Campylobacter spp., Shigella spp. and Yersinia enterocoltica. While data suggest a number of costly and morbid chronic sequelae associated with these infections, pathogen-specific risk estimates are lacking. We utilized a US Department of Defense medical encounter database to evaluate the risk of several gastrointestinal disorders following select foodborne infections. Methods We identified subjects with acute gastroenteritis between 1998 to 2009 attributed to Salmonella (nontyphoidal) spp., Shigella spp., Campylobacter spp. or Yersinia enterocolitica and matched each with up to 4 unexposed subjects. Medical history was analyzed for the duration of military service time (or a minimum of 1 year) to assess for incident chronic gastrointestinal disorders. Relative risks were calculated using modified Poisson regression while controlling for the effect of covariates. Results A total of 1,753 pathogen-specific gastroenteritis cases (Campylobacter: 738, Salmonella: 624, Shigella: 376, Yersinia: 17) were identified and followed for a median of 3.8 years. The incidence (per 100,000 person-years) of PI sequelae among exposed was as follows: irritable bowel syndrome (IBS), 3.0; dyspepsia, 1.8; constipation, 3.9; gastroesophageal reflux disease (GERD), 9.7. In multivariate analyses, we found pathogen-specific increased risk of IBS, dyspepsia, constipation and GERD. Conclusions These data confirm previous studies demonstrating risk of chronic gastrointestinal sequelae following bacterial enteric infections and highlight additional preventable burden of disease which may inform better food security policies and practices, and prompt further research into pathogenic mechanisms. PMID:23510245
Holstege, Henne; van der Lee, Sven J; Hulsman, Marc; Wong, Tsz Hang; van Rooij, Jeroen GJ; Weiss, Marjan; Louwersheimer, Eva; Wolters, Frank J; Amin, Najaf; Uitterlinden, André G; Hofman, Albert; Ikram, M Arfan; van Swieten, John C; Meijers-Heijboer, Hanne; van der Flier, Wiesje M; Reinders, Marcel JT; van Duijn, Cornelia M; Scheltens, Philip
2017-01-01
Accumulating evidence suggests that genetic variants in the SORL1 gene are associated with Alzheimer disease (AD), but a strategy to identify which variants are pathogenic is lacking. In a discovery sample of 115 SORL1 variants detected in 1908 Dutch AD cases and controls, we identified the variant characteristics associated with SORL1 variant pathogenicity. Findings were replicated in an independent sample of 103 SORL1 variants detected in 3193 AD cases and controls. In a combined sample of the discovery and replication samples, comprising 181 unique SORL1 variants, we developed a strategy to classify SORL1 variants into five subtypes ranging from pathogenic to benign. We tested this pathogenicity screen in SORL1 variants reported in two independent published studies. SORL1 variant pathogenicity is defined by the Combined Annotation Dependent Depletion (CADD) score and the minor allele frequency (MAF) reported by the Exome Aggregation Consortium (ExAC) database. Variants predicted strongly damaging (CADD score >30), which are extremely rare (ExAC-MAF <1 × 10−5) increased AD risk by 12-fold (95% CI 4.2–34.3; P=5 × 10−9). Protein-truncating SORL1 mutations were all unknown to ExAC and occurred exclusively in AD cases. More common SORL1 variants (ExAC-MAF≥1 × 10−5) were not associated with increased AD risk, even when predicted strongly damaging. Findings were independent of gender and the APOE-ε4 allele. High-risk SORL1 variants were observed in a substantial proportion of the AD cases analyzed (2%). Based on their effect size, we propose to consider high-risk SORL1 variants next to variants in APOE, PSEN1, PSEN2 and APP for personalized risk assessments in clinical practice. PMID:28537274
Pearson, Rachel E. Goeriz; Miller, Amy K.; Ziobro, George C.
2012-01-01
Although flies are important vectors of food-borne pathogens, there is little information to accurately assess the food-related health risk of the presence of individual flies, especially in urban areas. This study quantifies the prevalence and the relative risk of food-borne pathogens associated with the body surfaces and guts of individual wild flies. One hundred flies were collected from the dumpsters of 10 randomly selected urban restaurants. Flies were identified using taxonomic keys before being individually dissected. Cronobacter spp., Salmonella spp., and Listeria monocytogenes were detected using the PCR-based BAX system Q7. Positive samples were confirmed by culture on specific media and through PCR amplification and sequencing or ribotyping. Among collected flies were the housefly, Musca domestica (47%), the blowflies, Lucilia cuprina (33%) and Lucilia sericata (14%), and others (6%). Cronobacter species were detected in 14% of flies, including C. sakazakii, C. turicensis, and C. universalis, leading to the proposal of flies as a natural reservoir of this food-borne pathogen. Six percent of flies carried Salmonella enterica, including the serovars Poona, Hadar, Schwarzengrund, Senftenberg, and Brackenridge. L. monocytogenes was detected in 3% of flies. Overall, the prevalence of food-borne pathogens was three times greater in the guts than on the body surfaces of the flies. The relative risk of flies carrying any of the three pathogens was associated with the type of pathogen, the body part of the fly, and the ambient temperature. These data enhance the ability to predict the microbiological risk associated with the presence of individual flies in food and food facilities. PMID:22941079
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
... Draft Microbial Risk Assessment Guideline: Pathogenic Microorganisms With Focus on Food and Water AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: The Agency is announcing that Eastern Research... Water. EPA previously announced the release of the draft guidance for a 60 day comment period (76 FR...
Re-growth of fecal indicator bacteria and Escherichia coli 0157:H7 B6914 in cow fecal extract
The health risks that pathogens pose to water and food resources are highly dependent on their fate and transport in agricultural settings. In order to assess these risks, an understanding of the factors that influence pathogen fate in agricultural settings is needed and is criti...
Modeling Dental Health Care Workers' Risk of Occupational Infection from Bloodborne Pathogens.
ERIC Educational Resources Information Center
Capilouto, Eli; And Others
1990-01-01
The brief paper offers a model which permits quantification of the dental health care workers' risk of occupationally acquiring infection from bloodborne pathogens such as human immunodeficiency virus and hepatitis B virus. The model incorporates five parameters such as the probability that any individual patient is infected and number of patients…
Human milk pasteurization: benefits and risks.
O'Connor, Deborah L; Ewaschuk, Julia B; Unger, Sharon
2015-05-01
Recent findings substantiate that the optimal method of nourishing preterm, very low birth weight infants (VLBW, born <1500 g) is through appropriately nutrient-enriched human milk, which is frequently provided as pasteurized donor milk. The availability of donor milk for VLBW infants during initial hospitalization continues to increase with the launch of new milk banks in North America. The majority of North American neonatal ICUs now have written policies governing the provision of donor milk. The purpose of this review is to summarize recent evidence regarding the risks and benefits of pasteurization of human milk and outcomes associated with its provision to VLBW preterm infants. Studies investigating the impact of collection, storage and pasteurization on the bacteriostatic, immunologic and nutritional aspects of human milk continue to be published, generally revealing a partial, but not complete reduction in bioactivity. Risk of contamination of pasteurized donor human milk with pathogenic agents is mitigated through pasteurization. New pasteurization methods aiming to maintain the safety of pooled human milk while better preserving bioactivity are under investigation. Provision of a human milk-derived diet to preterm VLBW infants is associated with improved outcomes.
Gyawali, P
2018-02-01
Raw and partially treated wastewater has been widely used to maintain the global water demand. Presence of viable helminth ova and larvae in the wastewater raised significant public health concern especially when used for agriculture and aquaculture. Depending on the prevalence of helminth infections in communities, up to 1.0 × 10 3 ova/larvae can be presented per litre of wastewater and 4 gm (dry weight) of sludge. Multi-barrier approaches including pathogen reduction, risk assessment, and exposure reduction have been suggested by health regulators to minimise the potential health risk. However, with a lack of a sensitive and specific method for the quantitative detection of viable helminth ova from wastewater, an accurate health risk assessment is difficult to achieve. As a result, helminth infections are difficult to control from the communities despite two decades of global effort (mass drug administration). Molecular methods can be more sensitive and specific than currently adapted culture-based and vital stain methods. The molecular methods, however, required more and thorough investigation for its ability with accurate quantification of viable helminth ova/larvae from wastewater and sludge samples. Understanding different cell stages and corresponding gene copy numbers is pivotal for accurate quantification of helminth ova/larvae in wastewater samples. Identifying specific genetic markers including protein, lipid, and metabolites using multiomics approach could be utilized for cheap, rapid, sensitive, specific and point of care detection tools for helminth ova and larva in the wastewater.
40 CFR 503.10 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-07-01
... pathogen requirements in § 503.32(a); and one of the vector attraction reduction requirements in § 503.33(b... of the vector attraction reduction requirements in § 503.33(b)(1) through (b)(8). (2) The Regional... requirements in § 503.32(a); and one of the vector attraction reduction requirements in § 503.33(b)(1) through...
USDA-ARS?s Scientific Manuscript database
The produce safety research objectives of Research Project 1935-41420-011 are to 1) understand pathogen microbial ecology and its effects on decontamination efficacy; 2) develop biological-based intervention strategies for pathogen reduction; and 3) develop new effective chemical and physical decont...
USDA-ARS?s Scientific Manuscript database
Washing and sanitizing agents have been effective in reducing bacterial populations and pathogen presence on carcasses. Thermal interventions consistently provide the greatest pathogen reductions and can be applied during slaughter in a number of different forms, either as a whole carcass wash, or t...
USDA-ARS?s Scientific Manuscript database
Chlorinated water is widely used as the primary anti-microbial intervention during fresh-cut produce processing. Free chlorine in chlorinated water can provide effective reduction of potential contaminations by microbial pathogens, and, more importantly, effectively prevent cross contamination of p...
Assessment of the pathogenicity of cell-culture-adapted Newcastle disease virus strain Komarov.
Visnuvinayagam, Sivam; Thangavel, K; Lalitha, N; Malmarugan, S; Sukumar, Kuppannan
2015-01-01
Newcastle disease vaccines hitherto in vogue are produced from embryonated chicken eggs. Egg-adapted mesogenic vaccines possess several drawbacks such as paralysis and mortality in 2-week-old chicks and reduced egg production in the egg-laying flock. Owing to these possible drawbacks, we attempted to reduce the vaccine virulence for safe vaccination by adapting the virus in a chicken embryo fibroblast cell culture (CEFCC) system. Eighteen passages were carried out by CEFCC, and the pathogenicity was assessed on the basis of the mean death time, intracerebral pathogenicity index, and intravenous pathogenicity index, at equal passage intervals. Although the reduction in virulence demonstrated with increasing passage levels in CEFCC was encouraging, 20% of the 2-week-old birds showed paralytic symptoms with the virus vaccine from the 18(th)(final) passage. Thus, a tissue-culture-adapted vaccine would demand a few more passages by CEFCC in order to achieve a complete reduction in virulence for use as a safe and effective vaccine, especially among younger chicks. Moreover, it can be safely administered even to unprimed 8-week-old birds.
Removal of contaminants and pathogens from secondary effluents using intermittent sand filters.
Bali, Mahmoud; Gueddari, Moncef; Boukchina, Rachid
2011-01-01
Intermittent infiltration percolation of wastewater through unsaturated sand bed is an extensive treatment technique aimed at eliminating organic matter, oxidizing ammonium and removing pathogens. The main purpose of this study was to determine the depuration efficiencies of a sand filter to remove contaminants from secondary wastewater effluents. Elimination of pathogenic bacteria (total and faecal coliforms, streptococci) and their relationship with the filter depth were investigated. Results showed a high capacity of infiltration percolation process to treat secondary effluents. Total elimination of suspended solids was obtained. Mean removal rate of BOD(5) and COD was more than 97 and more than 81%, respectively. Other water quality parameters such as NH(4)-N, TKN and PO(4)-P showed significant reduction except NO(3)-N which increased significantly in the filtered water. Efficiency of pathogenic bacteria removal was shown to mainly depend on the filter depth. Average reductions of 2.35 log total coliforms, 2.47 log faecal coliforms and 2.11 log faecal streptococci were obtained. The experimental study has shown the influence of the temperature on the output purification of infiltration percolation process.
Olivos-García, Alfonso; Saavedra, Emma; Nequiz, Mario; Santos, Fabiola; Luis-García, Erika Rubí; Gudiño, Marco; Pérez-Tamayo, Ruy
2016-05-01
Several species belonging to the genus Entamoeba can colonize the mouth or the human gut; however, only Entamoeba histolytica is pathogenic to the host, causing the disease amoebiasis. This illness is responsible for one hundred thousand human deaths per year worldwide, affecting mainly underdeveloped countries. Throughout its entire life cycle and invasion of human tissues, the parasite is constantly subjected to stress conditions. Under in vitro culture, this microaerophilic parasite can tolerate up to 5 % oxygen concentrations; however, during tissue invasion the parasite has to cope with the higher oxygen content found in well-perfused tissues (4-14 %) and with reactive oxygen and nitrogen species derived from both host and parasite. In this work, the role of the amoebic oxygen reduction pathway (ORP) and heat shock response (HSP) are analyzed in relation to E. histolytica pathogenicity. The data suggest that in contrast with non-pathogenic E. dispar, the higher level of ORP and HSPs displayed by E. histolytica enables its survival in tissues by diminishing and detoxifying intracellular oxidants and repairing damaged proteins to allow metabolic fluxes, replication and immune evasion.
Ravishankar, Sadhana; Zhu, Libin; Olsen, Carl W; McHugh, Tara H; Friedman, Mendel
2009-10-01
Apple-based edible films containing plant antimicrobials were evaluated for their activity against pathogenic bacteria on meat and poultry products. Salmonella enterica or E. coli O157:H7 (10(7) CFU/g) cultures were surface inoculated on chicken breasts and Listeria monocytogenes (10(6) CFU/g) on ham. The inoculated products were then wrapped with edible films containing 3 concentrations (0.5%, 1.5%, and 3%) of cinnamaldehyde or carvacrol. Following incubation at either 23 or 4 degrees C for 72 h, samples were stomached in buffered peptone water, diluted, and plated for enumeration of survivors. The antimicrobial films exhibited concentration-dependent activities against the pathogens tested. At 23 degrees C on chicken breasts, films with 3% antimicrobials showed the highest reductions (4.3 to 6.8 log CFU/g) of both S. enterica and E. coli O157:H7. Films with 1.5% and 0.5% antimicrobials showed 2.4 to 4.3 and 1.6 to 2.8 log reductions, respectively. At 4 degrees C, carvacrol exhibited greater activity than did cinnamaldehyde. Films with 3%, 1.5%, and 0.5% carvacrol reduced the bacterial populations by about 3, 1.6 to 3, and 0.8 to 1 logs, respectively. Films with 3% and 1.5% cinnamaldehyde induced 1.2 to 2.8 and 1.2 to 1.3 log reductions, respectively. For L. monocytogenes on ham, carvacrol films induced greater reductions than did cinnamaldehyde films at all concentrations tested. In general, the reduction of L. monocytogenes on ham at 23 degrees C was greater than at 4 degrees C. Added antimicrobials had minor effects on physical properties of the films. The results suggest that the food industry and consumers could use these films as wrappings to control surface contamination by foodborne pathogenic microorganisms.
Schijven, Jack; Forêt, Jean Marie; Chardon, Jurgen; Teunis, Peter; Bouwknegt, Martijn; Tangena, Ben
2016-06-01
Drinking water distribution networks are vulnerable to accidental or intentional contamination events. The objective of this study was to investigate the effects of seeding duration and concentration, exposure pathway (ingestion via drinking of water and tooth brushing and inhalation by taking a shower) and pathogen infectivity on exposure and infection risk in the case of an intentional pathogenic contamination in a drinking water distribution network. Seeding of a pathogen for 10 min and 120 min, and subsequent spreading through a drinking water distribution network were simulated. For exposure via drinking, actual data on drinking events and volumes were used. Ingestion of a small volume of water by tooth brushing twice a day by every person in the network was assumed. Inhalation of contaminated aerosol droplets took place when taking a shower. Infection risks were estimated for pathogens with low (r = 0.0001) and high (r = 0.1) infectivity. In the served population (48 000 persons) and within 24 h, about 1400 persons were exposed to the pathogen by ingestion of water in the 10-min seeding scenario and about 3400 persons in the 120-min scenario. The numbers of exposed persons via tooth brushing were about the same as via drinking of water. Showering caused (inhalation) exposure in about 450 persons in the 10-min scenario and about 1500 in the 120-min scenario. Regardless of pathogen infectivity, if the seeding concentration is 10(6) pathogens per litre or more, infection risks are close to one. Exposure by taking a shower is of relevance if the pathogen is highly infectious via inhalation. A longer duration of the seeding of a pathogen increases the probability of exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fate and transport of pathogens in lakes and reservoirs.
Brookes, Justin D; Antenucci, Jason; Hipsey, Matthew; Burch, Michael D; Ashbolt, Nicholas J; Ferguson, Christobel
2004-07-01
Outbreaks of water-borne disease via public water supplies continue to be reported in developed countries even though there is increased awareness of, and treatment for, pathogen contamination. Pathogen episodes in lakes and reservoirs are often associated with rain events, and the riverine inflow is considered to be major source of pathogens. Consequently, the behaviour of these inflows is of particular importance in determining pathogen transport and distribution. Inflows are controlled by their density relative to that of the lake, such that warm inflows will flow over the surface of the lake as a buoyant surface flow and cold, dense inflows will sink beneath the lake water where they will flow along the bathymetry towards the deepest point. The fate of pathogens is determined by loss processes including settling and inactivation by temperature, UV and grazing. The general trend is for the insertion timescale to be shortest, followed by sedimentation losses and temperature inactivity. The fate of Cryptosporidium due to UV light inactivation can occur at opposite ends of the scale, depending on the location of the oocysts in the water column and the extinction coefficient for UV light. For this reason, the extinction coefficient for UV light appears to be a vitally important parameter for determining the risk of Cryptosporidium contamination. For risk assessment of pathogens in supply reservoirs, it is important to understand the role of hydrodynamics in determining the timescale of transport to the off-take relative to the timescale of inactivation. The characteristics of the riverine intrusion must also be considered when designing a sampling program for pathogens. A risk management framework is presented that accounts for pathogen fate and transport for reservoirs.
Zhao, Dongjun; Barrientos, Jessie Usaga; Wang, Qing; Markland, Sarah M; Churey, John J; Padilla-Zakour, Olga I; Worobo, Randy W; Kniel, Kalmia E; Moraru, Carmen I
2015-04-01
Thermal pasteurization can achieve the U. S. Food and Drug Administration-required 5-log reduction of pathogenic Escherichia coli O157:H7 and Cryptosporidium parvum in apple juice and cider, but it can also negatively affect the nutritional and organoleptic properties of the treated products. In addition, thermal pasteurization is only marginally effective against the acidophilic, thermophilic, and spore-forming bacteria Alicyclobacillus spp., which is known to cause off-flavors in juice products. In this study, the efficiency of a combined microfiltration (MF) and UV process as a nonthermal treatment for the reduction of pathogenic and nonpathogenic E. coli, C. parvum, and Alicyclobacillus acidoterrestris from apple cider was investigated. MF was used to physically remove suspended solids and microorganisms from apple cider, thus enhancing the effectiveness of UV and allowing a lower UV dose to be used. MF, with ceramic membranes (pore sizes, 0.8 and 1.4 μm), was performed at a temperature of 10 °C and a transmembrane pressure of 155 kPa. The subsequent UV treatment was conducted using at a low UV dose of 1.75 mJ/cm(2). The combined MF and UV achieved more than a 5-log reduction of E. coli, C. parvum, and A. acidoterrestris. MF with the 0.8-μm pore size performed better than the 1.4-μm pore size on removal of E. coli and A. acidoterrestris. The developed nonthermal hurdle treatment has the potential to significantly reduce pathogens, as well as spores, yeasts, molds, and protozoa in apple cider, and thus help juice processors improve the safety and quality of their products.
Nitric Oxide in the Offensive Strategy of Fungal and Oomycete Plant Pathogens
Arasimowicz-Jelonek, Magdalena; Floryszak-Wieczorek, Jolanta
2016-01-01
In the course of evolutionary changes pathogens have developed many invasion strategies, to which the host organisms responded with a broad range of defense reactions involving endogenous signaling molecules, such as nitric oxide (NO). There is evidence that pathogenic microorganisms, including two most important groups of eukaryotic plant pathogens, also acquired the ability to synthesize NO via non-unequivocally defined oxidative and/or reductive routes. Although the both kingdoms Chromista and Fungi are remarkably diverse, the experimental data clearly indicate that pathogen-derived NO is an important regulatory molecule controlling not only developmental processes, but also pathogen virulence and its survival in the host. An active control of mitigation or aggravation of nitrosative stress within host cells seems to be a key determinant for the successful invasion of plant pathogens representing different lifestyles and an effective mode of dispersion in various environmental niches. PMID:26973690
Praveen, Chandni; Jesudhasan, Palmy R; Reimers, Robert S; Pillai, Suresh D
2013-09-01
Microbial pathogens in municipal sewage sludges need to be inactivated prior to environmental disposal. The efficacy of high energy (10 MeV) e-beam irradiation to inactivate a variety of selected microbial pathogens and indicator organisms in aerobically and anaerobically digested sewage sludge was evaluated. Both bacterial and viral pathogens and indicator organisms are susceptible to e-beam irradiation. However, as expected there was a significant difference in their respective e-beam irradiation sensitivity. Somatic coliphages, bacterial endospores and enteric viruses were more resistant compared to bacterial pathogens. The current US EPA mandated 10 kGy minimum dose was capable of achieving significant reduction of both bacterial and viral pathogens. Somatic coliphages can be used as a microbial indicator for monitoring e-beam processes in terms of pathogen inactivation in sewage sludges. Copyright © 2013 Elsevier Ltd. All rights reserved.
Carter, C. J.
2011-01-01
Many genes have been implicated in schizophrenia as have viral prenatal or adult infections and toxoplasmosis or Lyme disease. Several autoantigens also target key pathology-related proteins. These factors are interrelated. Susceptibility genes encode for proteins homologous to those of the pathogens while the autoantigens are homologous to pathogens' proteins, suggesting that the risk-promoting effects of genes and risk factors are conditional upon each other, and dependent upon protein matching between pathogen and susceptibility gene products. Pathogens' proteins may act as dummy ligands, decoy receptors, or via interactome interference. Many such proteins are immunogenic suggesting that antibody mediated knockdown of multiple schizophrenia gene products could contribute to the disease, explaining the immune activation in the brain and lymphocytes in schizophrenia, and the preponderance of immune-related gene variants in the schizophrenia genome. Schizophrenia may thus be a “pathogenetic” autoimmune disorder, caused by pathogens, genes, and the immune system acting together, and perhaps preventable by pathogen elimination, or curable by the removal of culpable antibodies and antigens. PMID:22567321
Evaluation of a microwave based reactor for the treatment of blackwater sludge
Mawioo, Peter M.; Rweyemamu, Audax; Garcia, Hector A.; Hooijmans, Christine M.; Brdjanovic, Damir
2016-01-01
A laboratory-scale microwave (MW) unit was applied to treat fresh blackwater sludge that represented fecal sludge (FS) produced at heavily used toilet facilities. The sludge was exposed to MW irradiation at different power levels and for various durations. Variables such as sludge volume and pathogen reduction were observed. The results demonstrated that the MW is a rapid and efficient technology that can reduce the sludge volume by over 70% in these experimental conditions. The concentration of bacterial pathogenic indicator E. coli also decreased to below the analytical detection levels. Furthermore, the results indicated that the MW operational conditions including radiation power and contact time can be varied to achieve the desired sludge volume and pathogen reduction. MW technology can be further explored for the potential scaling-up as an option for rapid treatment of FS from intensively used sanitation facilities such as in emergency situations. PMID:26799809
NASA Astrophysics Data System (ADS)
Chandrasena, G. I.; Deletic, A.; McCarthy, D. T.
2016-06-01
Knowledge of pathogen removal in stormwater biofilters (also known as stormwater bioretention systems or rain gardens) has predominately been determined using bacterial indicators, and the removal of reference pathogens in these systems has rarely been investigated. Furthermore, current understanding of indicator bacteria removal in these systems is largely built upon laboratory-scale work. This paper examines whether indicator organism removal from urban stormwater using biofilters in laboratory settings are representative of the removal of pathogens in field conditions, by studying the removal of Escherichia coli (a typical indicator microorganism) and Campylobacter spp. (a typical reference pathogen) from urban stormwater by two established field-scale biofilters. It was found that E. coli log reduction was higher than that of Campylobacter spp. in both biofilters, and that there was no correlation between E. coli and Campylobacter spp. log removal performance. This confirms that E. coli behaves significantly differently to this reference pathogen, reinforcing that single organisms should not be employed to understand faecal microorganism removal in urban stormwater treatment systems. The average reduction in E. coli from only one of the tested biofilters was able to meet the log reduction targets suggested in the current Australian stormwater harvesting guidelines for irrigating sports fields and golf courses. The difference in the performance of the two biofilters is likely a result of a number of design and operational factors; the most important being that the biofilter that did not meet the guidelines was tested using extremely high influent volumes and microbial concentrations, and long antecedent dry weather periods. As such, the E. coli removal performances identified in this study confirmed laboratory findings that inflow concentration and antecedent dry period impact overall microbial removal. In general, this paper emphasizes the need for the validation of stormwater harvesting systems, namely, the testing of treatment systems under challenging operational conditions using multiple indicators and reference pathogens.
Miller, Melissa A.; Byrne, Barbara A.; Jang, Spencer S.; Dodd, Erin M.; Dorfmeier, Elene; Harris, Michael D.; Ames, Jack; Paradies, David; Worcester, Karen; Jessup, David A.; Miller, Woutrina A.
2009-01-01
Although protected for nearly a century, California’s sea otters have been slow to recover, in part due to exposure to fecally-associated protozoal pathogens like Toxoplasma gondii and Sarcocystis neurona. However, potential impacts from exposure to fecal bacteria have not been systematically explored. Using selective media, we examined feces from live and dead sea otters from California for specific enteric bacterial pathogens (Campylobacter, Salmonella, Clostridium perfringens, C. difficile and Escherichia coli O157:H7), and pathogens endemic to the marine environment (Vibrio cholerae, V. parahaemolyticus and Plesiomonas shigelloides). We evaluated statistical associations between detection of these pathogens in otter feces and demographic or environmental risk factors for otter exposure, and found that dead otters were more likely to test positive for C. perfringens, Campylobacter and V. parahaemolyticus than were live otters. Otters from more urbanized coastlines and areas with high freshwater runoff (near outflows of rivers or streams) were more likely to test positive for one or more of these bacterial pathogens. Other risk factors for bacterial detection in otters included male gender and fecal samples collected during the rainy season when surface runoff is maximal. Similar risk factors were reported in prior studies of pathogen exposure for California otters and their invertebrate prey, suggesting that land-sea transfer and/or facilitation of pathogen survival in degraded coastal marine habitat may be impacting sea otter recovery. Because otters and humans share many of the same foods, our findings may also have implications for human health. PMID:19720009
Background/Question/Methods Bacterial pathogens in surface water present disease risks to aquatic communities and for human recreational activities. Sources of these pathogens include runoff from urban, suburban, and agricultural point and non-point sources, but hazardous micr...
Soobhany, Nuhaa
2018-01-15
The use of composts or vermicomposts derived from organic fraction of Municipal Solid Waste (OFMSW) brought about certain disagreement in terms of high level of bacterial pathogens, thereby surpassing the legal restrictions. This preliminary study was undertaken to compare the evolution of pathogenic bacteria on OFMSW compost against vermicompost (generated by Eudrilus eugeniae) with promises of achieving sanitation goals. Analysis to quality data showed that OFMSW vermicomposting caused a moderately higher reduction in total coliforms in contrast to composting. E. coli in OFMSW composts was found to be in the range of 4.72-4.96 log 10 CFU g -1 whilst on a clear contrary, E. coli was undetectable in the final vermicomposts (6.01-6.14 logs of reduction) which might be explained by the involvement of the digestive processes in worms' guts. Both OFMSW composts and vermicomposts generated Salmonella-free products which were acceptable for agricultural usage and soil improvement. In comparison to compost, the analysis of this research indicated that earthworm activity can effectively destroy bacterial pathogenic load in OFMSW vermicomposts. But still, this study necessitates extra research in order to comprehend the factors that direct pathogenic bacteria in vermicomposting and earthworm-free decomposition systems. Copyright © 2017 Elsevier Ltd. All rights reserved.
Biswal, Basanta Kumar; Mazza, Alberto; Masson, Luke; Gehr, Ronald
2013-01-01
Effluents discharged from wastewater treatment plants are possible sources of pathogenic bacteria, including Escherichia coli, in the freshwater environment, and determining the possible selection of pathogens is important. This study evaluated the impact of activated sludge and physicochemical wastewater treatment processes on the prevalence of potentially virulent E. coli. A total of 719 E. coli isolates collected from four municipal plants in Québec before and after treatment were characterized by using a customized DNA microarray to determine the impact of treatment processes on the frequency of specific pathotypes and virulence genes. The percentages of potentially pathogenic E. coli isolates in the plant influents varied between 26 and 51%, and in the effluents, the percentages were 14 to 31%, for a reduction observed at all plants ranging between 14 and 45%. Pathotypes associated with extraintestinal pathogenic E. coli (ExPEC) were the most abundant at three of the four plants and represented 24% of all isolates, while intestinal pathogenic E. coli pathotypes (IPEC) represented 10% of the isolates. At the plant where ExPEC isolates were not the most abundant, a large number of isolates were classified as both ExPEC and IPEC; overall, 6% of the isolates were classified in both groups, with the majority being from the same plant. The reduction of the proportion of pathogenic E. coli could not be explained by the preferential loss of one virulence gene or one type of virulence factor; however, the quinolone resistance gene (qnrS) appears to enhance the loss of virulence genes, suggesting a mechanism involving the loss of pathogenicity islands. PMID:23160132
Prado-Silva, Leonardo; Cadavez, Vasco; Gonzales-Barron, Ursula; Rezende, Ana Carolina B.
2015-01-01
The aim of this study was to perform a meta-analysis of the effects of sanitizing treatments of fresh produce on Salmonella spp., Escherichia coli O157:H7, and Listeria monocytogenes. From 55 primary studies found to report on such effects, 40 were selected based on specific criteria, leading to more than 1,000 data on mean log reductions of these three bacterial pathogens impairing the safety of fresh produce. Data were partitioned to build three meta-analytical models that could allow the assessment of differences in mean log reductions among pathogens, fresh produce, and sanitizers. Moderating variables assessed in the meta-analytical models included type of fresh produce, type of sanitizer, concentration, and treatment time and temperature. Further, a proposal was done to classify the sanitizers according to bactericidal efficacy by means of a meta-analytical dendrogram. The results indicated that both time and temperature significantly affected the mean log reductions of the sanitizing treatment (P < 0.0001). In general, sanitizer treatments led to lower mean log reductions when applied to leafy greens (for example, 0.68 log reductions [0.00 to 1.37] achieved in lettuce) compared to other, nonleafy vegetables (for example, 3.04 mean log reductions [2.32 to 3.76] obtained for carrots). Among the pathogens, E. coli O157:H7 was more resistant to ozone (1.6 mean log reductions), while L. monocytogenes and Salmonella presented high resistance to organic acids, such as citric acid, acetic acid, and lactic acid (∼3.0 mean log reductions). With regard to the sanitizers, it has been found that slightly acidic electrolyzed water, acidified sodium chlorite, and the gaseous chlorine dioxide clustered together, indicating that they possessed the strongest bactericidal effect. The results reported seem to be an important achievement for advancing the global understanding of the effectiveness of sanitizers for microbial safety of fresh produce. PMID:26362982
Barbieri, Nicolle L.; Vande Vorde, Jessica A.; Baker, Alison R.; Horn, Fabiana; Li, Ganwu; Logue, Catherine M.; Nolan, Lisa K.
2017-01-01
Avian pathogenic Escherichia coli (APEC) is the etiologic agent of colibacillosis, an important cause of morbidity and mortality in poultry. Though, many virulence factors associated with APEC pathogenicity are known, their regulation remains unclear. FNR (fumarate and nitrate reduction) is a well-known global regulator that works as an oxygen sensor and has previously been described as a virulence regulator in bacterial pathogens. The goal of this study was to examine the role of FNR in the regulation of APEC virulence factors, such as Type I fimbriae, and processes such as adherence and invasion, type VI secretion, survival during oxidative stress, and growth in iron-restricted environments. To accomplish this goal, APEC O1, a well-characterized, highly virulent, and fully sequenced strain of APEC harboring multiple virulence mechanisms, some of which are plasmid-linked, was compared to its FNR mutant for expression of various virulence traits. Deletion of FNR was found to affect APEC O1's adherence, invasion and expression of ompT, a plasmid-encoded outer membrane protein, type I fimbriae, and aatA, encoding an autotransporter. Indeed, the fnr− mutant showed an 8-fold reduction in expression of type I fimbriae and a highly significant (P < 0.0001) reduction in expression of fimA, ompT (plasmid-borne), and aatA. FNR was also found to regulate expression of the type VI secretion system, affecting the expression of vgrG. Further, FNR was found to be important to APEC O1's growth in iron-deficient media and survival during oxidative stress with the mutant showing a 4-fold decrease in tolerance to oxidative stress, as compared to the wild type. Thus, our results suggest that FNR functions as an important regulator of APEC virulence. PMID:28690981
[Benefit-risk assessment of vaccination strategies].
Hanslik, Thomas; Boëlle, Pierre Yves
2007-04-01
This article summarises the various stages of the risk/benefit assessment of vaccination strategies. Establishing the awaited effectiveness of a vaccination strategy supposes to have an epidemiologic description of the disease to be prevented. The effectiveness of the vaccine strategy will be thus expressed in numbers of cases, hospitalizations or deaths avoided. The effectiveness can be direct, expressed as the reduction of the incidence of the infectious disease in the vaccinated subjects compared to unvaccinated subjects. It can also be indirect, the unvaccinated persons being protected by the suspension in circulation of the pathogenic agent, consecutive to the implementation of the vaccination campaign. The risks of vaccination related to the adverse effects detected during the clinical trials preceding marketing are well quantified, but other risks can occur after marketing: e.g., serious and unexpected adverse effects detected by vaccinovigilance systems, or risk of increase in the age of cases if the vaccination coverage is insufficient. The medico-economic evaluation forms a part of the risks/benefit assessment, by positioning the vaccine strategy comparatively with other interventions for health. Epidemiologic and vaccinovigilance informations must be updated very regularly, which underlines the need for having an operational and reliable real time monitoring system to accompany the vaccination strategies. Lastly, in the context of uncertainty which often accompanies the risks/benefit assessments, it is important that an adapted communication towards the public and the doctors is planned.
Perez, Keila L; Lucia, Lisa M; Cisneros-Zevallos, Luis; Castillo, Alejandro; Taylor, T Matthew
2012-05-01
While there is strong focus on eliminating pathogens from produce at a commercial level, consumers can employ simple methods to achieve additional pathogen reductions in the domestic kitchen. To determine the ability of antimicrobials to decontaminate peppers, samples of green bell pepper were inoculated with Salmonella enterica and Escherichia coli O157:H7 and then immersed in 3% (v/v) hydrogen peroxide (H₂O₂), 2.5% (v/v) acetic acid (AA), 70% (v/v) ethyl alcohol (EtOH), or sterile distilled water (SDW). The potential for transfer of pathogens from contaminated peppers to other non-contaminated produce items, and the effect of knife disinfection in preventing this cross contamination, were also tested. Knife disinfection procedures were evaluated by chopping inoculated peppers into 1 cm² pieces with kitchen knives. Experimental knives were then treated by either no treatment (control), wiping with a dry sterile cotton towel, rinsing under running warm water for 5 or 10s, or applying a 1% (v/v) lauryl sulfate-based detergent solution followed by rinsing with warm running water for 10s. Following disinfection treatment, knives were used to slice cucumbers. Exposure to H₂O₂ for 5 min and EtOH for 1 min resulted in reductions of 1.3±0.3 log₁₀ CFU/cm² for both pathogens. A 5 min exposure to AA resulted in a reduction of S. enterica of 1.0±0.7 log₁₀ CFU/cm² and E. coli of 0.7±0.8 log₁₀ CFU/cm². No differences (p ≥ 0.05) were found between numbers of pathogens on knives and numbers of pathogens transferred to cucumber slices, suggesting that organisms remaining on knife surfaces were transferred to cucumbers during slicing. Findings suggest that EtOH and H₂O₂ may be effective antimicrobials for in-home decontamination of peppers, and that use of detergent and warm water is effective for decontamination of implements used during meal preparation. Published by Elsevier B.V.
Resiliency or restoration: management of sudden oak death before and after outbreak
Richard Cobb; Peter Hartsough; Noam Ross; Janet Klein; David LaFever; Susan Frankel; David Rizzo
2017-01-01
Forests at risk to diseases caused by invasive Phytophthora pathogens can be grouped into two broad classes: those already invaded by the focal pathogen where disease has emerged or those at significant risk of invasion and subsequent emergence of disease. This dichotomy represents distinct management scenarios â treating after or before disease...
Wagner, Denae C; Kass, Philip H; Hurley, Kate F
2018-01-01
Upper respiratory infection (URI) is not an inevitable consequence of sheltering homeless cats. This study documents variation in risk of URI between nine North American shelters; determines whether this reflects variation in pathogen frequency on intake or differences in transmission and expression of disease; and identifies modifiable environmental and group health factors linked to risk for URI. This study demonstrated that although periodic introduction of pathogens into shelter populations may be inevitable, disease resulting from those pathogens is not. Housing and care of cats, particularly during their first week of stay in an animal shelter environment, significantly affects the rate of upper respiratory infection.
Refugee camps, fire disasters and burn injuries.
Atiyeh, B S; Gunn, S W A
2017-09-30
In the past five years, no fewer than 15 conflicts have brought unspeakable tragedy and misery to millions across the world. At present, nearly 20 people are forcibly displaced every minute as a result of conflict or persecution, representing a crisis of historic proportions. Many displaced persons end up in camps generally developing in an impromptu fashion, and are totally dependent on humanitarian aid. The precarious condition of temporary installations puts the nearly 700 refugee camps worldwide at high risk of disease, child soldier and terrorist recruitment, and physical and sexual violence. Poorly planned, densely packed refugee settlements are also one of the most pathogenic environments possible, representing high risk for fires with potential for uncontrolled fire spread and development over sometimes quite large areas. Moreover, providing healthcare to refugees comes with its own unique challenges. Internationally recognized guidelines for minimum standards in shelters and settlements have been set, however they remain largely inapplicable. As for fire risk reduction, and despite the high number of fire incidents, it is not evident that fire safety can justify a higher priority. In that regard, a number of often conflicting influences will need to be considered. The greatest challenge remains in balancing the various risks, such as the need/cost of shelter against the fire risk/cost of fire protection.
Refugee camps, fire disasters and burn injuries
Atiyeh, B.S.; Gunn, S.W.A.
2017-01-01
Summary In the past five years, no fewer than 15 conflicts have brought unspeakable tragedy and misery to millions across the world. At present, nearly 20 people are forcibly displaced every minute as a result of conflict or persecution, representing a crisis of historic proportions. Many displaced persons end up in camps generally developing in an impromptu fashion, and are totally dependent on humanitarian aid. The precarious condition of temporary installations puts the nearly 700 refugee camps worldwide at high risk of disease, child soldier and terrorist recruitment, and physical and sexual violence. Poorly planned, densely packed refugee settlements are also one of the most pathogenic environments possible, representing high risk for fires with potential for uncontrolled fire spread and development over sometimes quite large areas. Moreover, providing healthcare to refugees comes with its own unique challenges. Internationally recognized guidelines for minimum standards in shelters and settlements have been set, however they remain largely inapplicable. As for fire risk reduction, and despite the high number of fire incidents, it is not evident that fire safety can justify a higher priority. In that regard, a number of often conflicting influences will need to be considered. The greatest challenge remains in balancing the various risks, such as the need/cost of shelter against the fire risk/cost of fire protection. PMID:29849526
Ma, Hon Ming; Ip, Margaret; Woo, Jean; Hui, David S C
2014-05-01
Health care-associated pneumonia (HCAP) and drug-resistant bacterial pneumonia may not share identical risk factors. We have shown that bronchiectasis, recent hospitalization and severe pneumonia (confusion, blood urea level, respiratory rate, low blood pressure and 65 year old (CURB-65) score ≥ 3) were independent predictors of pneumonia caused by potentially drug-resistant (PDR) pathogens. This study aimed to develop and validate a clinical risk score for predicting drug-resistant bacterial pneumonia in older patients. We derived a risk score by assigning a weighting to each of these risk factors as follows: 14, bronchiectasis; 5, recent hospitalization; 2, severe pneumonia. A 0.5 point was defined for the presence of other risk factors for HCAP. We compared the areas under the receiver-operating characteristics curve (AUROC) of our risk score and the HCAP definition in predicting PDR pathogens in two cohorts of older patients hospitalized with non-nosocomial pneumonia. The derivation and validation cohorts consisted of 354 and 96 patients with bacterial pneumonia, respectively. PDR pathogens were isolated in 48 and 21 patients in the derivation and validation cohorts, respectively. The AUROCs of our risk score and the HCAP definition were 0.751 and 0.650, respectively, in the derivation cohort, and were 0.782 and 0.671, respectively, in the validation cohort. The differences between our risk score and the HCAP definition reached statistical significance. A score ≥ 2.5 had the best balance between sensitivity and specificity. Our risk score outperformed the HCAP definition to predict pneumonia caused by PDR pathogens. A history of bronchiectasis or recent hospitalization is the major indication of starting empirical broad-spectrum antibiotics. © 2014 Asian Pacific Society of Respirology.
Gilbert, Lucy; Medlock, Jolyon; Hansford, Kayleigh; Thompson, Des BA; Biek, Roman
2017-01-01
Landscape change and altered host abundance are major drivers of zoonotic pathogen emergence. Conservation and biodiversity management of landscapes and vertebrate communities can have secondary effects on vector-borne pathogen transmission that are important to assess. Here we review the potential implications of these activities on the risk of Lyme borreliosis in the United Kingdom. Conservation management activities include woodland expansion, management and restoration, deer management, urban greening and the release and culling of non-native species. Available evidence suggests that increasing woodland extent, implementing biodiversity policies that encourage ecotonal habitat and urban greening can increase the risk of Lyme borreliosis by increasing suitable habitat for hosts and the tick vectors. However, this can depend on whether deer population management is carried out as part of these conservation activities. Exclusion fencing or culling deer to low densities can decrease tick abundance and Lyme borreliosis risk. As management actions often constitute large-scale perturbation experiments, these hold great potential to understand underlying drivers of tick and pathogen dynamics. We recommend integrating monitoring of ticks and the risk of tick-borne pathogens with conservation management activities. This would help fill knowledge gaps and the production of best practice guidelines to reduce risks. This article is part of the themed issue ‘Conservation, biodiversity and infectious disease: scientific evidence and policy implications’. PMID:28438912
Millins, Caroline; Gilbert, Lucy; Medlock, Jolyon; Hansford, Kayleigh; Thompson, Des Ba; Biek, Roman
2017-06-05
Landscape change and altered host abundance are major drivers of zoonotic pathogen emergence. Conservation and biodiversity management of landscapes and vertebrate communities can have secondary effects on vector-borne pathogen transmission that are important to assess. Here we review the potential implications of these activities on the risk of Lyme borreliosis in the United Kingdom. Conservation management activities include woodland expansion, management and restoration, deer management, urban greening and the release and culling of non-native species. Available evidence suggests that increasing woodland extent, implementing biodiversity policies that encourage ecotonal habitat and urban greening can increase the risk of Lyme borreliosis by increasing suitable habitat for hosts and the tick vectors. However, this can depend on whether deer population management is carried out as part of these conservation activities. Exclusion fencing or culling deer to low densities can decrease tick abundance and Lyme borreliosis risk. As management actions often constitute large-scale perturbation experiments, these hold great potential to understand underlying drivers of tick and pathogen dynamics. We recommend integrating monitoring of ticks and the risk of tick-borne pathogens with conservation management activities. This would help fill knowledge gaps and the production of best practice guidelines to reduce risks.This article is part of the themed issue 'Conservation, biodiversity and infectious disease: scientific evidence and policy implications'. © 2017 The Authors.
Lee, Eun-Gyeong; Kang, Hyok Jo; Lim, Myong Cheol; Park, Boyoung; Park, Soo Jin; Jung, So-Youn; Lee, Seeyoun; Kang, Han-Sung; Park, Sang-Yoon; Park, Boram; Joo, Jungnam; Han, Jai Hong; Kong, Sun-Young; Lee, Eun Sook
2018-05-04
The purpose of this study was to investigate decision patterns to reduce the risks of BRCA-related breast and gynecologic cancers in carriers of BRCA pathogenic variants. We found a change in risk-reducing (RR) management patterns after December 2012, when the National Health Insurance System (NHIS) of Korea began to pay for BRCA testing and risk-reducing salpingo-oophorectomy (RRSO) in pathogenic-variant carriers. The study group consisted of 992 patients, including 705 with breast cancer (BC), 23 with ovarian cancer (OC), 10 with both, and 254 relatives of high-risk patients who underwent BRCA testing at the National Cancer Center of Korea from January 2008 to December 2016.We analyzed patterns of and factors in RR management. Of the 992 patients, 220 (22.2%) were carriers of BRCA pathogenic variants. About 92.3% (203/220) had a family history of BC and/or OC, which significantly differed between BRCA1 and BRCA2 carriers (p<0.001). All 41 male carriers chose surveillance. Of the 179 female carriers, 59 (71.1%) of the 83 carriers with BC and the 39 (49.4%) of 79 unaffected carriers underwent RR management. None of the carriers affected with OC underwent RR management. Of the management types, RRSO had the highest rate (42.5%) of patient choice. The rate of risk-reducing surgery was significantly higher after 2013 than before 2013 (46.3% [74/160] vs. 31.6% [6/19], p<0.001). RRSO was the preferred management for carriers of BRCA pathogenic variants. The most important factors in treatment choice were NHIS reimbursement and/or the severity of illness.
NASA Astrophysics Data System (ADS)
Henry, Rebekah; Schang, Christelle; Kolotelo, Peter; Coleman, Rhys; Rooney, Graham; Schmidt, Jonathan; Deletic, Ana; McCarthy, David T.
2016-06-01
Current World Health Organisation figures estimate that ∼2.5 million deaths per year result from recreational contact with contaminated water sources. Concerns about quantitative risk assessments of waterways using faecal indicator organisms (FIOs) as surrogates to infer pathogenic risk currently exist. In Melbourne, Australia, the Yarra River has come under public scrutiny due to perceived public health risks associated with aquatic recreation; a characteristic shared with urban estuaries worldwide. A 10-month study of the Yarra estuary investigated the processes that affect FIOs and pathogens within this system. A total of 74 samples were collected from three estuarine and two upstream, freshwater, locations under different climatic and hydrological conditions, and the levels of Escherichia coli, enterococci, Clostridium perfringens, fRNA coliphages, Campylobacter spp. Cryptosporidium oocysts, Giardia cysts, adenoviruses, and enteroviruses were monitored. Reference pathogenic bacteria, protozoa, and viruses were detected in 81%, 19%, and 8% of samples, respectively. Variations in FIO concentrations were found to be associated with changes in specific climatic and hydrological variables including: temperature, flow, humidity and rainfall. In contrast, pathogen levels remained unaffected by all variables investigated. Limitations of current national and international culture-based standard methods may have played a significant role in limiting the identification of correlative relationships The data demonstrate the differences between FIOs and microbial pathogens in terms of sources, sinks, and survival processes within an urban estuary and provide further evidence of the inadequacy of FIO inclusion in the development of worldwide regulatory water quality criteria and risk assessment models.
Hobday, R A; Dancer, S J
2013-08-01
Infections caught in buildings are a major global cause of sickness and mortality. Understanding how infections spread is pivotal to public health yet current knowledge of indoor transmission remains poor. To review the roles of natural ventilation and sunlight for controlling infection within healthcare environments. Comprehensive literature search was performed, using electronic and library databases to retrieve English language papers combining infection; risk; pathogen; and mention of ventilation; fresh air; and sunlight. Foreign language articles with English translation were included, with no limit imposed on publication date. In the past, hospitals were designed with south-facing glazing, cross-ventilation and high ceilings because fresh air and sunlight were thought to reduce infection risk. Historical and recent studies suggest that natural ventilation offers protection from transmission of airborne pathogens. Particle size, dispersal characteristics and transmission risk require more work to justify infection control practices concerning airborne pathogens. Sunlight boosts resistance to infection, with older studies suggesting potential roles for surface decontamination. Current knowledge of indoor transmission of pathogens is inadequate, partly due to lack of agreed definitions for particle types and mechanisms of spread. There is recent evidence to support historical data on the effects of natural ventilation but virtually none for sunlight. Modern practice of designing healthcare buildings for comfort favours pathogen persistence. As the number of effective antimicrobial agents declines, further work is required to clarify absolute risks from airborne pathogens along with any potential benefits from additional fresh air and sunlight. Copyright © 2013 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Sterk, Ankie; de Man, Heleen; Schijven, Jack F; de Nijs, Ton; de Roda Husman, Ana Maria
2016-11-15
Climate change is expected to influence infection risks while bathing downstream of sewage emissions from combined sewage overflows (CSOs) or waste water treatment plants (WWTPs) due to changes in pathogen influx, rising temperatures and changing flow rates of the receiving waters. In this study, climate change impacts on the surface water concentrations of Campylobacter, Cryptosporidium and norovirus originating from sewage were modelled. Quantitative microbial risk assessment (QMRA) was used to assess changes in risks of infection. In general, infection risks downstream of WWTPs are higher than downstream CSOs. Even though model outputs show an increase in CSO influxes, in combination with changes in pathogen survival, dilution within the sewage system and bathing behaviour, the effects on the infection risks are limited. However, a decrease in dilution capacity of surface waters could have significant impact on the infection risks of relatively stable pathogens like Cryptosporidium and norovirus. Overall, average risks are found to be higher downstream WWTPs compared to CSOs. Especially with regard to decreased flow rates, adaptation measures on treatment at WWTPs may be more beneficial for human health than decreasing CSO events. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Olff, H; Hoorens, B; de Goede, R G M; van der Putten, W H; Gleichman, J M
2000-10-01
We analyzed the dynamics of dominant plant species in a grazed grassland over 17 years, and investigated whether local shifts in these dominant species, leading to vegetation mosaics, could be attributed to interactions between plants and soil-borne pathogens. We found that Festuca rubra and Carex arenaria locally alternated in abundance, with different sites close together behaving out of phase, resulting in a shifting mosaic. The net effect of killing all soil biota on the growth of these two species was investigated in a greenhouse experiment using gamma radiation, controlling for possible effects of sterilization on soil chemistry. Both plant species showed a strong net positive response to soil sterilization, indicating that pathogens (e.g., nematodes, pathogenic fungi) outweighed the effect of mutualists (e.g., mycorrhizae). This positive growth response towards soil sterilization appeared not be due to effects of sterilization on soil chemistry. Growth of Carex was strongly reduced by soil-borne pathogens (86% reduction relative to its growth on sterilized soil) on soil from a site where this species decreased during the last decade (and Festuca increased), while it was reduced much less (50%) on soil from a nearby site where it increased in abundance during the last decade. Similarly, Festuca was reduced more (67%) on soil from the site where it decreased (and Carex increased) than on soil from the site where it increased (55%, the site where Carex decreased). Plant-feeding nematodes showed high small-scale variation in densities, and we related this variation to the observed growth reductions in both plant species. Carex growth on unsterilized soil was significantly more reduced at higher densities of plant-feeding nematodes, while the growth reduction in Festuca was independent of plant-feeding nematode densities. At high plant-feeding nematode densities, growth of Carex was reduced more than Festuca, while at low nematode densities the opposite was found. Each plant species thus seems to be affected by different (groups of) soil-borne pathogens. The resulting interaction web of plants and soil-borne pathogens is discussed. We hypothesize that soil disturbances by digging ants and rabbits may explain the small-scale variation in nematode densities, by locally providing "fresh" sand. We conclude that soil-borne pathogens may contribute to plant diversity and spatial mosaics of plants in grasslands.
A reassessment of the risk of rust fungi developing resistance to fungicides.
Oliver, Richard P
2014-11-01
Rust fungi are major pathogens of many annual and perennial crops. Crop protection is largely based on genetic and chemical control. Fungicide resistance is a significant issue that has affected many crop pathogens. Some pathogens have rapidly developed resistance and hence are regarded as high-risk species. Rust fungi have been classified as being low risk, in spite of sharing many relevant features with high-risk pathogens. An examination of the evidence suggests that rust fungi may be wrongly classified as low risk. Of the nine classes of fungicide to which resistance has developed, six are inactive against rusts. The three remaining classes are quinone outside inhibitors (QoIs), demethylation inhibitors (DMIs) and succinate dehydrogenase inhibitors (SDHIs). QoIs have been protected by a recently discovered intron that renders resistant mutants unviable. Low levels of resistance have developed to DMIs, but with limited field significance. Older SDHI fungicides were inactive against rusts. Some of the SDHIs introduced since 2003 are active against rusts, so it may be that insufficient time has elapsed for resistance to develop, especially as SDHIs are generally sold in mixtures with other actives. It would therefore seem prudent to increase the level of vigilance for possible cases of resistance to established and new fungicides in rusts. © 2014 Society of Chemical Industry.
Park, Sang-Hyun; Kang, Dong-Hyun
2018-06-20
The objective of this study was to evaluate how treatment temperature influences the solubility of ClO 2 gas and the antimicrobial effect of ClO 2 gas against Escherichia coli O157:H7, Salmonella Typhimurium, and Listeria monocytogenes on produce and food contact surfaces. Produce and food contact surfaces inoculated with a combined culture cocktail of three strains each of the three foodborne pathogens were processed in a treatment chamber with 20 ppmv ClO 2 gas at 15 or 25 °C under the same conditions of absolute humidity (11.2-12.3 g/m 3 ) for up to 30 min. As treatment time increased, ClO 2 gas treatment at 15 °C caused significantly more (p < 0.05) inactivation of the three pathogens than treatment at 25 °C. ClO 2 gas treatment at 25 °C for 30 min resulted in 1.15 to 1.54, 1.53 to 1.88, and 1.00 to 1.78 log reductions of the three pathogens on spinach leaves, tomatoes, and stainless steel No.4, respectively. ClO 2 gas treatment at 15 °C for 30 min caused 2.53 to 2.88, 2.82 to 3.23, and 2.37 to 3.03 log reductions of the three pathogens on spinach leaves, tomatoes, and stainless steel No.4, respectively. Treatment with ClO 2 gas at 25 °C for 20 min resulted in 1.88 to 2.31 log reductions of the three pathogens on glass while >5.91 to 6.82 log reductions of these pathogens occurred after 20 min when treated at 15 °C. Residual ClO 2 levels after gas treatment at 15 °C were significantly (p < 0.05) higher than those at 25 °C. The results of this study can help the food processing industry establish optimum ClO 2 gas treatment conditions for maximizing the antimicrobial efficacy of ClO 2 gas. Published by Elsevier B.V.
Prudent Use of Antimicrobials in Exotic Animal Medicine.
Broens, Els M; van Geijlswijk, Ingeborg M
2018-05-01
Reduction of antimicrobial use can result in reduction of resistance in commensal bacteria. In exotic animals, information on use of antimicrobials and resistance in commensals and pathogens is scarce. However, use of antimicrobials listed as critically important antimicrobials for human medicine seems high in exotic animals. Ideally, the selection of a therapy should be based on an accurate diagnosis and antimicrobial susceptibility testing. When prescribing antimicrobials based on empiricism, knowledge of the most common pathogens causing specific infections and the antimicrobial spectrum of antimicrobial agents is indispensable. Implementing antimicrobial stewardship promotes the prudent use of antimicrobials in exotic animals. Copyright © 2018 Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
One of the largest risks to the continued stability of the swine industry is by pathogens like porcine reproductive and respiratory syndrome virus (PRRSV) that can decimate production as it spreads among individuals. These infections can be low or highly pathogenic, and because it infects monocytic ...
Subcontinental impacts of an invasive tree disease on forest structure and dynamics
Jeffrey R. Garnas; Matthew P. Ayres; Andrew M. Liebhold; Celia. Evans
2011-01-01
Introduced pests and pathogens are a major source of disturbance to ecosystems world-wide. The famous examples have produced dramatic reductions in host abundance, including virtual extirpation, but most introductions have more subtle impacts that are hard to quantify but are potentially at least as important due to the pathogens' effects on host reproduction,...
Incidence and risk factors of surgical site infection in general surgery in a developing country.
Alp, Emine; Elmali, Ferhan; Ersoy, Safiye; Kucuk, Can; Doganay, Mehmet
2014-04-01
To investigate the incidence of surgical site infections (SSIs) according to risk factors, etiological agents, antimicrobial resistance rates of pathogens, and antimicrobial prophylaxis (AMP) in a developing country. Prospective surveillance of SSIs was carried out in general surgery (GS) units between May 2005 and April 2009. SSI was diagnosed in 415 (10.8%) patients. Cefazolin was used as AMP in 780 (49%) operations, whereas broad-spectrum antibiotics were used in the remaining operations. AMP was administered for >24 h in 69 and 64% of the GS patients. The most significant risk factors for SSI after GS were total parenteral nutrition, transfusion, and a drainage catheter. The most common pathogen was Escherichia coli, but all the isolated pathogens were multiresistant. AMP is effective for reducing the risk of SSI; however, the prolonged use of AMP and broad-spectrum antibiotics may be associated with the emergence of resistant bacterial strains.
Risk factors for community-acquired bacterial meningitis.
Lundbo, Lene Fogt; Benfield, Thomas
2017-06-01
Bacterial meningitis is a significant burden of disease and mortality in all age groups worldwide despite the development of effective conjugated vaccines. The pathogenesis of bacterial meningitis is based on complex and incompletely understood host-pathogen interactions. Some of these are pathogen-specific, while some are shared between different bacteria. We searched the database PubMed to identify host risk factors for bacterial meningitis caused by the pathogens Streptococcus pneumoniae, Neisseria meningitidis and Haemophilus influenzae type b, because they are three most common causative bacteria beyond the neonatal period. We describe a number of risk factors; including socioeconomic factors, age, genetic variation of the host and underlying medical conditions associated with increased susceptibility to invasive bacterial infections in both children and adults. As conjugated vaccines are available for these infections, it is of utmost importance to identify high risk patients to be able to prevent invasive disease.
Transport and fate of microbial pathogens in agricultural settings
Bradford, Scott A.; Morales, Veronica L.; Zhang, Wei; Harvey, Ronald W.; Packman, Aaron I.; Mohanram, Arvind; Welty, Claire
2013-01-01
An understanding of the transport and survival of microbial pathogens (pathogens hereafter) in agricultural settings is needed to assess the risk of pathogen contamination to water and food resources, and to develop control strategies and treatment options. However, many knowledge gaps still remain in predicting the fate and transport of pathogens in runoff water, and then through the shallow vadose zone and groundwater. A number of transport pathways, processes, factors, and mathematical models often are needed to describe pathogen fate in agricultural settings. The level of complexity is dramatically enhanced by soil heterogeneity, as well as by temporal variability in temperature, water inputs, and pathogen sources. There is substantial variability in pathogen migration pathways, leading to changes in the dominant processes that control pathogen transport over different spatial and temporal scales. For example, intense rainfall events can generate runoff and preferential flow that can rapidly transport pathogens. Pathogens that survive for extended periods of time have a greatly enhanced probability of remaining viable when subjected to such rapid-transport events. Conversely, in dry seasons, pathogen transport depends more strongly on retention at diverse environmental surfaces controlled by a multitude of coupled physical, chemical, and microbiological factors. These interactions are incompletely characterized, leading to a lack of consensus on the proper mathematical framework to model pathogen transport even at the column scale. In addition, little is known about how to quantify transport and survival parameters at the scale of agricultural fields or watersheds. This review summarizes current conceptual and quantitative models for pathogen transport and fate in agricultural settings over a wide range of spatial and temporal scales. The authors also discuss the benefits that can be realized by improved modeling, and potential treatments to mitigate the risk of waterborne disease transmission.
Lethal exposure: An integrated approach to pathogen transmission via environmental reservoirs
Turner, Wendy C.; Kausrud, Kyrre L.; Beyer, Wolfgang; Easterday, W. Ryan; Barandongo, Zoë R.; Blaschke, Elisabeth; Cloete, Claudine C.; Lazak, Judith; Van Ert, Matthew N.; Ganz, Holly H.; Turnbull, Peter C. B.; Stenseth, Nils Chr.; Getz, Wayne M.
2016-01-01
To mitigate the effects of zoonotic diseases on human and animal populations, it is critical to understand what factors alter transmission dynamics. Here we assess the risk of exposure to lethal concentrations of the anthrax bacterium, Bacillus anthracis, for grazing animals in a natural system over time through different transmission mechanisms. We follow pathogen concentrations at anthrax carcass sites and waterholes for five years and estimate infection risk as a function of grass, soil or water intake, age of carcass sites, and the exposure required for a lethal infection. Grazing, not drinking, seems the dominant transmission route, and transmission is more probable from grazing at carcass sites 1–2 years of age. Unlike most studies of virulent pathogens that are conducted under controlled conditions for extrapolation to real situations, we evaluate exposure risk under field conditions to estimate the probability of a lethal dose, showing that not all reservoirs with detectable pathogens are significant transmission pathways. PMID:27265371
Atwood, Todd C.; Duncan, Colleen G.; Patyk, Kelly A.; Nol, Pauline; Rhyan, Jack; McCollum, Matthew; McKinney, Melissa A.; Ramey, Andy M.; Cerqueira-Cezar, Camila; Kwok, Oliver C H; Dubey, Jitender P; Hennager, S.G.
2017-01-01
Recent decline of sea ice habitat has coincided with increased use of land by polar bears (Ursus maritimus) from the southern Beaufort Sea (SB), which may alter the risks of exposure to pathogens and contaminants. We assayed blood samples from SB polar bears to assess prior exposure to the pathogens Brucella spp., Toxoplasma gondii, Coxiella burnetii, Francisella tularensis, and Neospora caninum, estimate concentrations of persistent organic pollutants (POPs), and evaluate risk factors associated with exposure to pathogens and POPs. We found that seroprevalence of Brucella spp. and T. gondii antibodies likely increased through time, and provide the first evidence of exposure of polar bears to C. burnetii, N. caninum, and F. tularensis. Additionally, the odds of exposure to T. gondii were greater for bears that used land than for bears that remained on the sea ice during summer and fall, while mean concentrations of the POP chlordane (ΣCHL) were lower for land-based bears. Changes in polar bear behavior brought about by climate-induced modifications to the Arctic marine ecosystem may increase exposure risk to certain pathogens and alter contaminant exposure pathways.
Forest species diversity reduces disease risk in a generalist plant pathogen invasion
Sarah E. Haas; Mevin B. Hooten; David M. Rizzo; Ross K. Meentemeyer
2011-01-01
Empirical evidence suggests that biodiversity loss can increase disease transmission, yet our understanding of the diversity-disease hypothesis for generalist pathogens in natural ecosystems is limited. We used a landscape epidemiological approach to examine two scenarios regarding diversity effects on the emerging plant pathogen Phytophthora ramorum...
A generic risk-based surveying method for invading plant pathogens
USDA-ARS?s Scientific Manuscript database
Invasive plant pathogens are increasing with international trade and travel with damaging environmental and economic consequences. Recent examples include tree diseases such as Sudden Oak Death in the Western US and Ash Dieback in Europe. To control an invading pathogen it is crucial that newly in...
USDA-ARS?s Scientific Manuscript database
Waterborne pathogens were detected in 96% of samples collected at three Lake Michigan beaches during the summer of 2010. Linear regression models were developed to explore environmental factors that may be influential for pathogen prevalence. Simulation of pathogen concentration using these models, ...
This project focuses on providing basic data to bound risk estimates resulting from pathogens associated with pipe biofilms. Researchers will compare biofilm pathogen effects under two different disinfection scenarios (free chlorine or chloramines) for a conventionally treated s...
Widerström, Micael; Schönning, Caroline; Lilja, Mikael; Lebbad, Marianne; Ljung, Thomas; Allestam, Görel; Ferm, Martin; Björkholm, Britta; Hansen, Anette; Hiltula, Jari; Långmark, Jonas; Löfdahl, Margareta; Omberg, Maria; Reuterwall, Christina; Samuelsson, Eva; Widgren, Katarina; Wallensten, Anders; Lindh, Johan
2014-04-01
In November 2010, ≈27,000 (≈45%) inhabitants of Östersund, Sweden, were affected by a waterborne outbreak of cryptosporidiosis. The outbreak was characterized by a rapid onset and high attack rate, especially among young and middle-aged persons. Young age, number of infected family members, amount of water consumed daily, and gluten intolerance were identified as risk factors for acquiring cryptosporidiosis. Also, chronic intestinal disease and young age were significantly associated with prolonged diarrhea. Identification of Cryptosporidium hominis subtype IbA10G2 in human and environmental samples and consistently low numbers of oocysts in drinking water confirmed insufficient reduction of parasites by the municipal water treatment plant. The current outbreak shows that use of inadequate microbial barriers at water treatment plants can have serious consequences for public health. This risk can be minimized by optimizing control of raw water quality and employing multiple barriers that remove or inactivate all groups of pathogens.
Schönning, Caroline; Lilja, Mikael; Lebbad, Marianne; Ljung, Thomas; Allestam, Görel; Ferm, Martin; Björkholm, Britta; Hansen, Anette; Hiltula, Jari; Långmark, Jonas; Löfdahl, Margareta; Omberg, Maria; Reuterwall, Christina; Samuelsson, Eva; Widgren, Katarina; Wallensten, Anders; Lindh, Johan
2014-01-01
In November 2010, ≈27,000 (≈45%) inhabitants of Östersund, Sweden, were affected by a waterborne outbreak of cryptosporidiosis. The outbreak was characterized by a rapid onset and high attack rate, especially among young and middle-aged persons. Young age, number of infected family members, amount of water consumed daily, and gluten intolerance were identified as risk factors for acquiring cryptosporidiosis. Also, chronic intestinal disease and young age were significantly associated with prolonged diarrhea. Identification of Cryptosporidium hominis subtype IbA10G2 in human and environmental samples and consistently low numbers of oocysts in drinking water confirmed insufficient reduction of parasites by the municipal water treatment plant. The current outbreak shows that use of inadequate microbial barriers at water treatment plants can have serious consequences for public health. This risk can be minimized by optimizing control of raw water quality and employing multiple barriers that remove or inactivate all groups of pathogens. PMID:24655474
Safety in the SEM laboratory--1981 update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bance, G.N.; Barber, V.C.; Sholdice, J.A.
1981-01-01
The article reviews recent information on hazards as they relate to safety in SEM laboratories. The first section lists the safety equipment that should be available in a SEM laboratory. Flammable and combustible liquids are discussed, and particular warnings are given concerning the fire and explosion risks associated with diethyl ether and diisopropyl ether. The possible hazards associated with electrical equipment, and the risk of X-ray emissions from EM's are briefly outlined. The hazards associated with acute and chronic toxicity of chemicals used in the EM laboratory are discussed. The need to reduce exposure to a growing list of recognizablemore » hazardous chemicals is emphasized. This reduction can be accomplished by more extensive use of functioning fume hoods, and the use of more appropriate and effective protective gloves. Allergies and the hazards of dangerous pathogens in the SEM laboratory are discussed. The explosion and other hazards associated with cryogens, vacuum evaporators, critical point dryers, and compressed gas cylinders are emphasized.« less
Removal of enteric bacteria in constructed treatment wetlands with emergent macrophytes: a review.
Vymazal, Jan
2005-01-01
Domestic and municipal sewage contains various pathogenic or potentially pathogenic microorganisms which, depending on species concentration, pose a potential risk to human health and whose presence must therefore be reduced in the course of wastewater treatment. The removal of microbiological pollution is seldom a primary target for constructed treatment wetlands (CWs). However, wetlands are known to act as excellent biofilters through a complex of physical, chemical and biological factors which all participate in the reduction of the number of bacteria. Measurement of human pathogenic organisms in untreated and treated wastewater is expensive and technically challenging. Consequently, environmental engineers have sought indicator organisms that are (1) easy to monitor and (2) correlate with population of pathogenic organisms. The most frequently used indicators are total coliforms, fecal coliforms, fecal streptococci and Escherichia coli. The literature survey of 60 constructed wetlands with emergent vegetation around the world revealed that removal of total and fecal coliforms in constructed wetlands with emergent macrophytes is high, usually 95 to > 99% while removal of fecal streptococci is lower, usually 80-95%. Because bacterial removal efficiency is a function of inflow bacteria number, the high removal effects are achieved for untreated or mechanically pretreated wastewater. Therefore, the outflow numbers of bacteria are more important. For TC and FC the outflow concentrations are usually in the range of 10(2) to 10(5) CFU/ 100 ml while for FS the range is between 10(2) and 10(4) CFU/ 100 ml. Results from operating systems suggest that enteric microbe removal efficiency in CWs with emergent macrophytes is primarily influenced by hydraulic loading rate (HLR) and the resultant hydraulic residence time (HRT) and the presence of vegetation. Removal of enteric bacteria follows approximately a first-order relationship.
Sattar, Syed A; Kibbee, Richard J; Zargar, Bahram; Wright, Kathryn E; Rubino, Joseph R; Ijaz, M Khalid
2016-10-01
Although indoor air can spread many pathogens, information on the airborne survival and inactivation of such pathogens remains sparse. Staphylococcus aureus and Klebsiella pneumoniae were nebulized separately into an aerobiology chamber (24.0 m 3 ). The chamber's relative humidity and air temperature were at 50% ± 5% and 20°C ± 2°C, respectively. The air was sampled with a slit-to-agar sampler. Between tests, filtered air purged the chamber of any residual airborne microbes. The challenge in the air varied between 4.2 log 10 colony forming units (CFU)/m 3 and 5.0 log 10 CFU/m 3 , sufficient to show a ≥3 log 10 (≥99.9%) reduction in microbial viability in air over a given contact time by the technologies tested. The rates of biologic decay of S aureus and K pneumoniae were 0.0064 ± 0.00015 and 0.0244 ± 0.009 log 10 CFU/m 3 /min, respectively. Three commercial devices, with ultraviolet light and HEPA (high-efficiency particulate air) filtration, met the product efficacy criterion in 45-210 minutes; these rates were statistically significant compared with the corresponding rates of biologic decay of the bacteria. One device was also tested with repeated challenges with aerosolized S aureus to simulate ongoing fluctuations in indoor air quality; it could reduce each such recontamination to an undetectable level in approximately 40 minutes. The setup described is suitable for work with all major classes of pathogens and also complies with the U.S. Environmental Protection Agency's guidelines (2012) for testing air decontamination technologies. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Signorelli, Salvatore Santo; Ferrante, Margherita
2017-05-01
A wide body of evidence indicates that environmental and occupational risk factors are associated with the development of pathological disorders. The pathogenic role of many environmental pollutants or occupational contaminants is already known and has been extensively investigated. However, the molecular mechanisms of action and the pathogenic effects of many substances remain unknown. Therefore, there is a need to better investigate the role of new environmental and occupational risk factors that may cause the development of several diseases.
Bergeron, V; Reboux, G; Poirot, J L; Laudinet, N
2007-10-01
To evaluate the performance of a new mobile air-treatment unit that uses nonthermal-plasma reactors for lowering the airborne bioburden in critical hospital environments and reducing the risk of nosocomial infection due to opportunistic airborne pathogens, such as Aspergillus fumigatus. Tests were conducted in 2 different high-risk hospital areas: an operating room under simulated conditions and rooms hosting patients in a pediatric hematology ward. Operating room testing provided performance evaluations of removal rates for airborne contamination (ie, particles larger than 0.5 microm) and overall lowering of the airborne bioburden (ie, colony-forming units of total mesophilic flora and fungal flora per cubic meter of air). In the hematology service, opportunistic and nonpathogenic airborne fungal levels in a patient's room equipped with an air-treatment unit were compared to those in a control room. In an operating room with a volume of 118 m(3), the time required to lower the concentration of airborne particles larger than 0.5 microm by 90% was decreased from 12 minutes with the existing high-efficiency particulate air filtration system to less than 2 minutes with the units tested, with a 2-log decrease in the steady-state levels of such particles (P<.01). Concurrently, total airborne mesophilic flora concentrations dropped by a factor of 2, and the concentrations of fungal species were reduced to undetectable levels (P<.01). The 12-day test period in the hematology ward revealed a significant reduction in airborne fungus levels (P<.01), with average reductions of 75% for opportunistic species and 82% for nonpathogenic species. Our data indicate that the mobile, nonthermal-plasma air treatment unit tested in this study can rapidly reduce the levels of airborne particles and significantly lower the airborne bioburden in high-risk hospital environments.
Olivares, Marta; Benítez-Páez, Alfonso; de Palma, Giada; Capilla, Amalia; Nova, Esther; Castillejo, Gemma; Varea, Vicente; Marcos, Ascensión; Garrote, José Antonio; Polanco, Isabel; Donat, Ester; Ribes-Koninckx, Carmen; Calvo, Carmen; Ortigosa, Luis; Palau, Francesc; Sanz, Yolanda
2018-04-19
Celiac disease (CD) is an immune-mediated enteropathy involving genetic and environmental factors, whose interaction influences disease risk. The intestinal microbiota, including viruses and bacteria, could play a role in the pathological process leading to gluten intolerance. In this study, we investigated the prevalence of pathogens in the intestinal microbiota of infants at familial risk of developing CD. We included 127 full-term newborns with at least one first-degree relative with CD. Infants were classified according to milk-feeding practice (breastfeeding or formula feeding) and HLA-DQ genotype (low, intermediate or high genetic risk). The prevalence of pathogenic bacteria and viruses was assessed in the faeces of the infants at 7 days, 1 month and 4 months of age. The prevalence of Clostridium perfringens was higher in formula-fed infants than in breast-fed over the study period, and that of C. difficile at 4 months. Among breastfed infants, a higher prevalence of enterotoxigenic E. coli (ETEC) was found in infants with the highest genetic risk compared either to those with a low or intermediate risk. Among formula-fed infants, a higher prevalence of ETEC was also found in infants with a high genetic risk compared to those of intermediate risk. Our results show that specific factors, such as formula feeding and the HLA-DQ2 genotype, previously linked to a higher risk of developing CD, influence the presence of pathogenic bacteria differently in the intestinal microbiota in early life. Further studies are warranted to establish whether these associations are related to CD onset later in life.
Cow-specific risk factors for clinical mastitis in Brazilian dairy cattle.
Oliveira, C S F; Hogeveen, H; Botelho, A M; Maia, P V; Coelho, S G; Haddad, J P A
2015-10-01
Information related to mastitis risk factors is useful for the design and implementation of clinical mastitis (CM) control programs. The first objective of our study was to model the risk of CM under Brazilian conditions, using cow-specific risk factors. Our second objective was to explore which risk factors were associated with the occurrence of the most common pathogens involved in Brazilian CM infections. The analyses were based on 65 months of data from 9,789 dairy cows and 12,464 CM cases. Cow-specific risk factors that could easily be measured in standard Brazilian dairy farms were used in the statistical analyses, which included logistic regression and multinomial logistic regression. The first month of lactation, high somatic cell count, rainy season and history of clinical mastitis cases were factors associated with CM for both primiparous and multiparous cows. In addition, parity and breed were also associated risk factors for multiparous cows. Of all CM cases, 54% showed positive bacteriological culturing results from which 57% were classified as environmental pathogens, with a large percentage of coliforms (35%). Coagulase-negative Staphylococcus (16%), Streptococcus uberis (9%), Streptococcus agalactiae (7%) and other Streptococci (9%) were also common pathogens. Among the pathogens analyzed, the association of cow-specific risk factors, such as Zebu breed (OR=5.84, 95%CI 3.77-10.77) and accumulated history of SCC (1.76, 95%CI 1.37-2.27), was different for CM caused by Coagulase-negative Staphylococcus and S. agalactiae in comparison to CM caused by coliforms. Our results suggest that CM control programs in Brazil should specially consider the recent history of clinical mastitis cases and the beginning of the lactations, mainly during the rainy season as important risk factor for mastitis. Copyright © 2015 Elsevier B.V. All rights reserved.
Espinosa, Ana Cecilia; Jesudhasan, Palmy; Arredondo, René; Cepeda, Martha; Mazari-Hiriart, Marisa; Mena, Kristi D.
2012-01-01
Fresh produce, such as lettuce and spinach, serves as a route of food-borne illnesses. The U.S. FDA has approved the use of ionizing irradiation up to 4 kGy as a pathogen kill step for fresh-cut lettuce and spinach. The focus of this study was to determine the inactivation of poliovirus and rotavirus on lettuce and spinach when exposed to various doses of high-energy electron beam (E-beam) irradiation and to calculate the theoretical reduction in infection risks that can be achieved under different contamination scenarios and E-beam dose applications. The D10 value (dose required to reduce virus titers by 90%) (standard error) of rotavirus on spinach and lettuce was 1.29 (± 0.64) kGy and 1.03 (± 0.05) kGy, respectively. The D10 value (standard error) of poliovirus on spinach and lettuce was 2.35 (± 0.20) kGy and 2.32 (± 0.08) kGy, respectively. Risk assessment of data showed that if a serving (∼14 g) of lettuce was contaminated with 10 PFU/g of poliovirus, E-beam irradiation at 3 kGy will reduce the risk of infection from >2 in 10 persons to approximately 6 in 100 persons. Similarly, if a serving size (∼0.8 g) of spinach is contaminated with 10 PFU/g of rotavirus, E-beam irradiation at 3 kGy will reduce infection risks from >3 in 10 persons to approximately 5 in 100 persons. The results highlight the value of employing E-beam irradiation to reduce public health risks but also the critical importance of adhering to good agricultural practices that limit enteric virus contamination at the farm and in packing houses. PMID:22179244
ERIC Educational Resources Information Center
Cuming, Richard G.
2009-01-01
Exposure to certain bloodborne pathogens can prematurely end a person's life. Healthcare workers (HCWs), especially those who are members of surgical teams, are at increased risk of exposure to these pathogens. The proper use of personal protective equipment (PPE) during operative/invasive procedures reduces that risk. Despite this, some HCWs fail…
Impact of biodiversity and seasonality on Lyme-pathogen transmission.
Lou, Yijun; Wu, Jianhong; Wu, Xiaotian
2014-11-28
Lyme disease imposes increasing global public health challenges. To better understand the joint effects of seasonal temperature variation and host community composition on the pathogen transmission, a stage-structured periodic model is proposed by integrating seasonal tick development and activity, multiple host species and complex pathogen transmission routes between ticks and reservoirs. Two thresholds, one for tick population dynamics and the other for Lyme-pathogen transmission dynamics, are identified and shown to fully classify the long-term outcomes of the tick invasion and disease persistence. Seeding with the realistic parameters, the tick reproduction threshold and Lyme disease spread threshold are estimated to illustrate the joint effects of the climate change and host community diversity on the pattern of Lyme disease risk. It is shown that climate warming can amplify the disease risk and slightly change the seasonality of disease risk. Both the "dilution effect" and "amplification effect" are observed by feeding the model with different possible alternative hosts. Therefore, the relationship between the host community biodiversity and disease risk varies, calling for more accurate measurements on the local environment, both biotic and abiotic such as the temperature and the host community composition.
Maclean, Michelle; Anderson, John G.; MacGregor, Scott J.; White, Tracy
2016-01-01
Bacterial contamination of injectable stored biological fluids such as blood plasma and platelet concentrates preserved in plasma at room temperature is a major health risk. Current pathogen reduction technologies (PRT) rely on the use of chemicals and/or ultraviolet light, which affects product quality and can be associated with adverse events in recipients. 405 nm violet-blue light is antibacterial without the use of photosensitizers and can be applied at levels safe for human exposure, making it of potential interest for decontamination of biological fluids such as plasma. As a pilot study to test whether 405 nm light is capable of inactivating bacteria in biological fluids, rabbit plasma and human plasma were seeded with bacteria and treated with a 405 nm light emitting diode (LED) exposure system (patent pending). Inactivation was achieved in all tested samples, ranging from low volumes to prebagged plasma. 99.9% reduction of low density bacterial populations (≤103 CFU mL−1), selected to represent typical “natural” contamination levels, was achieved using doses of 144 Jcm−2. The penetrability of 405 nm light, permitting decontamination of prebagged plasma, and the nonrequirement for photosensitizing agents provide a new proof of concept in bacterial reduction in biological fluids, especially injectable fluids relevant to transfusion medicine. PMID:27774337
POU/POE may be a cost-effective option for reductions of a particular chemical to achieve water quality compliance under certain situations and given restrictions. Proactive consumers seeking to reduce exposure to potential pathogens, trace chemicals, and nanoparticles not curre...
Improving Pathogen Reduction by Chlorine Wash Prior to Cutting in Fresh-Cut Processing
USDA-ARS?s Scientific Manuscript database
Introduction: Currently, most fresh-cut processing facilities in the United States use chlorinated water or other sanitizer solutions for microbial reduction after lettuce is cut. Freshly cut lettuce releases significant amounts of organic matter that negatively impacts the effectiveness of chlorine...
Merchant, Sanjay; Proudfoot, Emma M; Quadri, Hafsa N; McElroy, Heather J; Wright, William R; Gupta, Ankur; Sarpong, Eric M
2018-02-15
Treating infections of Gram-negative pathogens, in particular Pseudomonas aeruginosa, is a challenge for clinicians in the Asia-Pacific region due to inherent and acquired resistance to antimicrobials. This systematic review and meta-analysis provides updated information of the risk factors for P. aeruginosa infections in Asia-Pacific, and consequences (e.g., mortality, costs) of initial inappropriate antimicrobial therapy (IIAT). EMBASE and MEDLINE databases were searched for Asia-Pacific studies reporting the consequences of IIAT versus initial appropriate antimicrobial therapy (IAAT) in Gram-negative infections, and risk factors for serious P. aeruginosa infections. A meta-analysis of unadjusted mortality was performed using random-effects model. Twenty-two studies reporting mortality and 13 reporting risk factors were identified. The meta-analysis demonstrated that mortality was significantly lower in patients receiving IAAT versus IIAT, with 67% reduction observed for 28- or 30-day all-cause mortality (OR 0.33; 95% CI 0.20, 0.55; P <0.001). Risk factors for serious P. aeruginosa infection include previous exposure to antimicrobials, mechanical ventilation, and previous hospitalization. The high rates of antimicrobial resistance in Asia-Pacific, as well as increased mortality associated with IIAT and the presence of risk factors for serious infection, highlight the importance of access to newer and appropriate antimicrobials. Copyright © 2018. Published by Elsevier Ltd.
Yang, Jian; Lee, Delores; Afaisen, Shayna; Gadi, Rama
2013-01-01
Lemon juice, a major source of acidulant citric acid, is frequently used in the preparation of ethnic foods. Raw or partially cooked meats are marinated with lemon juice in the preparation of a popular Chamorro dish called kelaguen, which is, unfortunately, strongly associated with foodborne illness outbreaks in Guam. We investigated the efficacy of lemon juice in reducing numbers of Escherichia coli O157:H7, Salmonella Enteritidis, and Listeria monocytogenes at stationary phase during marination. Beef inoculated with a three-strain mixture of E. coli O157:H7, S. Enteritidis, or L. monocytogenes at 10(6)CFU/mL was marinated with lemon juice from 0.2 to 10mL/g for 48h at 28°C. The decline of the pathogens during marination exhibited various degrees of deviation from first-order kinetics. Based on calculations with both linear regression and Weibull models, the decimal reduction time (4-D values) over the range of lemon concentrations was 366-5.1h for E. coli O157:H7, 282-2.4h for S. Enteritidis, and 104-2.4h for L. monocytogenes, indicating that E. coli O157:H7 was the most lemon-juice-resistant of the three. The pathogen reduction time (log 4-D values) plotted against undissociated titratable citric acid exhibited a biphasic pattern. The pathogen reduction time (log 4-D or δ values) was linearly correlated with the pH of the marinating beef (R(2)=0.92 to 0.98). The Z(pH) values (pH dependence of death rate) with beef marination were 1.03 for E. coli O157:H7, 0.92 for S. Enteritidis, and 1.29 for L. monocytogenes, indicating that L. monocytogenes was the most pH resistant of the three. L. monocytogenes exhibited less resistance to lemon juice than S. Enteritidis at pH of 3.5-4.4 but more resistance at pH of 2.6-2.8. In addition, at 4°C, all three pathogens exhibited 4-D values 1.7-4.1 times greater than those at 24°C at 5mL lemon juice/g beef. In conclusion, the usual beef marinating practice for kelaguen preparation (<0.5mL lemon juice/g beef for 1-12h) did not sufficiently inactivate E. coli O157:H7, S. Enteritidis, and L. monocytogenes to meet minimum food-safety requirements. To reduce the risk of kelaguen-associated foodborne illness, kelaguen preparation must include heat treatment in addition to marination with lemon juice. Published by Elsevier B.V.
Robust Derivation of Risk Reduction Strategies
NASA Technical Reports Server (NTRS)
Richardson, Julian; Port, Daniel; Feather, Martin
2007-01-01
Effective risk reduction strategies can be derived mechanically given sufficient characterization of the risks present in the system and the effectiveness of available risk reduction techniques. In this paper, we address an important question: can we reliably expect mechanically derived risk reduction strategies to be better than fixed or hand-selected risk reduction strategies, given that the quantitative assessment of risks and risk reduction techniques upon which mechanical derivation is based is difficult and likely to be inaccurate? We consider this question relative to two methods for deriving effective risk reduction strategies: the strategic method defined by Kazman, Port et al [Port et al, 2005], and the Defect Detection and Prevention (DDP) tool [Feather & Cornford, 2003]. We performed a number of sensitivity experiments to evaluate how inaccurate knowledge of risk and risk reduction techniques affect the performance of the strategies computed by the Strategic Method compared to a variety of alternative strategies. The experimental results indicate that strategies computed by the Strategic Method were significantly more effective than the alternative risk reduction strategies, even when knowledge of risk and risk reduction techniques was very inaccurate. The robustness of the Strategic Method suggests that its use should be considered in a wide range of projects.
Arbaeen, Ahmad F; Schubert, Peter; Serrano, Katherine; Carter, Cedric J; Culibrk, Brankica; Devine, Dana V
2017-05-01
Trauma transfusion packages for hemorrhage control consist of red blood cells, plasma, and platelets at a set ratio. Although pathogen reduction improves the transfusion safety of platelet and plasma units, there is an associated reduction in quality. This study aimed to investigate the impact of riboflavin/ultraviolet light-treated plasma or platelets in transfusion trauma packages composed of red blood cell, plasma, and platelet units in a ratio of 1:1:1 in vitro by modeling transfusion scenarios for trauma patients and assessing function by rotational thromboelastometry. Pathogen-reduced or untreated plasma and buffy coat platelet concentrate units produced in plasma were used in different combinations with red blood cells in trauma transfusion packages. After reconstitution of these packages with hemodiluted blood, the hemostatic functionality was analyzed by rotational thromboelastometry. Hemostatic profiles of pathogen-inactivated buffy coat platelet concentrate and plasma indicated decreased activity compared with their respective controls. Reconstitution of hemodiluted blood (hematocrit = 20%) with packages that contained treated or nontreated components resulted in increased alpha and maximum clot firmness and enhanced clot-formation time. Simulating transfusion scenarios based on 30% blood replacement with a transfusion trauma package resulted in a nonsignificant difference in rotational thromboelastometry parameters between packages containing treated and nontreated blood components (p ≥ 0.05). Effects of pathogen inactivation treatment were evident when the trauma package percentage was 50% or greater and contained both pathogen inactivation-treated plasma and buffy coat platelet concentrate. Rotational thromboelastometry investigations suggest that there is relatively little impact of pathogen inactivation treatment on whole blood clot formation unless large amounts of treated components are used. © 2017 AABB.
Sturrock, Craig J.; Woodhall, James; Brown, Matthew; Walker, Catherine; Mooney, Sacha J.; Ray, Rumiana V.
2015-01-01
Rhizoctonia solani is a plant pathogenic fungus that causes significant establishment and yield losses to several important food crops globally. This is the first application of high resolution X-ray micro Computed Tomography (X-ray μCT) and real-time PCR to study host–pathogen interactions in situ and elucidate the mechanism of Rhizoctonia damping-off disease over a 6-day period caused by R. solani, anastomosis group (AG) 2-1 in wheat (Triticum aestivum cv. Gallant) and oil seed rape (OSR, Brassica napus cv. Marinka). Temporal, non-destructive analysis of root system architectures was performed using RooTrak and validated by the destructive method of root washing. Disease was assessed visually and related to pathogen DNA quantification in soil using real-time PCR. R. solani AG2-1 at similar initial DNA concentrations in soil was capable of causing significant damage to the developing root systems of both wheat and OSR. Disease caused reductions in primary root number, root volume, root surface area, and convex hull which were affected less in the monocotyledonous host. Wheat was more tolerant to the pathogen, exhibited fewer symptoms and developed more complex root systems. In contrast, R. solani caused earlier damage and maceration of the taproot of the dicot, OSR. Disease severity was related to pathogen DNA accumulation in soil only for OSR, however, reductions in root traits were significantly associated with both disease and pathogen DNA. The method offers the first steps in advancing current understanding of soil-borne pathogen behavior in situ at the pore scale, which may lead to the development of mitigation measures to combat disease influence in the field. PMID:26157449
Sturrock, Craig J; Woodhall, James; Brown, Matthew; Walker, Catherine; Mooney, Sacha J; Ray, Rumiana V
2015-01-01
Rhizoctonia solani is a plant pathogenic fungus that causes significant establishment and yield losses to several important food crops globally. This is the first application of high resolution X-ray micro Computed Tomography (X-ray μCT) and real-time PCR to study host-pathogen interactions in situ and elucidate the mechanism of Rhizoctonia damping-off disease over a 6-day period caused by R. solani, anastomosis group (AG) 2-1 in wheat (Triticum aestivum cv. Gallant) and oil seed rape (OSR, Brassica napus cv. Marinka). Temporal, non-destructive analysis of root system architectures was performed using RooTrak and validated by the destructive method of root washing. Disease was assessed visually and related to pathogen DNA quantification in soil using real-time PCR. R. solani AG2-1 at similar initial DNA concentrations in soil was capable of causing significant damage to the developing root systems of both wheat and OSR. Disease caused reductions in primary root number, root volume, root surface area, and convex hull which were affected less in the monocotyledonous host. Wheat was more tolerant to the pathogen, exhibited fewer symptoms and developed more complex root systems. In contrast, R. solani caused earlier damage and maceration of the taproot of the dicot, OSR. Disease severity was related to pathogen DNA accumulation in soil only for OSR, however, reductions in root traits were significantly associated with both disease and pathogen DNA. The method offers the first steps in advancing current understanding of soil-borne pathogen behavior in situ at the pore scale, which may lead to the development of mitigation measures to combat disease influence in the field.
Ewig, S; Torres, A; El-Ebiary, M; Fábregas, N; Hernández, C; González, J; Nicolás, J M; Soto, L
1999-01-01
We prospectively evaluated the relation of upper airway, lower airway, and gastric colonization patterns with the development of pneumonia and its etiology in 48 patients with surgical (n = 25) and medical (n = 23) head injury. Initial colonization was assessed by cultures of nasal and pharyngeal swabs, tracheobronchial aspirates, gastric juice, and bronchoscopically retrieved protected specimen brush. Follow-up colonization was determined until the end points extubation, suspected ventilator-associated pneumonia (VAP), or death. The initial colonization rate at any site at ICU admission was 39/47 (83%). It mainly accounted for Group I pathogens (Streptococcus pneumoniae, Staphylococcus aureus, Hemophilus influenzae) of the upper and lower airways. At follow-up, colonization rates with Group II pathogens (Gram-negative enteric bacilli and Pseudomonas spp.) increased significantly. The high initial bacterial load with Group I pathogens of the upper airways and trachea decreased during Days 2 to 4, whereas that of Group II pathogens increased. Upper airway colonization was an independent predictor of follow-up tracheobronchial colonization (odds ratio [OR], 9.9; 95% confidence interval [CI], 1.8 to 56.3 for initial colonization with Group I pathogens; OR, 23.9; 95% CI, 3.8 to 153.3 for follow-up colonization with Group II pathogens). Previous (short-term) antibiotics had a protective effect against colonization with Group I pathogens of the lower respiratory tract (OR, 0.2; 95% CI, 0.05 to 0.86), but they were a risk factor for colonization with Group II pathogens (OR, 6.1; 95% CI, 1.3 to 29). Initial tracheobronchial colonization with Group I pathogens was associated with a higher probability of early onset pneumonia (OR, 4. 1; 95% CI, 0.7 to 23.3), whereas prolonged antibiotic treatment (> 24 h) independently predicted late-onset pneumonia (OR, 9.2; 95% CI, 1.7 to 51.3). We conclude that patients with head injury are colonized in the airways mainly by Group I pathogens early in the evolution of illness. The upper airways represent the main reservoir for subsequent lower airway colonization with Group I pathogens. Previous (short-term) antibiotic treatment is protective against initial tracheobronchial colonization with Group I pathogens, but it represents a risk factor for subsequent lower airway colonization by Group II pathogens.
Review of literature on climate change and forest diseases of western North America
John T. Kliejunas; Brian W. Geils; Jessie Micales Glaeser; Ellen Michaels Goheen; Paul Hennon; Mee-Sook Kim; Harry Kope; Jeff Stone; Rona Sturrock; Susan J. Frankel
2009-01-01
A summary of the literature on relationships between climate and various types of tree diseases, and the potential effects of climate change on pathogens in western North American forests is provided. Climate change generally will lead to reductions in tree health and will improve conditions for some highly damaging pathogens. Sections on abiotic diseases, declines,...
Prevalence, incidence and risk factors of heifer mastitis.
Fox, L K
2009-02-16
Traditionally heifers, as calves and as primiparae, have been thought of as a group as free of mastitis. Without appreciable lacteal secretion, there is reduced nutrient fluid available to support growth of intramammary pathogens. Contagious mastitis is primarily transmitted at milking time and the milking process affects the patency of the teat orifice which can increase the risk of development of environmental mastitis. Logically therefore prepartum heifers should be free of intramammary infections. During the last 20 years there have been numerous investigations describing the nature of mastitis in heifers and thus the dogma that heifers are free of this disease has been challenged. The purpose of this manuscript is to review that literature describing heifer intramammary infections that cause both subclinical and clinical disease. Mammary quarter infection prevalence ranges between 28.9-74.6% prepartum, and 12.3-45.5% at parturition. Generally, the pathogens that cause mastitis in heifers are the same as those that cause infections in the older cows. In all but one study reviewed, coagulase-negative staphylococci (CNS) are the most prevalent cause of subclinical intramammary infections in heifers. Coagulase-positive staphylococci (CPS) in some studies are the second most prevalent pathogens, while in other studies the environmental mastitis pathogens are more prevalent. The risk factors for subclinical mastitis appear to be season, herd location, and trimester of pregnancy; all suggesting that management can have an impact in control of this disease prepartum. With respect to clinical mastitis, the most prevalent mastitis pathogen has been reported to be CNS in one study and CPS, or environmental mastitis pathogens, in other studies. The heifer is most at risk for clinical mastitis during the periparturient period. Risk factors found are related to diet, mammary gland factors such as edema and leaking of milk, and factors associated with the change in management and introduction of the heifer to the milking herd.
Prasai, Tanka P.; Walsh, Kerry B.; Bhattarai, Surya P.; Midmore, David J.; Van, Thi T. H.; Moore, Robert J.; Stanley, Dragana
2016-01-01
A range of feed supplements, including antibiotics, have been commonly used in poultry production to improve health and productivity. Alternative methods are needed to suppress pathogen loads and maintain productivity. As an alternative to antibiotics use, we investigated the ability of biochar, bentonite and zeolite as separate 4% feed additives, to selectively remove pathogens without reducing microbial richness and diversity in the gut. Neither biochar, bentonite nor zeolite made any significant alterations to the overall richness and diversity of intestinal bacterial community. However, reduction of some bacterial species, including some potential pathogens was detected. The microbiota of bentonite fed animals were lacking all members of the order Campylobacterales. Specifically, the following operational taxonomic units (OTUs) were absent: an OTU 100% identical to Campylobacter jejuni; an OTU 99% identical to Helicobacter pullorum; multiple Gallibacterium anatis (>97%) related OTUs; Bacteroides dorei (99%) and Clostridium aldenense (95%) related OTUs. Biochar and zeolite treatments had similar but milder effects compared to bentonite. Zeolite amended feed was also associated with significant reduction in the phylum Proteobacteria. All three additives showed potential for the control of major poultry zoonotic pathogens. PMID:27116607
Cleanliness in context: reconciling hygiene with a modern microbial perspective.
Vandegrift, Roo; Bateman, Ashley C; Siemens, Kyla N; Nguyen, May; Wilson, Hannah E; Green, Jessica L; Van Den Wymelenberg, Kevin G; Hickey, Roxana J
2017-07-14
The concept of hygiene is rooted in the relationship between cleanliness and the maintenance of good health. Since the widespread acceptance of the germ theory of disease, hygiene has become increasingly conflated with sterilization. In reviewing studies across the hygiene literature (most often hand hygiene), we found that nearly all studies of hand hygiene utilize bulk reduction in bacterial load as a proxy for reduced transmission of pathogenic organisms. This treatment of hygiene may be insufficient in light of recent microbial ecology research, which has demonstrated that humans have intimate and evolutionarily significant relationships with a diverse assemblage of microorganisms (our microbiota). The human skin is home to a diverse and specific community of microorganisms, which include members that exist across the ecological spectrum from pathogen through commensal to mutualist. Most evidence suggests that the skin microbiota is likely of direct benefit to the host and only rarely exhibits pathogenicity. This complex ecological context suggests that the conception of hygiene as a unilateral reduction or removal of microbes has outlived its usefulness. As such, we suggest the explicit definition of hygiene as "those actions and practices that reduce the spread or transmission of pathogenic microorganisms, and thus reduce the incidence of disease."
Qi, Zhongqiang; Wang, Qi; Dou, Xianying; Wang, Wei; Zhao, Qian; Lv, Ruili; Zhang, Haifeng; Zheng, Xiaobo; Wang, Ping; Zhang, Zhengguang
2011-01-01
Magnaporthe oryzae MAPK MoMps1 plays a critical role in regulating various developmental processes including cell wall integrity, stress responses, and pathogenicity. To identify potential effectors of MoMps1, we characterized the function of MoSwi6, a homolog of Saccharomyces cerevisiae Swi6 downstream of MAPK Slt2 signaling. MoSwi6 interacted with MoMps1 both in vivo and in vitro, suggesting a possible functional link analogous to Swi6-Slt2 in S. cerevisiae. Targeted gene disruption of MoSWI6 resulted in multiple developmental defects, including reduced hyphal growth, abnormal formation of conidia and appressoria, and impaired appressorium function. The reduction in appressorial turgor pressure also contributed to an attenuation of pathogenicity. The ΔMoswi6 mutant also displayed a defect in cell wall integrity, was hypersensitive to the oxidative stress, and showed significant reduction in transcription and activities of extracellular enzymes including peroxidases and laccases. Collectively, these roles are similar to those of MoMps1, confirming that MoSwi6 functions in the MoMps1 pathway to govern growth, development, and full pathogenicity. PMID:22321443
Ha, Jae-Won; Back, Kyeong-Hwan; Kim, Yoon-Hee; Kang, Dong-Hyun
2016-08-01
In this study, the efficacy of using UV-C light to inactivate sliced cheese inoculated with Escherichia coli O157:H7, Salmonella Typhimurium, and Listeria monocytogenes and, packaged with 0.07 mm films of polyethylene terephthalate (PET), polyvinylchloride (PVC), polypropylene (PP), and polyethylene (PE) was investigated. The results show that compared with PET and PVC, PP and PE films showed significantly reduced levels of the three pathogens compared to inoculated but non-treated controls. Therefore, PP and PE films of different thicknesses (0.07 mm, 0.10 mm, and 0.13 mm) were then evaluated for pathogen reduction of inoculated sliced cheese samples. Compared with 0.10 and 0.13 mm, 0.07 mm thick PP and PE films did not show statistically significant reductions compared to non-packaged treated samples. Moreover, there were no statistically significant differences between the efficacy of PP and PE films. These results suggest that adjusted PP or PE film packaging in conjunction with UV-C radiation can be applied to control foodborne pathogens in the dairy industry. Copyright © 2016. Published by Elsevier Ltd.
Land Application of Wastes: An Educational Program. Pathogens - Module 9.
ERIC Educational Resources Information Center
Clarkson, W. W.; And Others
This module is intended to help engineers evaluate the relative health risks from pathogens at land treatment sites versus conventional waste treatment systems. Among the topics considered are the following: (1) the relationship between survival time of pathogens and the chance of disease transmission to humans; (2) the factors that favor survival…
Exploitation of microbial forensics and nanotechnology for the monitoring of emerging pathogens.
Bokhari, Habib
2018-03-07
Emerging infectious diseases remain among the leading causes of global mortality. Traditional laboratory diagnostic approaches designed to detect and track infectious disease agents provide a framework for surveillance of bio threats. However, surveillance and outbreak investigations using such time-consuming approaches for early detection of pathogens remain the major pitfall. Hence, reasonable real-time surveillance systems to anticipate threats to public health and environment are critical for identifying specific aetiologies and preventing the global spread of infectious disease. The current review discusses the growing need for monitoring and surveillance of pathogens with the same zeal and approach as adopted by microbial forensics laboratories, and further strengthening it by integrating with the innovative nanotechnology for rapid detection of microbial pathogens. Such innovative diagnostics platforms will help to track pathogens from high risk areas and environment by pre-emptive approach that will minimize damages. The various scenarios with the examples are discussed where the high risk associated human pathogens in particular were successfully detected using various nanotechnology approaches with potential future prospects in the field of microbial forensics.
Carratalà, Anna; Rodriguez-Manzano, Jesús; Hundesa, Ayalkibet; Rusiñol, Marta; Fresno, Sandra; Cook, Nigel; Girones, Rosina
2013-06-17
Determining the stability, or persistence in an infectious state, of foodborne viral pathogens attached to surfaces of soft fruits and salad vegetables is essential to underpin risk assessment studies in food safety. Here, we evaluate the effect of temperature and sunlight on the stability of infectious human adenoviruses type 2 and MS2 bacteriophages on lettuce and strawberry surfaces as representative fresh products. Human adenoviruses have been selected because of their double role as viral pathogens and viral indicators of human fecal contamination. Stability assays were performed with artificially contaminated fresh samples kept in the dark or under sunlight exposure at 4 and 30°C over 24h. The results indicate that temperature is the major factor affecting HAdV stability in fresh produce surfaces, effecting decay between 3 and 4 log after 24h at 30°C. The inactivation times to achieve a reduction between 1 and 4-log are calculated for each experimental condition. This work provides useful information to be considered for improving food safety regarding the transmission of foodborne viruses through supply chains. Copyright © 2013 Elsevier B.V. All rights reserved.
Kidd, Timothy J.; Geake, James B.; Bell, Scott C.; Currie, Bart J.
2017-01-01
ABSTRACT Cystic fibrosis (CF) is a genetic disorder characterized by progressive lung function decline. CF patients are at an increased risk of respiratory infections, including those by the environmental bacterium Burkholderia pseudomallei, the causative agent of melioidosis. Here, we compared the genomes of B. pseudomallei isolates collected between ~4 and 55 months apart from seven chronically infected CF patients. Overall, the B. pseudomallei strains showed evolutionary patterns similar to those of other chronic infections, including emergence of antibiotic resistance, genome reduction, and deleterious mutations in genes involved in virulence, metabolism, environmental survival, and cell wall components. We documented the first reported B. pseudomallei hypermutators, which were likely caused by defective MutS. Further, our study identified both known and novel molecular mechanisms conferring resistance to three of the five clinically important antibiotics for melioidosis treatment. Our report highlights the exquisite adaptability of microorganisms to long-term persistence in their environment and the ongoing challenges of antibiotic treatment in eradicating pathogens in the CF lung. Convergent evolution with other CF pathogens hints at a degree of predictability in bacterial evolution in the CF lung and potential targeted eradication of chronic CF infections in the future. PMID:28400528
Codigestion of manure and organic wastes in centralized biogas plants: status and future trends.
Angelidaki, I; Ellegaard, L
2003-01-01
Centralized biogas plants in Denmark codigest mainly manure, together with other organic waste such as industrial organic waste, source sorted household waste, and sewage sludge. Today 22 large-scale centralized biogas plants are in operation in Denmark, and in 2001 they treated approx 1.2 million tons of manure as well as approx 300,000 of organic industrial waste. Besides the centralized biogas plants there are a large number of smaller farm-scale plants. The long-term energy plan objective is a 10-fold increase of the 1998 level of biogas production by the year 2020. This will help to achieve a target of 12-14% of the national energy consumption being provided by renewable energy by the year 2005 and 33% by the year 2030. A major part of this increase is expected to come from new centralized biogas plants. The annual potential for biogas production from biomass resources available in Denmark is estimated to be approx 30 Peta Joule (PJ). Manure comprises about 80% of this potential. Special emphasis has been paid to establishing good sanitation and pathogen reduction of the digested material, to avoid risk of spreading pathogens when applying the digested manure as fertilizer to agricultural soils.
Wang, Jing; Chen, Lin; Zhou, Cong; Wang, Li; Xie, Hanbin; Xiao, Yuanyuan; Zhu, Hongmei; Hu, Ting; Zhang, Zhu; Zhu, Qian; Liu, Zhiying; Liu, Shanlin; Wang, He; Xu, Mengnan; Ren, Zhilin; Yu, Fuli; Cram, David S; Liu, Hongqian
2018-05-28
Next generation sequencing (NGS) is emerging as a viable alternative to chromosome microarray analysis for the diagnosis of chromosome disease syndromes. One NGS methodology, copy number variation sequencing (CNV-Seq), has been shown to deliver high reliability, accuracy and reproducibility for detection of fetal CNVs in prenatal samples. However, its clinical utility as a first tier diagnostic method has yet to be demonstrated in a large cohort of pregnant women referred for fetal chromosome testing. To evaluate CNV-Seq as a first tier diagnostic method for detection of fetal chromosome anomalies in a general population of pregnant women with high-risk prenatal indications. Prospective analysis of 3429 pregnant women referred for amniocentesis and fetal chromosome testing for different risk indications, including advanced maternal age (AMA), high-risk maternal serum screening (HR-MSS), and positivity for an ultrasound soft marker (USM). Amniocentesis was performed by standard procedures. Amniocyte DNA was analyzed by CNV-Seq with a chromosome resolution of 0.1 Mb. Fetal chromosome anomalies including whole chromosome aneuploidy and segmental imbalances were independently confirmed by gold standard cytogenetic and molecular methods and their pathogenicity determined following guidelines of the American College of Medical Genetics for sequence variants. Clear interpretable CNV-Seq results were obtained for all 3429 amniocentesis samples. CNV-Seq identified 3293 (96%) samples with a normal molecular karyotype and 136 samples (4%) with an altered molecular karyotype. A total of 146 fetal chromosome anomalies were detected, comprising 46 whole chromosome aneuploidies (pathogenic), 29 submicroscopic microdeletions/microduplications with known or suspected associations with chromosome disease syndromes (pathogenic), 22 other microdeletions/microduplications (likely pathogenic) and 49 variants of uncertain significance (VUS). Overall, the cumulative frequency of pathogenic/likely pathogenic and VUS chromosome anomalies in the patient cohort was 2.83% and 1.43%, respectively. In the three high-risk AMA, HR-MSS and USM groups, the most common whole chromosome aneuploidy detected was trisomy 21, followed by sex chromosome aneuploidies, trisomy 18 and trisomy 13. Across all clinical indications, there was a similar incidence of submicroscopic CNVs, with approximately equal proportions of pathogenic/likely pathogenic and VUS CNVs. If karyotyping had been used as an alternate cytogenetics detection method, CNV-Seq would have returned a 1% higher yield of pathogenic or likely pathogenic CNVs. In a large prospective clinical study, CNV-Seq delivered high reliability and accuracy for identifying clinically significant fetal anomalies in prenatal samples. Based on key performance criteria, CNV-Seq appears to be a well-suited methodology for first tier diagnosis of pregnant women in the general population at risk of having a fetal chromosome abnormality. Copyright © 2018. Published by Elsevier Inc.
Fournier, A; Young, I; Rajić, A; Greig, J; LeJeune, J
2015-09-01
Wildlife is a known reservoir of pathogenic bacteria, including Mycobacterium bovis and Brucella spp. Transmission of these pathogens between wildlife and food animals can lead to damaging impacts on the agri-food industry and public health. Several international case studies have highlighted the complex and cross-sectoral challenges involved in preventing and managing these potential transmission risks. The objective of our study was to develop a better understanding of the socio-economic aspects of the transmission of pathogenic bacteria between wildlife and food animals to support more effective and sustainable risk mitigation strategies. We conducted qualitative thematic analysis on a purposive sample of 30/141 articles identified in a complementary scoping review of the literature in this area and identified two key themes. The first related to the framing of this issue as a 'wicked problem' that depends on a complex interaction of social factors and risk perceptions, governance and public policy, and economic implications. The second theme consisted of promising approaches and strategies to prevent and mitigate the potential risks from transmission of pathogenic bacteria between wildlife and food animals. These included participatory, collaborative and multidisciplinary decision-making approaches and the proactive incorporation of credible scientific evidence and local contextual factors into solutions. The integration of these approaches to address 'wicked problems' in this field may assist stakeholders and decision-makers in improving the acceptability and sustainability of future strategies to reduce the transmission of pathogenic bacteria between wildlife and food animals. © 2015 Zoonoses and Public Health © 2015 Her Majesty the Queen in Right of Canada Reproduced with the permission of the Minister of the Public Health Agency of Canada.
QMRA for Drinking Water: 2. The Effect of Pathogen Clustering in Single-Hit Dose-Response Models.
Nilsen, Vegard; Wyller, John
2016-01-01
Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson-distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional "single-hit" dose-response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose-response models in terms of probability generating functions. It is shown formally that the theoretical single-hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single-hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single-hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose-response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose-response assessment as well as practical risk characterization are discussed. © 2016 Society for Risk Analysis.
Zeolite food supplementation reduces abundance of enterobacteria.
Prasai, Tanka P; Walsh, Kerry B; Bhattarai, Surya P; Midmore, David J; Van, Thi T H; Moore, Robert J; Stanley, Dragana
2017-01-01
According to the World Health Organisation, antibiotics are rapidly losing potency in every country of the world. Poultry are currently perceived as a major source of pathogens and antimicrobial resistance. There is an urgent need for new and natural ways to control pathogens in poultry and humans alike. Porous, cation rich, aluminosilicate minerals, zeolites can be used as a feed additive in poultry rations, demonstrating multiple productivity benefits. Next generation sequencing of the 16S rRNA marker gene was used to phylogenetically characterize the fecal microbiota and thus investigate the ability and dose dependency of zeolite in terms of anti-pathogenic effects. A natural zeolite was used as a feed additive in laying hens at 1, 2, and 4% w/w for a 23 week period. At the end of this period cloacal swabs were collected to sample faecal microbial communities. A significant reduction in carriage of bacteria within the phylum Proteobacteria, especially in members of the pathogen-rich family Enterobacteriaceae, was noted across all three concentrations of zeolite. Zeolite supplementation of feed resulted in a reduction in the carriage of a number of poultry pathogens without disturbing beneficial bacteria. This effect was, in some phylotypes, correlated with the zeolite concentration. This result is relevant to zeolite feeding in other animal production systems, and for human pathogenesis. Copyright © 2016 Elsevier GmbH. All rights reserved.
Bara, Jeffrey; Rapti, Zoi; Cáceres, Carla E; Muturi, Ephantus J
2015-01-01
Despite the growing awareness that larval competition can influence adult mosquito life history traits including susceptibility to pathogens, the net effect of larval competition on human risk of exposure to mosquito-borne pathogens remains poorly understood. We examined how intraspecific larval competition affects dengue-2 virus (DENV-2) extrinsic incubation period and vectorial capacity of its natural vector Aedes albopictus. Adult Ae. albopictus from low and high-larval density conditions were orally challenged with DENV-2 and then assayed for virus infection and dissemination rates following a 6, 9, or 12-day incubation period using real-time quantitative reverse transcription PCR. We then modeled the effect of larval competition on vectorial capacity using parameter estimates obtained from peer-reviewed field and laboratory studies. Larval competition resulted in significantly longer development times, lower emergence rates, and smaller adults, but did not significantly affect the extrinsic incubation period of DENV-2 in Ae. albopictus. Our vectorial capacity models suggest that the effect of larval competition on adult mosquito longevity likely has a greater influence on vectorial capacity relative to any competition-induced changes in vector competence. Furthermore, we found that large increases in the viral dissemination rate may be necessary to compensate for small competition-induced reductions in daily survivorship. Our results indicate that mosquito populations that experience stress from larval competition are likely to have a reduced vectorial capacity, even when susceptibility to pathogens is enhanced.
Klerks, M M; van Gent-Pelzer, M; Franz, E; Zijlstra, C; van Bruggen, A H C
2007-08-01
This paper describes the physiological and molecular interactions between the human-pathogenic organism Salmonella enterica serovar Dublin and the commercially available mini Roman lettuce cv. Tamburo. The association of S. enterica serovar Dublin with lettuce plants was first determined, which indicated the presence of significant populations outside and inside the plants. The latter was evidenced from significant residual concentrations after highly efficient surface disinfection (99.81%) and fluorescence microscopy of S. enterica serovar Dublin in cross sections of lettuce at the root-shoot transition region. The plant biomass was reduced significantly compared to that of noncolonized plants upon colonization with S. enterica serovar Dublin. In addition to the physiological response, transcriptome analysis by cDNA amplified fragment length polymorphism analysis also provided clear differential gene expression profiles between noncolonized and colonized lettuce plants. From these, generally and differentially expressed genes were selected and identified by sequence analysis, followed by reverse transcription-PCR displaying the specific gene expression profiles in time. Functional grouping of the expressed genes indicated a correlation between colonization of the plants and an increase in expressed pathogenicity-related genes. This study indicates that lettuce plants respond to the presence of S. enterica serovar Dublin at physiological and molecular levels, as shown by the reduction in growth and the concurrent expression of pathogenicity-related genes. In addition, it was confirmed that Salmonella spp. can colonize the interior of lettuce plants, thus potentially imposing a human health risk when processed and consumed.
Bara, Jeffrey; Rapti, Zoi; Cáceres, Carla E.; Muturi, Ephantus J.
2015-01-01
Despite the growing awareness that larval competition can influence adult mosquito life history traits including susceptibility to pathogens, the net effect of larval competition on human risk of exposure to mosquito-borne pathogens remains poorly understood. We examined how intraspecific larval competition affects dengue-2 virus (DENV-2) extrinsic incubation period and vectorial capacity of its natural vector Aedes albopictus. Adult Ae. albopictus from low and high-larval density conditions were orally challenged with DENV-2 and then assayed for virus infection and dissemination rates following a 6, 9, or 12-day incubation period using real-time quantitative reverse transcription PCR. We then modeled the effect of larval competition on vectorial capacity using parameter estimates obtained from peer-reviewed field and laboratory studies. Larval competition resulted in significantly longer development times, lower emergence rates, and smaller adults, but did not significantly affect the extrinsic incubation period of DENV-2 in Ae. albopictus. Our vectorial capacity models suggest that the effect of larval competition on adult mosquito longevity likely has a greater influence on vectorial capacity relative to any competition-induced changes in vector competence. Furthermore, we found that large increases in the viral dissemination rate may be necessary to compensate for small competition-induced reductions in daily survivorship. Our results indicate that mosquito populations that experience stress from larval competition are likely to have a reduced vectorial capacity, even when susceptibility to pathogens is enhanced. PMID:25951173
Benami, Maya; Busgang, Allison; Gillor, Osnat; Gross, Amit
2016-08-15
Greywater (GW) reuse can alleviate water stress by lowering freshwater consumption. However, GW contains pathogens that may compromise public health. During the GW-treatment process, bioaerosols can be produced and may be hazardous to human health if inhaled, ingested, or come in contact with skin. Using air-particle monitoring, BioSampler®, and settle plates we sampled bioaerosols emitted from recirculating vertical flow constructed wetlands (RVFCW) - a domestic GW-treatment system. An array of pathogens and indicators were monitored using settle plates and by culturing the BioSampler® liquid. Further enumeration of viable pathogens in the BioSampler® liquid utilized a newer method combining the benefits of enrichment with molecular detection (MPN-qPCR). Additionally, quantitative microbial risk assessment (QMRA) was applied to assess risks of infection from a representative skin pathogen, Staphylococcus aureus. According to the settle-plate technique, low amounts (0-9.7×10(4)CFUm(-2)h(-1)) of heterotrophic bacteria, Staphylococcus spp., Pseudomonas spp., Klebsiella pneumoniae, Enterococcus spp., and Escherichia coli were found to aerosolize up to 1m away from the GW systems. At the 5m distance amounts of these bacteria were not statistically different (p>0.05) from background concentrations tested over 50m away from the systems. Using the BioSampler®, no bacteria were detected before enrichment of the GW-aerosols. However, after enrichment, using an MPN-qPCR technique, viable indicators and pathogens were occasionally detected. Consequently, the QMRA results were below the critical disability-adjusted life year (DALY) safety limits, a measure of overall disease burden, for S. aureus under the tested exposure scenarios. Our study suggests that health risks from aerosolizing pathogens near RVFCW GW-treatment systems are likely low. This study also emphasizes the growing need for standardization of bioaerosol-evaluation techniques to provide more accurate quantification of small amounts of viable, aerosolized bacterial pathogens. Copyright © 2016 Elsevier B.V. All rights reserved.
40 CFR 503.27 - Recordkeeping.
Code of Federal Regulations, 2010 CFR
2010-07-01
... penalty of law, that the information that will be used to determine compliance with the pathogen... requirements is met) and the vector attraction reduction requirement in (insert one of the vector attraction... met. (iv) A description of how one of the vector attraction reduction requirements in § 503.33 (b)(1...
40 CFR 503.27 - Recordkeeping.
Code of Federal Regulations, 2014 CFR
2014-07-01
... penalty of law, that the information that will be used to determine compliance with the pathogen... requirements is met) and the vector attraction reduction requirement in (insert one of the vector attraction... met. (iv) A description of how one of the vector attraction reduction requirements in § 503.33 (b)(1...
40 CFR 503.27 - Recordkeeping.
Code of Federal Regulations, 2012 CFR
2012-07-01
... penalty of law, that the information that will be used to determine compliance with the pathogen... requirements is met) and the vector attraction reduction requirement in (insert one of the vector attraction... met. (iv) A description of how one of the vector attraction reduction requirements in § 503.33 (b)(1...
40 CFR 503.27 - Recordkeeping.
Code of Federal Regulations, 2013 CFR
2013-07-01
... penalty of law, that the information that will be used to determine compliance with the pathogen... requirements is met) and the vector attraction reduction requirement in (insert one of the vector attraction... met. (iv) A description of how one of the vector attraction reduction requirements in § 503.33 (b)(1...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-03
... reduction and Hazard Analysis and Critical Control Point (HACCP) Systems requirements because OMB approval... February 28, 2014. FSIS has established requirements applicable to meat and poultry establishments designed.... coli by slaughter establishments to verify the adequacy of the establishment's process controls for the...
Effect of radiant catalytic ionization on reduction of foodborne pathogens on beef
USDA-ARS?s Scientific Manuscript database
The objective of this study was to evaluate the effect of radiant catalytic ionization (RCI) on reduction of Shiga toxin-producing Escherichia coli (STEC) as well as antimicrobial resistant (AMR) and non-AMR Salmonella strains on inoculated beef flanks. The RCI technology utilizes a combination of U...
An assay was developed to assess the ability of oyster, Crassostrea virginica, hemocytes to kill the human pathogenic bacterium, Vibrio parahaemolyticus (ATCC 17802). Bacterial killing was estimated colorimetrically by the enzymatic reduction of a tetrazolium dye, 3-(4,5-dimethyl...
Pangloli, Philipus; Hung, Yen-Con
2011-08-01
The objective of this study was to evaluate the efficacy of slightly acidic electrolyzed (SAEO) water in killing or removing Escherichia coli O157:H7 on iceberg lettuce and tomatoes by washing and chilling treatment simulating protocols used in food service kitchens. Whole lettuce leaves and tomatoes were spot-inoculated with 100 μL of a mixture of 5 strains of E. coli O157:H7. Washing lettuce with SAEO water for 15 s reduced the pathogen by 1.4 to 1.6 log CFU/leaf, but the treatments did not completely inactivate the pathogen in the wash solution. Increasing the washing time to 30 s increased the reductions to 1.7 to 2.3 log CFU/leaf. Sequential washing in SAEO water for 15 s and then chilling in SAEO water for 15 min also increased the reductions to 2.0 to 2.4 log CFU/leaf, and no cell survived in chilling solution after treatment. Washing tomatoes with SAEO water for 8 s reduced E. coli O157:H7 by 5.4 to 6.3 log CFU/tomato. The reductions were increased to 6.6 to 7.6 log CFU/tomato by increasing the washing time to 15 s. Results suggested that application of SAEO water to wash and chill lettuce and tomatoes in food service kitchens could minimize cross-contamination and reduce the risk of E. coli O157:H7 present on the produce. SAEO water is equally or slightly better than acidic electrolyzed (AEO) water for inactivation of bacteria on lettuce and tomato surfaces. In addition, SAEO water may have the advantages over AEO water on its stability, no chlorine smell, and low corrosiveness. Therefore, SAEO water may have potential for produce wash to enhance food safety. © 2011 Institute of Food Technologists®
Hosseini, Parviez R; Mills, James N; Prieur-Richard, Anne-Hélène; Ezenwa, Vanessa O; Bailly, Xavier; Rizzoli, Annapaola; Suzán, Gerardo; Vittecoq, Marion; García-Peña, Gabriel E; Daszak, Peter; Guégan, Jean-François; Roche, Benjamin
2017-06-05
Biodiversity is of critical value to human societies, but recent evidence that biodiversity may mitigate infectious-disease risk has sparked controversy among researchers. The majority of work on this topic has focused on direct assessments of the relationship between biodiversity and endemic-pathogen prevalence, without disentangling intervening mechanisms; thus study outcomes often differ, fuelling more debate. Here, we suggest two critical changes to the approach researchers take to understanding relationships between infectious disease, both endemic and emerging, and biodiversity that may help clarify sources of controversy. First, the distinct concepts of hazards versus risks need to be separated to determine how biodiversity and its drivers may act differently on each. This distinction is particularly important since it illustrates that disease emergence drivers in humans could be quite different to the general relationship between biodiversity and transmission of endemic pathogens. Second, the interactive relationship among biodiversity, anthropogenic change and zoonotic disease risk, including both direct and indirect effects, needs to be recognized and accounted for. By carefully disentangling these interactions between humans' activities and pathogen circulation in wildlife, we suggest that conservation efforts could mitigate disease risks and hazards in novel ways that complement more typical disease control efforts.This article is part of the themed issue 'Conservation, biodiversity and infectious disease: scientific evidence and policy implications'. © 2017 The Author(s).
Risks to farm animals from pathogens in composted catering waste containing meat.
Gale, P
2004-07-17
Uncooked meat may contain animal pathogens, including bovine spongiform encephalopathy, foot-and-mouth disease virus, African swine fever virus and classical swine fever virus, and to prevent outbreaks of these diseases in farm animals, the disposal of meat from catering waste is controlled under the Animal By-Products Regulations. This paper estimates the risks to farm animals of grazing land on to which compost, produced by the composting of catering waste containing meat, has been applied. The factors controlling the level of risk are the separation of the meat at source, the efficiency of the composting process, and the decay and dilution of the pathogens in soil. The net pathogen destruction by the composting process is determined largely by the degree of bypass, and to accommodate the possibility of large joints or even whole carcases being discarded uncooked to catering waste, a time/temperature condition of 60 degrees C for two days is recommended. Where data are lacking, worst-case assumptions have been applied. According to the model, classical swine fever virus constitutes the highest risk, but the assessment shows that a two-barrier composting approach, together with a two-month grazing ban, reduces the risk to one infection in pigs every 190 years in England and Wales. This work defined the operational conditions for the composting of catering waste as set out in the Animal By-Products Regulations 2003 (SI 1482).
Hosseini, Parviez R.; Mills, James N.; Prieur-Richard, Anne-Hélène; Bailly, Xavier; Rizzoli, Annapaola; Suzán, Gerardo; Vittecoq, Marion; Daszak, Peter; Guégan, Jean-François
2017-01-01
Biodiversity is of critical value to human societies, but recent evidence that biodiversity may mitigate infectious-disease risk has sparked controversy among researchers. The majority of work on this topic has focused on direct assessments of the relationship between biodiversity and endemic-pathogen prevalence, without disentangling intervening mechanisms; thus study outcomes often differ, fuelling more debate. Here, we suggest two critical changes to the approach researchers take to understanding relationships between infectious disease, both endemic and emerging, and biodiversity that may help clarify sources of controversy. First, the distinct concepts of hazards versus risks need to be separated to determine how biodiversity and its drivers may act differently on each. This distinction is particularly important since it illustrates that disease emergence drivers in humans could be quite different to the general relationship between biodiversity and transmission of endemic pathogens. Second, the interactive relationship among biodiversity, anthropogenic change and zoonotic disease risk, including both direct and indirect effects, needs to be recognized and accounted for. By carefully disentangling these interactions between humans' activities and pathogen circulation in wildlife, we suggest that conservation efforts could mitigate disease risks and hazards in novel ways that complement more typical disease control efforts. This article is part of the themed issue ‘Conservation, biodiversity and infectious disease: scientific evidence and policy implications’. PMID:28438918
The risk of sustained sexual transmission of Zika is underestimated
2017-01-01
Pathogens often follow more than one transmission route during outbreaks—from needle sharing plus sexual transmission of HIV to small droplet aerosol plus fomite transmission of influenza. Thus, controlling an infectious disease outbreak often requires characterizing the risk associated with multiple mechanisms of transmission. For example, during the Ebola virus outbreak in West Africa, weighing the relative importance of funeral versus health care worker transmission was essential to stopping disease spread. As a result, strategic policy decisions regarding interventions must rely on accurately characterizing risks associated with multiple transmission routes. The ongoing Zika virus (ZIKV) outbreak challenges our conventional methodologies for translating case-counts into route-specific transmission risk. Critically, most approaches will fail to accurately estimate the risk of sustained sexual transmission of a pathogen that is primarily vectored by a mosquito—such as the risk of sustained sexual transmission of ZIKV. By computationally investigating a novel mathematical approach for multi-route pathogens, our results suggest that previous epidemic threshold estimates could under-estimate the risk of sustained sexual transmission by at least an order of magnitude. This result, coupled with emerging clinical, epidemiological, and experimental evidence for an increased risk of sexual transmission, would strongly support recent calls to classify ZIKV as a sexually transmitted infection. PMID:28934370
Hamilton, A J; Stagnitti, F S; Premier, R; Boland, A M
2006-01-01
The use of reclaimed wastewater for irrigation of horticultural crops is commonplace in many parts of the world and is likely to increase. Concerns about risks to human health arising from such practice, especially with respect to infection with microbial pathogens, are common. Several factors need to be considered when attempting to quantify the risk posed to a population, such as the concentration of pathogens in the source water, water treatment efficiency, the volume of water coming into contact with the crop, and the die-off rate of pathogens in the environment. Another factor, which has received relatively less attention, is the amount of food consumed. Plainly, higher consumption rates place one at greater risk of becoming infected. The amount of vegetables consumed is known to vary among ethic groups. We use Quantitative Microbial Risk Assessment Modelling (QMRA) to see if certain ethnic groups are exposed to higher risks by virtue of their consumption behaviour. The results suggest that despite the disparities in consumption rates by different ethnic groups they generally all faced comparable levels of risks. We conclude by suggesting that QMRA should be used to assess the relative levels of risk faced by groups based on divisions other than ethnicity, such as those with compromised immune systems.
Host pathogen relations: exploring animal models for fungal pathogens.
Harwood, Catherine G; Rao, Reeta P
2014-06-30
Pathogenic fungi cause superficial infections but pose a significant public health risk when infections spread to deeper tissues, such as the lung. Within the last three decades, fungi have been identified as the leading cause of nosocomial infections making them the focus of research. This review outlines the model systems such as the mouse, zebrafish larvae, flies, and nematodes, as well as ex vivo and in vitro systems available to study common fungal pathogens.
[Listeria monocytogenes outbreaks: a review of the routes that favor bacterial presence].
Rossi, M Laura; Paiva, Analía; Tornese, Mariela; Chianelli, Sabrina; Troncoso, Alcides
2008-10-01
Listeria monocytogenes is a foodborne pathogen that causes serious invasive illness, mainly in certain well-defined high-risk groups, including immunocompromised patients, pregnant women and neonates. L. monocytogenes primarily causes abortion, septicaemia or infections of the central nervous systems. Listeriosis outbreaks have mostly been linked to consumption of raw milk or cheese made of unpasteurized milk. Previous outbreaks of listeriosis have been linked to a variety of foods especially processed meats (such as hot dogs, deli meats, and páté). The public health importance of listeriosis is not always recognized, particularly since listeriosis is a relatively rare disease compared with other common foodborne illnesses such as salmonellosis or botulism. However, because of its high case fatality rate, listeriosis ranks among the most frequent causes of death due to foodborne illness: second after salmonellosis. Changes in the manner food is produced, distributed and stored have created the potential for widespread outbreaks involving many countries. The pasteurization of raw milk, which destroys L. monocytogenes, does not eliminate later risk of L. monocytogenes contamination in dairy products. Extensive work has been ongoing in many countries during the last decade to prevent outbreaks and decrease the incidence of listeriosis. A marked reduction has occurred in its incidence in some of these countries during the 1990s, suggesting a relationship between preventive measures and reduction on human cases listeriosis.
Eandi, Jonathan A; Nanigian, Dana K; Smith, William H; Low, Roger K
2008-12-01
The transmission risk to surgeons performing percutaneous renal surgery on patients who are infected with human immunodeficiency virus/acquired immunodeficiency syndrome, hepatitis B, or hepatitis C is unknown. A recent study found 55% of surgeons' masks contain evidence of blood splash contamination after percutaneous nephrolithotomy. While the risk of infectious disease transmission to the surgeon after mucocutaneous exposure is unknown, the incapacitating disease these pathogens cause can have a devastating and permanent effect on a surgeon's career. We describe our use of a surgical helmet system when performing percutaneous renal surgery on high-risk patients to minimize risk of splash injury and transmission of blood-borne pathogens.
Fryk, Jesse J; Marks, Denese C; Hobson-Peters, Jody; Prow, Natalie A; Watterson, Daniel; Hall, Roy A; Young, Paul R; Reichenberg, Stefan; Sumian, Chryslain; Faddy, Helen M
2016-09-01
Arboviruses, such as dengue viruses (DENV) and chikungunya virus (CHIKV), pose a risk to the safe transfusion of blood components, including plasma. Pathogen inactivation is an approach to manage this transfusion transmission risk, with a number of techniques being used worldwide for the treatment of plasma. In this study, the efficacy of the THERAFLEX MB-Plasma system to inactivate all DENV serotypes (DENV-1, DENV-2, DENV-3, DENV-4) or CHIKV in plasma, using methylene blue and light illumination at 630 nm, was investigated. Pooled plasma units were spiked with DENV-1, DENV-2, DENV-3 DENV-4, or CHIKV and treated with the THERAFLEX MB-Plasma system at four light illumination doses: 20, 40, 60, and 120 (standard dose) J/cm(2) . Pre- and posttreatment samples were collected and viral infectivity was determined. The reduction in viral infectivity was calculated for each dose. Treatment of plasma with the THERAFLEX MB-Plasma system resulted in at least a 4.46-log reduction in all DENV serotypes and CHIKV infectious virus. The residual infectivity for each was at the detection limit of the assay used at 60 J/cm(2) , with dose dependency also observed. Our study demonstrated the THERAFLEX MB-Plasma system can reduce the infectivity of all DENV serotypes and CHIKV spiked into plasma to the detection limit of the assay used at half of the standard illumination dose. This suggests this system has the capacity to be an effective option for managing the risk of DENV or CHIKV transfusion transmission in plasma. © 2016 AABB.
Assavasilavasukul, Prapakorn; Lau, Boris L T; Harrington, Gregory W; Hoffman, Rebecca M; Borchardt, Mark A
2008-05-01
The presence of waterborne enteric pathogens in municipal water supplies contributes risk to public health. To evaluate the removal of these pathogens in drinking water treatment processes, previous researchers have spiked raw waters with up to 10(6) pathogens/L in order to reliably detect the pathogens in treated water. These spike doses are 6-8 orders of magnitude higher than pathogen concentrations routinely observed in practice. In the present study, experiments were conducted with different sampling methods (i.e., grab versus continuous sampling) and initial pathogen concentrations ranging from 10(1) to 10(6) pathogens/L. Results showed that Cryptosporidium oocyst and Giardia cyst removal across conventional treatment were dependent on initial pathogen concentrations, with lower pathogen removals observed when lower initial pathogen spike doses were used. In addition, higher raw water turbidity appeared to result in higher log removal for both Cryptosporidium oocysts and Giardia cysts.
Survival of selected foodborne pathogens on dry cured pork loins.
Morales-Partera, Ángela M; Cardoso-Toset, Fernando; Jurado-Martos, Francisco; Astorga, Rafael J; Huerta, Belén; Luque, Inmaculada; Tarradas, Carmen; Gómez-Laguna, Jaime
2017-10-03
The safety of ready-to-eat products such as cured pork loins must be guaranteed by the food industry. In the present study, the efficacy of the dry curing process of pork loins obtained from free-range pigs in the reduction of three of the most important foodborne pathogens is analysed. A total of 28 pork loin segments, with an average weight of 0.57±0.12kg, were divided into four groups with three being inoculated by immersion with 7logCFU/ml of either Salmonella Typhimurium, Campylobacter coli or Listeria innocua and the last one inoculated by immersion with sterile medium (control group). The loin segments were treated with a seasoning mixture of curing agents and spices, packed in a synthetic sausage casing and cured for 64days. Microbiological analysis, pH and water activity (a w ) were assessed at four stages. The values of pH and a w decreased with curing time as expected. S. Typhimurium and C. coli dropped significantly (3.28 and 2.14 log units, respectively), but limited reduction of L. innocua (0.84 log unit) was observed along the curing process. In our study, three factors were considered critical: the initial concentration of the bacteria, the progressive reduction of pH and the reduction of a w values. Our results encourage performing periodic analysis at different stages of the manufacturing of dry cured pork loins to ensure the absence of the three evaluated foodborne pathogens. Copyright © 2017 Elsevier B.V. All rights reserved.
Strayer, David R; Carter, William A; Stouch, Bruce C; Stittelaar, Koert J; Thoolen, Robert J M M; Osterhaus, Albert D M E; Mitchell, William M
2014-10-01
Using an established nonhuman primate model for H5N1 highly pathogenic influenza virus infection in humans, we have been able to demonstrate the prophylactic mitigation of the pulmonary damage characteristic of human fatal cases from primary influenza virus pneumonia with a low dose oral formulation of a commercially available parenteral natural human interferon alpha (Alferon N Injection®). At the highest oral dose (62.5IU/kg body weight) used there was a marked reduction in the alveolar inflammatory response with minor evidence of alveolar and interstitial edema in contrast to the hemorrhage and inflammatory response observed in the alveoli of control animals. The mitigation of severe damage to the lower pulmonary airway was observed without a parallel reduction in viral titers. Clinical trial data will be necessary to establish its prophylactic human efficacy for highly pathogenic influenza viruses. Copyright © 2014. Published by Elsevier B.V.
Saini, Parmesh K; Marks, Harry M; Dreyfuss, Moshe S; Evans, Peter; Cook, L Victor; Dessai, Uday
2011-08-01
Measuring commonly occurring, nonpathogenic organisms on poultry products may be used for designing statistical process control systems that could result in reductions of pathogen levels. The extent of pathogen level reduction that could be obtained from actions resulting from monitoring these measurements over time depends upon the degree of understanding cause-effect relationships between processing variables, selected output variables, and pathogens. For such measurements to be effective for controlling or improving processing to some capability level within the statistical process control context, sufficiently frequent measurements would be needed to help identify processing deficiencies. Ultimately the correct balance of sampling and resources is determined by those characteristics of deficient processing that are important to identify. We recommend strategies that emphasize flexibility, depending upon sampling objectives. Coupling the measurement of levels of indicator organisms with practical emerging technologies and suitable on-site platforms that decrease the time between sample collections and interpreting results would enhance monitoring process control.
Microbiological safety of drinking water: United States and global perspectives.
Ford, T E
1999-01-01
Waterborne disease statistics only begin to estimate the global burden of infectious diseases from contaminated drinking water. Diarrheal disease is dramatically underreported and etiologies seldom diagnosed. This review examines available data on waterborne disease incidence both in the United States and globally together with its limitations. The waterborne route of transmission is examined for bacterial, protozoal, and viral pathogens that either are frequently associated with drinking water (e.g., Shigella spp.), or for which there is strong evidence implicating the waterborne route of transmission (e.g., Leptospira spp.). In addition, crucial areas of research are discussed, including risks from selection of treatment-resistant pathogens, importance of environmental reservoirs, and new methodologies for pathogen-specific monitoring. To accurately assess risks from waterborne disease, it is necessary to understand pathogen distribution and survival strategies within water distribution systems and to apply methodologies that can detect not only the presence, but also the viability and infectivity of the pathogen. Images Figure 1 Figure 2 PMID:10229718
Systemic Analysis of Foodborne Disease Outbreak in Korea.
Lee, Jong-Kyung; Kwak, No-Seong; Kim, Hyun Jung
2016-02-01
This study systemically analyzed data on the prevalence of foodborne pathogens and foodborne disease outbreaks to identify the priorities of foodborne infection risk management in Korea. Multiple correspondence analysis was applied to three variables: origin of food source, phase of food supply chain, and 12 pathogens using 358 cases from 76 original papers and official reports published in 1998-2012. In addition, correspondence analysis of two variables--place and pathogen--was conducted based on epidemiological data of 2357 foodborne outbreaks in 2002-2011 provided by the Korean Ministry of Food and Drug Safety. The results of this study revealed three distinct areas of food monitoring: (1) livestock-derived raw food contaminated with Campylobacter spp., pathogenic Escherichia coli, Salmonella spp., and Listeria monocytogenes; (2) multi-ingredient and ready-to-eat food related to Staphylococcus aureus; and (3) water associated with norovirus. Our findings emphasize the need to track the sources and contamination pathways of foodborne pathogens for more effective risk management.
Himsworth, Chelsea G; Parsons, Kirbee L; Jardine, Claire; Patrick, David M
2013-06-01
Urban Norway and black rats (Rattus norvegicus and Rattus rattus) are the source of a number of pathogens responsible for significant human morbidity and mortality in cities around the world. These pathogens include zoonotic bacteria (Leptospira interrogans, Yersina pestis, Rickettsia typhi, Bartonella spp., Streptobacillus moniliformis), viruses (Seoul hantavirus), and parasites (Angiostrongylus cantonensis). A more complete understanding of the ecology of these pathogens in people and rats is critical for determining the public health risks associated with urban rats and for developing strategies to monitor and mitigate those risks. Although the ecology of rat-associated zoonoses is complex, due to the multiple ways in which rats, people, pathogens, vectors, and the environment may interact, common determinants of human disease can still be identified. This review summarizes the ecology of zoonoses associated with urban rats with a view to identifying similarities, critical differences, and avenues for further study.
Bacterial and parasitic diseases of parrots.
Doneley, Robert J T
2009-09-01
As wild-caught birds become increasingly rare in aviculture, there is a corresponding decline in the incidence of bacterial and parasitic problems and an increase in the recognition of the importance of maintaining health through better nutrition and husbandry. Nevertheless, the relatively close confines of captivity mean an increased pathogen load in the environment in which companion and aviary parrots live. This increased pathogen load leads to greater exposure of these birds to bacteria and parasites, and consequently a greater risk of infection and disease. This article discusses bacterial and parasitic infections in companion and aviary parrots. It includes the origins, pathogens, diagnosis, treatment, and some of the associated risk factors.
Edmiston, Charles E; Zhou, S Steve; Hoerner, Pierre; Krikorian, Raffi; Krepel, Candace J; Lewis, Brian D; Brown, Kellie R; Rossi, Peter J; Graham, Mary Beth; Seabrook, Gary R
2013-02-01
Percutaneous injuries associated with cutting instruments, needles, and other sharps (eg, metallic meshes, bone fragments, etc) occur commonly during surgical procedures, exposing members of surgical teams to the risk for contamination by blood-borne pathogens. This study evaluated the efficacy of an innovative integrated antimicrobial glove to reduce transmission of the human immunodeficiency virus (HIV) following a simulated surgical-glove puncture injury. A pneumatically activated puncturing apparatus was used in a surgical-glove perforation model to evaluate the passage of live HIV-1 virus transferred via a contaminated blood-laden needle, using a reference (standard double-layer glove) and an antimicrobial benzalkonium chloride (BKC) surgical glove. The study used 2 experimental designs. In method A, 10 replicates were used in 2 cycles to compare the mean viral load following passage through standard and antimicrobial gloves. In method B, 10 replicates were pooled into 3 aliquots and were used to assess viral passage though standard and antimicrobial test gloves. In both methods, viral viability was assessed by observing the cytopathic effects in human lymphocytic C8166 T-cell tissue culture. Concurrent viral and cell culture viability controls were run in parallel with the experiment's studies. All controls involving tissue culture and viral viability were performed according to study design. Mean HIV viral loads (log(10)TCID(50)) were significantly reduced (P < .01) following passage through the BKC surgical glove compared to passage through the nonantimicrobial glove. The reduction (log reduction and percent viral reduction) of the HIV virus ranged from 1.96 to 2.4 and from 98.9% to 99.6%, respectively, following simulated surgical-glove perforation. Sharps injuries in the operating room pose a significant occupational risk for surgical practitioners. The findings of this study suggest that an innovative antimicrobial glove was effective at significantly (P < .01) reducing the risk for blood-borne virus transfer in a model of simulated glove perforation. Copyright © 2013 Mosby, Inc. All rights reserved.
Fegan, Narelle; Jenson, Ian
2018-04-20
Meat has featured prominently as a source of foodborne disease and a public health concern. For about the past 20 years the risk management paradigm has dominated international thinking about food safety. Control through the supply chain is supported by risk management concepts, as the public health risk at the point of consumption becomes the accepted outcome based measure. Foodborne pathogens can be detected at several points in the supply chain and determining the source of where these pathogens arise and how they behave throughout meat production and processing are important parts of risk based approaches. Recent improvements in molecular and genetic based technologies and data analysis for investigating source attribution and pathogen behaviour have enabled greater insights into how foodborne outbreaks occur and where controls can be implemented. These new approaches will improve our understanding of the role of meat in foodborne disease and are expected to have a significant impact on our understanding in the coming years. Copyright © 2018 Elsevier Ltd. All rights reserved.
de la Fuente, José; Antunes, Sandra; Bonnet, Sarah; Cabezas-Cruz, Alejandro; Domingos, Ana G.; Estrada-Peña, Agustín; Johnson, Nicholas; Kocan, Katherine M.; Mansfield, Karen L.; Nijhof, Ard M.; Papa, Anna; Rudenko, Nataliia; Villar, Margarita; Alberdi, Pilar; Torina, Alessandra; Ayllón, Nieves; Vancova, Marie; Golovchenko, Maryna; Grubhoffer, Libor; Caracappa, Santo; Fooks, Anthony R.; Gortazar, Christian; Rego, Ryan O. M.
2017-01-01
Ticks and the pathogens they transmit constitute a growing burden for human and animal health worldwide. Vector competence is a component of vectorial capacity and depends on genetic determinants affecting the ability of a vector to transmit a pathogen. These determinants affect traits such as tick-host-pathogen and susceptibility to pathogen infection. Therefore, the elucidation of the mechanisms involved in tick-pathogen interactions that affect vector competence is essential for the identification of molecular drivers for tick-borne diseases. In this review, we provide a comprehensive overview of tick-pathogen molecular interactions for bacteria, viruses, and protozoa affecting human and animal health. Additionally, the impact of tick microbiome on these interactions was considered. Results show that different pathogens evolved similar strategies such as manipulation of the immune response to infect vectors and facilitate multiplication and transmission. Furthermore, some of these strategies may be used by pathogens to infect both tick and mammalian hosts. Identification of interactions that promote tick survival, spread, and pathogen transmission provides the opportunity to disrupt these interactions and lead to a reduction in tick burden and the prevalence of tick-borne diseases. Targeting some of the similar mechanisms used by the pathogens for infection and transmission by ticks may assist in development of preventative strategies against multiple tick-borne diseases. PMID:28439499
Revenko, Alexey S; Gao, Dacao; Crosby, Jeff R; Bhattacharjee, Gourab; Zhao, Chenguang; May, Chris; Gailani, David; Monia, Brett P; MacLeod, A Robert
2011-11-10
Recent studies indicate that the plasma contact system plays an important role in thrombosis, despite being dispensable for hemostasis. For example, mice deficient in coagulation factor XII (fXII) are protected from arterial thrombosis and cerebral ischemia-reperfusion injury. We demonstrate that selective reduction of prekallikrein (PKK), another member of the contact system, using antisense oligonucleotide (ASO) technology results in an antithrombotic phenotype in mice. The effects of PKK deficiency were compared with those of fXII deficiency produced by specific ASO-mediated reduction of fXII. Mice with reduced PKK had ∼ 3-fold higher plasma levels of fXII, and reduced levels of fXIIa-serpin complexes, consistent with fXII being a substrate for activated PKK in vivo. PKK or fXII deficiency reduced thrombus formation in both arterial and venous thrombosis models, without an apparent effect on hemostasis. The amount of reduction of PKK and fXII required to produce an antithrombotic effect differed between venous and arterial models, suggesting that these factors may regulate thrombus formation by distinct mechanisms. Our results support the concept that fXII and PKK play important and perhaps nonredundant roles in pathogenic thrombus propagation, and highlight a novel, specific and safe pharmaceutical approach to target these contact system proteases.
Revenko, Alexey S.; Gao, Dacao; Crosby, Jeff R.; Bhattacharjee, Gourab; Zhao, Chenguang; May, Chris; Gailani, David; Monia, Brett P.
2011-01-01
Recent studies indicate that the plasma contact system plays an important role in thrombosis, despite being dispensable for hemostasis. For example, mice deficient in coagulation factor XII (fXII) are protected from arterial thrombosis and cerebral ischemia-reperfusion injury. We demonstrate that selective reduction of prekallikrein (PKK), another member of the contact system, using antisense oligonucleotide (ASO) technology results in an antithrombotic phenotype in mice. The effects of PKK deficiency were compared with those of fXII deficiency produced by specific ASO-mediated reduction of fXII. Mice with reduced PKK had ∼ 3-fold higher plasma levels of fXII, and reduced levels of fXIIa-serpin complexes, consistent with fXII being a substrate for activated PKK in vivo. PKK or fXII deficiency reduced thrombus formation in both arterial and venous thrombosis models, without an apparent effect on hemostasis. The amount of reduction of PKK and fXII required to produce an antithrombotic effect differed between venous and arterial models, suggesting that these factors may regulate thrombus formation by distinct mechanisms. Our results support the concept that fXII and PKK play important and perhaps nonredundant roles in pathogenic thrombus propagation, and highlight a novel, specific and safe pharmaceutical approach to target these contact system proteases. PMID:21821705
Anoop, Valar; Rotaru, Sever; Shwed, Philip S; Tayabali, Azam F; Arvanitakis, George
2015-09-01
Most industrial Saccharomyces cerevisiae strains used in food or biotechnology processes are benign. However, reports of S. cerevisiae infections have emerged and novel strains continue to be developed. In order to develop recommendations for the human health risk assessment of S. cerevisiae strains, we conducted a literature review of current methods used to characterize their pathogenic potential and evaluated their relevance towards risk assessment. These studies revealed that expression of virulence traits in S. cerevisiae is complex and depends on many factors. Given the opportunistic nature of this organism, an approach using multiple lines of evidence is likely necessary for the reasonable prediction of the pathogenic potential of a particular strain. Risk assessment of S. cerevisiae strains would benefit from more research towards the comparison of virulent and non-virulent strains in order to better understand those genotypic and phenotypic traits most likely to be associated with pathogenicity. © Her Majesty the Queen in Right of Canada 2015. Reproduced with the permission of the Minister of Health.
Innovative Approaches to Improve Anti-Infective Vaccine Efficacy.
Yeaman, Michael R; Hennessey, John P
2017-01-06
Safe and efficacious vaccines are arguably the most successful medical interventions of all time. Yet the ongoing discovery of new pathogens, along with emergence of antibiotic-resistant pathogens and a burgeoning population at risk of such infections, imposes unprecedented public health challenges. To meet these challenges, innovative strategies to discover and develop new or improved anti-infective vaccines are necessary. These approaches must intersect the most meaningful insights into protective immunity and advanced technologies with capabilities to deliver immunogens for optimal immune protection. This goal is considered through several recent advances in host-pathogen relationships, conceptual strides in vaccinology, and emerging technologies. Given a clear and growing risk of pandemic disease should the threat of infection go unmet, developing vaccines that optimize protective immunity against high-priority and antibiotic-resistant pathogens represents an urgent and unifying imperative.
Prey choice and habitat use drive sea otter pathogen exposure in a resource-limited coastal system
Johnson, Christine K.; Tinker, M. Tim; Estes, James A.; Conrad, Patricia A.; Staedler, Michelle M.; Miller, Melissa A.; Jessup, David A.; Mazet, Jonna A.K.
2014-01-01
The processes promoting disease in wild animal populations are highly complex, yet identifying these processes is critically important for conservation when disease is limiting a population. By combining field studies with epidemiologic tools, we evaluated the relationship between key factors impeding southern sea otter (Enhydra lutris nereis) population growth: disease and resource limitation. This threatened population has struggled to recover despite protection, so we followed radio-tagged sea otters and evaluated infection with 2 disease-causing protozoal pathogens, Toxoplasma gondii and Sarcocystis neurona, to reveal risks that increased the likelihood of pathogen exposure. We identified patterns of pathogen infection that are linked to individual animal behavior, prey choice, and habitat use. We detected a high-risk spatial cluster of S. neurona infections in otters with home ranges in southern Monterey Bay and a coastal segment near San Simeon and Cambria where otters had high levels of infection with T. gondii. We found that otters feeding on abalone, which is the preferred prey in a resource-abundant marine ecosystem, had a very low risk of infection with either pathogen, whereas otters consuming small marine snails were more likely to be infected with T. gondii. Individual dietary specialization in sea otters is an adaptive mechanism for coping with limited food resources along central coastal California. High levels of infection with protozoal pathogens may be an adverse consequence of dietary specialization in this threatened species, with both depleted resources and disease working synergistically to limit recovery.
Prey choice and habitat use drive sea otter pathogen exposure in a resource-limited coastal system
Johnson, Christine K.; Tinker, Martin T.; Estes, James A.; Conrad, Patricia A.; Staedler, Michelle; Miller, Melissa A.; Jessup, David A.; Mazet, Jonna A. K.
2009-01-01
The processes promoting disease in wild animal populations are highly complex, yet identifying these processes is critically important for conservation when disease is limiting a population. By combining field studies with epidemiologic tools, we evaluated the relationship between key factors impeding southern sea otter (Enhydra lutris nereis) population growth: disease and resource limitation. This threatened population has struggled to recover despite protection, so we followed radio-tagged sea otters and evaluated infection with 2 disease-causing protozoal pathogens, Toxoplasma gondii and Sarcocystis neurona, to reveal risks that increased the likelihood of pathogen exposure. We identified patterns of pathogen infection that are linked to individual animal behavior, prey choice, and habitat use. We detected a high-risk spatial cluster of S. neurona infections in otters with home ranges in southern Monterey Bay and a coastal segment near San Simeon and Cambria where otters had high levels of infection with T. gondii. We found that otters feeding on abalone, which is the preferred prey in a resource-abundant marine ecosystem, had a very low risk of infection with either pathogen, whereas otters consuming small marine snails were more likely to be infected with T. gondii. Individual dietary specialization in sea otters is an adaptive mechanism for coping with limited food resources along central coastal California. High levels of infection with protozoal pathogens may be an adverse consequence of dietary specialization in this threatened species, with both depleted resources and disease working synergistically to limit recovery. PMID:19164513
Fate of pathogens present in livestock wastes spread onto fescue plots.
Hutchison, Mike L; Walters, Lisa D; Moore, Tony; Thomas, D John I; Avery, Sheryl M
2005-02-01
Fecal wastes from a variety of farmed livestock were inoculated with livestock isolates of Escherichia coli O157, Listeria monocytogenes, Salmonella, Campylobacter jejuni, and Cryptosporidium parvum oocysts at levels representative of the levels found in naturally contaminated wastes. The wastes were subsequently spread onto a grass pasture, and the decline of each of the zoonotic agents was monitored over time. There were no significant differences among the decimal reduction times for the bacterial pathogens. The mean bacterial decimal reduction time was 1.94 days. A range of times between 8 and 31 days for a 1-log reduction in C. parvum levels was obtained, demonstrating that the protozoans were significantly more hardy than the bacteria. Oocyst recovery was more efficient from wastes with lower dry matter contents. The levels of most of the zoonotic agents had declined to below detectable levels by 64 days. However, for some waste types, 128 days was required for the complete decline of L. monocytogenes levels. We were unable to find significant differences between the rates of pathogen decline in liquid (slurry) and solid (farmyard manure) wastes, although concerns have been raised that increased slurry generation as a consequence of more intensive farming practices could lead to increased survival of zoonotic agents in the environment.
Food safety in raw milk production: risk factors associated to bacterial DNA contamination.
Cerva, Cristine; Bremm, Carolina; Reis, Emily Marques dos; Bezerra, André Vinícius Andrade; Loiko, Márcia Regina; Cruz, Cláudio Estêvão Farias da; Cenci, Alexander; Mayer, Fabiana Quoos
2014-06-01
While human illness from milkborne pathogens may be linked to contamination of the product after pasteurization or improper pasteurization, such diseases are usually associated with consumption of raw milk or its by-products. Molecular biology tools were applied to investigate contamination by Listeria monocytogenes, Salmonella spp., some pathogenic strains of Escherichia coli, and Campylobacter jejuni in 548 raw milk samples from 125 dairy farms established in two regions from southern Brazil. Moreover, 15 variables were evaluated for their association with raw milk contamination levels, and the risk factors were determined by multiple regression analysis. Salmonella spp. were more frequently detected, followed by pathogenic E. coli. There was difference in contamination index between the regions, in which risk factors such as temporary cattle confinement, low milk production, low milking machine cleaning frequency, and milk storage area without tile walls were identified. The risk factors were specific to each region studied. Nevertheless, the data can be used to improve milk quality of dairy farms/herds with similar management practices.
Barrett, Damien; Parr, Mervyn; Fagan, John; Johnson, Alan; Tratalos, Jamie; Lively, Francis; Diskin, Michael; Kenny, David
2018-01-06
There are limited data available, in Ireland or elsewhere, to determine the extent of exposure to various endemic diseases among beef cows and factors associated with exposure to causative pathogens. The objectives of this study were to determine the herd and within herd prevalence of Bovine Viral Diarrhoea Virus (BVDV), Bovine Herpes Virus 1 (BHV-1), Leptospirosis and Neosporosis in a large scale study of commercial beef herds on the island of Ireland, and to examine herd level factors associated with exposure to these pathogens in these herds. The average number of cows tested per herd was 35.5 (median 30). Herd level seroprevalence to Bovine Herpesvirus-1(BHV-1), Bovine Viral-Diarrhoea Virus (BVDV), Leptospirosis and Neosporosis was 90%, 100%, 91% and 67%, respectively, while the mean within herd prevalence for the these pathogens was 40%, 77.7%, 65.7% and 5.7%, respectively. The study confirms that the level of seroconversion for the four pathogens of interest increases with herd size. There was also evidence that exposure to one pathogen may increase the risk of exposure to another pathogen. Herd level seroprevalences were in excess of 90% for BVDV, BHV-1 and Leptosporosis. Larger herds were subject to increased exposure to disease pathogens. This study suggests that exposure to several pathogens may be associated with the further exposure to other pathogens.
A case-control study of pathogen and lifestyle risk factors for diarrhoea in dogs.
Stavisky, Jenny; Radford, Alan David; Gaskell, Rosalind; Dawson, Susan; German, Alex; Parsons, Bryony; Clegg, Simon; Newman, Jenny; Pinchbeck, Gina
2011-05-01
Diarrhoea is a common and multi-factorial condition in dogs, the aetiology of which is often incompletely understood. A case-control study was carried out to compare the carriage of some common canine enteric pathogens (enteric coronavirus, parvovirus, distemper, endoparasites, Campylobacter and Salmonella spp.), as well as lifestyle factors such as vaccination history, diet and contact with other species, in dogs presenting at first opinion veterinary practices with and without diarrhoea. Multivariable conditional logistic regression showed that dogs in the study which scavenged or had had a recent change of diet (OR 3.5, p=0.002), had recently stayed in kennels (OR 9.5, p=0.01), or were fed a home-cooked diet (OR 4, p=0.002) were at a significantly greater risk of diarrhoea, whilst being female (OR 0.4, p=0.01), currently up to date with routine vaccinations (OR 0.4, p=0.05) and having contact with horse faeces (OR 0.4, p=0.06) were associated with a reduced risk. None of the pathogens tested for was a significant factor in the final multivariable model suggesting that in this predominantly vaccinated population, diarrhoea may be more associated with lifestyle risk factors than specific pathogens. Copyright © 2011 Elsevier B.V. All rights reserved.
Interacting effects of land use and climate on rodent-borne pathogens in central Kenya.
Young, Hillary S; McCauley, Douglas J; Dirzo, Rodolfo; Nunn, Charles L; Campana, Michael G; Agwanda, Bernard; Otarola-Castillo, Erik R; Castillo, Eric R; Pringle, Robert M; Veblen, Kari E; Salkeld, Daniel J; Stewardson, Kristin; Fleischer, Robert; Lambin, Eric F; Palmer, Todd M; Helgen, Kristofer M
2017-06-05
Understanding the effects of anthropogenic disturbance on zoonotic disease risk is both a critical conservation objective and a public health priority. Here, we evaluate the effects of multiple forms of anthropogenic disturbance across a precipitation gradient on the abundance of pathogen-infected small mammal hosts in a multi-host, multi-pathogen system in central Kenya. Our results suggest that conversion to cropland and wildlife loss alone drive systematic increases in rodent-borne pathogen prevalence, but that pastoral conversion has no such systematic effects. The effects are most likely explained both by changes in total small mammal abundance, and by changes in relative abundance of a few high-competence species, although changes in vector assemblages may also be involved. Several pathogens responded to interactions between disturbance type and climatic conditions, suggesting the potential for synergistic effects of anthropogenic disturbance and climate change on the distribution of disease risk. Overall, these results indicate that conservation can be an effective tool for reducing abundance of rodent-borne pathogens in some contexts (e.g. wildlife loss alone); however, given the strong variation in effects across disturbance types, pathogen taxa and environmental conditions, the use of conservation as public health interventions will need to be carefully tailored to specific pathogens and human contexts.This article is part of the themed issue 'Conservation, biodiversity and infectious disease: scientific evidence and policy implications'. © 2017 The Authors.
NASA Astrophysics Data System (ADS)
Bernstein, N.
2009-04-01
The use of wastewater for agricultural irrigation is steadily increasing world-wide and due to shortages of fresh water is common today in most arid regions of the world. The use of treated wastewater for agricultural irrigation may result in soil exposure to pathogens, creating potential public health problems. A variety of human pathogens are present in raw sewage water. Although their concentrations decrease during the wastewater reclamation process, the secondary treated effluents most commonly used for irrigation today still contain bacterial human pathogens. A range of bacterial pathogens, introduced through contaminated irrigation water or manure, are capable of surviving for long periods in soil and water where they have the potential to contaminate crops in the field. Therefore, there is a risk of direct contamination of crops by human pathogens from the treated effluents used for irrigation, as well as a risk of indirect contamination of the crops from contaminated soil at the agricultural site. Contradictory to previous notion, recent studies have demonstrated that human pathogens can enter plants through their roots and translocate and survive in edible, aerial plant tissues. The practical implications of these new findings for food safety are still not clear, but no doubt reflect the pathogenic microorganisms' ability to survive and multiply in the irrigated soil, water, and the harvested edible crop.
Interacting effects of land use and climate on rodent-borne pathogens in central Kenya
McCauley, Douglas J.; Dirzo, Rodolfo; Campana, Michael G.; Agwanda, Bernard; Otarola-Castillo, Erik R.; Castillo, Eric R.; Pringle, Robert M.; Veblen, Kari E.; Salkeld, Daniel J.; Stewardson, Kristin; Fleischer, Robert; Lambin, Eric F.; Palmer, Todd M.; Helgen, Kristofer M.
2017-01-01
Understanding the effects of anthropogenic disturbance on zoonotic disease risk is both a critical conservation objective and a public health priority. Here, we evaluate the effects of multiple forms of anthropogenic disturbance across a precipitation gradient on the abundance of pathogen-infected small mammal hosts in a multi-host, multi-pathogen system in central Kenya. Our results suggest that conversion to cropland and wildlife loss alone drive systematic increases in rodent-borne pathogen prevalence, but that pastoral conversion has no such systematic effects. The effects are most likely explained both by changes in total small mammal abundance, and by changes in relative abundance of a few high-competence species, although changes in vector assemblages may also be involved. Several pathogens responded to interactions between disturbance type and climatic conditions, suggesting the potential for synergistic effects of anthropogenic disturbance and climate change on the distribution of disease risk. Overall, these results indicate that conservation can be an effective tool for reducing abundance of rodent-borne pathogens in some contexts (e.g. wildlife loss alone); however, given the strong variation in effects across disturbance types, pathogen taxa and environmental conditions, the use of conservation as public health interventions will need to be carefully tailored to specific pathogens and human contexts. This article is part of the themed issue ‘Conservation, biodiversity and infectious disease: scientific evidence and policy implications’. PMID:28438909
de Sousa Guedes, Jossana Pereira; da Costa Medeiros, José Alberto; de Souza E Silva, Richard Sidney; de Sousa, Janaína Maria Batista; da Conceição, Maria Lúcia; de Souza, Evandro Leite
2016-12-05
This study evaluated the ability of the essential oil from Mentha arvensis L. (MAEO) and M. piperita L. (MPEO) to induce ≥5-log reductions in counts (CFU/mL) of E. coli, L. monocytogenes, and Salmonella enterica serovar Enteritidis in Brain-Heart Infusion broth (BHIB) and cashew, guava, mango, and pineapple juices during refrigerated storage (4±0.5°C). The effects of the incorporation of these essential oils on some physicochemical and sensory parameters of juices were also evaluated. The incorporation of 5, 2.5, 1.25, or 0.625μL/mL of MAEO in BHIB caused a ≥5-log reduction in counts of E. coli and Salmonella Enteritidis after 24h of storage; but only 5μL/mL was able to cause the same reduction in counts of L.monocytogenes. The incorporation of 10μL/mL of MPEO in BHIB caused a ≥5-log reduction in counts of E. coli, Salmonella Enteritidis, and L. monocytogenes after 24h of storage; smaller reductions were observed in BHIB containing 5, 2.5, and 1.25μL/mL of MPEO. Similar reductions were observed when the MAEO or MPEO was incorporated at the same concentrations in mango juice. The incorporation of MAEO or MPEO at all tested concentrations in cashew, guava, and pineapple juices resulted in a ≥5-log reduction in pathogen counts within 1h. The incorporation of MAEO and MPEO (0.625 and 1.25μL/mL, respectively) in fruit juices did not induce alterations in °Brix, pH, and acidity, but negatively affected the taste, aftertaste, and overall acceptance. The use of MAEO or MPEO at low concentrations could constitute an interesting tool to achieve the required 5-log reduction of pathogenic bacteria in cashew, guava, mango, and pineapple fruit juices. However, new methods combining the use of MAEO or MPEO with other technologies are necessary to reduce their negative impacts on specific sensory properties of these juices. Copyright © 2016 Elsevier B.V. All rights reserved.
Money for microbes-Pathogen avoidance and out-group helping behaviour.
Laakasuo, Michael; Köbis, Nils; Palomäki, Jussi; Jokela, Markus
2017-02-23
Humans have evolved various adaptations against pathogens, including the physiological immune system. However, not all of these adaptations are physiological: the cognitive mechanisms whereby we avoid potential sources of pathogens-for example, disgust elicited by uncleanliness-can be considered as parts of a behavioural immune system (BIS). The mechanisms of BIS extend also to inter-group relations: Pathogen cues have been shown to increase xenophobia/ethnocentrism, as people prefer to keep their societal in-group norms unaltered and "clean." Nonetheless, little is known how pathogen cues influence people's willingness to provide humanitarian aid to out-group members. We examined how pathogen cues affected decisions of providing humanitarian aid in either instrumental (sending money) or non-instrumental form (sending personnel to help, or accepting refugees), and whether these effects were moderated by individual differences in BIS sensitivity. Data were collected in two online studies (Ns: 188 and 210). When the hypothetical humanitarian crisis involved a clear risk of infection, participants with high BIS sensitivity preferred to send money rather than personnel or to accept refugees. The results suggest that pathogen cues influence BIS-sensitive individuals' willingness to provide humanitarian aid when there is a risk of contamination to in-group members. © 2017 International Union of Psychological Science.
40 CFR 503.17 - Recordkeeping.
Code of Federal Regulations, 2011 CFR
2011-07-01
... in § 503.32(a) and the vector attraction reduction requirement in [insert one of the vector attraction reduction requirements in § 503.33(b)(1) through § 503.33(b)(8)] was prepared under my direction... pathogen requirements in § 503.32(a) are met. (iv) A description of how one of the vector attraction...
Geraldo, Ingrid M; Gilman, Allan; Shintre, Milind S; Modak, Shanta M
2008-08-01
To evaluate the antimicrobial efficacy of and risk of organisms developing resistance to 2 novel hand soaps: (1) a soap containing triclosan, polyhexamethylene biguanide, and benzethonium chloride added to a soap base (TPB soap); and (2) a soap containing farnesol, polyhexamethylene biguanide, and benzethonium chloride added to a soap base (FPB soap). Tests also included soaps containing only triclosan. The risk of emergence of resistant bacterial mutants was investigated by determining the susceptibility changes after repeated exposure of bacteria to the drugs and soaps in vitro. The effectiveness of the soaps was evaluated using an in vitro tube dilution method, a volunteer method (the ASTM standard), and 2 pig skin methods. The minimum inhibitory concentration and minimum bactericidal concentration of triclosan against Staphylococcus aureus increased 8- to 62.5-fold, whereas those of TPB and FPB (both alone and in soap) were unchanged. In vitro, TPB and FPB soaps produced higher log(10) reductions in colony-forming units of all tested organisms (4.95-8.58) than did soaps containing triclosan alone (0.29-4.86). In the test using the pig skin and volunteer methods, TPB soap produced a higher log(10) reduction in colony-forming units (3.1-3.3) than did the soap containing triclosan alone (2.6-2.8). The results indicate that TPB and FPB soaps may provide superior rapid and broad-spectrum efficacy with a lower risk of organisms developing resistance than do soaps containing triclosan alone. Pig skin methods may be used to predict the efficacy of antibacterial soaps in the rapid disinfection of contaminated hands. Hand washing with TPB and FPB soaps by healthcare workers and the general population may reduce the transmission of pathogens, with a lower risk of promoting the emergence of resistant organisms.
Reed, Robert N
2005-06-01
The growing international trade in live wildlife has the potential to result in continuing establishment of nonnative animal populations in the United States. Snakes may pose particularly high risks as potentially invasive species, as exemplified by the decimation of Guam's vertebrate fauna by the accidentally introduced brown tree snake. Herein, ecological and commercial predictors of the likelihood of establishment of invasive populations were used to model risk associated with legal commercial imports of 23 species of boas, pythons, and relatives into the United States during the period 1989-2000. Data on ecological variables were collected from multiple sources, while data on commercial variables were collated from import records maintained by the U.S. Fish and Wildlife Service. Results of the risk-assessment models indicate that species including boa constrictors (Boa constrictor), ball pythons (Python regius), and reticulated pythons (P. reticulatus) may pose particularly high risks as potentially invasive species. Recommendations for reducing risk of establishment of invasive populations of snakes and/or pathogens include temporary quarantine of imports to increase detection rates of nonnative pathogens, increasing research attention to reptile pathogens, reducing the risk that nonnative snakes will reach certain areas with high numbers of federally listed species (such as the Florida Keys), and attempting to better educate individuals purchasing reptiles.
Koseki, Shigenobu; Mizuno, Yasuko; Yamamoto, Kazutaka
2011-09-01
The route of pathogen contamination (from roots versus from leaves) of spinach leaves was investigated with a hydroponic cultivation system. Three major bacterial pathogens, Escherichia coli O157:H7, Salmonella, and Listeria monocytogenes, were inoculated into the hydroponic solution, in which the spinach was grown to give concentrations of 10⁶ and 10³ CFU/ml. In parallel, the pathogens were inoculated onto the growing leaf surface by pipetting, to give concentrations of 10⁶ and 10³ CFU per leaf. Although contamination was observed at a high rate through the root system by the higher inoculum (10⁶ CFU) for all the pathogens tested, the contamination was rare when the lower inoculum (10³ CFU) was applied. In contrast, contamination through the leaf occurred at a very low rate, even when the inoculum level was high. For all the pathogens tested in the present study, the probability of contamination was promoted through the roots and with higher inoculum levels. The probability of contamination was analyzed with logistic regression. The logistic regression model showed that the odds ratio of contamination from the roots versus from the leaves was 6.93, which suggested that the risk of contamination from the roots was 6.93 times higher than the risk of contamination from the leaves. In addition, the risk of contamination by L. monocytogenes was about 0.3 times that of Salmonella enterica subsp. enterica serovars Typhimurium and Enteritidis and E. coli O157:H7. The results of the present study indicate that the principal route of pathogen contamination of growing spinach leaves in a hydroponic system is from the plant's roots, rather than from leaf contamination itself.
Adalsteinsson, Solny A; Shriver, W Gregory; Hojgaard, Andrias; Bowman, Jacob L; Brisson, Dustin; D'Amico, Vincent; Buler, Jeffrey J
2018-01-23
Forests in urban landscapes differ from their rural counterparts in ways that may alter vector-borne disease dynamics. In urban forest fragments, tick-borne pathogen prevalence is not well characterized; mitigating disease risk in densely-populated urban landscapes requires understanding ecological factors that affect pathogen prevalence. We trapped blacklegged tick (Ixodes scapularis) nymphs in urban forest fragments on the East Coast of the United States and used multiplex real-time PCR assays to quantify the prevalence of four zoonotic, tick-borne pathogens. We used Bayesian logistic regression and WAIC model selection to understand how vegetation, habitat, and landscape features of urban forests relate to the prevalence of B. burgdorferi (the causative agent of Lyme disease) among blacklegged ticks. In the 258 nymphs tested, we detected Borrelia burgdorferi (11.2% of ticks), Borrelia miyamotoi (0.8%) and Anaplasma phagocytophilum (1.9%), but we did not find Babesia microti (0%). Ticks collected from forests invaded by non-native multiflora rose (Rosa multiflora) had greater B. burgdorferi infection rates (mean = 15.9%) than ticks collected from uninvaded forests (mean = 7.9%). Overall, B. burgdorferi prevalence among ticks was positively related to habitat features (e.g. coarse woody debris and total understory cover) favorable for competent reservoir host species. Understory structure provided by non-native, invasive shrubs appears to aggregate ticks and reservoir hosts, increasing opportunities for pathogen transmission. However, when we consider pathogen prevalence among nymphs in context with relative abundance of questing nymphs, invasive plants do not necessarily increase disease risk. Although pathogen prevalence is greater among ticks in invaded forests, the probability of encountering an infected tick remains greater in uninvaded forests characterized by thick litter layers, sparse understories, and relatively greater questing tick abundance in urban landscapes.
Hussain, Arif; Shaik, Sabiha; Ranjan, Amit; Nandanwar, Nishant; Tiwari, Sumeet K.; Majid, Mohammad; Baddam, Ramani; Qureshi, Insaf A.; Semmler, Torsten; Wieler, Lothar H.; Islam, Mohammad A.; Chakravortty, Dipshikha; Ahmed, Niyaz
2017-01-01
Multidrug-resistant Escherichia coli infections are a growing public health concern. This study analyzed the possibility of contamination of commercial poultry meat (broiler and free-range) with pathogenic and or multi-resistant E. coli in retail chain poultry meat markets in India. We analyzed 168 E. coli isolates from broiler and free-range retail poultry (meat/ceca) sampled over a wide geographical area, for their antimicrobial sensitivity, phylogenetic groupings, virulence determinants, extended-spectrum-β-lactamase (ESBL) genotypes, fingerprinting by Enterobacterial Repetitive Intergenic Consensus (ERIC) PCR and genetic relatedness to human pathogenic E. coli using whole genome sequencing (WGS). The prevalence rates of ESBL producing E. coli among broiler chicken were: meat 46%; ceca 40%. Whereas, those for free range chicken were: meat 15%; ceca 30%. E. coli from broiler and free-range chicken exhibited varied prevalence rates for multi-drug resistance (meat 68%; ceca 64% and meat 8%; ceca 26%, respectively) and extraintestinal pathogenic E. coli (ExPEC) contamination (5 and 0%, respectively). WGS analysis confirmed two globally emergent human pathogenic lineages of E. coli, namely the ST131 (H30-Rx subclone) and ST117 among our poultry E. coli isolates. These results suggest that commercial poultry meat is not only an indirect public health risk by being a possible carrier of non-pathogenic multi-drug resistant (MDR)-E. coli, but could as well be the carrier of human E. coli pathotypes. Further, the free-range chicken appears to carry low risk of contamination with antimicrobial resistant and extraintestinal pathogenic E. coli (ExPEC). Overall, these observations reinforce the understanding that poultry meat in the retail chain could possibly be contaminated by MDR and/or pathogenic E. coli. PMID:29180984
Hussain, Arif; Shaik, Sabiha; Ranjan, Amit; Nandanwar, Nishant; Tiwari, Sumeet K; Majid, Mohammad; Baddam, Ramani; Qureshi, Insaf A; Semmler, Torsten; Wieler, Lothar H; Islam, Mohammad A; Chakravortty, Dipshikha; Ahmed, Niyaz
2017-01-01
Multidrug-resistant Escherichia coli infections are a growing public health concern. This study analyzed the possibility of contamination of commercial poultry meat (broiler and free-range) with pathogenic and or multi-resistant E. coli in retail chain poultry meat markets in India. We analyzed 168 E. coli isolates from broiler and free-range retail poultry (meat/ceca) sampled over a wide geographical area, for their antimicrobial sensitivity, phylogenetic groupings, virulence determinants, extended-spectrum-β-lactamase (ESBL) genotypes, fingerprinting by Enterobacterial Repetitive Intergenic Consensus (ERIC) PCR and genetic relatedness to human pathogenic E. coli using whole genome sequencing (WGS). The prevalence rates of ESBL producing E. coli among broiler chicken were: meat 46%; ceca 40%. Whereas, those for free range chicken were: meat 15%; ceca 30%. E. coli from broiler and free-range chicken exhibited varied prevalence rates for multi-drug resistance (meat 68%; ceca 64% and meat 8%; ceca 26%, respectively) and extraintestinal pathogenic E. coli (ExPEC) contamination (5 and 0%, respectively). WGS analysis confirmed two globally emergent human pathogenic lineages of E. coli , namely the ST131 ( H 30-Rx subclone) and ST117 among our poultry E. coli isolates. These results suggest that commercial poultry meat is not only an indirect public health risk by being a possible carrier of non-pathogenic multi-drug resistant (MDR)- E. coli , but could as well be the carrier of human E. coli pathotypes. Further, the free-range chicken appears to carry low risk of contamination with antimicrobial resistant and extraintestinal pathogenic E. coli (ExPEC). Overall, these observations reinforce the understanding that poultry meat in the retail chain could possibly be contaminated by MDR and/or pathogenic E. coli.
Two pathogenic species of Pythium: P. aphanidermatum and P. diclinum from a wheat field.
Al-Sheikh, Hashem
2010-10-01
During a survey of pathogenic and non-pathogenic Pythium spp. in different localities in Egypt, several isolates of Pythia were obtained and maintained on corn meal agar. Among these isolates, Pythium aphanidermatum and Pythium diclinum were obtained from rhizosphere of wheat plants grown in Dear Attia village, Minia, Egypt. Identification was made using morphological and molecular analyses. P. aphanidermatum and P. diclinum were able to cause reductions in emergence and adulating in wheat in laboratory scale. P. aphanidermatum appeared to be the most aggressive parasite under agar and pot experimental conditions.
Two pathogenic species of Pythium: P. aphanidermatum and P. diclinum from a wheat field
Al-Sheikh, Hashem
2010-01-01
During a survey of pathogenic and non-pathogenic Pythium spp. in different localities in Egypt, several isolates of Pythia were obtained and maintained on corn meal agar. Among these isolates, Pythium aphanidermatum and Pythium diclinum were obtained from rhizosphere of wheat plants grown in Dear Attia village, Minia, Egypt. Identification was made using morphological and molecular analyses. P. aphanidermatum and P. diclinum were able to cause reductions in emergence and adulating in wheat in laboratory scale. P. aphanidermatum appeared to be the most aggressive parasite under agar and pot experimental conditions. PMID:23961096
Microbiological Quantitative Risk Assessment
NASA Astrophysics Data System (ADS)
Dominguez, Silvia; Schaffner, Donald W.
The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.
Robert C. Venette
2013-01-01
Climate change may alter the distribution and activity of native and alien pathogens that infect trees and, in severe cases, cause tree death. In this study, potential future changes in climate suitability are investigated for three forest pathogens that occur in western North America: the native Arceuthobium tsugense subsp tsugense...
S. E. Meyer; M. Masi; S. Clement; T. L. Davis; J. Beckstead
2015-01-01
Pyrenophora semeniperda, an important pathogen in Bromus tectorum seed banks in semi-arid western North America, exhibits >4-fold variation in mycelial growth rate. Host seeds exhibit seasonal changes in dormancy that affect the risk of pathogen-caused mortality. The hypothesis tested is that contrasting seed dormancy phenotypes select for contrasting strategies...
Woolhouse, Mark
2017-07-01
Transmissibility is the defining characteristic of infectious diseases. Quantifying transmission matters for understanding infectious disease epidemiology and designing evidence-based disease control programs. Tracing individual transmission events can be achieved by epidemiological investigation coupled with pathogen typing or genome sequencing. Individual infectiousness can be estimated by measuring pathogen loads, but few studies have directly estimated the ability of infected hosts to transmit to uninfected hosts. Individuals' opportunities to transmit infection are dependent on behavioral and other risk factors relevant given the transmission route of the pathogen concerned. Transmission at the population level can be quantified through knowledge of risk factors in the population or phylogeographic analysis of pathogen sequence data. Mathematical model-based approaches require estimation of the per capita transmission rate and basic reproduction number, obtained by fitting models to case data and/or analysis of pathogen sequence data. Heterogeneities in infectiousness, contact behavior, and susceptibility can have substantial effects on the epidemiology of an infectious disease, so estimates of only mean values may be insufficient. For some pathogens, super-shedders (infected individuals who are highly infectious) and super-spreaders (individuals with more opportunities to transmit infection) may be important. Future work on quantifying transmission should involve integrated analyses of multiple data sources.
Fryk, Jesse J; Marks, Denese C; Hobson-Peters, Jody; Watterson, Daniel; Hall, Roy A; Young, Paul R; Reichenberg, Stefan; Tolksdorf, Frank; Sumian, Chryslain; Gravemann, Ute; Seltsam, Axel; Faddy, Helen M
2017-11-01
Zika virus (ZIKV) has emerged as a potential threat to transfusion safety worldwide. Pathogen inactivation is one approach to manage this risk. In this study, the efficacy of the THERAFLEX UV-Platelets system and THERAFLEX MB-Plasma system to inactivate ZIKV in platelet concentrates (PCs) and plasma was investigated. PCs spiked with ZIKV were treated with the THERAFLEX UV-Platelets system at 0.05, 0.10, 0.15, and 0.20 J/cm 2 UVC. Plasma spiked with ZIKV was treated with the THERAFLEX MB-Plasma system at 20, 40, 60, and 120 J/cm 2 light at 630 nm with at least 0.8 µmol/L methylene blue (MB). Samples were taken before the first and after each illumination dose and tested for residual virus. For each system the level of viral reduction was determined. Treatment of PCs with THERAFLEX UV-Platelets system resulted in a mean of 5 log reduction in ZIKV infectivity at the standard UVC dose (0.20 J/cm 2 ), with dose dependency observed with increasing UVC dose. For plasma treated with MB and visible light, ZIKV infectivity was reduced by a mean of at least 5.68 log, with residual viral infectivity reaching the detection limit of the assay at 40 J/cm 2 (one-third the standard dose). Our study demonstrates that the THERAFLEX UV-Platelets system and THERAFLEX MB-Plasma system can reduce ZIKV infectivity in PCs and pooled plasma to the detection limit of the assays used. These findings suggest both systems have the capacity to be an effective option to manage potential ZIKV transfusion transmission risk. © 2017 AABB.
Biocontrol of foliar pathogens: mechanisms and application.
Elad, Y
2003-01-01
Biocontrol offers attractive alternatives or supplements to the use of conventional methods for plant disease management. Vast experience has been gained in the biocontrol of plant diseases. Prevention of infection by biocontrol agents or suppression of disease is based on various modes of action. Pathogens are typically affected by certain modes of actions and not by others according to their nature (i.e. biotrophs vs. necrotrophs). Resistance in the host plant may be induced locally or systemically by either live or dead cells of the biocontrol agent and may affect pathogens of various groups. As some pathogens are negatively affected by lake of nutrients in the infection court, competition for nutrients and space was long recognized as antagonism trait. Antibiosis and hyperparasitism affect pathogens of various groups. Other valid mechanisms are reduction of the saprophytic ability and reducing spore dissemination. Recently it was revealed that restraining of pathogenicity factors of the pathogens, i.e. host hydrolyzing proteins or reactive oxygen species takes place when biocontrol is used. It is likely that several modes of action concomitantly participate in pathogens suppression but the relative importance of each one of them is not clear. Examples of effective prevention of infection in the phyllosphere that rely on multiple modes of action will be demonstrated with Trichodermo harzianum T39 (TRICHODEX), Bacillus mycoides and Pichia guilermondii, a filamentous fungus, bacterium and yeast biocontrol agents, respectively. Several commercial products based on microorganisms have been developed and are starting to penetrate the market. However, large-scale use is still limited because of variability and inconsistency of biocontrol activity. In some cases this may be caused by sensitivity of the biocontrol agents to environmental influences. Ways to overcome biocontrol limitations and to improve its efficacy are i. integration of biocontrol with chemical fungicides on a calendar basis or according to ecological requirements of the biocontrol agents relying on the advise of a decision support system; ii. introduction of two or more biocontrol agents in a mixture, assuming that each one of them has different ecological requirements and/or different modes of action. Implementation of one (or more) of these approaches, using biocontrol preparations mentioned above lowered the variability and increased the consistency of disease suppression. The expected long-term result of the implementation of these suggested strategies is reduced risk of uncontrolled epidemics and increase of confidence of growers in using this non-chemical control measure on a large scale.
Udasin, I G; Gochfeld, M
1994-05-01
On December 6, 1991. The Occupational Health and Safety Administration (OSHA) issued its final regulation concerning occupational exposure to bloodborne pathogens (29 CFR 1910.1030). OSHA has determined that workers in a variety of settings face a significant health risk as the result of occupational exposure to blood and other body fluids. The pathogens that are of the most concern include human immunodeficiency type 1 (HIV) and hepatitis B virus (HBV). OSHA concludes that the hazard can be minimized via engineering and work practice controls, personal protective equipment, HBV vaccination, training and education, and appropriate use of signs and labels. Occupational health professionals, including physicians, nurses, industrial hygienists, and safety officers, are faced with the challenge of writing and periodically updating exposure control plans that are unique to their settings, as well as advising colleagues in other settings. They are charged with identifying the appropriate at-risk groups within their workplace, and providing them with the appropriate training to enable employees to understand the rationale for the safety procedures that prevent exposures to blood-borne pathogens. This review of HIV/HBV articles pertinent to the occupational setting analyzes six topics including: (1) occupational risk of transmission of HIV, (2) occupational risk of transmission of HBV, (3) special concerns of dental practices, (4) risk of HIV/HBV outside the hospital, medical, or dental office setting, (5) legal and ethical issues involved in HIV testing, and (6) the United States Public Health Service postexposure HIV/HBV prophylaxis/treatment recommendations.
Dekić, Svjetlana; Klobučar, Göran; Ivanković, Tomislav; Zanella, Davor; Vucić, Matej; Bourdineaud, Jean-Paul; Hrenović, Jasna
2018-05-08
Bacterium Acinetobacter baumannii is an emerging human pathogen whose presence in the aquatic environment raises the issue of public health risk. Fish colonization represents the potential route of pathogen transmission to humans. The aim was to examine the colonization of A. baumannii to freshwater fish Poecilia reticulata. An extensively drug-resistant A. baumannii was tested at three concentrations in natural spring water. Additionally, 70 fish from the Sava River (Croatia) were screened for the presence of A. baumannii, which was not found in gill swabs or analysed gut. The colonization potential of A. baumannii in freshwater fish is dependent upon its concentration in surrounding water. The low concentration of A. baumannii in natural waters represents low colonization potential of freshwater fish. The risk for public health exists in closed water bodies where there is constant inflow of water polluted by A. baumannii in concentrations above 3 log CFU mL -1 .
Hii, S F; Traub, R J; Thompson, M F; Henning, J; O'Leary, C A; Burleigh, A; McMahon, S; Rees, R L; Kopp, S R
2015-03-01
To estimate the proportion of canine tick-borne disease (CTBD) pathogens in dogs from northern states of Australia presenting with and without clinical signs/laboratory abnormalities suggestive of CTBD and to evaluate associated risk factors. Client-owned dogs presented to a general practice clinic in the Northern Territory (NT; n = 138) and five referral hospitals in south-east Queensland (SEQ; n = 100) were grouped into CTBD-suspect and -control groups based on clinical and laboratory criteria. Blood and sera were screened for haemotropic Mycoplasma spp., Babesia spp., Anaplasma spp., Ehrlichia spp. and Hepatozoon spp. using microscopic examination, in-clinic ELISA testing and PCR assays. Dog-specific risk factors associated with the presence of CTBD pathogens were evaluated. Overall, 24.4% of the suspect group and 12.2% of the control group dogs were infected. The proportions of M. haemocanis, B. vogeli, A. platys, Candidatus Mycoplasma haematoparvum, and C. Mycoplasma haemobos were 7.1%, 5.0%, 3.8%, 1.7% and 0.4%, respectively. Dogs originating from the NT were 3.6-fold (95% confidence interval (CI) 1.51-8.62; P = 0.004) more likely to be infected with CTBD pathogens than those from SEQ. Male dogs were 2.3-fold (95% CI 1.17-4.80, P = 0.024) more likely to be PCR-positive to CTBD pathogens than female dogs. Dogs presenting with clinical signs consistent with CTBD and thrombocytopenia were more likely to be infected by CTBD pathogens (odds ratio 2.85; 95% CI 1.16, 7.02; P = 0.019). Haemotropic mycoplasmas were the most common tick-borne pathogen infecting client-owned dogs. Subclinical cases were common in dogs from the NT. Veterinary practitioners should be aware of the proportion of CTBD pathogens and the presenting features of clinical and subclinical disease in their area. © 2015 Australian Veterinary Association.
A Spike Cocktail Approach to Improve Microbial Performance Monitoring for Water Reuse.
Zimmerman, Brian D; Korajkic, Asja; Brinkman, Nichole E; Grimm, Ann C; Ashbolt, Nicholas J; Garland, Jay L
Water reuse, via either centralized treatment of traditional wastewater or decentralized treatment and on-site reuse, is becoming an increasingly important element of sustainable water management. Despite advances in waterborne pathogen detection methods, low and highly variable pathogen levels limit their utility for routine evaluation of health risks in water reuse systems. Therefore, there is a need to improve our understanding of the linkage between pathogens and more readily measured process indicators during treatment. This paper describes an approach for constructing spiking experiments to relate the behavior of viral, bacterial, and protozoan pathogens with relevant process indicators. General issues are reviewed, and the spiking protocol is applied as a case study example to improve microbial performance monitoring and health risk evaluation in a water reuse system. This approach provides a foundation for the development of novel approaches to improve real or near-real time performance monitoring of water recycling systems.
Khromenkova, E P; Dimidova, L L; Dumbadze, O S; Aidinov, G T; Shendo, G L; Agirov, A Kh; Batchaev, Kh Kh
2015-01-01
Sanitary and parasitological studies of the waste effluents and surface reservoir waters were conducted in the south of Russia. The efficiency of purification of waste effluents from the pathogens of parasitic diseases was investigated in the region's sewage-purification facilities. The water of the surface water reservoirs was found to contain helminthic eggs and larvae and intestinal protozoan cysts because of the poor purification and disinfection of service fecal sewage waters. The poor purification and disinvasion of waste effluents in the region determine the potential risk of contamination of the surface water reservoirs and infection of the population with the pathogens of human parasitic diseases.
Pathogen Inactivated Plasma Concentrated: Preparation and Uses
2004-09-01
REPORT DATE 01 SEP 2004 2 . REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Pathogen Inactivated Plasma Concentrated: Preparation...Concentrated: Preparation and Uses 22 - 2 RTO-MP-HFM-109 Results: Both UVC and ozone yielded a PPV logarithmic reduction factor (LRF) of 6, for a...technology to be marketed; the industry name is Plas+SD [ 2 ]. This process functions by attacking the lipid sheathes that surround enveloped viruses
Sattar, Syed A; Zargar, Bahram; Wright, Kathryn E; Rubino, Joseph R; Ijaz, M Khalid
2017-05-15
Family cars represent ∼74% of the yearly global output of motorized vehicles. With a life expectancy of ∼8 decades in many countries, the average person spends >100 min daily inside the confined and often shared space of the car, with exposure to a mix of potentially harmful microbes. Can commercial in-car microbial air decontamination devices mitigate the risk? Three such devices (designated devices 1 to 3) with HEPA filters were tested in the modified passenger cabin (3.25 m 3 ) of a four-door sedan housed within a biosafety level 3 containment facility. Staphylococcus aureus (ATCC 6538) was suspended in a soil load to simulate the presence of body fluids and aerosolized into the car's cabin with a 6-jet Collison nebulizer. A muffin fan (80 mm by 80 mm, with an output of 0.17 m 3 /min) circulated the air inside. Plates (150 mm diameter) of Trypticase soy agar (TSA), placed inside a programmable slit-to-agar sampler, were held at 36 ± 1°C for 18 to 24 h and examined for CFU. The input dose of the test bacterium, its rate of biological decay, and the log 10 reductions by the test devices were analyzed. The arbitrarily set performance criterion was the time in hours a device took for a 3-log 10 reduction in the level of airborne challenge bacterium. On average, the level of S. aureus challenge in the air varied between 4.2 log 10 CFU/m 3 and 5.5 log 10 CFU/m 3 , and its rate of biological decay was -0.0213 ± 0.0021 log 10 CFU/m 3 /min. Devices 1 to 3 took 2.3, 1.5, and 9.7 h, respectively, to meet the performance criterion. While the experimental setup was tested using S. aureus as an archetypical airborne pathogen, it can be readily adapted to test other types of pathogens and technologies. IMPORTANCE This study was designed to test the survival of airborne pathogens in the confined and shared space of a family automobile as well as to assess claims of devices marketed for in-car air decontamination. The basic experimental setup and the test protocols reported are versatile enough for work with all major types of airborne human pathogens and for testing a wide variety of air decontamination technologies. This study could also lay the foundation for a standardized test protocol for use by device makers as well as regulators for the registration of such devices. Copyright © 2017 American Society for Microbiology.
Zargar, Bahram; Wright, Kathryn E.; Rubino, Joseph R.; Ijaz, M. Khalid
2017-01-01
ABSTRACT Family cars represent ∼74% of the yearly global output of motorized vehicles. With a life expectancy of ∼8 decades in many countries, the average person spends >100 min daily inside the confined and often shared space of the car, with exposure to a mix of potentially harmful microbes. Can commercial in-car microbial air decontamination devices mitigate the risk? Three such devices (designated devices 1 to 3) with HEPA filters were tested in the modified passenger cabin (3.25 m3) of a four-door sedan housed within a biosafety level 3 containment facility. Staphylococcus aureus (ATCC 6538) was suspended in a soil load to simulate the presence of body fluids and aerosolized into the car's cabin with a 6-jet Collison nebulizer. A muffin fan (80 mm by 80 mm, with an output of 0.17 m3/min) circulated the air inside. Plates (150 mm diameter) of Trypticase soy agar (TSA), placed inside a programmable slit-to-agar sampler, were held at 36 ± 1°C for 18 to 24 h and examined for CFU. The input dose of the test bacterium, its rate of biological decay, and the log10 reductions by the test devices were analyzed. The arbitrarily set performance criterion was the time in hours a device took for a 3-log10 reduction in the level of airborne challenge bacterium. On average, the level of S. aureus challenge in the air varied between 4.2 log10 CFU/m3 and 5.5 log10 CFU/m3, and its rate of biological decay was −0.0213 ± 0.0021 log10 CFU/m3/min. Devices 1 to 3 took 2.3, 1.5, and 9.7 h, respectively, to meet the performance criterion. While the experimental setup was tested using S. aureus as an archetypical airborne pathogen, it can be readily adapted to test other types of pathogens and technologies. IMPORTANCE This study was designed to test the survival of airborne pathogens in the confined and shared space of a family automobile as well as to assess claims of devices marketed for in-car air decontamination. The basic experimental setup and the test protocols reported are versatile enough for work with all major types of airborne human pathogens and for testing a wide variety of air decontamination technologies. This study could also lay the foundation for a standardized test protocol for use by device makers as well as regulators for the registration of such devices. PMID:28389537
Use of cationic polymers to reduce pathogen levels during dairy manure separation.
Liu, Zong; Carroll, Zachary S; Long, Sharon C; Gunasekaran, Sundaram; Runge, Troy
2016-01-15
Various separation technologies are used to deal with the enormous amounts of animal waste that large livestock operations generate. When the recycled waste stream is land applied, it is essential to lower the pathogen load to safeguard the health of livestock and humans. We investigated whether cationic polymers, used as a flocculent in the solid/liquid separation process, could reduce the pathogen indicator load in the animal waste stream. The effects of low charge density cationic polyacrylamide (CPAM) and high charge density cationic polydicyandiamide (PDCD) were investigated. Results demonstrated that CPAM was more effective than PDCD for manure coagulation and flocculation, while PDCD was more effective than CPAM in reducing the pathogen indicator loads. However, their combined use, CPAM followed by PDCD, resulted in both improved solids separation and pathogen indicator reduction. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R
2011-01-01
Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.
Hwang, Cheng-An; Porto-Fett, Anna C S; Juneja, Vijay K; Ingham, Steven C; Ingham, Barbara H; Luchansky, John B
2009-02-28
This study quantified and modeled the survival of Escherichia coli O157:H7, Listeria monocytogenes and Salmonella Typhimurium in soudjouk-style fermented sausage during fermentation, drying, and storage. Batter prepared from ground beef (20% fat), seasonings, starter culture, and dextrose was separately inoculated with a multi-strain mixture of each pathogen to an initial inoculum of ca. 6.5 log(10) CFU/g in the batter. The sausages were subsequently fermented at 24 degrees C with a relative humidity (RH) of 90% to 95% for 3 to 5 days to ca. pH 5.2, pH 4.9 or pH 4.6, then dried at 22 degrees C to a(w) 0.92, a(w) 0.89, or a(w) 0.86, respectively, and then stored at 4, 21, or 30 degrees C for up to 60 days. Lethality of the three pathogens was modeled as a function of pH, a(w) and/or storage temperature. During fermentation to pH 5.2 to pH 4.6, cell reductions ranged from 0 to 0.9 log(10) CFU/g for E. coli O157:H7, 0.1 to 0.5 log(10) CFU/g for L. monocytogenes, and 0 to 2.2 log(10) CFU/g for S. Typhimurium. Subsequent drying of sausages of pH 5.2 to pH 4.6 at 22 degrees C with 80% to 85% RH for 3 to 7 days to a(w) of 0.92 to a(w) 0.86 resulted in additional reductions that ranged from 0 to 3.5 log(10) CFU/g for E. coli O157:H7, 0 to 0.4 log(10) CFU/g for L. monocytogenes, and 0.3 to 2.4 log(10) CFU/g for S. Typhimurium. During storage at 4, 21, or 30 degrees C the reduction rates of the three pathogens were generally higher (p<0.05) in sausages with lower pH and lower a(w) that were stored at higher temperatures. Polynomial equations were developed to describe the inactivation of the three pathogens during fermentation, drying, and storage. The applicability of the resulting models for fermented sausage was evaluated by comparing model predictions with published data. Pathogen reductions estimated by the models for E. coli O157:H7 and S. Typhimurium were comparable to 67% and 73% of published data, respectively. Due to limited published data for L. monocytogenes, the models for L. monocytogenes would need additional validations. Results of pathogen reductions from this study may be used as a reference to assist manufacturers of soudjouk-style sausages to adopt manufacturing processes that meet the regulatory requirements. The resulting models may also be used for estimating the survival of E. coli O157:H7 and S. Typhimurium in other similar fermented sausage during fermentation and storage.
Li, Taotao; Wu, Qixian; Wang, Yong; John, Afiya; Qu, Hongxia; Gong, Liang; Duan, Xuewu; Zhu, Hong; Yun, Ze; Jiang, Yueming
2017-01-01
Fusarium proliferatum is an important pathogen and causes a great economic loss to fruit industry. Environmental pH-value plays a regulatory role in fungi pathogenicity, however, the mechanism needs further exploration. In this study, F. proliferatum was cultured under two initial pH conditions of 5 and 10. No obvious difference was observed in the growth rate of F. proliferatum between two pH-values. F. proliferatum cultured under both pH conditions infected banana fruit successfully, and smaller lesion diameter was presented on banana fruit inoculated with pH 10-cultured fungi. Proteomic approach based on two-dimensional electrophoresis (2-DE) was used to investigate the changes in secretome of this fungus between pH 5 and 10. A total of 39 differential spots were identified using matrix-assisted laser desorption/ionization tandem time-of-flight mass spectrometry (MALDI-TOF/TOF-MS) and liquid chromatography electrospray ionization tandem mass spectrometry (LC-ESI-MS/MS). Compared to pH 5 condition, proteins related to cell wall degrading enzymes (CWDEs) and proteolysis were significantly down-regulated at pH 10, while proteins related to oxidation-reduction process and transport were significantly up-regulated under pH 10 condition. Our results suggested that the downregulation of CWDEs and other virulence proteins in the pH 10-cultured F. proliferatum severely decreased its pathogenicity, compared to pH 5-cultured fungi. However, the alkaline environment did not cause a complete loss of the pathogenic ability of F. proliferatum , probably due to the upregulation of the oxidation-reduction related proteins at pH 10, which may partially compensate its pathogenic ability.
De Vliegher, S; Fox, L K; Piepers, S; McDougall, S; Barkema, H W
2012-03-01
Heifer mastitis is a disease that potentially threatens production and udder health in the first and subsequent lactations. In general, coagulase-negative staphylococci (CNS) are the predominant cause of intramammary infection and subclinical mastitis in heifers around parturition, whereas Staphylococcus aureus and environmental pathogens cause a minority of the cases. Clinical heifer mastitis is typically caused by the major pathogens. The variation in proportions of causative pathogens between studies, herds, and countries is considerable. The magnitude of the effect of heifer mastitis on an individual animal is influenced by the form of mastitis (clinical versus subclinical), the virulence of the causative pathogen(s) (major versus minor pathogens), the time of onset of infection relative to calving, cure or persistence of the infection when milk production has started, and the host's immunity. Intramammary infection in early lactation caused by CNS does not generally have a negative effect on subsequent productivity. At the herd level, the impact will depend on the prevalence and incidence of the disease, the nature of the problem (clinical, subclinical, nonfunctional quarters), the causative pathogens involved (major versus minor pathogens), the ability of the animals to cope with the disease, and the response of the dairy manager to control the disease through management changes. Specific recommendations to prevent and control mastitis in late gestation in periparturient heifers are not part of the current National Mastitis Council mastitis and prevention program. Control and prevention is currently based on avoidance of inter-sucking among young stock, fly control, optimal nutrition, and implementation of hygiene control and comfort measures, especially around calving. More risk factors for subclinical and clinical heifer mastitis have been identified (e.g., season, location of herd, stage of pregnancy) although they do not lend themselves to the development of specific intervention strategies designed to prevent the disease. Pathogen-specific risk factors and associated control measures need to be identified due to the pathogen-related variation in epidemiology and effect on future performance. Prepartum intramammary treatment with antibiotics has been proposed as a simple and effective way of controlling heifer mastitis but positive long-lasting effects on somatic cell count and milk yield do not always occur, ruling out universal recommendation of this practice. Moreover, use of antibiotics in this manner is off-label and results in an increased risk of antibiotic residues in milk. Prepartum treatment can be implemented only as a short-term measure to assist in the control of a significant heifer mastitis problem under supervision of the herd veterinarian. When CNS are the major cause of intramammary infection in heifers, productivity is not affected, making prepartum treatment redundant and even unwanted. In conclusion, heifer mastitis can affect the profitability of dairy farming because of a potential long-term negative effect on udder health and milk production and an associated culling risk, specifically when major pathogens are involved. Prevention and control is not easy but is possible through changes in young stock and heifer management. However, the pathogenesis and epidemiology of the disease remain largely unknown and more pathogen-specific risk factors should be identified to optimize current prevention programs. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Effects of Cancer Genetic Panel Testing on at-Risk Individuals.
Frost, Anja S; Toaff, Miriam; Biagi, Tara; Stark, Elizabeth; McHenry, Allison; Kaltman, Rebecca
2018-06-01
To evaluate the role of screening patients at increased risk for hereditary cancer syndromes with an extended panel of cancer predisposition genes to identify actionable genetic mutations. A retrospective chart review was conducted of all patients presenting to a multidisciplinary cancer program for genetic counseling and testing from January 2015 to December 2016. Individuals presenting to the program were identified as at-risk by a personal or family history of cancer, by their health care provider, or by self-referral. All participants met current National Comprehensive Cancer Network criteria for genetic risk evaluation for hereditary cancer. The results of testing and its implications for management, based on National Comprehensive Cancer Network guidelines, were recorded. Of 670 at-risk patients who underwent genetic testing, 66 (9.9%) had BRCA-limited testing; of these, 26 of 670 (3.9%) had a deleterious or likely pathogenic mutation. Expanded panel testing was done for 560 of the 670 patients (83.4%), and abnormal results were found in 65 of 670 (9.7%); non-BRCA mutations (predominantly CHEK2) were found in 49 of the 65 (75%). Abnormal genetic testing was associated with increased surveillance in 96% of those with deleterious mutations, whereas negative testing for a known familial mutation in 45 patients was associated with a downgrade of their risk and reduction of subsequent surveillance and management. Guideline-based management is frequently altered by genetic testing, including panel testing, in patients at risk for cancer. We recommend that obstetrics and gynecology providers routinely refer at-risk patients for genetic counseling and testing when clinically appropriate.
Kroupa, Radek; Jurankova, Jana; Dastych, Milan; Senkyrik, Michal; Pavlik, Tomas; Prokesova, Jitka; Jecmenova, Marketa; Dolina, Jiri; Hep, Ales
2014-01-01
The aim of this study was to monitor oropharyngeal bacterial colonization in patients indicated for percutaneous endoscopic gastronomy (PEG). Oropharyngeal swabs were obtained from patients prior to PEG placement. A development of peristomal infection was evaluated. The analysis of oropharyngeal and peristomal site pathogens was done. Consecutive 274 patients referred for PEG due to neurological disorder or cancer completed the study. Oropharyngeal colonization with pathogens was observed in 69% (190/274), dominantly in the neurologic subgroup of patients (P < 0.001). Peristomal infection occurred in 30 (10.9%) of patients and in 57% of them the correlation between oropharyngeal and peristomal agents was present. The presence of oropharyngeal pathogens was assessed as an important risk factor for the development of peristomal infection only in oncological patients (OR = 8.33, 95% CI: 1.66-41.76). Despite a high prevalence of pathogens in neurological patients, it did not influence the risk of peristomal infection with the exception for methicillin resistant Staphylococcus aureus (MRSA) carriers (OR 4.5, 95% CI: 1.08-18.76). During oropharyngeal microbial screening prior to the PEG insertion, the detection of pathogens may be a marker of the increased risk of peristomal infection in cancer patients only. In neurological patients the benefit of the screening is limited to the detection of MRSA carriers.
Amarasiri, Mohan; Kitajima, Masaaki; Nguyen, Thanh H; Okabe, Satoshi; Sano, Daisuke
2017-09-15
The multiple-barrier concept is widely employed in international and domestic guidelines for wastewater reclamation and reuse for microbiological risk management, in which a wastewater reclamation system is designed to achieve guideline values of the performance target of microbe reduction. Enteric viruses are one of the pathogens for which the target reduction values are stipulated in guidelines, but frequent monitoring to validate human virus removal efficacy is challenging in a daily operation due to the cumbersome procedures for virus quantification in wastewater. Bacteriophages have been the first choice surrogate for this task, because of the well-characterized nature of strains and the presence of established protocols for quantification. Here, we performed a meta-analysis to calculate the average log 10 reduction values (LRVs) of somatic coliphages, F-specific phages, MS2 coliphage and T4 phage by membrane bioreactor, activated sludge, constructed wetlands, pond systems, microfiltration and ultrafiltration. The calculated LRVs of bacteriophages were then compared with reported human enteric virus LRVs. MS2 coliphage LRVs in MBR processes were shown to be lower than those of norovirus GII and enterovirus, suggesting it as a possible validation and operational monitoring tool. The other bacteriophages provided higher LRVs compared to human viruses. The data sets on LRVs of human viruses and bacteriophages are scarce except for MBR and conventional activated sludge processes, which highlights the necessity of investigating LRVs of human viruses and bacteriophages in multiple treatment unit processes. Copyright © 2017 Elsevier Ltd. All rights reserved.
The epidemiology of microbial keratitis with silicone hydrogel contact lenses.
Stapleton, Fiona; Keay, Lisa; Edwards, Katie; Holden, Brien
2013-01-01
It was widely anticipated that after the introduction of silicone hydrogel lenses, the risk of microbial keratitis would be lower than with hydrogel lenses because of the reduction in hypoxic effects on the corneal epithelium. Large-scale epidemiological studies have confirmed that the absolute and relative risk of microbial keratitis is unchanged with overnight use of silicone hydrogel materials. The key findings include the following: (1) The risk of infection with 30 nights of silicone hydrogel use is equivalent to 6 nights of hydrogel extended wear; (2) Occasional overnight lens use is associated with a greater risk than daily lens use; (3) The rate of vision loss due to corneal infection with silicone hydrogel contact lenses is similar to that seen in hydrogel lenses; (4) The spectrum of causative organisms is similar to that seen in hydrogel lenses, and the material type does not impact the corneal location of presumed microbial keratitis; and (5) Modifiable risk factors for infection include overnight lens use, the degree of exposure, failing to wash hands before lens handling, and storage case hygiene practice. The lack of change in the absolute risk of disease would suggest that exposure to large number of pathogenic organisms can overcome any advantages obtained from eliminating the hypoxic effects of contact lenses. Epidemiological studies remain important in the assessment of new materials and modalities. Consideration of an early adopter effect with studies involving new materials and modalities and further investigation of the impact of second-generation silicone hydrogel materials is warranted.
Zavizion, B; Serebryanik, D; Chapman, J; Alford, B; Purmal, A
2004-10-01
The risk of transfusion-transmitted bacterial infections as a result of the presence of bacteria in blood is one of the major concerns in transfusion medicine. The purpose of this study was to investigate whether bacteria inoculated into red blood cell concentrates can be inactivated by the INACTINE PEN110 pathogen-reduction process. Four bacterial species were chosen for the study: anaerobic Gram-positive Clostridium perfringens and Propionibacterium acnes, known to be transfusion-transmitted; and two Gram-negative species, Acinetobacter johnsonii and Acinetobacter lwoffii, recently reported to be a common cause of transfusion-associated infections in Europe. Identical units of leucoreduced red cell concentrates were inoculated with A. johnsonii, A. lwoffii, C. perfringens, or P. acnes. The 4 degrees C control units were put on storage immediately after receiving the spike. The test units were subjected to PEN110 treatment and then stored. The bacterial titre in all units was monitored during a 6-week storage period. The PEN110 inactivation of all tested bacterial strains was time- and titre-dependent. For A. johnsonii and A. lwoffii, no viable bacteria were detected in the units spiked with up to 10(4) colony-forming units (CFU)/ml and treated with PEN110. For red cell units spiked with 10(4)-10(5) CFU/ml of C. perfringens and P. acnes, no viable bacteria were detected in the units treated with PEN110. In control units, there was a gradual decrease in A. johnsonii, A. lwoffii and C. perfringens titres during cold storage, while P. acnes titres remained stable. The PEN110 pathogen-reduction process was demonstrated to inactivate high titres of A. johnsonii, A. lwoffii, C. perfringens and P. acnes in red cell concentrates.
Brazeau, Randi H.; Edwards, Marc A.
2013-01-01
Abstract Residential water heating is linked to growth of pathogens in premise plumbing, which is the primary source of waterborne disease in the United States. Temperature and disinfectant residual are critical factors controlling increased concentration of pathogens, but understanding of how each factor varies in different water heater configurations is lacking. A direct comparative study of electric water heater systems was conducted to evaluate temporal variations in temperature and water quality parameters including dissolved oxygen levels, hydrogen evolution, total and soluble metal concentrations, and disinfectant decay. Recirculation tanks had much greater volumes of water at temperature ranges with potential for increased pathogen growth when set at 49°C compared with standard tank systems without recirculation. In contrast, when set at the higher end of acceptable ranges (i.e., 60°C), this relationship was reversed and recirculation systems had less volume of water at risk for pathogen growth compared with conventional systems. Recirculation tanks also tended to have much lower levels of disinfectant residual (standard systems had 40–600% higher residual), 4–6 times as much hydrogen, and 3–20 times more sediment compared with standard tanks without recirculation. On demand tankless systems had very small volumes of water at risk and relatively high levels of disinfectant residual. Recirculation systems may have distinct advantages in controlling pathogens via thermal disinfection if set at 60°C, but these systems have lower levels of disinfectant residual and greater volumes at risk if set at lower temperatures. PMID:24170969
USDA-ARS?s Scientific Manuscript database
Currently, nearly all fresh-cut lettuce processing facilities in the United States use chlorinated water or other sanitizer solutions for microbial reduction after lettuce is cut. It is believed that freshly cut lettuce releases significant amounts of organic matters that negatively impact the effec...
Cardoso, Teresa; Ribeiro, Orquídea; Aragão, Irene César; Costa-Pereira, Altamiro; Sarmento, António Eugénio
2012-12-26
There is a lack of consensus regarding the definition of risk factors for healthcare-associated infection (HCAI). The purpose of this study was to identify additional risk factors for HCAI, which are not included in the current definition of HCAI, associated with infection by multidrug-resistant (MDR) pathogens, in all hospitalized infected patients from the community. This 1-year prospective cohort study included all patients with infection admitted to a large, tertiary care, university hospital. Risk factors not included in the HCAI definition, and independently associated with MDR pathogen infection, namely MDR Gram-negative (MDR-GN) and ESKAPE microorganisms (vancomycin-resistant Enterococcus faecium, methicillin-resistant Staphylococcus aureus, extended-spectrum beta-lactamase-producing Escherichia coli and Klebsiella species, carbapenem-hydrolyzing Klebsiella pneumonia and MDR Acinetobacter baumannii, Pseudomonas aeruginosa, Enterobacter species), were identified by logistic regression among patients admitted from the community (either with community-acquired or HCAI). There were 1035 patients with infection, 718 from the community. Of these, 439 (61%) had microbiologic documentation; 123 were MDR (28%). Among MDR: 104 (85%) had MDR-GN and 41 (33%) had an ESKAPE infection. Independent risk factors associated with MDR and MDR-GN infection were: age (adjusted odds ratio (OR) = 1.7 and 1.5, p = 0.001 and p = 0.009, respectively), and hospitalization in the previous year (between 4 and 12 months previously) (adjusted OR = 2.0 and 1,7, p = 0.008 and p = 0.048, respectively). Infection by pathogens from the ESKAPE group was independently associated with previous antibiotic therapy (adjusted OR = 7.2, p < 0.001) and a Karnofsky index <70 (adjusted OR = 3.7, p = 0.003). Patients with infection by MDR, MDR-GN and pathogens from the ESKAPE group had significantly higher rates of inadequate antibiotic therapy than those without (46% vs 7%, 44% vs 10%, 61% vs 15%, respectively, p < 0.001). This study suggests that the inclusion of additional risk factors in the current definition of HCAI for MDR pathogen infection, namely age >60 years, Karnofsky index <70, hospitalization in the previous year, and previous antibiotic therapy, may be clinically beneficial for early diagnosis, which may decrease the rate of inadequate antibiotic therapy among these patients.
2012-01-01
Background There is a lack of consensus regarding the definition of risk factors for healthcare-associated infection (HCAI). The purpose of this study was to identify additional risk factors for HCAI, which are not included in the current definition of HCAI, associated with infection by multidrug-resistant (MDR) pathogens, in all hospitalized infected patients from the community. Methods This 1-year prospective cohort study included all patients with infection admitted to a large, tertiary care, university hospital. Risk factors not included in the HCAI definition, and independently associated with MDR pathogen infection, namely MDR Gram-negative (MDR-GN) and ESKAPE microorganisms (vancomycin-resistant Enterococcus faecium, methicillin-resistant Staphylococcus aureus, extended-spectrum beta-lactamase-producing Escherichia coli and Klebsiella species, carbapenem-hydrolyzing Klebsiella pneumonia and MDR Acinetobacter baumannii, Pseudomonas aeruginosa, Enterobacter species), were identified by logistic regression among patients admitted from the community (either with community-acquired or HCAI). Results There were 1035 patients with infection, 718 from the community. Of these, 439 (61%) had microbiologic documentation; 123 were MDR (28%). Among MDR: 104 (85%) had MDR-GN and 41 (33%) had an ESKAPE infection. Independent risk factors associated with MDR and MDR-GN infection were: age (adjusted odds ratio (OR) = 1.7 and 1.5, p = 0.001 and p = 0.009, respectively), and hospitalization in the previous year (between 4 and 12 months previously) (adjusted OR = 2.0 and 1,7, p = 0.008 and p = 0.048, respectively). Infection by pathogens from the ESKAPE group was independently associated with previous antibiotic therapy (adjusted OR = 7.2, p < 0.001) and a Karnofsky index <70 (adjusted OR = 3.7, p = 0.003). Patients with infection by MDR, MDR-GN and pathogens from the ESKAPE group had significantly higher rates of inadequate antibiotic therapy than those without (46% vs 7%, 44% vs 10%, 61% vs 15%, respectively, p < 0.001). Conclusions This study suggests that the inclusion of additional risk factors in the current definition of HCAI for MDR pathogen infection, namely age >60 years, Karnofsky index <70, hospitalization in the previous year, and previous antibiotic therapy, may be clinically beneficial for early diagnosis, which may decrease the rate of inadequate antibiotic therapy among these patients. PMID:23267668
Sze To, G N; Chao, C Y H
2010-02-01
Infection risk assessment is very useful in understanding the transmission dynamics of infectious diseases and in predicting the risk of these diseases to the public. Quantitative infection risk assessment can provide quantitative analysis of disease transmission and the effectiveness of infection control measures. The Wells-Riley model has been extensively used for quantitative infection risk assessment of respiratory infectious diseases in indoor premises. Some newer studies have also proposed the use of dose-response models for such purpose. This study reviews and compares these two approaches to infection risk assessment of respiratory infectious diseases. The Wells-Riley model allows quick assessment and does not require interspecies extrapolation of infectivity. Dose-response models can consider other disease transmission routes in addition to airborne route and can calculate the infectious source strength of an outbreak in terms of the quantity of the pathogen rather than a hypothetical unit. Spatial distribution of airborne pathogens is one of the most important factors in infection risk assessment of respiratory disease. Respiratory deposition of aerosol induces heterogeneous infectivity of intake pathogens and randomness on the intake dose, which are not being well accounted for in current risk models. Some suggestions for further development of the risk assessment models are proposed. This review article summarizes the strengths and limitations of the Wells-Riley and the dose-response models for risk assessment of respiratory diseases. Even with many efforts by various investigators to develop and modify the risk assessment models, some limitations still persist. This review serves as a reference for further development of infection risk assessment models of respiratory diseases. The Wells-Riley model and dose-response model offer specific advantages. Risk assessors can select the approach that is suitable to their particular conditions to perform risk assessment.
MANAGING URBAN WATERSHED PATHOGEN CONTAMINATION
This presentation is a summary of the EPA National Risk Management Research Laboratory (NRMRL) publication entitled Managing Urban Watershed Pathogen Contamination, EPA/600/R-03/111 (September 2003). It is available on the internet at http://www.epa.gov/ednnrmrl/repository/water...
MetaCompare: A computational pipeline for prioritizing environmental resistome risk.
Oh, Min; Pruden, Amy; Chen, Chaoqi; Heath, Lenwood S; Xia, Kang; Zhang, Liqing
2018-04-26
The spread of antibiotic resistance is a growing public health concern. While numerous studies have highlighted the importance of environmental sources and pathways of the spread of antibiotic resistance, a systematic means of comparing and prioritizing risks represented by various environmental compartments is lacking. Here we introduce MetaCompare, a publicly-available tool for ranking 'resistome risk,' which we define as the potential for antibiotic resistance genes (ARGs) to be associated with mobile genetic elements (MGEs) and mobilize to pathogens based on metagenomic data. A computational pipeline was developed in which each ARG is evaluated based on relative abundance, mobility, and presence within a pathogen. This is determined through assembly of shotgun sequencing data and analysis of contigs containing ARGs to determine if they contain sequence similarity to MGEs or human pathogens. Based on the assembled metagenomes, samples are projected into a 3-D hazard space and assigned resistome risk scores. To validate, we tested previously published metagenomic data derived from distinct aquatic environments. Based on unsupervised machine learning, the test samples clustered in the hazard space in a manner consistent with their origin. The derived scores produced a well-resolved ascending resistome risk ranking of: wastewater treatment plant effluent, dairy lagoon, hospital sewage.
Kamins, Alexandra O; Rowcliffe, J Marcus; Ntiamoa-Baidu, Yaa; Cunningham, Andrew A; Wood, James L N; Restif, Olivier
2015-03-01
Emerging zoonotic pathogens from wildlife pose increasing public health threats globally. Bats, in particular, host an array of zoonotic pathogens, yet there is little research on how bats and humans interact, how people perceive bats and their accompanying disease risk, or who is most at risk. Eidolon helvum, the largest and most abundant African fruit bat species, is widely hunted and eaten in Ghana and also carries potentially zoonotic pathogens. This combination raises concerns, as hunting and butchering bushmeat are common sources of zoonotic transmission. Through a combination of interviews with 577 Ghanaians across southern Ghana, we identified the characteristics of people involved in the bat-bushmeat trade and we explored their perceptions of risk. Bat hunting, selling and consumption are widely distributed across regional and ethnic lines, with hotspots in certain localities, while butchering is predominantly done by women and active hunters. Interviewees held little belief of disease risk from bats, saw no ecological value in fruit bats and associated the consumption of bats with specific tribes. These data can be used to inform disease and conservation management plans, drawing on social contexts and ensuring that local voices are heard within the larger global effort to study and mitigate outbreaks.
Significance of Viable but Nonculturable Escherichia coli: Induction, Detection, and Control.
Ding, Tian; Suo, Yuanjie; Xiang, Qisen; Zhao, Xihong; Chen, Shiguo; Ye, Xingqian; Liu, Donghong
2017-03-28
Diseases caused by foodborne or waterborne pathogens are emerging. Many pathogens can enter into the viable but nonculturable (VBNC) state, which is a survival strategy when exposed to harsh environmental stresses. Pathogens in the VBNC state have the ability to evade conventional microbiological detection methods, posing a significant and potential health risk. Therefore, controlling VBNC bacteria in food processing and the environment is of great importance. As the typical one of the gram-negatives, Escherichia coli ( E. coli ) is a widespread foodborne and waterborne pathogenic bacterium and is able to enter into a VBNC state in extreme conditions (similar to the other gram-negative bacteria), including inducing factors and resuscitation stimulus. VBNC E. coli has the ability to recover both culturability and pathogenicity, which may bring potential health risk. This review describes the concrete factors (nonthermal treatment, chemical agents, and environmental factors) that induce E. coli into the VBNC state, the condition or stimulus required for resuscitation of VBNC E. coli , and the methods for detecting VBNC E. coli . Furthermore, the mechanism of genes and proteins involved in the VBNC E. coli is also discussed in this review.
Li, Sen; Gilbert, Lucy; Harrison, Paula A; Rounsevell, Mark D A
2016-03-01
Lyme disease is the most prevalent vector-borne disease in the temperate Northern Hemisphere. The abundance of infected nymphal ticks is commonly used as a Lyme disease risk indicator. Temperature can influence the dynamics of disease by shaping the activity and development of ticks and, hence, altering the contact pattern and pathogen transmission between ticks and their host animals. A mechanistic, agent-based model was developed to study the temperature-driven seasonality of Ixodes ricinus ticks and transmission of Borrelia burgdorferi sensu lato across mainland Scotland. Based on 12-year averaged temperature surfaces, our model predicted that Lyme disease risk currently peaks in autumn, approximately six weeks after the temperature peak. The risk was predicted to decrease with increasing altitude. Increases in temperature were predicted to prolong the duration of the tick questing season and expand the risk area to higher altitudinal and latitudinal regions. These predicted impacts on tick population ecology may be expected to lead to greater tick-host contacts under climate warming and, hence, greater risks of pathogen transmission. The model is useful in improving understanding of the spatial determinants and system mechanisms of Lyme disease pathogen transmission and its sensitivity to temperature changes. © 2016 The Author(s).
Gilbert, Lucy; Harrison, Paula A.; Rounsevell, Mark D. A.
2016-01-01
Lyme disease is the most prevalent vector-borne disease in the temperate Northern Hemisphere. The abundance of infected nymphal ticks is commonly used as a Lyme disease risk indicator. Temperature can influence the dynamics of disease by shaping the activity and development of ticks and, hence, altering the contact pattern and pathogen transmission between ticks and their host animals. A mechanistic, agent-based model was developed to study the temperature-driven seasonality of Ixodes ricinus ticks and transmission of Borrelia burgdorferi sensu lato across mainland Scotland. Based on 12-year averaged temperature surfaces, our model predicted that Lyme disease risk currently peaks in autumn, approximately six weeks after the temperature peak. The risk was predicted to decrease with increasing altitude. Increases in temperature were predicted to prolong the duration of the tick questing season and expand the risk area to higher altitudinal and latitudinal regions. These predicted impacts on tick population ecology may be expected to lead to greater tick–host contacts under climate warming and, hence, greater risks of pathogen transmission. The model is useful in improving understanding of the spatial determinants and system mechanisms of Lyme disease pathogen transmission and its sensitivity to temperature changes. PMID:27030039
Infectious disease risks in xenotransplantation.
Fishman, Jay A
2018-03-07
Hurdles exist to clinical xenotransplantation including potential infectious transmission from nonhuman species to xenograft recipients. In anticipation of clinical trials of xenotransplantation, the associated infectious risks have been investigated. Swine and immunocompromised humans share some potential pathogens. Swine herpesviruses including porcine cytomegalovirus (PCMV) and porcine lymphotropic herpesvirus (PLHV) are largely species-specific and do not, generally, infect human cells. Human cellular receptors exist for porcine endogenous retrovirus (PERV), which infects certain human-derived cell lines in vitro. PERV-inactivated pigs have been produced recently. Human infection due to PERV has not been described. A screening paradigm can be applied to exclude potential human pathogens from "designated pathogen free" breeding colonies. Various microbiological assays have been developed for screening and diagnosis including antibody-based tests and qualitative and quantitative molecular assays for viruses. Additional assays may be required to diagnose pig-specific organisms in human xenograft recipients. Significant progress has been made in the evaluation of the potential infectious risks of clinical xenotransplantation. Infectious risk would be amplified by intensive immunosuppression. The available data suggest that risks of xenotransplant-associated recipient infection are manageable and that clinical trials can be performed safely. Possible infectious risks of xenotransplantation to the community at large are undefined but merit consideration. © 2018 The American Society of Transplantation and the American Society of Transplant Surgeons.
Null expectations for disease dynamics in shrinking habitat: dilution or amplification?
McCallum, Hamish I.; Gillespie, Thomas R.
2017-01-01
As biodiversity declines with anthropogenic land-use change, it is increasingly important to understand how changing biodiversity affects infectious disease risk. The dilution effect hypothesis, which points to decreases in biodiversity as critical to an increase in infection risk, has received considerable attention due to the allure of a win–win scenario for conservation and human well-being. Yet some empirical data suggest that the dilution effect is not a generalizable phenomenon. We explore the response of pathogen transmission dynamics to changes in biodiversity that are driven by habitat loss using an allometrically scaled multi-host model. With this model, we show that declining habitat, and thus declining biodiversity, can lead to either increasing or decreasing infectious-disease risk, measured as endemic prevalence. Whether larger habitats, and thus greater biodiversity, lead to a decrease (dilution effect) or increase (amplification effect) in infection prevalence depends upon the pathogen transmission mode and how host competence scales with body size. Dilution effects were detected for most frequency-transmitted pathogens and amplification effects were detected for density-dependent pathogens. Amplification effects were also observed over a particular range of habitat loss in frequency-dependent pathogens when we assumed that host competence was greatest in large-bodied species. By contrast, only amplification effects were observed for density-dependent pathogens; host competency only affected the magnitude of the effect. These models can be used to guide future empirical studies of biodiversity–disease relationships across gradients of habitat loss. The type of transmission, the relationship between host competence and community assembly, the identity of hosts contributing to transmission, and how transmission scales with area are essential factors to consider when elucidating the mechanisms driving disease risk in shrinking habitat. This article is part of the themed issue ‘Conservation, biodiversity and infectious disease: scientific evidence and policy implications'. PMID:28438921
Null expectations for disease dynamics in shrinking habitat: dilution or amplification?
Faust, Christina L; Dobson, Andrew P; Gottdenker, Nicole; Bloomfield, Laura S P; McCallum, Hamish I; Gillespie, Thomas R; Diuk-Wasser, Maria; Plowright, Raina K
2017-06-05
As biodiversity declines with anthropogenic land-use change, it is increasingly important to understand how changing biodiversity affects infectious disease risk. The dilution effect hypothesis, which points to decreases in biodiversity as critical to an increase in infection risk, has received considerable attention due to the allure of a win-win scenario for conservation and human well-being. Yet some empirical data suggest that the dilution effect is not a generalizable phenomenon. We explore the response of pathogen transmission dynamics to changes in biodiversity that are driven by habitat loss using an allometrically scaled multi-host model. With this model, we show that declining habitat, and thus declining biodiversity, can lead to either increasing or decreasing infectious-disease risk, measured as endemic prevalence. Whether larger habitats, and thus greater biodiversity, lead to a decrease (dilution effect) or increase (amplification effect) in infection prevalence depends upon the pathogen transmission mode and how host competence scales with body size. Dilution effects were detected for most frequency-transmitted pathogens and amplification effects were detected for density-dependent pathogens. Amplification effects were also observed over a particular range of habitat loss in frequency-dependent pathogens when we assumed that host competence was greatest in large-bodied species. By contrast, only amplification effects were observed for density-dependent pathogens; host competency only affected the magnitude of the effect. These models can be used to guide future empirical studies of biodiversity-disease relationships across gradients of habitat loss. The type of transmission, the relationship between host competence and community assembly, the identity of hosts contributing to transmission, and how transmission scales with area are essential factors to consider when elucidating the mechanisms driving disease risk in shrinking habitat.This article is part of the themed issue 'Conservation, biodiversity and infectious disease: scientific evidence and policy implications'. © 2017 The Author(s).
Teillant, Aude; Gandra, Sumanth; Barter, Devra; Morgan, Daniel J; Laxminarayan, Ramanan
2015-12-01
The declining efficacy of existing antibiotics potentially jeopardises outcomes in patients undergoing medical procedures. We investigated the potential consequences of increases in antibiotic resistance on the ten most common surgical procedures and immunosuppressing cancer chemotherapies that rely on antibiotic prophylaxis in the USA. We searched the published scientific literature and identified meta-analyses and reviews of randomised controlled trials or quasi-randomised controlled trials (allocation done on the basis of a pseudo-random sequence-eg, odd/even hospital number or date of birth, alternation) to estimate the efficacy of antibiotic prophylaxis in preventing infections and infection-related deaths after surgical procedures and immunosuppressing cancer chemotherapy. We varied the identified effect sizes under different scenarios of reduction in the efficacy of antibiotic prophylaxis (10%, 30%, 70%, and 100% reductions) and estimated the additional number of infections and infection-related deaths per year in the USA for each scenario. We estimated the percentage of pathogens causing infections after these procedures that are resistant to standard prophylactic antibiotics in the USA. We estimate that between 38·7% and 50·9% of pathogens causing surgical site infections and 26·8% of pathogens causing infections after chemotherapy are resistant to standard prophylactic antibiotics in the USA. A 30% reduction in the efficacy of antibiotic prophylaxis for these procedures would result in 120,000 additional surgical site infections and infections after chemotherapy per year in the USA (ranging from 40,000 for a 10% reduction in efficacy to 280,000 for a 70% reduction in efficacy), and 6300 infection-related deaths (range: 2100 for a 10% reduction in efficacy, to 15,000 for a 70% reduction). We estimated that every year, 13,120 infections (42%) after prostate biopsy are attributable to resistance to fluoroquinolones in the USA. Increasing antibiotic resistance potentially threatens the safety and efficacy of surgical procedures and immunosuppressing chemotherapy. More data are needed to establish how antibiotic prophylaxis recommendations should be modified in the context of increasing rates of resistance. DRIVE-AB Consortium. Copyright © 2015 Elsevier Ltd. All rights reserved.
Honda, Hitoshi; Iwata, Kentaro
2016-08-01
Personal protective equipment (PPE) protects healthcare workers (HCWs) from infection by highly virulent pathogens via exposure to body fluids and respiratory droplets. Given the recent outbreaks of contagious infectious diseases worldwide, including Ebola virus and Middle Eastern respiratory syndrome, there is urgent need for further research to determine optimal PPE use in high-risk settings. This review intends to provide a general understanding of PPE and to provide guidelines for appropriate use based on current evidence. Although previous studies have focused on the efficacy of PPE in preventing transmission of pathogens, recent studies have examined the dangers to HCWs during removal of PPE when risk of contamination is highest. Access to adequate PPE supplies is crucial to preventing transmission of pathogens, especially in resource-limited settings. Adherence to appropriate PPE use is a challenge due to inadequate education on its usage, technical difficulties, and tolerability of PPE in the workplace. Future projects aim at ameliorating this situation, including redesigning PPE which is crucial to improving the safety of HCWs. PPE remains the most important strategy for protecting HCW from potentially fatal pathogens. Further research into optimal PPE design and use to improve the safety of HCWs is urgently needed.
[Important vector-borne infectious diseases among humans in Germany. Epidemiological aspects].
Frank, C; Faber, M; Hellenbrand, W; Wilking, H; Stark, K
2014-05-01
Vector-borne infections pathogenic to humans play an important role in Germany. The relevant zoonotic pathogens are either endemic throughout Germany (e.g. Borrelia burgdorferi sensu latu) or only in specific regions, e.g. tick-borne encephalitis (TBE) virus and hantavirus. They cause a substantial burden of disease. Prevention and control largely rely on public advice and the application of personal protective measures (e.g. TBE virus vaccination and protection against vectors). High quality surveillance and targeted epidemiological studies are fundamental for the evaluation of temporal and spatial risks of infection and the effectiveness of preventive measures. Aside from endemic pathogens, vector-borne infections acquired abroad, mostly transmitted by mosquitoes, have to be systematically and intensively monitored as well, to assess the risk of infection for German residents traveling abroad and to adequately evaluate the risk of autochthonous transmission. Related issues, such as invasive species of mosquitoes in Germany and climate change, have to be taken into consideration. Such pathogens include West Nile, dengue and chikungunya viruses, as well as malaria parasites (Plasmodium species). The article presents an overview of the epidemiological situation of selected relevant vector-borne infections in Germany.
RISK ASSESSMENT AND EPIDEMIOLOGICAL INFORMATION FOR PATHOGENIC MICROORGANISMS APPLIED TO SOIL
There is increasing interest in the development of a microbial risk assessment methodology for regulatory and operational decision making. Initial interests in microbial risk assessments focused on drinking, recreational, and reclaimed water issues. More recently risk assessmen...
MICROBIOLOGICAL RISK ASSESSMENT FOR LAND APPLICATION OF MUNICIPAL SLUDGE
Each major option for the disposal/reuse of municipal sludges poses potential risks to human health or the environment because of the microbial contaminants in sludge. Therefore, risk assessment methodology appropriate for pathogen risk evaluation for land application and distrib...
Liao, Chien-Wei; Chuang, Ting-Wu; Huang, Ying-Chieh; Chou, Chia-Mei; Chiang, Chia-Lien; Lee, Fei-Peng; Hsu, Yun-Ting; Lin, Jia-Wei; Briand, Kennar; Tu, Chia-Ying; Fan, Chia-Kwung
2017-12-01
Intestinal parasitic infections (IPIs) among schoolchildren in Republic of Marshall Islands (RMI) largely remains unknown, thus investigation on IPIs status to establish the baseline data is urgently needed. This cross-sectional study intended to investigate the current IPIs status and associated risk factors among schoolchildren at capital of RMI. Single stool sample from 400 schoolchildren (207 boys and 193 girls) aged 9.73±2.50 yrs old was examined by employing merthiolate-iodine-formaldehyde concentration method. Demographic characteristics, uncomfortable symptoms and risk factors were obtained by questionnaires investigation. The overall prevalence of IPIs in schoolchildren was 22.8% (91/400), of them 24.2% harbored at least 2 different parasites. Notably, the majority was infected by waterborne protozoan parasites (82.4%, 75/91). Nine different intestinal parasites have been identified, of which six were pathogenic including Hook worm, Trichuris trichiura, Enterobius vermicularis, Entamoeba histolytica/dispar, Giardia intestinalis and Blastocystis hominis. Schoolchildren who ever complained dizziness or headache showed a significant higher prevalence of pathogenic IPIs than those who did not (p<0.05). Schoolchildren who lived in urban area than rural area had higher chance to acquire pathogenic IPIs (p=0.03). However, none of risk factors were identified to be associated with pathogenic IPIs. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
VanDerslice, James; Briscoe, James
1993-07-01
Storing drinking water in the home is common in the developing world. Several studies have documented increased concentrations of fecal coliforms during household storage. This has led to the belief that in-house water contamination is an important transmission route for enteric pathogens and, moreover, that improving water source quality is not warranted until that quality can be maintained in the home. We contend that in-house water contamination does not pose a serious risk of diarrhea because family members would likely develop some level of immunity to pathogens commonly encountered in the household environment. Even when there is no such immunity, transmission of these pathogens via stored water may be inefficient relative to other household transmission routes, such as person-to-person contact or food contamination. A contaminated water source poses much more of a risk since it may introduce new pathogens into the household, The effects of water source and in-house contamination on diarrheal disease are estimated for 2355 Filipino infants. The results confirm our hypothesis: contaminated water sources pose a serious risk of diarrhea while contamination of drinking water in the home does not. Water boiling is shown to eliminate the risk of diarrhea due to water source contamination. The results imply that improvements in water source quality are more important than improving water storage practices.
Alendronate for fracture prevention in postmenopause.
Holder, Kathryn K; Kerley, Sara Shelton
2008-09-01
Osteoporosis is an abnormal reduction in bone mass and bone deterioration leading to increased fracture risk. Alendronate (Fosamax) belongs to the bisphosphonate class of drugs, which act to inhibit bone resorption by interfering with the activity of osteoclasts. To assess the effectiveness of alendronate in the primary and secondary prevention of osteoporotic fractures in postmenopausal women. The authors searched Central, Medline, and EMBASE for relevant randomized controlled trials published from 1966 to 2007. The authors undertook study selection and data abstraction in duplicate. The authors performed meta-analysis of fracture outcomes using relative risks, and a relative change greater than 15 percent was considered clinically important. The authors assessed study quality through reporting of allocation concealment, blinding, and withdrawals. Eleven trials representing 12,068 women were included in the review. Relative and absolute risk reductions for the 10-mg dose were as follows. For vertebral fractures, a 45 percent relative risk reduction was found (relative risk [RR] = 0.55; 95% confidence interval [CI], 0.45 to 0.67). This was significant for primary prevention, with a 45 percent relative risk reduction (RR = 0.55; 95% CI, 0.38 to 0.80) and 2 percent absolute risk reduction; and for secondary prevention, with 45 percent relative risk reduction (RR = 0.55; 95% CI, 0.43 to 0.69) and 6 percent absolute risk reduction. For nonvertebral fractures, a 16 percent relative risk reduction was found (RR = 0.84; 95% CI, 0.74 to 0.94). This was significant for secondary prevention, with a 23 percent relative risk reduction (RR = 0.77; 95% CI, 0.64 to 0.92) and a 2 percent absolute risk reduction, but not for primary prevention (RR = 0.89; 95% CI, 0.76 to 1.04). There was a 40 percent relative risk reduction in hip fractures (RR = 0.60; 95% CI, 0.40 to 0.92), but only secondary prevention was significant, with a 53 percent relative risk reduction (RR = 0.47; 95% CI, 0.26 to 0.85) and a 1 percent absolute risk reduction. The only significance found for wrist fractures was in secondary prevention, with a 50 percent relative risk reduction (RR = 0.50; 95% CI, 0.34 to 0.73) and a 2 percent absolute risk reduction. For adverse events, the authors found no statistically significant difference in any included study. However, observational data raise concerns about potential risk for upper gastrointestinal injury and, less commonly, osteonecrosis of the jaw. At 10 mg of alendronate per day, clinically important and statistically significant reductions in vertebral, nonvertebral, hip, and wrist fractures were observed for secondary prevention. The authors found no statistically significant results for primary prevention, with the exception of vertebral fractures, for which the reduction was clinically important.
A Simple Model to Rank Shellfish Farming Areas Based on the Risk of Disease Introduction and Spread.
Thrush, M A; Pearce, F M; Gubbins, M J; Oidtmann, B C; Peeler, E J
2017-08-01
The European Union Council Directive 2006/88/EC requires that risk-based surveillance (RBS) for listed aquatic animal diseases is applied to all aquaculture production businesses. The principle behind this is the efficient use of resources directed towards high-risk farm categories, animal types and geographic areas. To achieve this requirement, fish and shellfish farms must be ranked according to their risk of disease introduction and spread. We present a method to risk rank shellfish farming areas based on the risk of disease introduction and spread and demonstrate how the approach was applied in 45 shellfish farming areas in England and Wales. Ten parameters were used to inform the risk model, which were grouped into four risk themes based on related pathways for transmission of pathogens: (i) live animal movement, (ii) transmission via water, (iii) short distance mechanical spread (birds) and (iv) long distance mechanical spread (vessels). Weights (informed by expert knowledge) were applied both to individual parameters and to risk themes for introduction and spread to reflect their relative importance. A spreadsheet model was developed to determine quantitative scores for the risk of pathogen introduction and risk of pathogen spread for each shellfish farming area. These scores were used to independently rank areas for risk of introduction and for risk of spread. Thresholds were set to establish risk categories (low, medium and high) for introduction and spread based on risk scores. Risk categories for introduction and spread for each area were combined to provide overall risk categories to inform a risk-based surveillance programme directed at the area level. Applying the combined risk category designation framework for risk of introduction and spread suggested by European Commission guidance for risk-based surveillance, 4, 10 and 31 areas were classified as high, medium and low risk, respectively. © 2016 Crown copyright.
Code of Federal Regulations, 2010 CFR
2010-07-01
... identify and estimate safety and environmental management risks and appropriate risk reduction strategies... responsible for identifying/estimating risks and for appropriate risk reduction strategies? 102-80.50 Section... Environmental Management Risks and Risk Reduction Strategies § 102-80.50 Are Federal agencies responsible for...
Szabados, Florian; Mohner, Amelie; Kleine, Britta; Gatermann, Sören G
2013-10-01
Staphylococcal lipases have been proposed as pathogenicity factors. In Staphylococcus saprophyticus the surface-associated protein (Ssp) has been previously characterized as a cell wall-associated true lipase. A S. saprophyticus Δssp::ermB mutant has been described as less virulent in an in vivo model of urinary tract infection compared with its wild-type. This is the first report showing that S. saprophyticus induced a lifespan reduction in Caenorhabditis elegans similar to that of S. aureus RN4220. In two S. saprophyticus Δssp::ermB mutants lifespan reduction in C. elegans was partly abolished. In order to attribute virulence to the lipase activity itself and distinguish this phenomenon from the presence of the Ssp-protein, the conserved active site of the lipase was modified by site-directed ligase-independent mutagenesis and lipase activity-deficient mutants were constructed. These results indicate that the Ssp is associated with pathogenicity in C. elegans and one could speculate that the lipase activity itself is responsible for this virulence.
Szabados, Florian; Mohner, Amelie; Kleine, Britta; Gatermann, Sören G
2013-01-01
Staphylococcal lipases have been proposed as pathogenicity factors. In Staphylococcus saprophyticus the surface-associated protein (Ssp) has been previously characterized as a cell wall-associated true lipase. A S. saprophyticus Δssp::ermB mutant has been described as less virulent in an in vivo model of urinary tract infection compared with its wild-type. This is the first report showing that S. saprophyticus induced a lifespan reduction in Caenorhabditis elegans similar to that of S. aureus RN4220. In two S. saprophyticus Δssp::ermB mutants lifespan reduction in C. elegans was partly abolished. In order to attribute virulence to the lipase activity itself and distinguish this phenomenon from the presence of the Ssp-protein, the conserved active site of the lipase was modified by site-directed ligase-independent mutagenesis and lipase activity-deficient mutants were constructed. These results indicate that the Ssp is associated with pathogenicity in C. elegans and one could speculate that the lipase activity itself is responsible for this virulence. PMID:23959029
Krempely, Kate; Karam, Rachid
2018-05-24
Most truncating CDH1 pathogenic alterations confer an elevated lifetime risk of diffuse gastric cancer and lobular breast cancer. However, transcripts containing carboxyl-terminal (C-terminal) premature stop codons have been demonstrated to escape the nonsense-mediated mRNA decay (NMD) pathway, and gastric and breast cancer risks associated with these truncations should be carefully evaluated. A female patient underwent multigene panel testing due to a personal history of invasive lobular breast cancer diagnosed at age 54, which identified the germline CDH1 nonsense alteration, c.2506G>T (p.E836*), in the last exon of the gene. Subsequent parental testing for the alteration was negative and additional short tandem repeat analysis confirmed the familial relationships and the de novo occurrence in the proband. Based on the de novo occurrence, clinical history, and rarity in general population databases, this alteration was classified as a likely pathogenic variant. This is the most C-terminal pathogenic alteration reported to date. Additionally, this alteration contributed to the classification of six other upstream CDH1 C-terminal truncating variants as pathogenic or likely pathogenic. Identifying the most distal pathogenic alteration provides evidence to classify other C-terminal truncating variants as either pathogenic or benign, a fundamental step to offering pre-symptomatic screening and prophylactic procedures to the appropriate patients. Cold Spring Harbor Laboratory Press.
[Mosquitoes as vectors for exotic pathogens in Germany].
Becker, N; Krüger, A; Kuhn, C; Plenge-Bönig, A; Thomas, S M; Schmidt-Chanasit, J; Tannich, E
2014-05-01
As a result of intensified globalization of international trade and of substantial travel activities, mosquito-borne exotic pathogens are becoming an increasing threat for Europe. In Germany some 50 different mosquito species are known, several of which have vector competence for pathogens. During the last few years a number of zoonotic arboviruses that are pathogenic for humans have been isolated from mosquitoes in Germany including Usutu, Sindbis and Batai viruses. In addition, filarial worms, such as Dirofilaria repens have been repeatedly detected in mosquitoes from the federal state of Brandenburg. Other pathogens, in particular West Nile virus, are expected to emerge sooner or later in Germany as the virus is already circulating in neighboring countries, e.g. France, Austria and the Czech Republic. In upcoming years the risk for arbovirus transmission might increase in Germany due to increased occurrence of new so-called "invasive" mosquito species, such as the Asian bush mosquito Ochlerotatus japonicus or the Asian tiger mosquito Aedes albopictus. These invasive species are characterized by high vector competence for a broad range of pathogens and a preference for human blood meals. For risk assessment, a number of mosquito and pathogen surveillance projects have been initiated in Germany during the last few years; however, mosquito control strategies and plans of action have to be developed and put into place to allow early and efficient action against possible vector-borne epidemics.
Ashbolt, Nicholas J.
2015-01-01
Major waterborne (enteric) pathogens are relatively well understood and treatment controls are effective when well managed. However, water-based, saprozoic pathogens that grow within engineered water systems (primarily within biofilms/sediments) cannot be controlled by water treatment alone prior to entry into water distribution and other engineered water systems. Growth within biofilms or as in the case of Legionella pneumophila, primarily within free-living protozoa feeding on biofilms, results from competitive advantage. Meaning, to understand how to manage water-based pathogen diseases (a sub-set of saprozoses) we need to understand the microbial ecology of biofilms; with key factors including biofilm bacterial diversity that influence amoebae hosts and members antagonistic to water-based pathogens, along with impacts from biofilm substratum, water temperature, flow conditions and disinfectant residual—all control variables. Major saprozoic pathogens covering viruses, bacteria, fungi and free-living protozoa are listed, yet today most of the recognized health burden from drinking waters is driven by legionellae, non-tuberculous mycobacteria (NTM) and, to a lesser extent, Pseudomonas aeruginosa. In developing best management practices for engineered water systems based on hazard analysis critical control point (HACCP) or water safety plan (WSP) approaches, multi-factor control strategies, based on quantitative microbial risk assessments need to be developed, to reduce disease from largely opportunistic, water-based pathogens. PMID:26102291
Cunniffe, Nik J; Gilligan, Christopher A
2011-06-07
We develop and analyse a flexible compartmental model of the interaction between a plant host, a soil-borne pathogen and a microbial antagonist, for use in optimising biological control. By extracting invasion and persistence thresholds of host, pathogen and biological control agent, performing an equilibrium analysis, and numerical investigation of sensitivity to parameters and initial conditions, we determine criteria for successful biological control. We identify conditions for biological control (i) to prevent a pathogen entering a system, (ii) to eradicate a pathogen that is already present and, if that is not possible, (iii) to reduce the density of the pathogen. Control depends upon the epidemiology of the pathogen and how efficiently the antagonist can colonise particular habitats (i.e. healthy tissue, infected tissue and/or soil-borne inoculum). A sharp transition between totally effective control (i.e. eradication of the pathogen) and totally ineffective control can follow slight changes in biologically interpretable parameters or to the initial amounts of pathogen and biological control agent present. Effective biological control requires careful matching of antagonists to pathosystems. For preventative/eradicative control, antagonists must colonise susceptible hosts. However, for reduction in disease prevalence, the range of habitat is less important than the antagonist's bulking-up efficiency. Copyright © 2011 Elsevier Ltd. All rights reserved.
Evolution of pathogen virulence across space during an epidemic
Osnas, Erik; Hurtado, Paul J.; Dobson, Andrew P.
2015-01-01
We explore pathogen virulence evolution during the spatial expansion of an infectious disease epidemic in the presence of a novel host movement trade-off, using a simple, spatially explicit mathematical model. This work is motivated by empirical observations of the Mycoplasma gallisepticum invasion into North American house finch (Haemorhous mexicanus) populations; however, our results likely have important applications to other emerging infectious diseases in mobile hosts. We assume that infection reduces host movement and survival and that across pathogen strains the severity of these reductions increases with pathogen infectiousness. Assuming these trade-offs between pathogen virulence (host mortality), pathogen transmission, and host movement, we find that pathogen virulence levels near the epidemic front (that maximize wave speed) are lower than those that have a short-term growth rate advantage or that ultimately prevail (i.e., are evolutionarily stable) near the epicenter and where infection becomes endemic (i.e., that maximize the pathogen basic reproductive ratio). We predict that, under these trade-offs, less virulent pathogen strains will dominate the periphery of an epidemic and that more virulent strains will increase in frequency after invasion where disease is endemic. These results have important implications for observing and interpreting spatiotemporal epidemic data and may help explain transient virulence dynamics of emerging infectious diseases.
Horigan, V; De Nardi, M; Simons, R R L; Bertolini, S; Crescio, M I; Estrada-Peña, A; Léger, A; Maurella, C; Ru, G; Schuppers, M; Stärk, K D C; Adkin, A
2018-05-01
We present a novel approach of using the multi-criteria pathogen prioritisation methodology as a basis for selecting the most appropriate case studies for a generic risk assessment framework. The approach uses selective criteria to rank exotic animal health pathogens according to the likelihood of introduction and the impact of an outbreak if it occurred in the European Union (EU). Pathogens were evaluated based on their impact on production at the EU level and international trade. A subsequent analysis included criteria of relevance to quantitative risk assessment case study selection, such as the availability of data for parameterisation, the need for further research and the desire for the case studies to cover different routes of transmission. The framework demonstrated is flexible with the ability to adjust both the criteria and their weightings to the user's requirements. A web based tool has been developed using the RStudio shiny apps software, to facilitate this. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
Ensuring privacy in the study of pathogen genetics
Mehta, Sanjay R.; Vinterbo, Staal A.; Little, Susan J.
2014-01-01
Rapid growth in the genetic sequencing of pathogens in recent years has led to the creation of large sequence databases. This aggregated sequence data can be very useful for tracking and predicting epidemics of infectious diseases. However, the balance between the potential public health benefit and the risk to personal privacy for individuals whose genetic data (personal or pathogen) are included in such work has been difficult to delineate, because neither the true benefit nor the actual risk to participants has been adequately defined. Existing approaches to minimise the risk of privacy loss to participants are based on de-identification of data by removal of a predefined set of identifiers. These approaches neither guarantee privacy nor protect the usefulness of the data. We propose a new approach to privacy protection that will quantify the risk to participants, while still maximising the usefulness of the data to researchers. This emerging standard in privacy protection and disclosure control, which is known as differential privacy, uses a process-driven rather than data-centred approach to protecting privacy. PMID:24721230
Ensuring privacy in the study of pathogen genetics.
Mehta, Sanjay R; Vinterbo, Staal A; Little, Susan J
2014-08-01
Rapid growth in the genetic sequencing of pathogens in recent years has led to the creation of large sequence databases. This aggregated sequence data can be very useful for tracking and predicting epidemics of infectious diseases. However, the balance between the potential public health benefit and the risk to personal privacy for individuals whose genetic data (personal or pathogen) are included in such work has been difficult to delineate, because neither the true benefit nor the actual risk to participants has been adequately defined. Existing approaches to minimise the risk of privacy loss to participants are based on de-identification of data by removal of a predefined set of identifiers. These approaches neither guarantee privacy nor protect the usefulness of the data. We propose a new approach to privacy protection that will quantify the risk to participants, while still maximising the usefulness of the data to researchers. This emerging standard in privacy protection and disclosure control, which is known as differential privacy, uses a process-driven rather than data-centred approach to protecting privacy. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Du, Fangzhou; Keller, Jürg; Yuan, Zhiguo; Batstone, Damien J.; Freguia, Stefano; Pikaar, Ilje
2016-12-01
Sludge management is a major issue for water utilities globally. Poor digestibility and dewaterability are the main factors determining the cost for sludge management, whereas pathogen and toxic metal concentrations limit beneficial reuse. In this study, the effects of low level nitrite addition to acidified sludge to simultaneously enhance digestibility, toxic metal removal, dewaterability and pathogen reduction were investigated. Waste activated sludge (WAS) from a full-scale waste water treatment plant was treated at pH 2 with 10 mg NO2--N/L for 5 h. Biochemical methane potential tests showed an increase in the methane production of 28%, corresponding to an improvement from 247 ± 8 L CH4/kg VS to 317 ± 1 L CH4/kg VS. The enhanced removal of toxic metals further increased the methane production by another 18% to 360 ± 6 L CH4/kg VS (a total increase of 46%). The solids content of dewatered sludge increased from 14.6 ± 1.4% in the control to 18.2 ± 0.8%. A 4-log reduction for both total coliforms and E. coli was achieved. Overall, this study highlights the potential of acidification with low level nitrite addition as an effective and simple method achieving multiple improvements in terms of sludge management.
Du, Fangzhou; Keller, Jürg; Yuan, Zhiguo; Batstone, Damien J.; Freguia, Stefano; Pikaar, Ilje
2016-01-01
Sludge management is a major issue for water utilities globally. Poor digestibility and dewaterability are the main factors determining the cost for sludge management, whereas pathogen and toxic metal concentrations limit beneficial reuse. In this study, the effects of low level nitrite addition to acidified sludge to simultaneously enhance digestibility, toxic metal removal, dewaterability and pathogen reduction were investigated. Waste activated sludge (WAS) from a full-scale waste water treatment plant was treated at pH 2 with 10 mg NO2−-N/L for 5 h. Biochemical methane potential tests showed an increase in the methane production of 28%, corresponding to an improvement from 247 ± 8 L CH4/kg VS to 317 ± 1 L CH4/kg VS. The enhanced removal of toxic metals further increased the methane production by another 18% to 360 ± 6 L CH4/kg VS (a total increase of 46%). The solids content of dewatered sludge increased from 14.6 ± 1.4% in the control to 18.2 ± 0.8%. A 4-log reduction for both total coliforms and E. coli was achieved. Overall, this study highlights the potential of acidification with low level nitrite addition as an effective and simple method achieving multiple improvements in terms of sludge management. PMID:28004811
Schaffner, Donald W; Schaffner, Kristin M
2007-01-01
This research was undertaken to determine the effectiveness of an alcohol-based hand sanitizer on hands contaminated with a nonpathogen surrogate for Escherichia coli O157:H7, where the source of the contamination was frozen hamburger patties. A nonpathogenic nalidixic acid-resistant food-grade strain of Enterobacter aerogenes was used to inoculate frozen hamburger patties composed of 76% lean beef and 24% fat. Thirty-two individuals participated to produce the data used in this study. Each participant handled nine patties at least three times, a sample for microbiological analysis was collected from the surface of one hand, the participant sanitized both hands, and a sample was collected from the other hand. Burger handling created perceptible and visible food debris on the hands of most participants. Computer simulations also were used to perform a variety of risk calculations. The average reduction in bacteria from the use of sanitizer on hands contaminated by frozen burgers containing E. aerogenes was 2.6 +/- 0.7 log CFU per hand. An experiment designed to simultaneously test the effect of sanitizer on E. aerogenes and E. coli O157:H7 also revealed no significant difference in sanitizer effectiveness against the two organisms. The results of the real-world risk estimation calculations (using the actual prevalence and concentration of E. coli O157:H7 in ground beef) predict that once in 1 million trials, a single pathogen cell will be transferred to a single lettuce piece. The effectiveness of this sanitizer intervention was similar to that for hand washing and glove use previously reported. The person-to-person microbial reduction variability from sanitizer use is similar to published data for glove use and was less variable than published data on hand washing effectiveness.
de Jong, Aarieke E. I.; van Asselt, Esther D.; Zwietering, Marcel H.; Nauta, Maarten J.; de Jonge, Rob
2012-01-01
The aim of this research was to determine the decimal reduction times of bacteria present on chicken fillet in boiling water. The experiments were conducted with Campylobacter jejuni, Salmonella, and Escherichia coli. Whole chicken breast fillets were inoculated with the pathogens, stored overnight (4°C), and subsequently cooked. The surface temperature reached 70°C within 30 sec and 85°C within one minute. Extremely high decimal reduction times of 1.90, 1.97, and 2.20 min were obtained for C. jejuni, E. coli, and S. typhimurium, respectively. Chicken meat and refrigerated storage before cooking enlarged the heat resistance of the food borne pathogens. Additionally, a high challenge temperature or fast heating rate contributed to the level of heat resistance. The data were used to assess the probability of illness (campylobacteriosis) due to consumption of chicken fillet as a function of cooking time. The data revealed that cooking time may be far more critical than previously assumed. PMID:22389647
Rajkowski, Kathleen T; Ashurst, Kean
2009-11-01
To achieve the production of pathogen-free sprouts, there must be appropriate mixing of liquid sanitizer with the seeds to assure contact. Commercial treatments by irradiation or ozone gas of Salmonella spp. artificially inoculated seeds were compared, and these resulted in a 1 log reduction after all treatments. Use of peroxyacetic acid (1%) sanitizer on Salmonella spp. or Escherichia coli O157:H7 inoculated alfalfa seeds consistently resulted in a greater than 1 log reduction. In addition, during these studies debris was noted after the seeds were removed. Based on this observation, an air-mixing wash basin was developed for commercial use. Validation was done by commercial growers using 1% peroxyacetic acid sanitizer to wash seeds in the air-mixing basin, followed by sprouting the seeds. No positive or false-positive pathogen results were reported after the required testing of the sprout water (run-off during sprouting). Use of 1% peroxyacetic acid sanitizer in the air-mixing wash basin does provide the sprout grower an effective means of sanitizing sprout seeds.
Mokhtari, Amir; Oryang, David; Chen, Yuhuan; Pouillot, Regis; Van Doren, Jane
2018-01-08
We developed a probabilistic mathematical model for the postharvest processing of leafy greens focusing on Escherichia coli O157:H7 contamination of fresh-cut romaine lettuce as the case study. Our model can (i) support the investigation of cross-contamination scenarios, and (ii) evaluate and compare different risk mitigation options. We used an agent-based modeling framework to predict the pathogen prevalence and levels in bags of fresh-cut lettuce and quantify spread of E. coli O157:H7 from contaminated lettuce to surface areas of processing equipment. Using an unbalanced factorial design, we were able to propagate combinations of random values assigned to model inputs through different processing steps and ranked statistically significant inputs with respect to their impacts on selected model outputs. Results indicated that whether contamination originated on incoming lettuce heads or on the surface areas of processing equipment, pathogen prevalence among bags of fresh-cut lettuce and batches was most significantly impacted by the level of free chlorine in the flume tank and frequency of replacing the wash water inside the tank. Pathogen levels in bags of fresh-cut lettuce were most significantly influenced by the initial levels of contamination on incoming lettuce heads or surface areas of processing equipment. The influence of surface contamination on pathogen prevalence or levels in fresh-cut bags depended on the location of that surface relative to the flume tank. This study demonstrates that developing a flexible yet mathematically rigorous modeling tool, a "virtual laboratory," can provide valuable insights into the effectiveness of individual and combined risk mitigation options. © 2018 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Heusinkveld, M; Mughini-Gras, L; Pijnacker, R; Vennema, H; Scholts, R; van Huisstede-Vlaanderen, K W; Kortbeek, T; Kooistra-Smid, M; van Pelt, W
2016-10-01
Acute gastroenteritis (AGE) morbidity remains high amongst preschool children, posing a significant societal burden. Empirical data on AGE-causing agents is needed to gauge their clinical relevance and identify agent-specific targets for control. We assessed the prevalence, risk factors and association with symptoms for enteropathogens in households with preschool children. A monthly-repeated cross-sectional survey of enteropathogens in households with preschool children was performed. A parent-child pair per household (n = 907 households) provided faecal samples and reported their symptoms and potential risk exposures. Samples were tested by multiplex reverse transcription polymerase chain reaction (RT-PCR) for 19 enteropathogens. Associations were assessed using logistic regression. 28.3 % of children (n = 981) and 15.6 % of parents (n = 971) carried pathogenic bacteria and/or Escherichia coli-associated pathogenicity genes, and 6.5 % and 3.3 % carried viruses, respectively. Giardia lamblia (4.6 % of children, 2.5 % of parents) and Dientamoeba fragilis (36 %, 39 %, respectively) were the main parasites, and were associated with pet exposure. Living in rural areas was associated with carriage of pathogenic E. coli, norovirus I and D. fragilis. Pathogenic E. coli was associated with summertime and livestock exposure. Attending day-care centres increased the risk of carrying norovirus, sapovirus and G. lamblia. Viruses occurred mainly in winter and were associated with AGE symptoms. Child-parent associations were found for bacterial pathogenicity genes, viruses, G. lamblia and D. fragilis. Enteropathogens spread widely in households with preschool children, particularly viruses, which more often cause symptoms. While bacteria predominate during summer and in those exposed to livestock, viruses predominate in wintertime and, like G. lamblia, are widespread amongst day-care centre attendees.
Smith, R A; Griffin, D D; Dargatz, D A
1997-08-01
There are currently no scientifically defined critical management points or critical control points to manage foodborne pathogens at the pre-harvest level. Research is ongoing: much of the pre-harvest research is funded by producer organisations. The beef industry has Beef Quality Assurance (BQA) programmes in place and these are dynamic. Groups of cattlemen have made a very strong commitment to reducing foodborne pathogens in beef. Fewer Escherichia coli O157:H7 organisms are shed by feedlot cattle near the end of the feeding period than by newly arrived cattle. Moreover, there is less shedding of the organisms in cattle of slaughter age than in younger cattle. The prevalence of E. coli O157:H7 in feedlot cattle is similar to that in range cattle. This suggests that concentrating cattle in feedlot dirt pens does not increase the risk of shedding E. coli organisms. Pen maintenance, considered a good management practice, appears to be an adequate means of keeping pathogen levels in pens low. It is not likely that pre-harvest food safety programmes will eliminate the threat of pathogens such as E. coli O157:H7 or Salmonella. The management of foodborne pathogens will become part of an integrated programme to enhance food safety which includes the producer, the packer, the distributors, retailers and the consumer. The feedlot industry initiated a residue avoidance programme several years ago. As a result, the risk of chemical residues in beef from feedlots in the United States of America is near zero. Hazard analysis and critical control point-type prevention programmes, using scientifically based critical management points, will help ensure that the risk remains negligible.
Interrelationships of food safety and plant pathology: the life cycle of human pathogens on plants.
Barak, Jeri D; Schroeder, Brenda K
2012-01-01
Bacterial food-borne pathogens use plants as vectors between animal hosts, all the while following the life cycle script of plant-associated bacteria. Similar to phytobacteria, Salmonella, pathogenic Escherichia coli, and cross-domain pathogens have a foothold in agricultural production areas. The commonality of environmental contamination translates to contact with plants. Because of the chronic absence of kill steps against human pathogens for fresh produce, arrival on plants leads to persistence and the risk of human illness. Significant research progress is revealing mechanisms used by human pathogens to colonize plants and important biological interactions between and among bacteria in planta. These findings articulate the difficulty of eliminating or reducing the pathogen from plants. The plant itself may be an untapped key to clean produce. This review highlights the life of human pathogens outside an animal host, focusing on the role of plants, and illustrates areas that are ripe for future investigation.
Nguyen-Dumont, Tú; Teo, Zhi L; Hammet, Fleur; Roberge, Alexis; Mahmoodi, Maryam; Tsimiklis, Helen; Park, Daniel J; Pope, Bernard J; Lonie, Andrew; Kapuscinski, Miroslav K; Mahmood, Khalid; Goldgar, David E; Giles, Graham G; Winship, Ingrid; Hopper, John L; Southey, Melissa C
2018-02-08
Breast cancer risk for BRCA1 and BRCA2 pathogenic mutation carriers is modified by risk factors that cluster in families, including genetic modifiers of risk. We considered genetic modifiers of risk for carriers of high-risk mutations in other breast cancer susceptibility genes. In a family known to carry the high-risk mutation PALB2:c.3113G>A (p.Trp1038*), whole-exome sequencing was performed on germline DNA from four affected women, three of whom were mutation carriers. RNASEL:p.Glu265* was identified in one of the PALB2 carriers who had two primary invasive breast cancer diagnoses before 50 years. Gene-panel testing of BRCA1, BRCA2, PALB2 and RNASEL in the Australian Breast Cancer Family Registry identified five carriers of RNASEL:p.Glu265* in 591 early onset breast cancer cases. Three of the five women (60%) carrying RNASEL:p.Glu265* also carried a pathogenic mutation in a breast cancer susceptibility gene compared with 30 carriers of pathogenic mutations in the 586 non-carriers of RNASEL:p.Glu265* (5%) (p < 0.002). Taqman genotyping demonstrated that the allele frequency of RNASEL:p.Glu265* was similar in affected and unaffected Australian women, consistent with other populations. Our study suggests that RNASEL:p.Glu265* may be a genetic modifier of risk for early-onset breast cancer predisposition in carriers of high-risk mutations. Much larger case-case and case-control studies are warranted to test the association observed in this report.
Borrelia miyamotoi, Other Vector-Borne Agents in Cat Blood and Ticks in Eastern Maryland.
Shannon, Avery B; Rucinsky, Renee; Gaff, Holly D; Brinkerhoff, R Jory
2017-12-01
We collected blood and tick samples in eastern Maryland to quantify vector-borne pathogen exposure and infection in healthy cats and to assess occupational disease risk to veterinary professionals and others who regularly interact with household pets. Thirty-six percent of healthy cats parasitized by ticks at time of examination (9/25) were exposed to, and 14% of bloods (7/49) tested PCR-positive for, at least one vector-borne pathogen including several bloods and ticks with Borrelia miyamotoi, a recently recognized tick-borne zoonotic bacterium. There was no indication that high tick burdens were associated with exposure to vector-borne pathogens. Our results underscore the potential importance of cats to human vector-borne disease risk.
Vector-borne diseases and climate change: a European perspective
Suk, Jonathan E
2017-01-01
Abstract Climate change has already impacted the transmission of a wide range of vector-borne diseases in Europe, and it will continue to do so in the coming decades. Climate change has been implicated in the observed shift of ticks to elevated altitudes and latitudes, notably including the Ixodes ricinus tick species that is a vector for Lyme borreliosis and tick-borne encephalitis. Climate change is also thought to have been a factor in the expansion of other important disease vectors in Europe: Aedes albopictus (the Asian tiger mosquito), which transmits diseases such as Zika, dengue and chikungunya, and Phlebotomus sandfly species, which transmits diseases including Leishmaniasis. In addition, highly elevated temperatures in the summer of 2010 have been associated with an epidemic of West Nile Fever in Southeast Europe and subsequent outbreaks have been linked to summer temperature anomalies. Future climate-sensitive health impacts are challenging to project quantitatively, in part due to the intricate interplay between non-climatic and climatic drivers, weather-sensitive pathogens and climate-change adaptation. Moreover, globalisation and international air travel contribute to pathogen and vector dispersion internationally. Nevertheless, monitoring forecasts of meteorological conditions can help detect epidemic precursors of vector-borne disease outbreaks and serve as early warning systems for risk reduction. PMID:29149298
Hancock, P.A; Thomas, M.B; Godfray, H.C.J
2008-01-01
It has recently been proposed that mosquito vectors of human diseases, particularly malaria, may be controlled by spraying with fungal biopesticides that increase the rate of adult mortality. Though fungal pathogens do not cause instantaneous mortality, they can kill mosquitoes before they are old enough to transmit disease. A model is developed (i) to explore the potential for fungal entomopathogens to reduce significantly infectious mosquito populations, (ii) to assess the relative value of the many different fungal strains that might be used, and (iii) to help guide the tactical design of vector-control programmes. The model follows the dynamics of different classes of adult mosquitoes with the risk of mortality due to the fungus being assumed to be a function of time since infection (modelled using the Weibull distribution). It is shown that substantial reductions in mosquito numbers are feasible for realistic assumptions about mosquito, fungus and malaria biology and moderate to low daily fungal infection probability. The choice of optimal fungal strain and spraying regime is shown to depend on local mosquito and malaria biology. Fungal pathogens may also influence the ability of mosquitoes to transmit malaria and such effects are shown to further reduce vectorial capacity. PMID:18765347